WorldWideScience

Sample records for scheduling process ii

  1. Prescriptions for schedule II opioids and benzodiazepines increase after the introduction of computer-generated prescriptions.

    Science.gov (United States)

    McGerald, Genevieve; Dvorkin, Ronald; Levy, David; Lovell-Rose, Stephanie; Sharma, Adhi

    2009-06-01

    Prescriptions for controlled substances decrease when regulatory barriers are put in place. The converse has not been studied. The objective was to determine whether a less complicated prescription writing process is associated with a change in the prescribing patterns of controlled substances in the emergency department (ED). The authors conducted a retrospective nonconcurrent cohort study of all patients seen in an adult ED between April 19, 2005, and April 18, 2007, who were discharged with a prescription. Prior to April 19, 2006, a specialized prescription form stored in a locked cabinet was obtained from the nursing staff to write a prescription for benzodiazepines or Schedule II opioids. After April 19, 2006, New York State mandated that all prescriptions, regardless of schedule classification, be generated on a specialized bar-coded prescription form. The main outcome of the study was to compare the proportion of Schedule III-V opioids to Schedule II opioids and benzodiazepines prescribed in the ED before and after the introduction of a less cumbersome prescription writing process. Of the 26,638 charts reviewed, 2.1% of the total number of prescriptions generated were for a Schedule II controlled opioid before the new system was implemented compared to 13.6% after (odds ratio [OR] = 7.3, 95% confidence interval [CI] = 6.4 to 8.4). The corresponding percentages for Schedule III-V opioids were 29.9% to 18.1% (OR = 0.52, 95% CI = 0.49 to 0.55) and for benzodiazepines 1.4% to 3.9% (OR = 2.8, 95% CI = 2.4 to 3.4). Patients were more likely to receive a prescription for a Schedule II opioid or a benzodiazepine after a more streamlined computer-generated prescription writing process was introduced in this ED. (c) 2009 by the Society for Academic Emergency Medicine.

  2. 21 CFR 1308.12 - Schedule II.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Schedule II. 1308.12 Section 1308.12 Food and... 9733 (26) Remifentanil 9739 (27) Sufentanil 9740 (28) Tapentadol 9780 (d) Stimulants. Unless... which contains any quantity of the following substances having a stimulant effect on the central nervous...

  3. Analyzing scheduling in the food-processing industry

    DEFF Research Database (Denmark)

    Akkerman, Renzo; van Donk, Dirk Pieter

    2009-01-01

    Production scheduling has been widely studied in several research areas, resulting in a large number of methods, prescriptions, and approaches. However, the impact on scheduling practice seems relatively low. This is also the case in the food-processing industry, where industry......-specific characteristics induce specific and complex scheduling problems. Based on ideas about decomposition of the scheduling task and the production process, we develop an analysis methodology for scheduling problems in food processing. This combines an analysis of structural (technological) elements of the production...... process with an analysis of the tasks of the scheduler. This helps to understand, describe, and structure scheduling problems in food processing, and forms a basis for improving scheduling and applying methods developed in literature. It also helps in evaluating the organisational structures...

  4. 21 CFR 113.83 - Establishing scheduled processes.

    Science.gov (United States)

    2010-04-01

    ... commercial production runs should be determined on the basis of recognized scientific methods to be of a size... CONTAINERS Production and Process Controls § 113.83 Establishing scheduled processes. Scheduled processes for... production shall be adequately provided for in establishing the scheduled process. Critical factors, e.g...

  5. AsmL Specification of a Ptolemy II Scheduler

    DEFF Research Database (Denmark)

    Lázaro Cuadrado, Daniel; Koch, Peter; Ravn, Anders Peter

    2003-01-01

    Ptolemy II is a tool that combines different computational models for simulation and design of embedded systems. AsmL is a software specification language based on the Abstract State Machine formalism. This paper reports on development of an AsmL model of the Synchronous Dataflow domain scheduler...

  6. Pharmacists correcting schedule II prescriptions: DEA flip-flops continue.

    Science.gov (United States)

    Abood, Richard R

    2010-12-01

    The Drug Enforcement Administration (DEA) has in recent years engaged in flip-flopping over important policy decisions. The most recent example involved whether a pharmacist can correct a written schedule II prescription upon verification with the prescriber. For several years the DEA's policy permitted this practice. Then the DEA issued a conflicting policy statement in 2007 in the preamble to the multiple schedule II prescription regulation, causing a series of subsequent contradictory statements ending with the policy that pharmacists should follow state law or policy until the Agency issues a regulation. It is doubtful that the DEA's opinion in the preamble would in itself constitute legal authority, or that the Agency would try to enforce the opinion. Nonetheless, these flip-flop opinions have confused pharmacists, caused some pharmacies to have claims rejected by third party payors, and most likely have inconvenienced patients.

  7. 78 FR 55099 - Established Aggregate Production Quotas for Schedule I and II Controlled Substances and...

    Science.gov (United States)

    2013-09-09

    ... aggregate production quotas, an additional 25% of the estimated medical, scientific, and research needs as... Production Quotas for Schedule I and II Controlled Substances and Established Assessment of Annual Needs for... initial 2014 aggregate production quotas for controlled substances in Schedules I and II of the Controlled...

  8. LHC Experiments Phase II - TDRs Approval Process

    CERN Document Server

    Forti, F

    2017-01-01

    The overall review process and steps of Phase II were described in CERN-LHCC-2015-077. As experiments submit detailed technical design reports (TDRs), the LHCC and UCG work in close connection to ensure a timely review of the scientific and technical feasibility as well as of the budget and schedule of the upgrade programme.

  9. A Bee Evolutionary Guiding Nondominated Sorting Genetic Algorithm II for Multiobjective Flexible Job-Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Qianwang Deng

    2017-01-01

    Full Text Available Flexible job-shop scheduling problem (FJSP is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II for multiobjective FJSP (MO-FJSP with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N, in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.

  10. A Bee Evolutionary Guiding Nondominated Sorting Genetic Algorithm II for Multiobjective Flexible Job-Shop Scheduling.

    Science.gov (United States)

    Deng, Qianwang; Gong, Guiliang; Gong, Xuran; Zhang, Like; Liu, Wei; Ren, Qinghua

    2017-01-01

    Flexible job-shop scheduling problem (FJSP) is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP) characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II) for multiobjective FJSP (MO-FJSP) with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N , in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.

  11. A Scheduling Model for the Re-entrant Manufacturing System and Its Optimization by NSGA-II

    Directory of Open Access Journals (Sweden)

    Masoud Rabbani

    2016-11-01

    Full Text Available In this study, a two-objective mixed-integer linear programming model (MILP for multi-product re-entrant flow shop scheduling problem has been designed. As a result, two objectives are considered. One of them is maximization of the production rate and the other is the minimization of processing time. The system has m stations and can process several products in a moment. The re-entrant flow shop scheduling problem is well known as NP-hard problem and its complexity has been discussed by several researchers. Given that NSGA-II algorithm is one of the strongest and most applicable algorithm in solving multi-objective optimization problems, it is used to solve this problem. To increase algorithm performance, Taguchi technique is used to design experiments for algorithm’s parameters. Numerical experiments are proposed to show the efficiency and effectiveness of the model. Finally, the results of NSGA-II are compared with SPEA2 algorithm (Strength Pareto Evolutionary Algorithm 2. The experimental results show that the proposed algorithm performs significantly better than the SPEA2.

  12. Multi-Objective Flexible Flow Shop Scheduling Problem Considering Variable Processing Time due to Renewable Energy

    Directory of Open Access Journals (Sweden)

    Xiuli Wu

    2018-03-01

    Full Text Available Renewable energy is an alternative to non-renewable energy to reduce the carbon footprint of manufacturing systems. Finding out how to make an alternative energy-efficient scheduling solution when renewable and non-renewable energy drives production is of great importance. In this paper, a multi-objective flexible flow shop scheduling problem that considers variable processing time due to renewable energy (MFFSP-VPTRE is studied. First, the optimization model of the MFFSP-VPTRE is formulated considering the periodicity of renewable energy and the limitations of energy storage capacity. Then, a hybrid non-dominated sorting genetic algorithm with variable local search (HNSGA-II is proposed to solve the MFFSP-VPTRE. An operation and machine-based encoding method is employed. A low-carbon scheduling algorithm is presented. Besides the crossover and mutation, a variable local search is used to improve the offspring’s Pareto set. The offspring and the parents are combined and those that dominate more are selected to continue evolving. Finally, two groups of experiments are carried out. The results show that the low-carbon scheduling algorithm can effectively reduce the carbon footprint under the premise of makespan optimization and the HNSGA-II outperforms the traditional NSGA-II and can solve the MFFSP-VPTRE effectively and efficiently.

  13. 78 FR 37237 - Proposed Adjustments to the Aggregate Production Quotas for Schedule I and II Controlled...

    Science.gov (United States)

    2013-06-20

    ... class of controlled substance listed in schedules I and II and for ephedrine, pseudoephedrine, and... disposal by the registrants holding individual manufacturing quotas for the class; (2) whether any... the Aggregate Production Quotas for Schedule I and II Controlled Substances and Assessment of Annual...

  14. Car painting process scheduling with harmony search algorithm

    Science.gov (United States)

    Syahputra, M. F.; Maiyasya, A.; Purnamawati, S.; Abdullah, D.; Albra, W.; Heikal, M.; Abdurrahman, A.; Khaddafi, M.

    2018-02-01

    Automotive painting program in the process of painting the car body by using robot power, making efficiency in the production system. Production system will be more efficient if pay attention to scheduling of car order which will be done by considering painting body shape of car. Flow shop scheduling is a scheduling model in which the job-job to be processed entirely flows in the same product direction / path. Scheduling problems often arise if there are n jobs to be processed on the machine, which must be specified which must be done first and how to allocate jobs on the machine to obtain a scheduled production process. Harmony Search Algorithm is a metaheuristic optimization algorithm based on music. The algorithm is inspired by observations that lead to music in search of perfect harmony. This musical harmony is in line to find optimal in the optimization process. Based on the tests that have been done, obtained the optimal car sequence with minimum makespan value.

  15. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times.

    Science.gov (United States)

    Yang, Xin; Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems.

  16. Multimodal processes scheduling in mesh-like network environment

    Directory of Open Access Journals (Sweden)

    Bocewicz Grzegorz

    2015-06-01

    Full Text Available Multimodal processes planning and scheduling play a pivotal role in many different domains including city networks, multimodal transportation systems, computer and telecommunication networks and so on. Multimodal process can be seen as a process partially processed by locally executed cyclic processes. In that context the concept of a Mesh-like Multimodal Transportation Network (MMTN in which several isomorphic subnetworks interact each other via distinguished subsets of common shared intermodal transport interchange facilities (such as a railway station, bus station or bus/tram stop as to provide a variety of demand-responsive passenger transportation services is examined. Consider a mesh-like layout of a passengers transport network equipped with different lines including buses, trams, metro, trains etc. where passenger flows are treated as multimodal processes. The goal is to provide a declarative model enabling to state a constraint satisfaction problem aimed at multimodal transportation processes scheduling encompassing passenger flow itineraries. Then, the main objective is to provide conditions guaranteeing solvability of particular transport lines scheduling, i.e. guaranteeing the right match-up of local cyclic acting bus, tram, metro and train schedules to a given passengers flow itineraries.

  17. Surgical scheduling: a lean approach to process improvement.

    Science.gov (United States)

    Simon, Ross William; Canacari, Elena G

    2014-01-01

    A large teaching hospital in the northeast United States had an inefficient, paper-based process for scheduling orthopedic surgery that caused delays and contributed to site/side discrepancies. The hospital's leaders formed a team with the goals of developing a safe, effective, patient-centered, timely, efficient, and accurate orthopedic scheduling process; smoothing the schedule so that block time was allocated more evenly; and ensuring correct site/side. Under the resulting process, real-time patient information is entered into a database during the patient's preoperative visit in the surgeon's office. The team found the new process reduced the occurrence of site/side discrepancies to zero, reduced instances of changing the sequence of orthopedic procedures by 70%, and increased patient satisfaction. Copyright © 2014 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  18. Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching

    Science.gov (United States)

    Shen, Kaiming; Yu, Wei

    2018-05-01

    This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.

  19. Study on multi-objective flexible job-shop scheduling problem considering energy consumption

    Directory of Open Access Journals (Sweden)

    Zengqiang Jiang

    2014-06-01

    Full Text Available Purpose: Build a multi-objective Flexible Job-shop Scheduling Problem(FJSP optimization model, in which the makespan, processing cost, energy consumption and cost-weighted processing quality are considered, then Design a Modified Non-dominated Sorting Genetic Algorithm (NSGA-II based on blood variation for above scheduling model.Design/methodology/approach: A multi-objective optimization theory based on Pareto optimal method is used in carrying out the optimization model. NSGA-II is used to solve the model.Findings: By analyzing the research status and insufficiency of multi-objective FJSP, Find that the difference in scheduling will also have an effect on energy consumption in machining process and environmental emissions. Therefore, job-shop scheduling requires not only guaranteeing the processing quality, time and cost, but also optimizing operation plan of machines and minimizing energy consumption.Originality/value: A multi-objective FJSP optimization model is put forward, in which the makespan, processing cost, energy consumption and cost-weighted processing quality are considered. According to above model, Blood-Variation-based NSGA-II (BVNSGA-II is designed. In which, the chromosome mutation rate is determined after calculating the blood relationship between two cross chromosomes, crossover and mutation strategy of NSGA-II is optimized and the prematurity of population is overcome. Finally, the performance of the proposed model and algorithm is evaluated through a case study, and the results proved the efficiency and feasibility of the proposed model and algorithm.

  20. Economic Benefit from Progressive Integration of Scheduling and Control for Continuous Chemical Processes

    Directory of Open Access Journals (Sweden)

    Logan D. R. Beal

    2017-12-01

    Full Text Available Performance of integrated production scheduling and advanced process control with disturbances is summarized and reviewed with four progressive stages of scheduling and control integration and responsiveness to disturbances: open-loop segregated scheduling and control, closed-loop segregated scheduling and control, open-loop scheduling with consideration of process dynamics, and closed-loop integrated scheduling and control responsive to process disturbances and market fluctuations. Progressive economic benefit from dynamic rescheduling and integrating scheduling and control is shown on a continuously stirred tank reactor (CSTR benchmark application in closed-loop simulations over 24 h. A fixed horizon integrated scheduling and control formulation for multi-product, continuous chemical processes is utilized, in which nonlinear model predictive control (NMPC and continuous-time scheduling are combined.

  1. Multi-core processing and scheduling performance in CMS

    International Nuclear Information System (INIS)

    Hernández, J M; Evans, D; Foulkes, S

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.

  2. Process simulations for the LCLS-II cryogenic systems

    Science.gov (United States)

    Ravindranath, V.; Bai, H.; Heloin, V.; Fauve, E.; Pflueckhahn, D.; Peterson, T.; Arenius, D.; Bevins, M.; Scanlon, C.; Than, R.; Hays, G.; Ross, M.

    2017-12-01

    Linac Coherent Light Source II (LCLS-II), a 4 GeV continuous-wave (CW) superconducting electron linear accelerator, is to be constructed in the existing two mile Linac facility at the SLAC National Accelerator Laboratory. The first light from the new facility is scheduled to be in 2020. The LCLS-II Linac consists of thirty-five 1.3 GHz and two 3.9 GHz superconducting cryomodules. The Linac cryomodules require cryogenic cooling for the super-conducting niobium cavities at 2.0 K, low temperature thermal intercept at 5.5-7.5 K, and a thermal shield at 35-55 K. The equivalent 4.5 K refrigeration capacity needed for the Linac operations range from a minimum of 11 kW to a maximum of 24 kW. Two cryogenic plants with 18 kW of equivalent 4.5 K refrigeration capacity will be used for supporting the Linac cryogenic cooling requirements. The cryogenic plants are based on the Jefferson Lab’s CHL-II cryogenic plant design which uses the “Floating Pressure” design to support a wide variation in the cooling load. In this paper, the cryogenic process for the integrated LCLS-II cryogenic system and the process simulation for a 4.5 K cryoplant in combination with a 2 K cold compressor box, and the Linac cryomodules are described.

  3. Schedules of Controlled Substances: Placement of FDA-Approved Products of Oral Solutions Containing Dronabinol [(-)-delta-9-transtetrahydrocannabinol (delta-9-THC)] in Schedule II. Interim final rule, with request for comments.

    Science.gov (United States)

    2017-03-23

    On July 1, 2016, the U.S. Food and Drug Administration (FDA) approved a new drug application for Syndros, a drug product consisting of dronabinol [(-)-delta-9-trans-tetrahydrocannabinol (delta-9-THC)] oral solution. Thereafter, the Department of Health and Human Services (HHS) provided the Drug Enforcement Administration (DEA) with a scheduling recommendation that would result in Syndros (and other oral solutions containing dronabinol) being placed in schedule II of the Controlled Substances Act (CSA). In accordance with the CSA, as revised by the Improving Regulatory Transparency for New Medical Therapies Act, DEA is hereby issuing an interim final rule placing FDA-approved products of oral solutions containing dronabinol in schedule II of the CSA.

  4. Designing scheduling concept and computer support in the food processing industries

    NARCIS (Netherlands)

    van Donk, DP; van Wezel, W; Gaalman, G; Bititci, US; Carrie, AS

    1998-01-01

    Food processing industries cope with a specific production process and a dynamic market. Scheduling the production process is thus important in being competitive. This paper proposes a hierarchical concept for structuring the scheduling and describes the (computer) support needed for this concept.

  5. Application of coupled symbolic and numeric processing to an advanced scheduling system for plant construction

    International Nuclear Information System (INIS)

    Kobayashi, Yasuhiro; Takamoto, Masanori; Nonaka, Hisanori; Yamada, Naoyuki

    1994-01-01

    A scheduling system has been developed by integrating symbolic processing functions for constraint handling and modification guidance, with numeric processing functions for schedule optimization and evaluation. The system is composed of an automatic schedule generation module, interactive schedule revision module and schedule evaluation module. The goal of the problem solving is the flattening of the daily resources requirement throughout the scheduling period. The automatic schedule generation module optimizes the initial schedule according to the formulatable portion of requirement description specified in a predicate-like language. A planning engineer refines the near-goal schedule through a knowledge-based interactive optimization process to obtain the goal schedule which fully covers the requirement description, with the interactive schedule revision module and schedule evaluation module. A scheduling system has been implemented on the basis of the proposed problem solving framework and experimentally applied to real-world sized scheduling problems for plant construction. With a result of the overall plant construction scheduling, a section schedule optimization process is described with the emphasis on the symbolic processing functions. (author)

  6. International Literature Review on WHODAS II (World Health Organization Disability Assessment Schedule II

    Directory of Open Access Journals (Sweden)

    Federici, Stefano

    2009-06-01

    Full Text Available This review is a critical analysis regarding the study and utilization of the World Health Organization Disability Assessment Schedule II (WHODAS II as a basis for establishing specific criteria for evaluating relevant international scientific literature.The WHODAS II is an instrument developed by the World Health Organisation in order to assess behavioural limitations and restrictions related to an individual’s participation, independent from a medical diagnosis. This instrument was developed by the WHO’s Assessment, Classification and Epidemiology Group within the framework of the WHO/NIH Joint Project on Assessment and Classification of Disablements. To ascertain the international dissemination level of for WHODAS II’s utilization and, at the same time, analyse the studies regarding the psychometric validation of the WHODAS II translation and adaptation in other languages and geographical contests. Particularly, our goal is to highlight which psychometric features have been investigated, focusing on the factorial structure, the reliability, and the validity of this instrument. International literature was researched through the main data bases of indexed scientific production: the Cambridge Scientific Abstracts – CSA, PubMed, and Google Scholar, from 1990 through to December 2008.The following search terms were used:“whodas”, in the field query, plus “title” and “abstract”.The WHODAS II has been used in 54 studies, of which 51 articles are published in international journals, 2 conference abstracts, and one dissertation abstract. Nevertheless, only 7 articles are published in journals and conference proceedings regarding disability and rehabilitation. Others have been published in medical and psychiatric journals, with the aim of indentifying comorbidity correlations in clinical diagnosis concerning patients with mental illness. Just 8 out of 51 articles have studied the psychometric properties of the WHODAS II. The

  7. Job schedulers for Big data processing in Hadoop environment: testing real-life schedulers using benchmark programs

    Directory of Open Access Journals (Sweden)

    Mohd Usama

    2017-11-01

    Full Text Available At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, manage, store, distribute, and analyze petabyte or larger-sized datasets having different structures with high speed. Big data can be structured, unstructured, or semi structured. Hadoop is an open source framework that is used to process large amounts of data in an inexpensive and efficient way, and job scheduling is a key factor for achieving high performance in big data processing. This paper gives an overview of big data and highlights the problems and challenges in big data. It then highlights Hadoop Distributed File System (HDFS, Hadoop MapReduce, and various parameters that affect the performance of job scheduling algorithms in big data such as Job Tracker, Task Tracker, Name Node, Data Node, etc. The primary purpose of this paper is to present a comparative study of job scheduling algorithms along with their experimental results in Hadoop environment. In addition, this paper describes the advantages, disadvantages, features, and drawbacks of various Hadoop job schedulers such as FIFO, Fair, capacity, Deadline Constraints, Delay, LATE, Resource Aware, etc, and provides a comparative study among these schedulers.

  8. Intelligence amplification framework for enhancing scheduling processes

    NARCIS (Netherlands)

    Dobrkovic, Andrej; Liu, Luyao; Iacob, Maria Eugenia; van Hillegersberg, Jos

    2016-01-01

    The scheduling process in a typical business environment consists of predominantly repetitive tasks that have to be completed in limited time and often containing some form of uncertainty. The intelligence amplification is a symbiotic relationship between a human and an intelligent agent. This

  9. Combined Noncyclic Scheduling and Advanced Control for Continuous Chemical Processes

    Directory of Open Access Journals (Sweden)

    Damon Petersen

    2017-12-01

    Full Text Available A novel formulation for combined scheduling and control of multi-product, continuous chemical processes is introduced in which nonlinear model predictive control (NMPC and noncyclic continuous-time scheduling are efficiently combined. A decomposition into nonlinear programming (NLP dynamic optimization problems and mixed-integer linear programming (MILP problems, without iterative alternation, allows for computationally light solution. An iterative method is introduced to determine the number of production slots for a noncyclic schedule during a prediction horizon. A filter method is introduced to reduce the number of MILP problems required. The formulation’s closed-loop performance with both process disturbances and updated market conditions is demonstrated through multiple scenarios on a benchmark continuously stirred tank reactor (CSTR application with fluctuations in market demand and price for multiple products. Economic performance surpasses cyclic scheduling in all scenarios presented. Computational performance is sufficiently light to enable online operation in a dual-loop feedback structure.

  10. ICPP calcined solids storage facility closure study. Volume II: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    International Nuclear Information System (INIS)

    1998-02-01

    This document contains Volume II of the Closure Study for the Idaho Chemical Processing Plant Calcined Solids Storage Facility. This volume contains draft information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the four options described in Volume I: (1) Risk-Based Clean Closure; NRC Class C fill, (2) Risk-Based Clean Closure; Clean fill, (3) Closure to landfill Standards; NRC Class C fill, and (4) Closure to Landfill Standards; Clean fill

  11. ICPP calcined solids storage facility closure study. Volume II: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    This document contains Volume II of the Closure Study for the Idaho Chemical Processing Plant Calcined Solids Storage Facility. This volume contains draft information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the four options described in Volume I: (1) Risk-Based Clean Closure; NRC Class C fill, (2) Risk-Based Clean Closure; Clean fill, (3) Closure to landfill Standards; NRC Class C fill, and (4) Closure to Landfill Standards; Clean fill.

  12. Ground Processing Optimization Using Artificial Intelligence Techniques, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The ultimate goal is the automation of a large amount of KSC's planning, scheduling, and execution decision making. Phase II will result in a complete full-scale...

  13. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

    Science.gov (United States)

    Duan, Haoran

    1997-12-01

    This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a

  14. Improved Low Power FPGA Binding of Datapaths from Data Flow Graphs with NSGA II -based Schedule Selection

    Directory of Open Access Journals (Sweden)

    BHUVANESWARI, M. C.

    2013-11-01

    Full Text Available FPGAs are increasingly being used to implement data path intensive algorithms for signal processing and image processing applications. In High Level Synthesis of Data Flow Graphs targeted at FPGAs, the effect of interconnect resources such as multiplexers must be considered since they contribute significantly to the area and switching power. We propose a binding framework for behavioral synthesis of Data Flow Graphs (DFGs onto FPGA targets with power reduction as the main criterion. The technique uses a multi-objective GA, NSGA II for design space exploration to identify schedules that have the potential to yield low-power bindings from a population of non-dominated solutions. A greedy constructive binding technique reported in the literature is adapted for interconnect minimization. The binding is further subjected to a perturbation process by altering the register and multiplexer assignments. Results obtained on standard DFG benchmarks indicate that our technique yields better power aware bindings than the constructive binding approach with little or no area overhead.

  15. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  16. SPANR planning and scheduling

    Science.gov (United States)

    Freund, Richard F.; Braun, Tracy D.; Kussow, Matthew; Godfrey, Michael; Koyama, Terry

    2001-07-01

    SPANR (Schedule, Plan, Assess Networked Resources) is (i) a pre-run, off-line planning and (ii) a runtime, just-in-time scheduling mechanism. It is designed to support primarily commercial applications in that it optimizes throughput rather than individual jobs (unless they have highest priority). Thus it is a tool for a commercial production manager to maximize total work. First the SPANR Planner is presented showing the ability to do predictive 'what-if' planning. It can answer such questions as, (i) what is the overall effect of acquiring new hardware or (ii) what would be the effect of a different scheduler. The ability of the SPANR Planner to formulate in advance tree-trimming strategies is useful in several commercial applications, such as electronic design or pharmaceutical simulations. The SPANR Planner is demonstrated using a variety of benchmarks. The SPANR Runtime Scheduler (RS) is briefly presented. The SPANR RS can provide benefit for several commercial applications, such as airframe design and financial applications. Finally a design is shown whereby SPANR can provide scheduling advice to most resource management systems.

  17. 78 FR 21818 - Schedules of Controlled Substances: Placement of Methylone Into Schedule I

    Science.gov (United States)

    2013-04-12

    ..., methamphetamine, and MDMA, Schedule I and II substances. These effects included elevated body temperature... of reuptake of monoamines, and in vivo studies (microdialysis, locomotor activity, body temperature.... Yet another commenter claimed that Schedule I placement would ``cripple efforts at learning,'' make it...

  18. Susceptibility of optimal train schedules to stochastic disturbances of process times

    DEFF Research Database (Denmark)

    Larsen, Rune; Pranzo, Marco; D’Ariano, Andrea

    2013-01-01

    study, an advanced branch and bound algorithm, on average, outperforms a First In First Out scheduling rule both in deterministic and stochastic traffic scenarios. However, the characteristic of the stochastic processes and the way a stochastic instance is handled turn out to have a serious impact...... and dwell times). In fact, the objective of railway traffic management is to reduce delay propagation and to increase disturbance robustness of train schedules at a network scale. We present a quantitative study of traffic disturbances and their effects on the schedules computed by simple and advanced...

  19. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  20. Scheduling of Conditional Process Graphs for the Synthesis of Embedded Systems

    DEFF Research Database (Denmark)

    Eles, Petru; Kuchcinski, Krzysztof; Peng, Zebo

    1998-01-01

    We present an approach to process scheduling based on an abstract graph representation which captures both dataflow and the flow of control. Target architectures consist of several processors, ASICs and shared busses. We have developed a heuristic which generates a schedule table so that the worst...... case delay is minimized. Several experiments demonstrate the efficiency of the approach....

  1. Step-by-step cyclic processes scheduling

    DEFF Research Database (Denmark)

    Bocewicz, G.; Nielsen, Izabela Ewa; Banaszak, Z.

    2013-01-01

    Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research is to pro......Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research...... is to provide a declarative model enabling to state a constraint satisfaction problem aimed at AGVs fleet scheduling subject to assumed itineraries of concurrently manufactured product types. In other words, assuming a given layout of FMS’s material handling and production routes of simultaneously manufactured...... orders, the main objective is to provide the declarative framework aimed at conditions allowing one to calculate the AGVs fleet schedule in online mode. An illustrative example of the relevant algebra-like driven step-by-stem cyclic scheduling is provided....

  2. An Improved Multiobjective PSO for the Scheduling Problem of Panel Block Construction

    Directory of Open Access Journals (Sweden)

    Zhi Yang

    2016-01-01

    Full Text Available Uncertainty is common in ship construction. However, few studies have focused on scheduling problems under uncertainty in shipbuilding. This paper formulates the scheduling problem of panel block construction as a multiobjective fuzzy flow shop scheduling problem (FSSP with a fuzzy processing time, a fuzzy due date, and the just-in-time (JIT concept. An improved multiobjective particle swarm optimization called MOPSO-M is developed to solve the scheduling problem. MOPSO-M utilizes a ranked-order-value rule to convert the continuous position of particles into the discrete permutations of jobs, and an available mapping is employed to obtain the precedence-based permutation of the jobs. In addition, to improve the performance of MOPSO-M, archive maintenance is combined with global best position selection, and mutation and a velocity constriction mechanism are introduced into the algorithm. The feasibility and effectiveness of MOPSO-M are assessed in comparison with general MOPSO and nondominated sorting genetic algorithm-II (NSGA-II.

  3. An Extended Genetic Algorithm for Distributed Integration of Fuzzy Process Planning and Scheduling

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2016-01-01

    Full Text Available The distributed integration of process planning and scheduling (DIPPS aims to simultaneously arrange the two most important manufacturing stages, process planning and scheduling, in a distributed manufacturing environment. Meanwhile, considering its advantage corresponding to actual situation, the triangle fuzzy number (TFN is adopted in DIPPS to represent the machine processing and transportation time. In order to solve this problem and obtain the optimal or near-optimal solution, an extended genetic algorithm (EGA with innovative three-class encoding method, improved crossover, and mutation strategies is proposed. Furthermore, a local enhancement strategy featuring machine replacement and order exchange is also added to strengthen the local search capability on the basic process of genetic algorithm. Through the verification of experiment, EGA achieves satisfactory results all in a very short period of time and demonstrates its powerful performance in dealing with the distributed integration of fuzzy process planning and scheduling (DIFPPS.

  4. Optimal methodology for a machining process scheduling in spot electricity markets

    International Nuclear Information System (INIS)

    Yusta, J.M.; Torres, F.; Khodr, H.M.

    2010-01-01

    Electricity spot markets have introduced hourly variations in the price of electricity. These variations allow the increase of the energy efficiency by the appropriate scheduling and adaptation of the industrial production to the hourly cost of electricity in order to obtain the maximum profit for the industry. In this article a mathematical optimization model simulates costs and the electricity demand of a machining process. The resultant problem is solved using the generalized reduced gradient approach, to find the optimum production schedule that maximizes the industry profit considering the hourly variations of the price of electricity in the spot market. Different price scenarios are studied to analyze the impact of the spot market prices for electricity on the optimal scheduling of the machining process and on the industry profit. The convenience of the application of the proposed model is shown especially in cases of very high electricity prices.

  5. Hypergraph+: An Improved Hypergraph-Based Task-Scheduling Algorithm for Massive Spatial Data Processing on Master-Slave Platforms

    Directory of Open Access Journals (Sweden)

    Bo Cheng

    2016-08-01

    Full Text Available Spatial data processing often requires massive datasets, and the task/data scheduling efficiency of these applications has an impact on the overall processing performance. Among the existing scheduling strategies, hypergraph-based algorithms capture the data sharing pattern in a global way and significantly reduce total communication volume. Due to heterogeneous processing platforms, however, single hypergraph partitioning for later scheduling may be not optimal. Moreover, these scheduling algorithms neglect the overlap between task execution and data transfer that could further decrease execution time. In order to address these problems, an extended hypergraph-based task-scheduling algorithm, named Hypergraph+, is proposed for massive spatial data processing. Hypergraph+ improves upon current hypergraph scheduling algorithms in two ways: (1 It takes platform heterogeneity into consideration offering a metric function to evaluate the partitioning quality in order to derive the best task/file schedule; and (2 It can maximize the overlap between communication and computation. The GridSim toolkit was used to evaluate Hypergraph+ in an IDW spatial interpolation application on heterogeneous master-slave platforms. Experiments illustrate that the proposed Hypergraph+ algorithm achieves on average a 43% smaller makespan than the original hypergraph scheduling algorithm but still preserves high scheduling efficiency.

  6. The 12-item World Health Organization Disability Assessment Schedule II (WHO-DAS II: a nonparametric item response analysis

    Directory of Open Access Journals (Sweden)

    Fernandez Ana

    2010-05-01

    Full Text Available Abstract Background Previous studies have analyzed the psychometric properties of the World Health Organization Disability Assessment Schedule II (WHO-DAS II using classical omnibus measures of scale quality. These analyses are sample dependent and do not model item responses as a function of the underlying trait level. The main objective of this study was to examine the effectiveness of the WHO-DAS II items and their options in discriminating between changes in the underlying disability level by means of item response analyses. We also explored differential item functioning (DIF in men and women. Methods The participants were 3615 adult general practice patients from 17 regions of Spain, with a first diagnosed major depressive episode. The 12-item WHO-DAS II was administered by the general practitioners during the consultation. We used a non-parametric item response method (Kernel-Smoothing implemented with the TestGraf software to examine the effectiveness of each item (item characteristic curves and their options (option characteristic curves in discriminating between changes in the underliying disability level. We examined composite DIF to know whether women had a higher probability than men of endorsing each item. Results Item response analyses indicated that the twelve items forming the WHO-DAS II perform very well. All items were determined to provide good discrimination across varying standardized levels of the trait. The items also had option characteristic curves that showed good discrimination, given that each increasing option became more likely than the previous as a function of increasing trait level. No gender-related DIF was found on any of the items. Conclusions All WHO-DAS II items were very good at assessing overall disability. Our results supported the appropriateness of the weights assigned to response option categories and showed an absence of gender differences in item functioning.

  7. Two-Agent Single-Machine Scheduling of Jobs with Time-Dependent Processing Times and Ready Times

    Directory of Open Access Journals (Sweden)

    Jan-Yee Kung

    2013-01-01

    Full Text Available Scheduling involving jobs with time-dependent processing times has recently attracted much research attention. However, multiagent scheduling with simultaneous considerations of jobs with time-dependent processing times and ready times is relatively unexplored. Inspired by this observation, we study a two-agent single-machine scheduling problem in which the jobs have both time-dependent processing times and ready times. We consider the model in which the actual processing time of a job of the first agent is a decreasing function of its scheduled position while the actual processing time of a job of the second agent is an increasing function of its scheduled position. In addition, each job has a different ready time. The objective is to minimize the total completion time of the jobs of the first agent with the restriction that no tardy job is allowed for the second agent. We propose a branch-and-bound and several genetic algorithms to obtain optimal and near-optimal solutions for the problem, respectively. We also conduct extensive computational results to test the proposed algorithms and examine the impacts of different problem parameters on their performance.

  8. Concurrent processes scheduling with scarce resources in small and medium enterprises

    Institute of Scientific and Technical Information of China (English)

    马嵩华

    2016-01-01

    Scarce resources , precedence and non-determined time-lag are three constraints commonly found in small and medium manufacturing enterprises (SMEs), which are deemed to block the ap-plication of workflow management system ( WfMS ) .To tackle this problem , a workflow scheduling approach is proposed based on timing workflow net (TWF-net) and genetic algorithm (GA).The workflow is modelled in a form of TWF-net in favour of process simulation and resource conflict checking .After simplifying and reconstructing the set of workflow instance , the conflict resolution problem is transformed into a resource-constrained project scheduling problem ( RCPSP ) , which could be efficiently solved by a heuristic method , such as GA.Finally, problems of various sizes are utilized to test the performance of the proposed algorithm and to compare it with first-come-first-served ( FCFS) strategy.The evaluation demonstrates that the proposed method is an overwhelming and effective approach for scheduling the concurrent processes with precedence and resource con -straints .

  9. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    Science.gov (United States)

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  10. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    Science.gov (United States)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  11. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Integrated project scheduling and staff assignment with controllable processing times.

    Science.gov (United States)

    Fernandez-Viagas, Victor; Framinan, Jose M

    2014-01-01

    This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.

  13. Integrated Project Scheduling and Staff Assignment with Controllable Processing Times

    Directory of Open Access Journals (Sweden)

    Victor Fernandez-Viagas

    2014-01-01

    Full Text Available This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.

  14. QoS Differentiated and Fair Packet Scheduling in Broadband Wireless Access Networks

    Directory of Open Access Journals (Sweden)

    Zhang Yan

    2009-01-01

    Full Text Available This paper studies the packet scheduling problem in Broadband Wireless Access (BWA networks. The key difficulties of the BWA scheduling problem lie in the high variability of wireless channel capacity and the unknown model of packet arrival process. It is difficult for traditional heuristic scheduling algorithms to handle the situation and guarantee satisfying performance in BWA networks. In this paper, we introduce learning-based approach for a better solution. Specifically, we formulate the packet scheduling problem as an average cost Semi-Markov Decision Process (SMDP. Then, we solve the SMDP by using reinforcement learning. A feature-based linear approximation and the Temporal-Difference learning technique are employed to produce a near optimal solution of the corresponding SMDP problem. The proposed algorithm, called Reinforcement Learning Scheduling (RLS, has in-built capability of self-training. It is able to adaptively and timely regulate its scheduling policy according to the instantaneous network conditions. Simulation results indicate that RLS outperforms two classical scheduling algorithms and simultaneously considers: (i effective QoS differentiation, (ii high bandwidth utilization, and (iii both short-term and long-term fairness.

  15. Uncertainty management by relaxation of conflicting constraints in production process scheduling

    Science.gov (United States)

    Dorn, Juergen; Slany, Wolfgang; Stary, Christian

    1992-01-01

    Mathematical-analytical methods as used in Operations Research approaches are often insufficient for scheduling problems. This is due to three reasons: the combinatorial complexity of the search space, conflicting objectives for production optimization, and the uncertainty in the production process. Knowledge-based techniques, especially approximate reasoning and constraint relaxation, are promising ways to overcome these problems. A case study from an industrial CIM environment, namely high-grade steel production, is presented to demonstrate how knowledge-based scheduling with the desired capabilities could work. By using fuzzy set theory, the applied knowledge representation technique covers the uncertainty inherent in the problem domain. Based on this knowledge representation, a classification of jobs according to their importance is defined which is then used for the straightforward generation of a schedule. A control strategy which comprises organizational, spatial, temporal, and chemical constraints is introduced. The strategy supports the dynamic relaxation of conflicting constraints in order to improve tentative schedules.

  16. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  17. Downlink scheduling using non-orthogonal uplink beams

    KAUST Repository

    Eltayeb, Mohammed E.

    2014-04-01

    Opportunistic schedulers rely on the feedback of the channel state information of users in order to perform user selection and downlink scheduling. This feedback increases with the number of users, and can lead to inefficient use of network resources and scheduling delays. We tackle the problem of feedback design, and propose a novel class of nonorthogonal codes to feed back channel state information. Users with favorable channel conditions simultaneously transmit their channel state information via non-orthogonal beams to the base station. The proposed formulation allows the base station to identify the strong users via a simple correlation process. After deriving the minimum required code length and closed-form expressions for the feedback load and downlink capacity, we show that i) the proposed algorithm reduces the feedback load while matching the achievable rate of full feedback algorithms operating over a noiseless feedback channel, and ii) the proposed codes are superior to the Gaussian codes.

  18. Downlink scheduling using non-orthogonal uplink beams

    KAUST Repository

    Eltayeb, Mohammed E.; Al-Naffouri, Tareq Y.; Bahrami, Hamid Reza Talesh

    2014-01-01

    Opportunistic schedulers rely on the feedback of the channel state information of users in order to perform user selection and downlink scheduling. This feedback increases with the number of users, and can lead to inefficient use of network resources and scheduling delays. We tackle the problem of feedback design, and propose a novel class of nonorthogonal codes to feed back channel state information. Users with favorable channel conditions simultaneously transmit their channel state information via non-orthogonal beams to the base station. The proposed formulation allows the base station to identify the strong users via a simple correlation process. After deriving the minimum required code length and closed-form expressions for the feedback load and downlink capacity, we show that i) the proposed algorithm reduces the feedback load while matching the achievable rate of full feedback algorithms operating over a noiseless feedback channel, and ii) the proposed codes are superior to the Gaussian codes.

  19. Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier

    Science.gov (United States)

    Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki

    We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.

  20. The development of stochastic process modeling through risk analysis derived from scheduling of NPP project

    International Nuclear Information System (INIS)

    Lee, Kwang Ho; Roh, Myung Sub

    2013-01-01

    There are so many different factors to consider when constructing a nuclear power plant successfully from planning to decommissioning. According to PMBOK, all projects have nine domains from a holistic project management perspective. They are equally important to all projects, however, this study focuses mostly on the processes required to manage timely completion of the project and conduct risk management. The overall objective of this study is to let you know what the risk analysis derived from scheduling of NPP project is, and understand how to implement the stochastic process modeling through risk management. Building the Nuclear Power Plant is required a great deal of time and fundamental knowledge related to all engineering. That means that integrated project scheduling management with so many activities is necessary and very important. Simulation techniques for scheduling of NPP project using Open Plan program, Crystal Ball program, and Minitab program can be useful tools for designing optimal schedule planning. Thus far, Open Plan and Monte Carlo programs have been used to calculate the critical path for scheduling network analysis. And also, Minitab program has been applied to monitor the scheduling risk. This approach to stochastic modeling through risk analysis of project activities is very useful for optimizing the schedules of activities using Critical Path Method and managing the scheduling control of NPP project. This study has shown new approach to optimal scheduling of NPP project, however, this does not consider the characteristic of activities according to the NPP site conditions. Hence, this study needs more research considering those factors

  1. The development of stochastic process modeling through risk analysis derived from scheduling of NPP project

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kwang Ho; Roh, Myung Sub [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-10-15

    There are so many different factors to consider when constructing a nuclear power plant successfully from planning to decommissioning. According to PMBOK, all projects have nine domains from a holistic project management perspective. They are equally important to all projects, however, this study focuses mostly on the processes required to manage timely completion of the project and conduct risk management. The overall objective of this study is to let you know what the risk analysis derived from scheduling of NPP project is, and understand how to implement the stochastic process modeling through risk management. Building the Nuclear Power Plant is required a great deal of time and fundamental knowledge related to all engineering. That means that integrated project scheduling management with so many activities is necessary and very important. Simulation techniques for scheduling of NPP project using Open Plan program, Crystal Ball program, and Minitab program can be useful tools for designing optimal schedule planning. Thus far, Open Plan and Monte Carlo programs have been used to calculate the critical path for scheduling network analysis. And also, Minitab program has been applied to monitor the scheduling risk. This approach to stochastic modeling through risk analysis of project activities is very useful for optimizing the schedules of activities using Critical Path Method and managing the scheduling control of NPP project. This study has shown new approach to optimal scheduling of NPP project, however, this does not consider the characteristic of activities according to the NPP site conditions. Hence, this study needs more research considering those factors.

  2. Refinery scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, Marcus V.; Fraga, Eder T. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Shah, Nilay [Imperial College, London (United Kingdom)

    2004-07-01

    This work addresses the refinery scheduling problem using mathematical programming techniques. The solution adopted was to decompose the entire refinery model into a crude oil scheduling and a product scheduling problem. The envelope for the crude oil scheduling problem is composed of a terminal, a pipeline and the crude area of a refinery, including the crude distillation units. The solution method adopted includes a decomposition technique based on the topology of the system. The envelope for the product scheduling comprises all tanks, process units and products found in a refinery. Once crude scheduling decisions are Also available the product scheduling is solved using a rolling horizon algorithm. All models were tested with real data from PETROBRAS' REFAP refinery, located in Canoas, Southern Brazil. (author)

  3. A priority-based heuristic algorithm (PBHA for optimizing integrated process planning and scheduling problem

    Directory of Open Access Journals (Sweden)

    Muhammad Farhan Ausaf

    2015-12-01

    Full Text Available Process planning and scheduling are two important components of a manufacturing setup. It is important to integrate them to achieve better global optimality and improved system performance. To find optimal solutions for integrated process planning and scheduling (IPPS problem, numerous algorithm-based approaches exist. Most of these approaches try to use existing meta-heuristic algorithms for solving the IPPS problem. Although these approaches have been shown to be effective in optimizing the IPPS problem, there is still room for improvement in terms of quality of solution and algorithm efficiency, especially for more complicated problems. Dispatching rules have been successfully utilized for solving complicated scheduling problems, but haven’t been considered extensively for the IPPS problem. This approach incorporates dispatching rules with the concept of prioritizing jobs, in an algorithm called priority-based heuristic algorithm (PBHA. PBHA tries to establish job and machine priority for selecting operations. Priority assignment and a set of dispatching rules are simultaneously used to generate both the process plans and schedules for all jobs and machines. The algorithm was tested for a series of benchmark problems. The proposed algorithm was able to achieve superior results for most complex problems presented in recent literature while utilizing lesser computational resources.

  4. Multi-processor network implementations in Multibus II and VME

    International Nuclear Information System (INIS)

    Briegel, C.

    1992-01-01

    ACNET (Fermilab Accelerator Controls Network), a proprietary network protocol, is implemented in a multi-processor configuration for both Multibus II and VME. The implementations are contrasted by the bus protocol and software design goals. The Multibus II implementation provides for multiple processors running a duplicate set of tasks on each processor. For a network connected task, messages are distributed by a network round-robin scheduler. Further, messages can be stopped, continued, or re-routed for each task by user-callable commands. The VME implementation provides for multiple processors running one task across all processors. The process can either be fixed to a particular processor or dynamically allocated to an available processor depending on the scheduling algorithm of the multi-processing operating system. (author)

  5. Biogenesis and proteolytic processing of lysosomal DNase II.

    Directory of Open Access Journals (Sweden)

    Susumu Ohkouchi

    Full Text Available Deoxyribonuclease II (DNase II is a key enzyme in the phagocytic digestion of DNA from apoptotic nuclei. To understand the molecular properties of DNase II, particularly the processing, we prepared a polyclonal antibody against carboxyl-terminal sequences of mouse DNase II. In the present study, partial purification of DNase II using Con A Sepharose enabled the detection of endogenous DNase II by Western blotting. It was interesting that two forms of endogenous DNase II were detected--a 30 kDa form and a 23 kDa form. Neither of those forms carried the expected molecular weight of 45 kDa. Subcellular fractionation showed that the 23 kDa and 30 kDa proteins were localized in lysosomes. The processing of DNase II in vivo was also greatly altered in the liver of mice lacking cathepsin L. DNase II that was extracellularly secreted from cells overexpressing DNase II was detected as a pro-form, which was activated under acidic conditions. These results indicate that DNase II is processed and activated in lysosomes, while cathepsin L is involved in the processing of the enzyme.

  6. A Study of the Operating Room Scheduling System at Tripler Army Medical Center, Hawaii

    Science.gov (United States)

    1981-08-01

    PROCESSING CLASS V SYSTEM .... .......... . A BIBLIOGRAPHY ....... ........... . . . .. . ii ’I. INTRODUCTIO9 Development of the Problem Convinced that...of the most difficult administrativo tasks that a modern hospital must face, and proposed using a combination of a master posting sheet and a...deal with scheduling problems.9 This particular process also incorporates the two-room system doscribed earlier, and the author admits that this

  7. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  8. A Flexible Job Shop Scheduling Problem with Controllable Processing Times to Optimize Total Cost of Delay and Processing

    Directory of Open Access Journals (Sweden)

    Hadi Mokhtari

    2015-11-01

    Full Text Available In this paper, the flexible job shop scheduling problem with machine flexibility and controllable process times is studied. The main idea is that the processing times of operations may be controlled by consumptions of additional resources. The purpose of this paper to find the best trade-off between processing cost and delay cost in order to minimize the total costs. The proposed model, flexible job shop scheduling with controllable processing times (FJCPT, is formulated as an integer non-linear programming (INLP model and then it is converted into an integer linear programming (ILP model. Due to NP-hardness of FJCPT, conventional analytic optimization methods are not efficient. Hence, in order to solve the problem, a Scatter Search (SS, as an efficient metaheuristic method, is developed. To show the effectiveness of the proposed method, numerical experiments are conducted. The efficiency of the proposed algorithm is compared with that of a genetic algorithm (GA available in the literature for solving FJSP problem. The results showed that the proposed SS provide better solutions than the existing GA.

  9. Optimization of multi-objective integrated process planning and scheduling problem using a priority based optimization algorithm

    Science.gov (United States)

    Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu

    2015-12-01

    For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.

  10. Thin film processes II

    CERN Document Server

    Kern, Werner

    1991-01-01

    This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

  11. NRC comprehensive records disposition schedule

    International Nuclear Information System (INIS)

    1983-05-01

    Effective January 1, 1982, NRC will institute records retention and disposal practives in accordance with the approved Comprehensive Records Disposition Schedule (CRDS). CRDS is comprised of NRC Schedules (NRCS) 1 to 4 which apply to the agency's program or substantive records and General Records Schedules (GRS) 1 to 24 which apply to housekeeping or facilitative records. NRCS-I applies to records common to all or most NRC offices; NRCS-II applies to program records as found in the various offices of the Commission, Atomic Safety and Licensing Board Panel, and the Atomic Safety and Licensing Appeal Panel; NRCS-III applies to records accumulated by the Advisory Committee on Reactor Safeguards; and NRCS-IV applies to records accumulated in the various NRC offices under the Executive Director for Operations. The schedules are assembled functionally/organizationally to facilitate their use. Preceding the records descriptions and disposition instructions for both NRCS and GRS, there are brief statements on the organizational units which accumulate the records in each functional area, and other information regarding the schedules' applicability

  12. A customizable system for real-time image processing using the Blackfin DSProcessor and the MicroC/OS-II real-time kernel

    Science.gov (United States)

    Coffey, Stephen; Connell, Joseph

    2005-06-01

    This paper presents a development platform for real-time image processing based on the ADSP-BF533 Blackfin processor and the MicroC/OS-II real-time operating system (RTOS). MicroC/OS-II is a completely portable, ROMable, pre-emptive, real-time kernel. The Blackfin Digital Signal Processors (DSPs), incorporating the Analog Devices/Intel Micro Signal Architecture (MSA), are a broad family of 16-bit fixed-point products with a dual Multiply Accumulate (MAC) core. In addition, they have a rich instruction set with variable instruction length and both DSP and MCU functionality thus making them ideal for media based applications. Using the MicroC/OS-II for task scheduling and management, the proposed system can capture and process raw RGB data from any standard 8-bit greyscale image sensor in soft real-time and then display the processed result using a simple PC graphical user interface (GUI). Additionally, the GUI allows configuration of the image capture rate and the system and core DSP clock rates thereby allowing connectivity to a selection of image sensors and memory devices. The GUI also allows selection from a set of image processing algorithms based in the embedded operating system.

  13. LEARNING SCHEDULER PARAMETERS FOR ADAPTIVE PREEMPTION

    OpenAIRE

    Prakhar Ojha; Siddhartha R Thota; Vani M; Mohit P Tahilianni

    2015-01-01

    An operating system scheduler is expected to not allow processor stay idle if there is any process ready or waiting for its execution. This problem gains more importance as the numbers of processes always outnumber the processors by large margins. It is in this regard that schedulers are provided with the ability to preempt a running process, by following any scheduling algorithm, and give us an illusion of simultaneous running of several processes. A process which is allowed t...

  14. Integration of scheduling and discrete event simulation systems to improve production flow planning

    Science.gov (United States)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  15. Magnetite Dissolution Performance of HYBRID-II Decontamination Process

    International Nuclear Information System (INIS)

    Kim, Seonbyeong; Lee, Woosung; Won, Huijun; Moon, Jeikwon; Choi, Wangkyu

    2014-01-01

    In this study, we conducted the magnetite dissolution performance test of HYBRID-II (Hydrazine Based Reductive metal Ion Decontamination with sulfuric acid) as a part of decontamination process development. Decontamination performance of HYBRID process was successfully tested with the results of the acceptable decontamination factor (DF) in the previous study. While following-up studies such as the decomposition of the post-decontamination HYBRID solution and corrosion compatibility on the substrate metals of the target reactor coolant system have been continued, we also seek for an alternate version of HYBRID process suitable especially for decommissioning. Inspired by the relationship between the radius of reacting ion and the reactivity, we replaced the nitrate ion in HYBRID with bigger sulfate ion to accommodate the dissolution reaction and named HYBRID-II process. As a preliminary step for the decontamination performance, we tested the magnetite dissolution performance of developing HYBRID-II process and compared the results with those of HYBRID process. HYBRID process developed previously is known have the acceptable decontamination performance, but the relatively larger volume of secondary waste induced by anion exchange resin to treat nitrate ion is the one of the problems related in the development of HYBRID process to be applicable. Therefore we alternatively devised HYBRID-II process using sulfuric acid and tested its dissolution of magnetite in numerous conditions. From the results shown in this study, we can conclude that HYBRID-II process improves the decontamination performance and potentially reduces the volume of secondary waste. Rigorous tests with metal oxide coupons obtained from reactor coolant system will be followed to prove the robustness of HYBRID-II process in the future

  16. New heating schedule in hydrogen annealing furnace based on process simulation for less energy consumption

    International Nuclear Information System (INIS)

    Saboonchi, Ahmad; Hassanpour, Saeid; Abbasi, Shahram

    2008-01-01

    Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%

  17. New heating schedule in hydrogen annealing furnace based on process simulation for less energy consumption

    Energy Technology Data Exchange (ETDEWEB)

    Saboonchi, Ahmad [Department of Mechanical Engineering, Isfahan University of Technology, Isfahan 84154 (Iran); Hassanpour, Saeid [Rayan Tahlil Sepahan Co., Isfahan Science and Technology Town, Isfahan 84155 (Iran); Abbasi, Shahram [R and D Department, Mobarakeh Steel Complex, Isfahan (Iran)

    2008-11-15

    Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%. (author)

  18. Influence of intravenous self-administered psychomotor stimulants on performance of rhesus monkeys in a multiple schedule paradigm.

    Science.gov (United States)

    Hoffmeister, F

    1980-01-01

    Rhesus monkeys were trained to complete three multiple schedules. The schedules consisted of three components: a fixed interval (component 1), a variable interval (component 2), and a fixed ratio (component 3). During components 1 and 2, pressing lever 1 was always reinforced by food delivery. During component 3, pressing lever 2 resulted in either food delivery or intravenous infusions of saline solution, solutions of cocaine, of d-amphetamine, of phenmetrazine, or fenetylline. In schedule I, animals were presented with all three components independent of key-pressing behavior during components 1 and 2. In schedule II the availability of component 2 was dependent on completion of component 1. Component 3 was made available only on completion of component 2. Noncompletion of components 1 or 2 resulted in time-out of 15 and 10 min, respectively. Schedule III was identical with schedule II, except that in schedule III the completion of components was indicated only by a change in the lever lights. The influence of self-administered drugs on behavior in all three components was evaluated. Self-administration of psychomotor stimulants impaired the performance of animals and delayed completion of components 1 and 2 of schedules I, II, and III. The effects on behavior were similar with low drug intake in schedule III, moderate intake in schedule II, and high drug intake in schedule I. These effects were strong with self-administration of phenmetrazine, moderate with self-administration of cocaine and d-amphetamine, and weak with self-administration of fenetylline.

  19. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    Science.gov (United States)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  20. 9 CFR 318.303 - Critical factors and the application of the process schedule.

    Science.gov (United States)

    2010-01-01

    ... of the process schedule. 318.303 Section 318.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY...; REINSPECTION AND PREPARATION OF PRODUCTS Canning and Canned Products § 318.303 Critical factors and the...

  1. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Science.gov (United States)

    2010-01-01

    ... PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Critical factors and the application of the process schedule. 381.303 Section 381.303 Animals and Animal Products FOOD SAFETY AND...

  2. Decoupling algorithms from schedules for easy optimization of image processing pipelines

    OpenAIRE

    Adams, Andrew; Paris, Sylvain; Levoy, Marc; Ragan-Kelley, Jonathan Millar; Amarasinghe, Saman P.; Durand, Fredo

    2012-01-01

    Using existing programming tools, writing high-performance image processing code requires sacrificing readability, portability, and modularity. We argue that this is a consequence of conflating what computations define the algorithm, with decisions about storage and the order of computation. We refer to these latter two concerns as the schedule, including choices of tiling, fusion, recomputation vs. storage, vectorization, and parallelism. We propose a representation for feed-forward imagi...

  3. Genetic algorithm to solve the problems of lectures and practicums scheduling

    Science.gov (United States)

    Syahputra, M. F.; Apriani, R.; Sawaluddin; Abdullah, D.; Albra, W.; Heikal, M.; Abdurrahman, A.; Khaddafi, M.

    2018-02-01

    Generally, the scheduling process is done manually. However, this method has a low accuracy level, along with possibilities that a scheduled process collides with another scheduled process. When doing theory class and practicum timetable scheduling process, there are numerous problems, such as lecturer teaching schedule collision, schedule collision with another schedule, practicum lesson schedules that collides with theory class, and the number of classrooms available. In this research, genetic algorithm is implemented to perform theory class and practicum timetable scheduling process. The algorithm will be used to process the data containing lists of lecturers, courses, and class rooms, obtained from information technology department at University of Sumatera Utara. The result of scheduling process using genetic algorithm is the most optimal timetable that conforms to available time slots, class rooms, courses, and lecturer schedules.

  4. Scheduling job shop - A case study

    Science.gov (United States)

    Abas, M.; Abbas, A.; Khan, W. A.

    2016-08-01

    The scheduling in job shop is important for efficient utilization of machines in the manufacturing industry. There are number of algorithms available for scheduling of jobs which depend on machines tools, indirect consumables and jobs which are to be processed. In this paper a case study is presented for scheduling of jobs when parts are treated on available machines. Through time and motion study setup time and operation time are measured as total processing time for variety of products having different manufacturing processes. Based on due dates different level of priority are assigned to the jobs and the jobs are scheduled on the basis of priority. In view of the measured processing time, the times for processing of some new jobs are estimated and for efficient utilization of the machines available an algorithm is proposed and validated.

  5. 2007 Wholesale Power Rate Schedules : 2007 General Rate Schedule Provisions.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    2006-11-01

    This schedule is available for the contract purchase of Firm Power to be used within the Pacific Northwest (PNW). Priority Firm (PF) Power may be purchased by public bodies, cooperatives, and Federal agencies for resale to ultimate consumers, for direct consumption, and for Construction, Test and Start-Up, and Station Service. Rates in this schedule are in effect beginning October 1, 2006, and apply to purchases under requirements Firm Power sales contracts for a three-year period. The Slice Product is only available for public bodies and cooperatives who have signed Slice contracts for the FY 2002-2011 period. Utilities participating in the Residential Exchange Program (REP) under Section 5(c) of the Northwest Power Act may purchase Priority Firm Power pursuant to the Residential Exchange Program. Rates under contracts that contain charges that escalate based on BPA's Priority Firm Power rates shall be based on the three-year rates listed in this rate schedule in addition to applicable transmission charges. This rate schedule supersedes the PF-02 rate schedule, which went into effect October 1, 2001. Sales under the PF-07 rate schedule are subject to BPA's 2007 General Rate Schedule Provisions (2007 GRSPs). Products available under this rate schedule are defined in the 2007 GRSPs. For sales under this rate schedule, bills shall be rendered and payments due pursuant to BPA's 2007 GRSPs and billing process.

  6. Energy-Efficient Scheduling Problem Using an Effective Hybrid Multi-Objective Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Lvjiang Yin

    2016-12-01

    Full Text Available Nowadays, manufacturing enterprises face the challenge of just-in-time (JIT production and energy saving. Therefore, study of JIT production and energy consumption is necessary and important in manufacturing sectors. Moreover, energy saving can be attained by the operational method and turn off/on idle machine method, which also increases the complexity of problem solving. Thus, most researchers still focus on small scale problems with one objective: a single machine environment. However, the scheduling problem is a multi-objective optimization problem in real applications. In this paper, a single machine scheduling model with controllable processing and sequence dependence setup times is developed for minimizing the total earliness/tardiness (E/T, cost, and energy consumption simultaneously. An effective multi-objective evolutionary algorithm called local multi-objective evolutionary algorithm (LMOEA is presented to tackle this multi-objective scheduling problem. To accommodate the characteristic of the problem, a new solution representation is proposed, which can convert discrete combinational problems into continuous problems. Additionally, a multiple local search strategy with self-adaptive mechanism is introduced into the proposed algorithm to enhance the exploitation ability. The performance of the proposed algorithm is evaluated by instances with comparison to other multi-objective meta-heuristics such as Nondominated Sorting Genetic Algorithm II (NSGA-II, Strength Pareto Evolutionary Algorithm 2 (SPEA2, Multiobjective Particle Swarm Optimization (OMOPSO, and Multiobjective Evolutionary Algorithm Based on Decomposition (MOEA/D. Experimental results demonstrate that the proposed LMOEA algorithm outperforms its counterparts for this kind of scheduling problems.

  7. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    Over the last ten years, there have been numerous applications of evolutionary algorithms to a variety of scheduling problems. Like most other research on heuristic scheduling, the primary aim of the research has been on deterministic formulations of the problems. This is in contrast to real world...... scheduling problems which are usually not deterministic. Usually at the time the schedule is made some information about the problem and processing environment is available, but this information is uncertain and likely to change during schedule execution. Changes frequently encountered in scheduling...... environments include machine breakdowns, uncertain processing times, workers getting sick, materials being delayed and the appearance of new jobs. These possible environmental changes mean that a schedule which was optimal for the information available at the time of scheduling can end up being highly...

  8. Multi-objective group scheduling with learning effect in the cellular manufacturing system

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Taghavi-fard

    2011-01-01

    Full Text Available Group scheduling problem in cellular manufacturing systems consists of two major steps. Sequence of parts in each part-family and the sequence of part-family to enter the cell to be processed. This paper presents a new method for group scheduling problems in flow shop systems where it minimizes makespan (Cmax and total tardiness. In this paper, a position-based learning model in cellular manufacturing system is utilized where processing time for each part-family depends on the entrance sequence of that part. The problem of group scheduling is modeled by minimizing two objectives of position-based learning effect as well as the assumption of setup time depending on the sequence of parts-family. Since the proposed problem is NP-hard, two meta heuristic algorithms are presented based on genetic algorithm, namely: Non-dominated sorting genetic algorithm (NSGA-II and non-dominated rank genetic algorithm (NRGA. The algorithms are tested using randomly generated problems. The results include a set of Pareto solutions and three different evaluation criteria are used to compare the results. The results indicate that the proposed algorithms are quite efficient to solve the problem in a short computational time.

  9. A Gas Scheduling Optimization Model for Steel Enterprises

    Directory of Open Access Journals (Sweden)

    Niu Honghai

    2017-01-01

    Full Text Available Regarding the scheduling problems of steel enterprises, this research designs the gas scheduling optimization model according to the rules and priorities. Considering different features and the process changes of the gas unit in the process of actual production, the calculation model of process state and gas consumption soft measurement together with the rules of scheduling optimization is proposed to provide the dispatchers with real-time gas using status of each process, then help them to timely schedule and reduce the gas volume fluctuations. In the meantime, operation forewarning and alarm functions are provided to avoid the abnormal situation in the scheduling, which has brought about very good application effect in the actual scheduling and ensures the safety of the gas pipe network system and the production stability.

  10. ATLAS construction schedule

    CERN Multimedia

    Kotamaki, M

    The goal during the last few months has been to freeze and baseline as much as possible the schedules of various ATLAS systems and activities. The main motivations for the re-baselining of the schedules have been the new LHC schedule aiming at first collisions in early 2006 and the encountered delays in civil engineering as well as in the production of some of the detectors. The process was started by first preparing a new installation schedule that takes into account all the new external constraints and the new ATLAS staging scenario. The installation schedule version 3 was approved in the March EB and it provides the Ready For Installation (RFI) milestones for each system, i.e. the date when the system should be available for the start of the installation. TCn is now interacting with the systems aiming at a more realistic and resource loaded version 4 before the end of the year. Using the new RFI milestones as driving dates a new summary schedule has been prepared, or is under preparation, for each system....

  11. Analyzing the nursing organizational structure and process from a scheduling perspective.

    Science.gov (United States)

    Maenhout, Broos; Vanhoucke, Mario

    2013-09-01

    The efficient and effective management of nursing personnel is of critical importance in a hospital's environment comprising approximately 25 % of the hospital's operational costs. The nurse organizational structure and the organizational processes highly affect the nurses' working conditions and the provided quality of care. In this paper, we investigate the impact of different nurse organization structures and different organizational processes for a real-life situation in a Belgian university hospital. In order to make accurate nurse staffing decisions, the employed solution methodology incorporates shift scheduling characteristics in order to overcome the deficiencies of the many phase-specific methodologies that are proposed in the academic literature.

  12. Integrated multi-resource planning and scheduling in engineering project

    Directory of Open Access Journals (Sweden)

    Samer Ben Issa

    2017-01-01

    Full Text Available Planning and scheduling processes in project management are carried out sequentially in prac-tice, i.e. planning project activities first without visibility of resource limitation, and then schedul-ing the project according to these pre-planned activities. This is a need to integrate these two pro-cesses. In this paper, we use Branch and Bound approach for generating all the feasible and non-feasible project schedules with/without activity splitting, and with a new criterion called “the Minimum Moments of Resources Required around X-Y axes (MMORR”, we select the best feasible project schedule to integrate plan processing and schedule processing for engineering projects. The results illustrate that this integrated approach can effectively select the best feasible project schedule among alternatives, improves the resource utilization, and shortens the project lead time.

  13. Constraint-based scheduling

    Science.gov (United States)

    Zweben, Monte

    1993-01-01

    The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint-based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all the inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocation for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its application to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems.

  14. An Artificial Bee Colony Algorithm for the Job Shop Scheduling Problem with Random Processing Times

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2011-09-01

    Full Text Available Due to the influence of unpredictable random events, the processing time of each operation should be treated as random variables if we aim at a robust production schedule. However, compared with the extensive research on the deterministic model, the stochastic job shop scheduling problem (SJSSP has not received sufficient attention. In this paper, we propose an artificial bee colony (ABC algorithm for SJSSP with the objective of minimizing the maximum lateness (which is an index of service quality. First, we propose a performance estimate for preliminary screening of the candidate solutions. Then, the K-armed bandit model is utilized for reducing the computational burden in the exact evaluation (through Monte Carlo simulation process. Finally, the computational results on different-scale test problems validate the effectiveness and efficiency of the proposed approach.

  15. Space network scheduling benchmark: A proof-of-concept process for technology transfer

    Science.gov (United States)

    Moe, Karen; Happell, Nadine; Hayden, B. J.; Barclay, Cathy

    1993-01-01

    This paper describes a detailed proof-of-concept activity to evaluate flexible scheduling technology as implemented in the Request Oriented Scheduling Engine (ROSE) and applied to Space Network (SN) scheduling. The criteria developed for an operational evaluation of a reusable scheduling system is addressed including a methodology to prove that the proposed system performs at least as well as the current system in function and performance. The improvement of the new technology must be demonstrated and evaluated against the cost of making changes. Finally, there is a need to show significant improvement in SN operational procedures. Successful completion of a proof-of-concept would eventually lead to an operational concept and implementation transition plan, which is outside the scope of this paper. However, a high-fidelity benchmark using actual SN scheduling requests has been designed to test the ROSE scheduling tool. The benchmark evaluation methodology, scheduling data, and preliminary results are described.

  16. An Improved Hierarchical Genetic Algorithm for Sheet Cutting Scheduling with Process Constraints

    OpenAIRE

    Yunqing Rao; Dezhong Qi; Jinling Li

    2013-01-01

    For the first time, an improved hierarchical genetic algorithm for sheet cutting problem which involves n cutting patterns for m non-identical parallel machines with process constraints has been proposed in the integrated cutting stock model. The objective of the cutting scheduling problem is minimizing the weighted completed time. A mathematical model for this problem is presented, an improved hierarchical genetic algorithm (ant colony—hierarchical genetic algorithm) is developed for better ...

  17. Energy Efficient Scheduling of Real Time Signal Processing Applications through Combined DVFS and DPM

    OpenAIRE

    Nogues , Erwan; Pelcat , Maxime; Menard , Daniel; Mercat , Alexandre

    2016-01-01

    International audience; This paper proposes a framework to design energy efficient signal processing systems. The energy efficiency is provided by combining Dynamic Frequency and Voltage Scaling (DVFS) and Dynamic Power Management (DPM). The framework is based on Synchronous Dataflow (SDF) modeling of signal processing applications. A transformation to a single rate form is performed to expose the application parallelism. An automated scheduling is then performed, minimizing the constraint of...

  18. A FAST AND ELITIST BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR SCHEDULING INDEPENDENT TASKS ON HETEROGENEOUS SYSTEMS

    Directory of Open Access Journals (Sweden)

    G.Subashini

    2010-07-01

    Full Text Available To meet the increasing computational demands, geographically distributed resources need to be logically coupled to make them work as a unified resource. In analyzing the performance of such distributed heterogeneous computing systems scheduling a set of tasks to the available set of resources for execution is highly important. Task scheduling being an NP-complete problem, use of metaheuristics is more appropriate in obtaining optimal solutions. Schedules thus obtained can be evaluated using several criteria that may conflict with one another which require multi objective problem formulation. This paper investigates the application of an elitist Nondominated Sorting Genetic Algorithm (NSGA-II, to efficiently schedule a set of independent tasks in a heterogeneous distributed computing system. The objectives considered in this paper include minimizing makespan and average flowtime simultaneously. The implementation of NSGA-II algorithm and Weighted-Sum Genetic Algorithm (WSGA has been tested on benchmark instances for distributed heterogeneous systems. As NSGA-II generates a set of Pareto optimal solutions, to verify the effectiveness of NSGA-II over WSGA a fuzzy based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto solution set.

  19. Scheduling the scheduling task : a time management perspective on scheduling

    NARCIS (Netherlands)

    Larco Martinelli, J.A.; Wiers, V.C.S.; Fransoo, J.C.

    2013-01-01

    Time is the most critical resource at the disposal of schedulers. Hence, an adequate management of time from the schedulers may impact positively on the scheduler’s productivity and responsiveness to uncertain scheduling environments. This paper presents a field study of how schedulers make use of

  20. BIM-BASED SCHEDULING OF CONSTRUCTION

    DEFF Research Database (Denmark)

    Andersson, Niclas; Büchmann-Slorup, Rolf

    2010-01-01

    The potential of BIM is generally recognized in the construction industry, but the practical application of BIM for management purposes is, however, still limited among contractors. The objective of this study is to review the current scheduling process of construction in light of BIM...... and communicate. Scheduling on the detailed level, on the other hand, follows a stipulated approach to scheduling, i.e. the Last Planner System (LPS), which is characterized by involvement of all actors in the construction phase. Thus, the major challenge when implementing BIM-based scheduling is to improve...

  1. The Composition of the Master Schedule

    Science.gov (United States)

    Thomas, Cynthia C.; Behrend, Dirk; MacMillan, Daniel S.

    2010-01-01

    Over a period of about four months, the IVS Coordinating Center (IVSCC) each year composes the Master Schedule for the IVS observing program of the next calendar year. The process begins in early July when the IVSCC contacts the IVS Network Stations to request information about available station time as well as holiday and maintenance schedules for the upcoming year. Going through various planning stages and a review process with the IVS Observing Program Committee (OPC), the final version of the Master Schedule is posted by early November. We describe the general steps of the composition and illustrate them with the example of the planning for the Master Schedule of the 2010 observing year.

  2. 75 FR 34219 - Revision of Fee Schedules; Fee Recovery for FY 2010

    Science.gov (United States)

    2010-06-16

    ... Part II Nuclear Regulatory Commission 10 CFR Parts 170 and 171 Revision of Fee Schedules; Fee...-2009-0333 RIN 3150-AI70 Revision of Fee Schedules; Fee Recovery for FY 2010 AGENCY: Nuclear Regulatory..., inspection, and annual fees charged to its applicants and licensees. The amendments are necessary to...

  3. Plan and schedule for disposition and regulatory compliance for miscellaneous streams. Revision 1

    International Nuclear Information System (INIS)

    1994-12-01

    On December 23, 1991, the U.S. Department of Energy, Richland Operations Office (RL) and the Washington State Department of Ecology (Ecology) agreed to adhere to the provisions of Department of Ecology Consent Order No. DE 91NM-177 (Consent Order). The Consent Order lists regulatory milestones for liquid effluent streams at the Hanford Site to comply with the permitting requirements of Washington Administrative Code (WAC) 173-216 (State Waste Discharge Permit Program) or WAC 173-218 (Washington Underground Injection Control Program) where applicable. Hanford Site liquid effluent streams discharging to the soil column have been categorized in the Consent Order as follows: Phase I Streams Phase II Streams Miscellaneous Streams. Phase I and Phase II Streams are addressed in two RL reports: open-quotes Plan and Schedule to Discontinue Disposal of Contaminated Liquids into the Soil Column at the Hanford Siteclose quotes (DOE-RL 1987), and open-quotes Annual Status of the Report of the Plan and Schedule to Discontinue Disposal of Contaminated Liquids into the Soil Column at the Hanford Siteclose quotes. Miscellaneous Streams are those liquid effluent streams discharged to the ground that are not categorized as Phase I or Phase II Streams. Miscellaneous Streams discharging to the soil column at the Hanford Site are subject to the requirements of several milestones identified in the Consent Order. This document provides a plan and schedule for the disposition of Miscellaneous Streams. The disposition process for the Miscellaneous Streams is facilitated using a decision tree format. The decision tree and corresponding analysis for determining appropriate disposition of these streams is presented in this document

  4. Short term economic emission power scheduling of hydrothermal energy systems using improved water cycle algorithm

    International Nuclear Information System (INIS)

    Haroon, S.S.; Malik, T.N.

    2017-01-01

    Due to the increasing environmental concerns, the demand of clean and green energy and concern of atmospheric pollution is increasing. Hence, the power utilities are forced to limit their emissions within the prescribed limits. Therefore, the minimization of fuel cost as well as exhaust gas emissions is becoming an important and challenging task in the short-term scheduling of hydro-thermal energy systems. This paper proposes a novel algorithm known as WCA-ER (Water Cycle Algorithm with Evaporation Rate) to inspect the short term EEPSHES (Economic Emission Power Scheduling of Hydrothermal Energy Systems). WCA has its ancestries from the natural hydrologic cycle i.e. the raining process forms streams and these streams start flowing towards the rivers which finally flow towards the sea. The worth of WCA-ER has been tested on the standard economic emission power scheduling of hydrothermal energy test system consisting of four hydropower and three thermal plants. The problem has been investigated for the three case studies (i) ECS (Economic Cost Scheduling), (ii) ES (Economic Emission Scheduling) and (iii) ECES (Economic Cost and Emission Scheduling). The results obtained show that WCA-ER is superior to many other methods in the literature in bringing lower fuel cost and emissions. (author)

  5. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  6. Scheduling theory, algorithms, and systems

    CERN Document Server

    Pinedo, Michael L

    2016-01-01

    This new edition of the well-established text Scheduling: Theory, Algorithms, and Systems provides an up-to-date coverage of important theoretical models in the scheduling literature as well as important scheduling problems that appear in the real world. The accompanying website includes supplementary material in the form of slide-shows from industry as well as movies that show actual implementations of scheduling systems. The main structure of the book, as per previous editions, consists of three parts. The first part focuses on deterministic scheduling and the related combinatorial problems. The second part covers probabilistic scheduling models; in this part it is assumed that processing times and other problem data are random and not known in advance. The third part deals with scheduling in practice; it covers heuristics that are popular with practitioners and discusses system design and implementation issues. All three parts of this new edition have been revamped, streamlined, and extended. The reference...

  7. Optimal production scheduling for energy efficiency improvement in biofuel feedstock preprocessing considering work-in-process particle separation

    International Nuclear Information System (INIS)

    Li, Lin; Sun, Zeyi; Yao, Xufeng; Wang, Donghai

    2016-01-01

    Biofuel is considered a promising alternative to traditional liquid transportation fuels. The large-scale substitution of biofuel can greatly enhance global energy security and mitigate greenhouse gas emissions. One major concern of the broad adoption of biofuel is the intensive energy consumption in biofuel manufacturing. This paper focuses on the energy efficiency improvement of biofuel feedstock preprocessing, a major process of cellulosic biofuel manufacturing. An improved scheme of the feedstock preprocessing considering work-in-process particle separation is introduced to reduce energy waste and improve energy efficiency. A scheduling model based on the improved scheme is also developed to identify an optimal production schedule that can minimize the energy consumption of the feedstock preprocessing under production target constraint. A numerical case study is used to illustrate the effectiveness of the proposed method. The research outcome is expected to improve the energy efficiency and enhance the environmental sustainability of biomass feedstock preprocessing. - Highlights: • A novel method to schedule production in biofuel feedstock preprocessing process. • Systems modeling approach is used. • Capable of optimize preprocessing to reduce energy waste and improve energy efficiency. • A numerical case is used to illustrate the effectiveness of the method. • Energy consumption per unit production can be significantly reduced.

  8. Comparing Mixed & Integer Programming vs. Constraint Programming by solving Job-Shop Scheduling Problems

    Directory of Open Access Journals (Sweden)

    Renata Melo e Silva de Oliveira

    2015-03-01

    Full Text Available Scheduling is a key factor for operations management as well as for business success. From industrial Job-shop Scheduling problems (JSSP, many optimization challenges have emerged since de 1960s when improvements have been continuously required such as bottlenecks allocation, lead-time reductions and reducing response time to requests.  With this in perspective, this work aims to discuss 3 different optimization models for minimizing Makespan. Those 3 models were applied on 17 classical problems of examples JSSP and produced different outputs.  The first model resorts on Mixed and Integer Programming (MIP and it resulted on optimizing 60% of the studied problems. The other models were based on Constraint Programming (CP and approached the problem in two different ways: a model CP1 is a standard IBM algorithm whereof restrictions have an interval structure that fail to solve 53% of the proposed instances, b Model CP-2 approaches the problem with disjunctive constraints and optimized 88% of the instances. In this work, each model is individually analyzed and then compared considering: i Optimization success performance, ii Computational processing time, iii Greatest Resource Utilization and, iv Minimum Work-in-process Inventory. Results demonstrated that CP-2 presented best results on criteria i and ii, but MIP was superior on criteria iii and iv and those findings are discussed at the final section of this work.

  9. PEP-II injection timing and controls

    International Nuclear Information System (INIS)

    Bharadwaj, V.; Browne, M.; Crane, M.; Gromme, T.; Himel, T.; Ross, M.; Stanek, M.; Ronan, M.

    1997-07-01

    Hardware has been built and software written and incorporated in the existing SLC accelerator control system to control injection of beam pulses from the accelerator into the PEP-II storage rings currently under construction. Hardware includes a CAMAC module to delay the machine timing fiducial in order that a beam pulse extracted from a damping ring will be injected into a selected group of four 476 MHz buckets in a PEP-II ring. Further timing control is accomplished by shifting the phase of the bunches stored in the damping rings before extraction while leaving the phase of the PEP-II stored beam unchanged. The software which drives timing devices on a pulse-to-pulse basis relies on a dedicated communication link on which one scheduling microprocessor broadcasts a 128-bit message to all distributed control microprocessors at 360 Hz. PEP-II injection will be driven by the scheduling microprocessor according to lists specifying bucket numbers in arbitrary order, and according to scheduling constraints maximizing the useful beam delivered to the SLC collider currently in operation. These lists will be generated by a microprocessor monitoring the current stored per bucket in each of the PEP-II rings

  10. An Evaluation of Parallel Job Scheduling for ASCI Blue-Pacific

    International Nuclear Information System (INIS)

    Franke, H.; Jann, J.; Moreira, J.; Pattnaik, P.; Jette, M.

    1999-01-01

    In this paper we analyze the behavior of a gang-scheduling strategy that we are developing for the ASCI Blue-Pacific machines. Using actual job logs for one of the ASCI machines we generate a statistical model of the current workload with hyper Erlang distributions. We then vary the parameters of those distributions to generate various workloads, representative of different operating points of the machine. Through simulation we obtain performance parameters for three different scheduling strategies: (i) first-come first-serve, (ii) gang-scheduling, and (iii) backfilling. Our results show that backfilling, can be very effective for the common operating points in the 60-70% utilization range. However, for higher utilization rates, time-sharing techniques such as gang-scheduling offer much better performance

  11. A comparison of mixed-integer linear programming models for workforce scheduling with position-dependent processing times

    Science.gov (United States)

    Moreno-Camacho, Carlos A.; Montoya-Torres, Jairo R.; Vélez-Gallego, Mario C.

    2018-06-01

    Only a few studies in the available scientific literature address the problem of having a group of workers that do not share identical levels of productivity during the planning horizon. This study considers a workforce scheduling problem in which the actual processing time is a function of the scheduling sequence to represent the decline in workers' performance, evaluating two classical performance measures separately: makespan and maximum tardiness. Several mathematical models are compared with each other to highlight the advantages of each approach. The mathematical models are tested with randomly generated instances available from a public e-library.

  12. An improved scheduling algorithm for linear networks

    KAUST Repository

    Bader, Ahmed; Alouini, Mohamed-Slim; Ayadi, Yassin

    2017-01-01

    In accordance with the present disclosure, embodiments of an exemplary scheduling controller module or device implement an improved scheduling process such that the targeted reduction in schedule length can be achieve while incurring minimal energy penalty by allowing for a large rate (or duration) selection alphabet.

  13. An improved scheduling algorithm for linear networks

    KAUST Repository

    Bader, Ahmed

    2017-02-09

    In accordance with the present disclosure, embodiments of an exemplary scheduling controller module or device implement an improved scheduling process such that the targeted reduction in schedule length can be achieve while incurring minimal energy penalty by allowing for a large rate (or duration) selection alphabet.

  14. Efficient Load Scheduling Method For Power Management

    Directory of Open Access Journals (Sweden)

    Vijo M Joy

    2015-08-01

    Full Text Available An efficient load scheduling method to meet varying power supply needs is presented in this paper. At peak load times the power generation system fails due to its instability. Traditionally we use load shedding process. In load shedding process disconnect the unnecessary and extra loads. The proposed method overcomes this problem by scheduling the load based on the requirement. Artificial neural networks are used for this optimal load scheduling process. For generate economic scheduling artificial neural network has been used because generation of power from each source is economically different. In this the total load required is the inputs of this network and the power generation from each source and power losses at the time of transmission are the output of the neural network. Training and programming of the artificial neural networks are done using MATLAB.

  15. Value of flexible resources, virtual bidding, and self-scheduling in two-settlement electricity markets with wind generation - Part II: ISO Models and Application

    DEFF Research Database (Denmark)

    Kazempour, Jalal; Hobbs, Benjamin F.

    2017-01-01

    In Part II of this paper, we present formulations for three two-settlement market models: baseline cost-minimization (Stoch-Opt); and two sequential market models in which an independent system operator (ISO) runs real-time (RT) balancing markets after making day-ahead (DA) generating unit...... commitment decisions based upon deterministic wind forecasts, while virtual bidders arbitrage the two markets (Seq and SeqSS). The latter two models differ in terms of whether some slow-start generators can self-schedule in the DA market while anticipating probabilities of RT prices. Models in Seq and Seq......-SS build on components of the two-settlement equilibrium model (Stoch-MP) defined in Part I of this paper [1]. We then provide numerical results for all four models. A simple single-node case illustrates the economic impacts of flexibility, virtual bidding, and self-schedules, and is followed by a larger...

  16. Self-scheduling with Microsoft Excel.

    Science.gov (United States)

    Irvin, S A; Brown, H N

    1999-01-01

    Excessive time was being spent by the emergency department (ED) staff, head nurse, and unit secretary on a complex 6-week manual self-scheduling system. This issue, plus inevitable errors and staff dissatisfaction, resulted in a manager-lead initiative to automate elements of the scheduling process using Microsoft Excel. The implementation of this initiative included: common coding of all 8-hour and 12-hour shifts, with each 4-hour period represented by a cell; the creation of a 6-week master schedule using the "count-if" function of Excel based on current staffing guidelines; staff time-off requests then entered by the department secretary; the head nurse, with staff input, then fine-tuned the schedule to provide even unit coverage. Outcomes of these changes included an increase in staff satisfaction, time saved by the head nurse, and staff work time saved because there was less arguing about the schedule. Ultimately, the automated self-scheduling method was expanded to the entire 700-bed hospital.

  17. Synthesis of zero effluent multipurpose batch processes using effective scheduling

    CSIR Research Space (South Africa)

    Gouws, JF

    2008-06-01

    Full Text Available as follows. Given, i) required production over a given time horizon, ii) product recipe and production times, iii) maximum number of processing vessels and storage vessels, and iv) maximum and minimum capacity of processing vessels and storage vessels... the cleaning operation, due to the three different products mixed. Each type of wastewater has the possibility of being stored in a distinct storage vessel. The minimum and maximum capacity of each storage vessel is 500kg and 1500kg, respectively...

  18. Phase II study of a 3-day schedule with topotecan and cisplatin in patients with previously untreated small cell lung cancer and extensive disease

    DEFF Research Database (Denmark)

    Sorensen, M.; Lassen, Ulrik Niels; Jensen, Peter Buhl

    2008-01-01

    INTRODUCTION: Treatment with a topoisomerase I inhibitor in combination with a platinum results in superior or equal survival compared with etoposide-based treatment in extensive disease small cell lung cancer (SCLC). Five-day topotecan is inconvenient and therefore shorter schedules of topotecan...... and cisplatin are needed. The aim of this phase II study was to establish the response rate and response duration in chemo-naive patients with SCLC receiving a 3-day topotecan and cisplatin schedule. METHODS: Simons optimal two-stage design was used. Patients with previously untreated extensive disease SCLC...... age was 59 (range 44-74), 79% had performance status 0 or 1. Thirty-one patients completed all six cycles. Grade 3/4 anemia, neutrocytopenia, and thrombocytopenia were recorded in 9.5%, 66.7%, and 21.4% of patients, respectively. Fourteen percent of patients experienced neutropenic fever. No episodes...

  19. D-Zero run II data management and access

    International Nuclear Information System (INIS)

    Lueking, L.

    1997-03-01

    During the Run II data taking period at Fermilab, scheduled to begin in 1999, D0 plans to accumulate at least 200 TB of raw and reconstructed data per year. Data access patterns observed in the Run I experience have been examined in an attempt to establish an efficient data access environment. The needs and models for storing and processing the upcoming data are discussed

  20. A hybrid job-shop scheduling system

    Science.gov (United States)

    Hellingrath, Bernd; Robbach, Peter; Bayat-Sarmadi, Fahid; Marx, Andreas

    1992-01-01

    The intention of the scheduling system developed at the Fraunhofer-Institute for Material Flow and Logistics is the support of a scheduler working in a job-shop. Due to the existing requirements for a job-shop scheduling system the usage of flexible knowledge representation and processing techniques is necessary. Within this system the attempt was made to combine the advantages of symbolic AI-techniques with those of neural networks.

  1. Maximum Lateness Scheduling on Two-Person Cooperative Games with Variable Processing Times and Common Due Date

    OpenAIRE

    Liu, Peng; Wang, Xiaoli

    2017-01-01

    A new maximum lateness scheduling model in which both cooperative games and variable processing times exist simultaneously is considered in this paper. The job variable processing time is described by an increasing or a decreasing function dependent on the position of a job in the sequence. Two persons have to cooperate in order to process a set of jobs. Each of them has a single machine and their processing cost is defined as the minimum value of maximum lateness. All jobs have a common due ...

  2. Multi-Satellite Observation Scheduling for Large Area Disaster Emergency Response

    Science.gov (United States)

    Niu, X. N.; Tang, H.; Wu, L. X.

    2018-04-01

    an optimal imaging plan, plays a key role in coordinating multiple satellites to monitor the disaster area. In the paper, to generate imaging plan dynamically according to the disaster relief, we propose a dynamic satellite task scheduling method for large area disaster response. First, an initial robust scheduling scheme is generated by a robust satellite scheduling model in which both the profit and the robustness of the schedule are simultaneously maximized. Then, we use a multi-objective optimization model to obtain a series of decomposing schemes. Based on the initial imaging plan, we propose a mixed optimizing algorithm named HA_NSGA-II to allocate the decomposing results thus to obtain an adjusted imaging schedule. A real disaster scenario, i.e., 2008 Wenchuan earthquake, is revisited in terms of rapid response using satellite resources and used to evaluate the performance of the proposed method with state-of-the-art approaches. We conclude that our satellite scheduling model can optimize the usage of satellite resources so as to obtain images in disaster response in a more timely and efficient manner.

  3. Sustainable Scheduling of Cloth Production Processes by Multi-Objective Genetic Algorithm with Tabu-Enhanced Local Search

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2017-09-01

    Full Text Available The dyeing of textile materials is the most critical process in cloth production because of the strict technological requirements. In addition to the technical aspect, there have been increasing concerns over how to minimize the negative environmental impact of the dyeing industry. The emissions of pollutants are mainly caused by frequent cleaning operations which are necessary for initializing the dyeing equipment, as well as idled production capacity which leads to discharge of unconsumed chemicals. Motivated by these facts, we propose a methodology to reduce the pollutant emissions by means of systematic production scheduling. Firstly, we build a three-objective scheduling model that incorporates both the traditional tardiness objective and the environmentally-related objectives. A mixed-integer programming formulation is also provided to accurately define the problem. Then, we present a novel solution method for the sustainable scheduling problem, namely, a multi-objective genetic algorithm with tabu-enhanced iterated greedy local search strategy (MOGA-TIG. Finally, we conduct extensive computational experiments to investigate the actual performance of the MOGA-TIG. Based on a fair comparison with two state-of-the-art multi-objective optimizers, it is concluded that the MOGA-TIG is able to achieve satisfactory solution quality within tight computational time budget for the studied scheduling problem.

  4. The applicability of knowledge-based scheduling to the utilities industry

    International Nuclear Information System (INIS)

    Yoshimoto, G.; Gargan, R. Jr.; Duggan, P.

    1992-01-01

    The Electric Power Research Institute (EPRI), Nuclear Power Division, has identified the three major goals of high technology applications for nuclear power plants. These goals are to enhance power production through increasing power generation efficiency, to increase productivity of the operations, and to reduce the threats to the safety of the plant. Our project responds to the second goal by demonstrating that significant productivity increases can be achieved for outage maintenance operations based on existing knowledge-based scheduling technology. Its use can also mitigate threats to potential safety problems by means of the integration of risk assessment features into the scheduler. The scheduling approach uses advanced techniques enabling the automation of the routine scheduling decision process that previously was handled by people. The process of removing conflicts in scheduling is automated. This is achieved by providing activity representations that allow schedulers to express a variety of different scheduling constraints and by implementing scheduling mechanisms that simulate kinds of processes that humans use to find better solutions from a large number of possible solutions. This approach allows schedulers to express detailed constraints between activities and other activities, resources (material and personnel), and requirements that certain states exist for their execution. Our scheduler has already demonstrated its benefit to improving the shuttle processing flow management at Kennedy Space Center. Knowledge-based scheduling techniques should be examined by utilities industry researchers, developers, operators and management for application to utilities planning problems because of its great cost benefit potential. 4 refs., 4 figs

  5. A meta-heuristic method for solving scheduling problem: crow search algorithm

    Science.gov (United States)

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  6. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Science.gov (United States)

    2010-04-01

    ... schedules of rates for emergency energy, spinning reserve or economy energy or in cases of coordination and...? (ii) A summary statement of all cost (whether fully distributed, incremental or other) computations...

  7. Minimizing tardiness for job shop scheduling under uncertainties

    OpenAIRE

    Yahouni , Zakaria; Mebarki , Nasser; Sari , Zaki

    2016-01-01

    International audience; —Many disturbances can occur during the execution of a manufacturing scheduling process. To cope with this drawback , flexible solutions are proposed based on the offline and the online phase of the schedule. Groups of permutable operations is one of the most studied flexible scheduling methods bringing flexibility as well as quality to a schedule. The online phase of this method is based on a human-machine system allowing to choose in real-time one schedule from a set...

  8. NASA Schedule Management Handbook

    Science.gov (United States)

    2011-01-01

    The purpose of schedule management is to provide the framework for time-phasing, resource planning, coordination, and communicating the necessary tasks within a work effort. The intent is to improve schedule management by providing recommended concepts, processes, and techniques used within the Agency and private industry. The intended function of this handbook is two-fold: first, to provide guidance for meeting the scheduling requirements contained in NPR 7120.5, NASA Space Flight Program and Project Management Requirements, NPR 7120.7, NASA Information Technology and Institutional Infrastructure Program and Project Requirements, NPR 7120.8, NASA Research and Technology Program and Project Management Requirements, and NPD 1000.5, Policy for NASA Acquisition. The second function is to describe the schedule management approach and the recommended best practices for carrying out this project control function. With regards to the above project management requirements documents, it should be noted that those space flight projects previously established and approved under the guidance of prior versions of NPR 7120.5 will continue to comply with those requirements until project completion has been achieved. This handbook will be updated as needed, to enhance efficient and effective schedule management across the Agency. It is acknowledged that most, if not all, external organizations participating in NASA programs/projects will have their own internal schedule management documents. Issues that arise from conflicting schedule guidance will be resolved on a case by case basis as contracts and partnering relationships are established. It is also acknowledged and understood that all projects are not the same and may require different levels of schedule visibility, scrutiny and control. Project type, value, and complexity are factors that typically dictate which schedule management practices should be employed.

  9. Conception of Self-Construction Production Scheduling System

    Science.gov (United States)

    Xue, Hai; Zhang, Xuerui; Shimizu, Yasuhiro; Fujimura, Shigeru

    With the high speed innovation of information technology, many production scheduling systems have been developed. However, a lot of customization according to individual production environment is required, and then a large investment for development and maintenance is indispensable. Therefore now the direction to construct scheduling systems should be changed. The final objective of this research aims at developing a system which is built by it extracting the scheduling technique automatically through the daily production scheduling work, so that an investment will be reduced. This extraction mechanism should be applied for various production processes for the interoperability. Using the master information extracted by the system, production scheduling operators can be supported to accelerate the production scheduling work easily and accurately without any restriction of scheduling operations. By installing this extraction mechanism, it is easy to introduce scheduling system without a lot of expense for customization. In this paper, at first a model for expressing a scheduling problem is proposed. Then the guideline to extract the scheduling information and use the extracted information is shown and some applied functions are also proposed based on it.

  10. NASA Instrument Cost/Schedule Model

    Science.gov (United States)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  11. Topology-based hierarchical scheduling using deficit round robin

    DEFF Research Database (Denmark)

    Yu, Hao; Yan, Ying; Berger, Michael Stubert

    2009-01-01

    according to the topology. The mapping process could be completed through the network management plane or by manual configuration. Based on the knowledge of the network, the scheduler can manage the traffic on behalf of other less advanced nodes, avoid potential traffic congestion, and provide flow...... protection and isolation. Comparisons between hierarchical scheduling, flow-based scheduling, and class-based scheduling schemes have been carried out under a symmetric tree topology. Results have shown that the hierarchical scheduling scheme provides better flow protection and isolation from attack...

  12. IFR fuel cycle demonstration in the EBR-II Fuel Cycle Facility

    International Nuclear Information System (INIS)

    Lineberry, M.J.; Phipps, R.D.; Rigg, R.H.; Benedict, R.W.; Carnes, M.D.; Herceg, J.E.; Holtz, R.E.

    1991-01-01

    The next major milestone of the IFR (Integral Fast Reactor) program is engineering-scale demonstration of the pyroprocess fuel cycle. The EBR-II Fuel Cycle Facility has just entered a startup phase which includes completion of facility modifications, and installation and cold checkout of process equipment. This paper reviews the design and construction of the facility, the design and fabrication of the process equipment, and the schedule and initial plan for its operation. (author)

  13. IFR fuel cycle demonstration in the EBR-II Fuel Cycle Facility

    International Nuclear Information System (INIS)

    Lineberry, M.J.; Phipps, R.D.; Rigg, R.H.; Benedict, R.W.; Carnes, M.D.; Herceg, J.E.; Holtz, R.E.

    1991-01-01

    The next major milestone of the IFR program is engineering-scale demonstration of the pyroprocess fuel cycle. The EBR-II Fuel Cycle Facility has just entered a startup phase which includes completion of facility modifications, and installation and cold checkout of process equipment. This paper reviews the design and construction of the facility, the design and fabrication of the process equipment, and the schedule and initial plan for its operation. 5 refs., 4 figs

  14. Effects of practice schedule and task specificity on the adaptive process of motor learning.

    Science.gov (United States)

    Barros, João Augusto de Camargo; Tani, Go; Corrêa, Umberto Cesar

    2017-10-01

    This study investigated the effects of practice schedule and task specificity based on the perspective of adaptive process of motor learning. For this purpose, tasks with temporal and force control learning requirements were manipulated in experiments 1 and 2, respectively. Specifically, the task consisted of touching with the dominant hand the three sequential targets with specific movement time or force for each touch. Participants were children (N=120), both boys and girls, with an average age of 11.2years (SD=1.0). The design in both experiments involved four practice groups (constant, random, constant-random, and random-constant) and two phases (stabilisation and adaptation). The dependent variables included measures related to the task goal (accuracy and variability of error of the overall movement and force patterns) and movement pattern (macro- and microstructures). Results revealed a similar error of the overall patterns for all groups in both experiments and that they adapted themselves differently in terms of the macro- and microstructures of movement patterns. The study concludes that the effects of practice schedules on the adaptive process of motor learning were both general and specific to the task. That is, they were general to the task goal performance and specific regarding the movement pattern. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Technology Estimating 2: A Process to Determine the Cost and Schedule of Space Technology Research and Development

    Science.gov (United States)

    Cole, Stuart K.; Wallace, Jon; Schaffer, Mark; May, M. Scott; Greenberg, Marc W.

    2014-01-01

    As a leader in space technology research and development, NASA is continuing in the development of the Technology Estimating process, initiated in 2012, for estimating the cost and schedule of low maturity technology research and development, where the Technology Readiness Level is less than TRL 6. NASA' s Technology Roadmap areas consist of 14 technology areas. The focus of this continuing Technology Estimating effort included four Technology Areas (TA): TA3 Space Power and Energy Storage, TA4 Robotics, TA8 Instruments, and TA12 Materials, to confine the research to the most abundant data pool. This research report continues the development of technology estimating efforts completed during 2013-2014, and addresses the refinement of parameters selected and recommended for use in the estimating process, where the parameters developed are applicable to Cost Estimating Relationships (CERs) used in the parametric cost estimating analysis. This research addresses the architecture for administration of the Technology Cost and Scheduling Estimating tool, the parameters suggested for computer software adjunct to any technology area, and the identification of gaps in the Technology Estimating process.

  16. A Constraint Logic Programming Framework for the Synthesis of Fault-Tolerant Schedules for Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Poulsen, Kåre Harbo; Pop, Paul; Izosimov, Viacheslav

    2007-01-01

    -execution for recovering from multiple transient faults. We propose three scheduling approaches, which each present a trade-off between schedule simplicity and performance, (i) full transparency, (ii) slack sharing and (iii) conditional, and provide various degrees of transparency. We have developed a CLP framework...

  17. An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System

    Science.gov (United States)

    Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed

    PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.

  18. Constraint-based job shop scheduling with ILOG SCHEDULER

    NARCIS (Netherlands)

    Nuijten, W.P.M.; Le Pape, C.

    1998-01-01

    We introduce constraint-based scheduling and discuss its main principles. An approximation algorithm based on tree search is developed for the job shop scheduling problem using ILOG SCHEDULER. A new way of calculating lower bounds on the makespan of the job shop scheduling problem is presented and

  19. Process optimization and mechanistic studies of lead (II): Aspergillus ...

    African Journals Online (AJOL)

    The lead (II) accumulation potential of various biosorbent had been widely studied in the last few years, but an outstanding Pb(II) accumulating biomass still seems crucial for bringing the process to a successful application stage. This investigation describes the use of non-living biomass of Aspergillus caespitosus for ...

  20. Chemical Processing Department monthly report for April 1958

    Energy Technology Data Exchange (ETDEWEB)

    Warren, J.H.

    1958-05-21

    The separations plants operated on schedule, and Pu production exceeded commitment. UO{sub 3} production and shipments were also ahead of schedule. Purex operation under pseudo two-cycle conditions (elimination of HS and 1A columns, co-decontamination cycle concentrator HCP) was successful. Final U stream was 3{times} lower in Pu than ever before; {gamma} activity in recovered HNO{sub 3} was also low. Four of 6 special E metal batches were processed through Redox and analyzed. Boric acid is removed from solvent extraction process via aq waste. The filter in Task II hydrofluorinator was changed from carbon to Poroloy. Various modifications to equipment were made.

  1. Surgical scheduling categorization system (SSCS): A novel classification system to improve coordination and scheduling of operative cases in a tertiary pediatric medical system.

    Science.gov (United States)

    Gantwerker, Eric A; Bannos, Cassandra; Cunningham, Michael J; Rahbar, Reza

    2017-01-01

    To describe a surgical categorization system to create a universal nomenclature, delineating patient complexity as a first step toward developing a true risk stratification system. Retrospective database review of all otolaryngology surgical procedures performed in a tertiary pediatric hospital system over one academic year (July 2012-June 2013). All otolaryngology surgical procedures were reviewed, encompassing 8478 procedures on 5711 patients. The attending otolaryngologist assigned surgical scheduling category (SSCS) at the time of case booking based on an institution specific guidelines. The guidelines are as follow: Category I was assigned to American Society of Anesthesiologists physical status classification (ASA) I/II patients, designating them appropriate for institution's suburban ambulatory surgery centers; Category II was ASA I/II patients with social or transportation issues; Category III was ASA I/II patients who required case coordination with other medical or surgical departments; Category IV was reserved for patients of any ASA class whom the surgeon designated to be of a higher complexity. 8478 total procedures analyzed with 7198 having complete records. 48% were Category I, 13.6% were Category II, 1.9% were Category III and 36.5% were Category IV. The ASA were 34.7% ASA I, 50% ASA II, 13.39% ASA III, and 1.9% ASA IV. Although the largest proportion of patients were ASA II (50%), 39.6% of all ASA II were Category IV. Category IV was split into 54.2% ASA II and 34% ASA III and shows that peri-operative surgical concerns were not encompassed by the ASA system. This surgical categorization system streamlines surgical scheduling in a tertiary pediatric hospital system, particularly with respect to the designation of cases as ambulatory surgery center or main operating room appropriate. The case mix complexity is also readily apparent, enhancing recognition of the coordination and attention required for the perioperative management of high complexity

  2. Online scheduling of 2-re-entrant flexible manufacturing systems

    NARCIS (Netherlands)

    Pinxten, J. van; Waqas, U.; Geilen, M.; Basten, T.; Somers, L.

    2017-01-01

    Online scheduling of operations is essential to optimize productivity of flexible manufacturing systems (FMSs) where manufacturing requests arrive on the fly. An FMS processes products according to a particular flow through processing stations. This work focusses on online scheduling of re-entrant

  3. Planning and scheduling - A schedule's performance

    International Nuclear Information System (INIS)

    Whitman, N.M.

    1993-01-01

    Planning and scheduling is a process whose time has come to PSI Energy. With an awareness of the challenges ahead, individuals must look for ways to enhance the corporate competitiveness. Working toward this goal means that each individual has to dedicate themselves to this more competitive corporate environment. Being competitive may be defined as the ability of each employee to add value to the corporation's economic well being. The timely and successful implementation of projects greatly enhances competitiveness. Those projects that do not do well often suffer from lack of proper execution - not for lack of talent or strategic vision. Projects are consumers of resources such as cash and people. They produce a return when completed and will generate a better return when properly completed utilizing proven project management techniques. Completing projects on time, within budget and meeting customer expectations is the way a corporation builds it's future. This paper offers suggestions on implementing planning and scheduling and provides a review of results in the form of management reports

  4. Some extensions of the discrete lotsizing and scheduling problem

    NARCIS (Netherlands)

    M. Salomon (Marc); L.G. Kroon (Leo); R. Kuik (Roelof); L.N. van Wassenhove (Luk)

    1991-01-01

    textabstractIn this paper the Discrete Lotsizing and Scheduling Problem (DLSP) is considered. DLSP relates to capacitated lotsizing as well as to job scheduling problems and is concerned with determining a feasible production schedule with minimal total costs in a single-stage manufacturing process.

  5. CMS multicore scheduling strategy

    International Nuclear Information System (INIS)

    Yzquierdo, Antonio Pérez-Calero; Hernández, Jose; Holzman, Burt; Majewski, Krista; McCrea, Alison

    2014-01-01

    In the next years, processor architectures based on much larger numbers of cores will be most likely the model to continue 'Moore's Law' style throughput gains. This not only results in many more jobs in parallel running the LHC Run 1 era monolithic applications, but also the memory requirements of these processes push the workernode architectures to the limit. One solution is parallelizing the application itself, through forking and memory sharing or through threaded frameworks. CMS is following all of these approaches and has a comprehensive strategy to schedule multicore jobs on the GRID based on the glideinWMS submission infrastructure. The main component of the scheduling strategy, a pilot-based model with dynamic partitioning of resources that allows the transition to multicore or whole-node scheduling without disallowing the use of single-core jobs, is described. This contribution also presents the experiences made with the proposed multicore scheduling schema and gives an outlook of further developments working towards the restart of the LHC in 2015.

  6. Cloud Service Scheduling Algorithm Research and Optimization

    Directory of Open Access Journals (Sweden)

    Hongyan Cui

    2017-01-01

    Full Text Available We propose a cloud service scheduling model that is referred to as the Task Scheduling System (TSS. In the user module, the process time of each task is in accordance with a general distribution. In the task scheduling module, we take a weighted sum of makespan and flowtime as the objective function and use an Ant Colony Optimization (ACO and a Genetic Algorithm (GA to solve the problem of cloud task scheduling. Simulation results show that the convergence speed and output performance of our Genetic Algorithm-Chaos Ant Colony Optimization (GA-CACO are optimal.

  7. Dynamic Appliances Scheduling in Collaborative MicroGrids System

    Science.gov (United States)

    Bilil, Hasnae; Aniba, Ghassane; Gharavi, Hamid

    2017-01-01

    In this paper a new approach which is based on a collaborative system of MicroGrids (MG’s), is proposed to enable household appliance scheduling. To achieve this, appliances are categorized into flexible and non-flexible Deferrable Loads (DL’s), according to their electrical components. We propose a dynamic scheduling algorithm where users can systematically manage the operation of their electric appliances. The main challenge is to develop a flattening function calculus (reshaping) for both flexible and non-flexible DL’s. In addition, implementation of the proposed algorithm would require dynamically analyzing two successive multi-objective optimization (MOO) problems. The first targets the activation schedule of non-flexible DL’s and the second deals with the power profiles of flexible DL’s. The MOO problems are resolved by using a fast and elitist multi-objective genetic algorithm (NSGA-II). Finally, in order to show the efficiency of the proposed approach, a case study of a collaborative system that consists of 40 MG’s registered in the load curve for the flattening program has been developed. The results verify that the load curve can indeed become very flat by applying the proposed scheduling approach. PMID:28824226

  8. Optimal Algorithms and a PTAS for Cost-Aware Scheduling

    NARCIS (Netherlands)

    L. Chen; N. Megow; R. Rischke; L. Stougie (Leen); J. Verschae

    2015-01-01

    htmlabstractWe consider a natural generalization of classical scheduling problems in which using a time unit for processing a job causes some time-dependent cost which must be paid in addition to the standard scheduling cost. We study the scheduling objectives of minimizing the makespan and the

  9. Online Scheduling in Manufacturing A Cumulative Delay Approach

    CERN Document Server

    Suwa, Haruhiko

    2013-01-01

    Online scheduling is recognized as the crucial decision-making process of production control at a phase of “being in production" according to the released shop floor schedule. Online scheduling can be also considered as one of key enablers to realize prompt capable-to-promise as well as available-to-promise to customers along with reducing production lead times under recent globalized competitive markets. Online Scheduling in Manufacturing introduces new approaches to online scheduling based on a concept of cumulative delay. The cumulative delay is regarded as consolidated information of uncertainties under a dynamic environment in manufacturing and can be collected constantly without much effort at any points in time during a schedule execution. In this approach, the cumulative delay of the schedule has the important role of a criterion for making a decision whether or not a schedule revision is carried out. The cumulative delay approach to trigger schedule revisions has the following capabilities for the ...

  10. Developing optimal nurses work schedule using integer programming

    Science.gov (United States)

    Shahidin, Ainon Mardhiyah; Said, Mohd Syazwan Md; Said, Noor Hizwan Mohamad; Sazali, Noor Izatie Amaliena

    2017-08-01

    Time management is the art of arranging, organizing and scheduling one's time for the purpose of generating more effective work and productivity. Scheduling is the process of deciding how to commit resources between varieties of possible tasks. Thus, it is crucial for every organization to have a good work schedule for their staffs. The job of Ward nurses at hospitals runs for 24 hours every day. Therefore, nurses will be working using shift scheduling. This study is aimed to solve the nurse scheduling problem at an emergency ward of a private hospital. A 7-day work schedule for 7 consecutive weeks satisfying all the constraints set by the hospital will be developed using Integer Programming. The work schedule for the nurses obtained gives an optimal solution where all the constraints are being satisfied successfully.

  11. Technology for planning and scheduling under complex constraints

    Science.gov (United States)

    Alguire, Karen M.; Pedro Gomes, Carla O.

    1997-02-01

    Within the context of law enforcement, several problems fall into the category of planning and scheduling under constraints. Examples include resource and personnel scheduling, and court scheduling. In the case of court scheduling, a schedule must be generated considering available resources, e.g., court rooms and personnel. Additionally, there are constraints on individual court cases, e.g., temporal and spatial, and between different cases, e.g., precedence. Finally, there are overall objectives that the schedule should satisfy such as timely processing of cases and optimal use of court facilities. Manually generating a schedule that satisfies all of the constraints is a very time consuming task. As the number of court cases and constraints increases, this becomes increasingly harder to handle without the assistance of automatic scheduling techniques. This paper describes artificial intelligence (AI) technology that has been used to develop several high performance scheduling applications including a military transportation scheduler, a military in-theater airlift scheduler, and a nuclear power plant outage scheduler. We discuss possible law enforcement applications where we feel the same technology could provide long-term benefits to law enforcement agencies and their operations personnel.

  12. The power of reordering for online minimum makespan scheduling

    OpenAIRE

    Englert, Matthias; Özmen, Deniz; Westermann, Matthias

    2014-01-01

    In the classic minimum makespan scheduling problem, we are given an input sequence of jobs with processing times. A scheduling algorithm has to assign the jobs to m parallel machines. The objective is to minimize the makespan, which is the time it takes until all jobs are processed. In this paper, we consider online scheduling algorithms without preemption. However, we do not require that each arriving job has to be assigned immediately to one of the machines. A reordering buffer with limited...

  13. Interactive Dynamic Mission Scheduling for ASCA

    Science.gov (United States)

    Antunes, A.; Nagase, F.; Isobe, T.

    The Japanese X-ray astronomy satellite ASCA (Advanced Satellite for Cosmology and Astrophysics) mission requires scheduling for each 6-month observation phase, further broken down into weekly schedules at a few minutes resolution. Two tools, SPIKE and NEEDLE, written in Lisp and C, use artificial intelligence (AI) techniques combined with a graphic user interface for fast creation and alteration of mission schedules. These programs consider viewing and satellite attitude constraints as well as observer-requested criteria and present an optimized set of solutions for review by the planner. Six-month schedules at 1 day resolution are created for an oversubscribed set of targets by the SPIKE software, originally written for HST and presently being adapted for EUVE, XTE and AXAF. The NEEDLE code creates weekly schedules at 1 min resolution using in-house orbital routines and creates output for processing by the command generation software. Schedule creation on both the long- and short-term scale is rapid, less than 1 day for long-term, and one hour for short-term.

  14. Algorithms for classical and modern scheduling problems

    OpenAIRE

    Ott, Sebastian

    2016-01-01

    Subject of this thesis is the design and the analysis of algorithms for scheduling problems. In the first part, we focus on energy-efficient scheduling, where one seeks to minimize the energy needed for processing certain jobs via dynamic adjustments of the processing speed (speed scaling). We consider variations and extensions of the standard model introduced by Yao, Demers, and Shenker in 1995 [79], including the addition of a sleep state, the avoidance of preemption, and variable speed lim...

  15. Nonblocking Scheduling for Web Service Transactions

    DEFF Research Database (Denmark)

    Alrifai, Mohammad; Balke, Wolf-Tilo; Dolog, Peter

    2007-01-01

    . In this paper, we propose a novel nonblocking scheduling mechanism that is used prior to the actual service invocations. Its aim is to reach an agreement between the client and all participating providers on what transaction processing times have to be expected, accepted, and guaranteed. This enables service......For improved flexibility and concurrent usage existing transaction management models for Web services relax the isolation property of Web service-based transactions. Correctness of the concurrent execution then has to be ensured by commit order-preserving transaction schedulers. However, local...... schedulers of service providers typically do take into account neither time constraints for committing the whole transaction, nor the individual services' constraints when scheduling decisions are made. This often leads to an unnecessary blocking of transactions by (possibly long-running) others...

  16. How useful are preemptive schedules?

    NARCIS (Netherlands)

    Brucker, P.; Heitmann, S.; Hurink, J.L.

    2001-01-01

    Machine scheduling admits two options to process jobs. In a preemptive mode processing may be interrupted and resumed later even on a different machine. In a nonpreemptive mode interruptions are not allowed. Usually, the possibility to preempt jobs leads to better performance values. However, also

  17. A status report on the PBFA II construction project

    International Nuclear Information System (INIS)

    Barr, G.W.; Furaus, J.P.; Cook, D.L.; Shirley, C.G.

    1985-01-01

    The Particle Beam Fusion Accelerator II (PBFA II) is under construction at Sandia National Laboratories (SNL). PBFA II contains 36 individual power modules configured in a stacked radial geometry and synchronized to provide greater than 3.5 MJ of energy into the vacuum section in a single 55-ns-wide 90-TW peak power pulse. This R and D construction project is being implemented in a fast track schedule mode in which final design of the accelerator components occurs in parallel with the construction of the laboratory building and the accelerator tank. PBFA II is scheduled to become operational in January 1986 with its first multi-module shot into an applied-B ion diode that will generate and transport a beam of lithium ions. Plans are now being made for experimental work on PBFA II beyond the construction phase

  18. An improved sheep flock heredity algorithm for job shop scheduling and flow shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Chandramouli Anandaraman

    2011-10-01

    Full Text Available Job Shop Scheduling Problem (JSSP and Flow Shop Scheduling Problem (FSSP are strong NP-complete combinatorial optimization problems among class of typical production scheduling problems. An improved Sheep Flock Heredity Algorithm (ISFHA is proposed in this paper to find a schedule of operations that can minimize makespan. In ISFHA, the pairwise mutation operation is replaced by a single point mutation process with a probabilistic property which guarantees the feasibility of the solutions in the local search domain. A Robust-Replace (R-R heuristic is introduced in place of chromosomal crossover to enhance the global search and to improve the convergence. The R-R heuristic is found to enhance the exploring potential of the algorithm and enrich the diversity of neighborhoods. Experimental results reveal the effectiveness of the proposed algorithm, whose optimization performance is markedly superior to that of genetic algorithms and is comparable to the best results reported in the literature.

  19. A Review Of Fault Tolerant Scheduling In Multicore Systems

    Directory of Open Access Journals (Sweden)

    Shefali Malhotra

    2015-05-01

    Full Text Available Abstract In this paper we have discussed about various fault tolerant task scheduling algorithm for multi core system based on hardware and software. Hardware based algorithm which is blend of Triple Modulo Redundancy and Double Modulo Redundancy in which Agricultural Vulnerability Factor is considered while deciding the scheduling other than EDF and LLF scheduling algorithms. In most of the real time system the dominant part is shared memory.Low overhead software based fault tolerance approach can be implemented at user-space level so that it does not require any changes at application level. Here redundant multi-threaded processes are used. Using those processes we can detect soft errors and recover from them. This method gives low overhead fast error detection and recovery mechanism. The overhead incurred by this method ranges from 0 to 18 for selected benchmarks. Hybrid Scheduling Method is another scheduling approach for real time systems. Dynamic fault tolerant scheduling gives high feasibility rate whereas task criticality is used to select the type of fault recovery method in order to tolerate the maximum number of faults.

  20. Enhanced round robin CPU scheduling with burst time based time quantum

    Science.gov (United States)

    Indusree, J. R.; Prabadevi, B.

    2017-11-01

    Process scheduling is a very important functionality of Operating system. The main-known process-scheduling algorithms are First Come First Serve (FCFS) algorithm, Round Robin (RR) algorithm, Priority scheduling algorithm and Shortest Job First (SJF) algorithm. Compared to its peers, Round Robin (RR) algorithm has the advantage that it gives fair share of CPU to the processes which are already in the ready-queue. The effectiveness of the RR algorithm greatly depends on chosen time quantum value. Through this research paper, we are proposing an enhanced algorithm called Enhanced Round Robin with Burst-time based Time Quantum (ERRBTQ) process scheduling algorithm which calculates time quantum as per the burst-time of processes already in ready queue. The experimental results and analysis of ERRBTQ algorithm clearly indicates the improved performance when compared with conventional RR and its variants.

  1. Iterative Relay Scheduling with Hybrid ARQ under Multiple User Equipment (Type II) Relay Environments

    KAUST Repository

    Nam, Sung Sik

    2018-01-09

    In this work, we propose an iterative relay scheduling with hybrid ARQ (IRS-HARQ) scheme which realizes fast jump-in/successive relaying and subframe-based decoding under the multiple user equipment (UE) relay environments applicable to the next-generation cellular systems (e.g., LTE-Advanced and beyond). The proposed IRS-HARQ aims to increase the achievable data rate by iteratively scheduling a relatively better UE relay closer to the end user in a probabilistic sense, provided that the relay-to-end user link should be operated in an open-loop and transparent mode. The latter is due to the fact that not only there are no dedicated control channels between the UE relay and the end user but also a new cell is not created. Under this open-loop and transparent mode, our proposed protocol is implemented by partially exploiting the channel state information based on the overhearing mechanism of ACK/NACK for HARQ. Further, the iterative scheduling enables UE-to-UE direct communication with proximity that offers spatial frequency reuse and energy saving.

  2. VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans

    Science.gov (United States)

    Wang, Song; Gupta, Chetan; Mehta, Abhay

    There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.

  3. Recovery scheduling for industrial pocesses using graph constraints

    NARCIS (Netherlands)

    Saltik, M.B.; van Gameren, S.; Özkan, L.; Weiland, S.

    2017-01-01

    This paper considers a class of scheduling problems cast for processes that consist of several interconnected subprocesses. We model the temporal constraints (On-Off status) on each subprocess using labeled directed graphs to form the admissible set of schedules. Furthermore, we consider physical

  4. Multiple-Machine Scheduling with Learning Effects and Cooperative Games

    Directory of Open Access Journals (Sweden)

    Yiyuan Zhou

    2015-01-01

    Full Text Available Multiple-machine scheduling problems with position-based learning effects are studied in this paper. There is an initial schedule in this scheduling problem. The optimal schedule minimizes the sum of the weighted completion times; the difference between the initial total weighted completion time and the minimal total weighted completion time is the cost savings. A multiple-machine sequencing game is introduced to allocate the cost savings. The game is balanced if the normal processing times of jobs that are on the same machine are equal and an equal number of jobs are scheduled on each machine initially.

  5. Compositional schedulability analysis of real-time actor-based systems.

    Science.gov (United States)

    Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan

    2017-01-01

    We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.

  6. Phase I/II Study Evaluating Early Tolerance in Breast Cancer Patients Undergoing Accelerated Partial Breast Irradiation Treated With the MammoSite Balloon Breast Brachytherapy Catheter Using a 2-Day Dose Schedule

    International Nuclear Information System (INIS)

    Wallace, Michelle; Martinez, Alvaro; Mitchell, Christina; Chen, Peter Y.; Ghilezan, Mihai; Benitez, Pamela; Brown, Eric; Vicini, Frank

    2010-01-01

    Purpose: Initial Phase I/II results using balloon brachytherapy to deliver accelerated partial breast irradiation (APBI) in 2 days in patients with early-stage breast cancer are presented. Materials and Methods: Between March 2004 and August 2007, 45 patients received adjuvant radiation therapy after lumpectomy with balloon brachytherapy in a Phase I/II trial delivering 2800 cGy in four fractions of 700 cGy. Toxicities were evaluated using the National Cancer Institute Common Toxicity Criteria for Adverse Events v3.0 scale and cosmesis was documented at ≥6 months. Results: The median age was 66 years (range, 48-83) and median skin spacing was 12 mm (range, 8-24). The median follow-up was 11.4 months (5.4-48 months) with 21 patients (47%) followed ≥1 year, 11 (24%) ≥2 years, and 7 (16%) ≥3 years. At <6 months (n = 45), Grade II toxicity rates were 9% radiation dermatitis, 13% breast pain, 2% edema, and 2% hyperpigmentation. Grade III breast pain was reported in 13% (n = 6). At ≥6 months (n = 43), Grade II toxicity rates were: 2% radiation dermatitis, 2% induration, and 2% hypopigmentation. Grade III breast pain was reported in 2%. Infection was 13% (n = 6) at <6 months and 5% (n = 2) at ≥6 months. Persistent seroma ≥6 months was 30% (n = 13). Fat necrosis developed in 4 cases (2 symptomatic). Rib fractures were seen in 4% (n = 2). Cosmesis was good/excellent in 96% of cases. Conclusions: Treatment with balloon brachytherapy using a 2-day dose schedule resulted acceptable rates of Grade II/III chronic toxicity rates and similar cosmetic results observed with a standard 5-day accelerated partial breast irradiation schedule.

  7. Scheduling and Optimization of Fault-Tolerant Embedded Systems with Transparency/Performance Trade-Offs

    DEFF Research Database (Denmark)

    Izosimov, Viacheslav; Pop, Paul; Eles, Petru

    2012-01-01

    In this article, we propose a strategy for the synthesis of fault-tolerant schedules and for the mapping of fault-tolerant applications. Our techniques handle transparency/performance trade-offs and use the faultoccurrence information to reduce the overhead due to fault tolerance. Processes...... and messages are statically scheduled, and we use process reexecution for recovering from multiple transient faults. We propose a finegrained transparent recovery, where the property of transparency can be selectively applied to processes and messages. Transparency hides the recovery actions in a selected part...... of the application so that they do not affect the schedule of other processes and messages. While leading to longer schedules, transparent recovery has the advantage of both improved debuggability and less memory needed to store the faulttolerant schedules....

  8. In-situ Moessbauer spectroscopy with MIMOS II

    Energy Technology Data Exchange (ETDEWEB)

    Fleischer, Iris, E-mail: fleischi@uni-mainz.de; Klingelhoefer, Goestar [Institute of Inorganic and Analytical Chemistry, Johannes Gutenberg University of Mainz (Germany); Morris, Richard V. [NASA Johnson Space Center (United States); Schroeder, Christian [University of Bayreuth and University of Tuebingen (Germany); Rodionov, Daniel [Institute of Inorganic and Analytical Chemistry, Johannes Gutenberg University of Mainz (Germany); Souza, Paulo A. de [Tasmanian ICT Centre (Australia); Collaboration: MIMOS II Team

    2012-03-15

    The miniaturized Moessbauer spectrometer MIMOS II was developed for the exploration of planetary surfaces. Two MIMOS II instruments were successfully deployed on the martian surface as payload elements of the NASA Mars Exploration Rover (MER) mission and have returned data since landing in January 2004. Moessbauer spectroscopy has made significant contributions to the success of the MER mission, in particular identification of iron-bearing minerals formed through aqueous weathering processes. As a field-portable instrument and with backscattering geometry, MIMOS II provides an opportunity for non-destructive in-situ investigations for a range of applications. For example, the instrument has been used for analyses of archaeological artifacts, for air pollution studies and for in-field monitoring of green rust formation. A MER-type MIMOS II instrument is part of the payload of the Russian Phobos-Grunt mission, scheduled for launch in November 2011, with the aim of exploring the composition of the martian moon Phobos. An advanced version of the instrument, MIMOS IIA, that incorporates capability for elemental analyses, is currently under development.

  9. Sport Tournament Automated Scheduling System

    Directory of Open Access Journals (Sweden)

    Raof R. A. A

    2018-01-01

    Full Text Available The organizer of sport events often facing problems such as wrong calculations of marks and scores, as well as difficult to create a good and reliable schedule. Most of the time, the issues about the level of integrity of committee members and also issues about errors made by human came into the picture. Therefore, the development of sport tournament automated scheduling system is proposed. The system will be able to automatically generate the tournament schedule as well as automatically calculating the scores of each tournament. The problem of scheduling the matches of a round robin and knock-out phase in a sport league are given focus. The problem is defined formally and the computational complexity is being noted. A solution algorithm is presented using a two-step approach. The first step is the creation of a tournament pattern and is based on known graph-theoretic method. The second one is an assignment problem and it is solved using a constraint based depth-first branch and bound procedure that assigns actual teams to numbers in the pattern. As a result, the scheduling process and knock down phase become easy for the tournament organizer and at the same time increasing the level of reliability.

  10. Contrast and autoshaping in multiple schedules varying reinforcer rate and duration.

    Science.gov (United States)

    Hamilton, B E; Silberberg, A

    1978-07-01

    Thirteen master pigeons were exposed to multiple schedules in which reinforcement frequency (Experiment I) or duration (Experiment II) was varied. In Phases 1 and 3 of Experiment I, the values of the first and second components' random-interval schedules were 33 and 99 seconds, respectively. In Phase 2, these values were 99 seconds for both components. In Experiment II, a random-interval 33-second schedule was associated with each component. During Phases 1 and 3, the first and second components had hopper durations of 7.5 and 2.5 seconds respectively. During Phase 2, both components' hopper durations were 2.5 seconds. In each experiment, positive contrast obtained for about half the master subjects. The rest showed a rate increase in both components (positive induction). Each master subject's key colors and reinforcers were synchronously presented on a response-independent basis to a yoked control. Richer component key-pecking occurred during each experiment's Phases 1 and 3 among half these subjects. However, none responded during the contrast condition (unchanged component of each experiment's Phase 2). From this it is inferred that autoshaping did not contribute to the contrast and induction findings among master birds. Little evidence of local contrast (highest rate at beginning of richer component) was found in any subject. These data show that (a) contrast can occur independently from autoshaping, (b) contrast assays during equal-valued components may produce induction, (c) local contrast in multiple schedules often does not occur, and (d) differential hopper durations can produce autoshaping and contrast.

  11. An Improved Version of Discrete Particle Swarm Optimization for Flexible Job Shop Scheduling Problem with Fuzzy Processing Time

    Directory of Open Access Journals (Sweden)

    Song Huang

    2016-01-01

    Full Text Available The fuzzy processing time occasionally exists in job shop scheduling problem of flexible manufacturing system. To deal with fuzzy processing time, fuzzy flexible job shop model was established in several papers and has attracted numerous researchers’ attention recently. In our research, an improved version of discrete particle swarm optimization (IDPSO is designed to solve flexible job shop scheduling problem with fuzzy processing time (FJSPF. In IDPSO, heuristic initial methods based on triangular fuzzy number are developed, and a combination of six initial methods is applied to initialize machine assignment and random method is used to initialize operation sequence. Then, some simple and effective discrete operators are employed to update particle’s position and generate new particles. In order to guide the particles effectively, we extend global best position to a set with several global best positions. Finally, experiments are designed to investigate the impact of four parameters in IDPSO by Taguchi method, and IDPSO is tested on five instances and compared with some state-of-the-art algorithms. The experimental results show that the proposed algorithm can obtain better solutions for FJSPF and is more competitive than the compared algorithms.

  12. Prescribed Travel Schedules for Fatigue Management

    Science.gov (United States)

    Whitmire, Alexandra; Johnston, Smith; Lockley, Steven

    2011-01-01

    The NASA Fatigue Management Team is developing recommendations for managing fatigue during travel and for shift work operations, as Clinical Practice Guidelines for the Management of Circadian Desynchrony in ISS Operations. The Guidelines provide the International Space Station (ISS ) flight surgeons and other operational clinicians with evidence-based recommendations for mitigating fatigue and other factors related to sleep loss and circadian desynchronization. As much international travel is involved both before and after flight, the guidelines provide recommendations for: pre-flight training, in-flight operations, and post-flight rehabilitation. The objective of is to standardize the process by which care is provided to crewmembers, ground controllers, and other support personnel such as trainers, when overseas travel or schedule shifting is required. Proper scheduling of countermeasures - light, darkness, melatonin, diet, exercise, and medications - is the cornerstone for facilitating circadian adaptation, improving sleep, enhancing alertness, and optimizing performance. The Guidelines provide, among other things, prescribed travel schedules that outline the specific implementation of these mitigation strategies. Each travel schedule offers evidence based protocols for properly using the NASA identified countermeasures for fatigue. This presentation will describe the travel implementation schedules and how these can be used to alleviate the effects of jet lag and/or schedule shifts.

  13. Instructions, multiple schedules, and extinction: Distinguishing rule-governed from schedule-controlled behavior.

    Science.gov (United States)

    Hayes, S C; Brownstein, A J; Haas, J R; Greenway, D E

    1986-09-01

    Schedule sensitivity has usually been examined either through a multiple schedule or through changes in schedules after steady-state responding has been established. This study compared the effects of these two procedures when various instructions were given. Fifty-five college students responded in two 32-min sessions under a multiple fixed-ratio 18/differential-reinforcement-of-low-rate 6-s schedule, followed by one session of extinction. Some subjects received no instructions regarding the appropriate rates of responding, whereas others received instructions to respond slowly, rapidly, or both. Relative to the schedule in operation, the instructions were minimal, partially inaccurate, or accurate. When there was little schedule sensitivity in the multiple schedule, there was little in extinction. When apparently schedule-sensitive responding occurred in the multiple schedule, however, sensitivity in extinction occurred only if differential responding in the multiple schedule could not be due to rules supplied by the experimenter. This evidence shows that rule-governed behavior that occurs in the form of schedule-sensitive behavior may not in fact become schedule-sensitive even though it makes contact with the scheduled reinforcers.

  14. Complete Element Abundances of Nine Stars in the r-process Galaxy Reticulum II

    Science.gov (United States)

    Ji, Alexander P.; Frebel, Anna; Simon, Joshua D.; Chiti, Anirudh

    2016-10-01

    We present chemical abundances derived from high-resolution Magellan/Magellan Inamori Kyocera Echelle spectra of the nine brightest known red giant members of the ultra-faint dwarf galaxy Reticulum II (Ret II). These stars span the full metallicity range of Ret II (-3.5 contaminated known r-process pattern. The abundances of lighter elements up to the iron peak are otherwise similar to abundances of stars in the halo and in other ultra-faint dwarf galaxies. However, the scatter in abundance ratios is large enough to suggest that inhomogeneous metal mixing is required to explain the chemical evolution of this galaxy. The presence of low amounts of neutron-capture elements in other ultra-faint dwarf galaxies may imply the existence of additional r-process sites besides the source of r-process elements in Ret II. Galaxies like Ret II may be the original birth sites of r-process enhanced stars now found in the halo. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.

  15. The local–global conjecture for scheduling with non-linear cost

    NARCIS (Netherlands)

    Bansal, N.; Dürr, C.; Thang, N.K.K.; Vásquez, Ó.C.

    2017-01-01

    We consider the classical scheduling problem on a single machine, on which we need to schedule sequentially n given jobs. Every job j has a processing time pj and a priority weight wj, and for a given schedule a completion time Cj. In this paper, we consider the problem of minimizing the objective

  16. Evaluation of Selected Resource Allocation and Scheduling Methods in Heterogeneous Many-Core Processors and Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Ciznicki Milosz

    2014-12-01

    Full Text Available Heterogeneous many-core computing resources are increasingly popular among users due to their improved performance over homogeneous systems. Many developers have realized that heterogeneous systems, e.g. a combination of a shared memory multi-core CPU machine with massively parallel Graphics Processing Units (GPUs, can provide significant performance opportunities to a wide range of applications. However, the best overall performance can only be achieved if application tasks are efficiently assigned to different types of processor units in time taking into account their specific resource requirements. Additionally, one should note that available heterogeneous resources have been designed as general purpose units, however, with many built-in features accelerating specific application operations. In other words, the same algorithm or application functionality can be implemented as a different task for CPU or GPU. Nevertheless, from the perspective of various evaluation criteria, e.g. the total execution time or energy consumption, we may observe completely different results. Therefore, as tasks can be scheduled and managed in many alternative ways on both many-core CPUs or GPUs and consequently have a huge impact on the overall computing resources performance, there are needs for new and improved resource management techniques. In this paper we discuss results achieved during experimental performance studies of selected task scheduling methods in heterogeneous computing systems. Additionally, we present a new architecture for resource allocation and task scheduling library which provides a generic application programming interface at the operating system level for improving scheduling polices taking into account a diversity of tasks and heterogeneous computing resources characteristics.

  17. A Fe(II)/citrate/UV/PMS process for carbamazepine degradation at a very low Fe(II)/PMS ratio and neutral pH: The mechanisms.

    Science.gov (United States)

    Ling, Li; Zhang, Dapeng; Fan, Chihhao; Shang, Chii

    2017-11-01

    A novel Fe(II)/citrate/UV/PMS process for degrading a model micropollutant, carbamazepine (CBZ), at a low Fe(II)/PMS ratio and neutral pH has been proposed in this study, and the mechanisms of radical generation in the system was explored. With a UV dose of 302.4 mJ/cm 2 , an initial pH of 7, and CBZ, PMS, Fe(II) and citrate at initial concentrations of 10, 100, 12 and 26 μM, respectively, the CBZ degradation efficiency reached 71% in 20 min in the Fe(II)/citrate/UV/PMS process, which was 4.7 times higher than that in either the citrate/UV/PMS or Fe(II)/citrate/PMS process. The enhanced CBZ degradation in the Fe(II)/citrate/UV/PMS process was mainly attributed to the continuous activation of PMS by the UV-catalyzed regenerated Fe(II) from a Fe(III)-citrate complex, [Fe 3 O(cit) 3 H 3 ] 2- , which not only maintained Fe(III) soluble at neutral pH, but also increased 6.6 and 2.6 times of its molar absorbance and quantum yield as compared to those of ionic Fe(III), respectively. In the Fe(II)/citrate/UV/PMS process, the SO 4 •- produced from the fast reaction between PMS and the initially-added Fe(II) contributed 11% of CBZ degradation. The PMS activation by the UV radiation and regenerated Fe(II) contributed additional 14% and 46% of CBZ removal, respectively. The low iron and citrate doses and the fast radical generation at neutral pH make the Fe(II)/citrate/UV/PMS process suitable for degrading recalcitrant organic compounds in potable water. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Flow-shop scheduling problem under uncertainties: Review and trends

    Directory of Open Access Journals (Sweden)

    Eliana María González-Neira

    2017-03-01

    Full Text Available Among the different tasks in production logistics, job scheduling is one of the most important at the operational decision-making level to enable organizations to achieve competiveness. Scheduling consists in the allocation of limited resources to activities over time in order to achieve one or more optimization objectives. Flow-shop (FS scheduling problems encompass the sequencing processes in environments in which the activities or operations are performed in a serial flow. This type of configuration includes assembly lines and the chemical, electronic, food, and metallurgical industries, among others. Scheduling has been mostly investigated for the deterministic cases, in which all parameters are known in advance and do not vary over time. Nevertheless, in real-world situations, events are frequently subject to uncertainties that can affect the decision-making process. Thus, it is important to study scheduling and sequencing activities under uncertainties since they can cause infeasibilities and disturbances. The purpose of this paper is to provide a general overview of the FS scheduling problem under uncertainties and its role in production logistics and to draw up opportunities for further research. To this end, 100 papers about FS and flexible flow-shop scheduling problems published from 2001 to October 2016 were analyzed and classified. Trends in the reviewed literature are presented and finally some research opportunities in the field are proposed.

  19. Schedules of Controlled Substances: Temporary Placement of 4-Fluoroisobutyryl Fentanyl into Schedule I. Temporary scheduling order.

    Science.gov (United States)

    2017-05-03

    The Administrator of the Drug Enforcement Administration is issuing this temporary scheduling order to schedule the synthetic opioid, N-(4-fluorophenyl)-N-(1-phenethylpiperidin-4-yl)isobutyramide (4-fluoroisobutyryl fentanyl or para-fluoroisobutyryl fentanyl), and its isomers, esters, ethers, salts and salts of isomers, esters, and ethers, into schedule I pursuant to the temporary scheduling provisions of the Controlled Substances Act. This action is based on a finding by the Administrator that the placement of 4-fluoroisobutyryl fentanyl into schedule I of the Controlled Substances Act is necessary to avoid an imminent hazard to the public safety. As a result of this order, the regulatory controls and administrative, civil, and criminal sanctions applicable to schedule I controlled substances will be imposed on persons who handle (manufacture, distribute, reverse distribute, import, export, engage in research, conduct instructional activities or chemical analysis, or possess), or propose to handle, 4-fluoroisobutyryl fentanyl.

  20. 36 CFR 1258.12 - NARA reproduction fee schedule.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false NARA reproduction fee... ADMINISTRATION PUBLIC AVAILABILITY AND USE FEES § 1258.12 NARA reproduction fee schedule. (a) Certification: $15...) Unlisted processes: For reproductions not covered by this fee schedule, see also § 1258.4. Fees for other...

  1. KEKB and PEP-II B Factories

    International Nuclear Information System (INIS)

    Seeman, J.T.

    1997-01-01

    Two asymmetric B-Factories KEKB at KEK and PEP-II at SLAC are under construction, designed to study CP violation in the b-quark sector with a center of mass energy of 10.58 GeV. These two new accelerators are high luminosity two-ring two-energy e + e - colliders with one interaction point. There are many challenging accelerator physics and engineering issues associated with the high beam currents and high luminosities of these rings. The chosen solutions to these issues and the general parameters of the two rings are described in detail side-by-side. KEKB and PEP-II are well into the installation phase and are both scheduled to be completed in 1998. The particle physics programs are scheduled to start in 1999

  2. Scheduling techniques in the Request Oriented Scheduling Engine (ROSE)

    Science.gov (United States)

    Zoch, David R.

    1991-01-01

    Scheduling techniques in the ROSE are presented in the form of the viewgraphs. The following subject areas are covered: agenda; ROSE summary and history; NCC-ROSE task goals; accomplishments; ROSE timeline manager; scheduling concerns; current and ROSE approaches; initial scheduling; BFSSE overview and example; and summary.

  3. Permutation flow-shop scheduling problem to optimize a quadratic objective function

    Science.gov (United States)

    Ren, Tao; Zhao, Peng; Zhang, Da; Liu, Bingqian; Yuan, Huawei; Bai, Danyu

    2017-09-01

    A flow-shop scheduling model enables appropriate sequencing for each job and for processing on a set of machines in compliance with identical processing orders. The objective is to achieve a feasible schedule for optimizing a given criterion. Permutation is a special setting of the model in which the processing order of the jobs on the machines is identical for each subsequent step of processing. This article addresses the permutation flow-shop scheduling problem to minimize the criterion of total weighted quadratic completion time. With a probability hypothesis, the asymptotic optimality of the weighted shortest processing time schedule under a consistency condition (WSPT-CC) is proven for sufficiently large-scale problems. However, the worst case performance ratio of the WSPT-CC schedule is the square of the number of machines in certain situations. A discrete differential evolution algorithm, where a new crossover method with multiple-point insertion is used to improve the final outcome, is presented to obtain high-quality solutions for moderate-scale problems. A sequence-independent lower bound is designed for pruning in a branch-and-bound algorithm for small-scale problems. A set of random experiments demonstrates the performance of the lower bound and the effectiveness of the proposed algorithms.

  4. Artificial intelligence techniques for scheduling Space Shuttle missions

    Science.gov (United States)

    Henke, Andrea L.; Stottler, Richard H.

    1994-01-01

    Planning and scheduling of NASA Space Shuttle missions is a complex, labor-intensive process requiring the expertise of experienced mission planners. We have developed a planning and scheduling system using combinations of artificial intelligence knowledge representations and planning techniques to capture mission planning knowledge and automate the multi-mission planning process. Our integrated object oriented and rule-based approach reduces planning time by orders of magnitude and provides planners with the flexibility to easily modify planning knowledge and constraints without requiring programming expertise.

  5. Hybrid Metaheuristics for Solving a Fuzzy Single Batch-Processing Machine Scheduling Problem

    Directory of Open Access Journals (Sweden)

    S. Molla-Alizadeh-Zavardehi

    2014-01-01

    Full Text Available This paper deals with a problem of minimizing total weighted tardiness of jobs in a real-world single batch-processing machine (SBPM scheduling in the presence of fuzzy due date. In this paper, first a fuzzy mixed integer linear programming model is developed. Then, due to the complexity of the problem, which is NP-hard, we design two hybrid metaheuristics called GA-VNS and VNS-SA applying the advantages of genetic algorithm (GA, variable neighborhood search (VNS, and simulated annealing (SA frameworks. Besides, we propose three fuzzy earliest due date heuristics to solve the given problem. Through computational experiments with several random test problems, a robust calibration is applied on the parameters. Finally, computational results on different-scale test problems are presented to compare the proposed algorithms.

  6. An Improved Hierarchical Genetic Algorithm for Sheet Cutting Scheduling with Process Constraints

    Directory of Open Access Journals (Sweden)

    Yunqing Rao

    2013-01-01

    Full Text Available For the first time, an improved hierarchical genetic algorithm for sheet cutting problem which involves n cutting patterns for m non-identical parallel machines with process constraints has been proposed in the integrated cutting stock model. The objective of the cutting scheduling problem is minimizing the weighted completed time. A mathematical model for this problem is presented, an improved hierarchical genetic algorithm (ant colony—hierarchical genetic algorithm is developed for better solution, and a hierarchical coding method is used based on the characteristics of the problem. Furthermore, to speed up convergence rates and resolve local convergence issues, a kind of adaptive crossover probability and mutation probability is used in this algorithm. The computational result and comparison prove that the presented approach is quite effective for the considered problem.

  7. An improved hierarchical genetic algorithm for sheet cutting scheduling with process constraints.

    Science.gov (United States)

    Rao, Yunqing; Qi, Dezhong; Li, Jinling

    2013-01-01

    For the first time, an improved hierarchical genetic algorithm for sheet cutting problem which involves n cutting patterns for m non-identical parallel machines with process constraints has been proposed in the integrated cutting stock model. The objective of the cutting scheduling problem is minimizing the weighted completed time. A mathematical model for this problem is presented, an improved hierarchical genetic algorithm (ant colony--hierarchical genetic algorithm) is developed for better solution, and a hierarchical coding method is used based on the characteristics of the problem. Furthermore, to speed up convergence rates and resolve local convergence issues, a kind of adaptive crossover probability and mutation probability is used in this algorithm. The computational result and comparison prove that the presented approach is quite effective for the considered problem.

  8. In-situ Mössbauer spectroscopy with MIMOS II

    International Nuclear Information System (INIS)

    Fleischer, Iris; Klingelhöfer, Göstar; Morris, Richard V.; Schröder, Christian; Rodionov, Daniel; Souza, Paulo A. de

    2012-01-01

    The miniaturized Mössbauer spectrometer MIMOS II was developed for the exploration of planetary surfaces. Two MIMOS II instruments were successfully deployed on the martian surface as payload elements of the NASA Mars Exploration Rover (MER) mission and have returned data since landing in January 2004. Mössbauer spectroscopy has made significant contributions to the success of the MER mission, in particular identification of iron-bearing minerals formed through aqueous weathering processes. As a field-portable instrument and with backscattering geometry, MIMOS II provides an opportunity for non-destructive in-situ investigations for a range of applications. For example, the instrument has been used for analyses of archaeological artifacts, for air pollution studies and for in-field monitoring of green rust formation. A MER-type MIMOS II instrument is part of the payload of the Russian Phobos-Grunt mission, scheduled for launch in November 2011, with the aim of exploring the composition of the martian moon Phobos. An advanced version of the instrument, MIMOS IIA, that incorporates capability for elemental analyses, is currently under development.

  9. Split Scheduling with Uniform Setup Times

    NARCIS (Netherlands)

    Schalekamp, F.; Sitters, R.A.; van der Ster, S.L.; Stougie, L.; Verdugo, V.; van Zuylen, A.

    2015-01-01

    We study a scheduling problem in which jobs may be split into parts, where the parts of a split job may be processed simultaneously on more than one machine. Each part of a job requires a setup time, however, on the machine where the job part is processed. During setup, a machine cannot process or

  10. Using Integer Programming for Airport Service Planning in Staff Scheduling

    Directory of Open Access Journals (Sweden)

    W.H. Ip

    2010-09-01

    Full Text Available Reliability and safety in flight is extremely necessary and that depend on the adoption of proper maintenance system. Therefore, it is essential for aircraft maintenance companies to perform the manpower scheduling efficiently. One of the objectives of this paper is to provide an Integer Programming approach to determine the optimal solutions to aircraft maintenance planning and scheduling and hence the planning and scheduling processes can become more efficient and effective. Another objective is to develop a set of computational schedules for maintenance manpower to cover all scheduled flights. In this paper, a sequential methodology consisting of 3 stages is proposed. They are initial maintenance demand schedule, the maintenance pairing and the maintenance group(s assignment. Since scheduling would split up into different stages, different mathematical techniques have been adopted to cater for their own problem characteristics. Microsoft Excel would be used. Results from the first stage and second stage would be inputted into integer programming model using Microsoft Excel Solver to find the optimal solution. Also, Microsoft Excel VBA is used for devising a scheduling system in order to reduce the manual process and provide a user friendly interface. For the results, all can be obtained optimal solution and the computation time is reasonable and acceptable. Besides, the comparison of the peak time and non-peak time is discussed.

  11. Scheduling of Fault-Tolerant Embedded Systems with Soft and Hard Timing Constraints

    DEFF Research Database (Denmark)

    Izosimov, Viacheslav; Pop, Paul; Eles, Petru

    2008-01-01

    In this paper we present an approach to the synthesis of fault-tolerant schedules for embedded applications with soft and hard real-time constraints. We are interested to guarantee the deadlines for the hard processes even in the case of faults, while maximizing the overall utility. We use time....../utility functions to capture the utility of soft processes. Process re-execution is employed to recover from multiple faults. A single static schedule computed off-line is not fault tolerant and is pessimistic in terms of utility, while a purely online approach, which computes a new schedule every time a process...

  12. Flow-shop scheduling problem under uncertainties: Review and trends

    OpenAIRE

    Eliana María González-Neira; Jairo R. Montoya-Torres; David Barrera

    2017-01-01

    Among the different tasks in production logistics, job scheduling is one of the most important at the operational decision-making level to enable organizations to achieve competiveness. Scheduling consists in the allocation of limited resources to activities over time in order to achieve one or more optimization objectives. Flow-shop (FS) scheduling problems encompass the sequencing processes in environments in which the activities or operations are performed in a serial flow. This type of co...

  13. APGEN Scheduling: 15 Years of Experience in Planning Automation

    Science.gov (United States)

    Maldague, Pierre F.; Wissler, Steve; Lenda, Matthew; Finnerty, Daniel

    2014-01-01

    In this paper, we discuss the scheduling capability of APGEN (Activity Plan Generator), a multi-mission planning application that is part of the NASA AMMOS (Advanced Multi- Mission Operations System), and how APGEN scheduling evolved over its applications to specific Space Missions. Our analysis identifies two major reasons for the successful application of APGEN scheduling to real problems: an expressive DSL (Domain-Specific Language) for formulating scheduling algorithms, and a well-defined process for enlisting the help of auxiliary modeling tools in providing high-fidelity, system-level simulations of the combined spacecraft and ground support system.

  14. Understanding the costs and schedule of hydroelectric projects

    International Nuclear Information System (INIS)

    Merrow, E.W.; Schroeder, B.R.

    1991-01-01

    This paper is based on a study conducted for the World Bank which evaluated the feasibility of developing an empirically based ex ante project analysis system for hydroelectric projects. The system would be used to assess: the reasonableness of engineering-based cost and schedule estimates used for project appraisal and preliminary estimates used to select projects for appraisal; and the potential for cost growth and schedule slip. The system would help identify projects early in the project appraisal process that harbor significantly higher than normal risks of overrunning cost and schedule estimates

  15. Diverse task scheduling for individualized requirements in cloud manufacturing

    Science.gov (United States)

    Zhou, Longfei; Zhang, Lin; Zhao, Chun; Laili, Yuanjun; Xu, Lida

    2018-03-01

    Cloud manufacturing (CMfg) has emerged as a new manufacturing paradigm that provides ubiquitous, on-demand manufacturing services to customers through network and CMfg platforms. In CMfg system, task scheduling as an important means of finding suitable services for specific manufacturing tasks plays a key role in enhancing the system performance. Customers' requirements in CMfg are highly individualized, which leads to diverse manufacturing tasks in terms of execution flows and users' preferences. We focus on diverse manufacturing tasks and aim to address their scheduling issue in CMfg. First of all, a mathematical model of task scheduling is built based on analysis of the scheduling process in CMfg. To solve this scheduling problem, we propose a scheduling method aiming for diverse tasks, which enables each service demander to obtain desired manufacturing services. The candidate service sets are generated according to subtask directed graphs. An improved genetic algorithm is applied to searching for optimal task scheduling solutions. The effectiveness of the scheduling method proposed is verified by a case study with individualized customers' requirements. The results indicate that the proposed task scheduling method is able to achieve better performance than some usual algorithms such as simulated annealing and pattern search.

  16. Application of Zr/Ti-Pic in the adsorption process of Cu(II), Co(II) and Ni(II) using adsorption physico-chemical models and thermodynamics of the process; Aplicacao de Zr/Ti-PILC no processo de adsorcao de Cu(II), Co(II) e Ni(II) utilizando modelos fisico-quimicos de adsorcao e termodinamica do processo

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, Denis Lima; Airoldi, Claudio [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Dept. de Quimica Inorganica]. E-mail: dlguerra@iqm.unicamp.br; Lemos, Vanda Porpino; Angelica, Romulo Simoes [Universidade Federal do Para (UFPa), Belem (Brazil); Viana, Rubia Ribeiro [Universidade Federal do Mato Grosso (UFMT), Cuiaba (Brazil). Inst. de Ciencias Exatas e da Terra. Dept. de Recursos Minerais

    2008-07-01

    The aim of this investigation is to study how Zr/Ti-Pic adsorbs metals. The physico-chemical proprieties of Zr/Ti-Pic have been optimized with pillarization processes and Cu(II), Ni(II) and Co(II) adsorption from aqueous solution has been carried out, with maximum adsorption values of 8.85, 8.30 and 7.78 x-1 mmol g{sup -1}, respectively. The Langmuir, Freundlich and Temkin adsorption isotherm models have been applied to fit the experimental data with a linear regression process. The energetic effect caused by metal interaction was determined through calorimetric titration at the solid-liquid interface and gave a net thermal effect that enabled the calculation of the exothermic values and the equilibrium constant. (author)

  17. Maximum Lateness Scheduling on Two-Person Cooperative Games with Variable Processing Times and Common Due Date

    Directory of Open Access Journals (Sweden)

    Peng Liu

    2017-01-01

    Full Text Available A new maximum lateness scheduling model in which both cooperative games and variable processing times exist simultaneously is considered in this paper. The job variable processing time is described by an increasing or a decreasing function dependent on the position of a job in the sequence. Two persons have to cooperate in order to process a set of jobs. Each of them has a single machine and their processing cost is defined as the minimum value of maximum lateness. All jobs have a common due date. The objective is to maximize the multiplication of their rational positive cooperative profits. A division of those jobs should be negotiated to yield a reasonable cooperative profit allocation scheme acceptable to them. We propose the sufficient and necessary conditions for the problems to have positive integer solution.

  18. An Event-driven, Value-based, Pull Systems Engineering Scheduling Approach

    Science.gov (United States)

    2012-03-01

    combining a services approach to systems engineering with a kanban -based scheduling system. It provides the basis for validating the approach with...agent-based simulations. Keywords-systems engineering; systems engineering process; lean; kanban ; process simulation I. INTRODUCTION AND BACKGROUND...approaches [8], [9], we are investigating the use of flow-based pull scheduling techniques ( kanban systems) in a rapid response development

  19. Project management with dynamic scheduling baseline scheduling, risk analysis and project control

    CERN Document Server

    Vanhoucke, Mario

    2013-01-01

    The topic of this book is known as dynamic scheduling, and is used to refer to three dimensions of project management and scheduling: the construction of a baseline schedule and the analysis of a project schedule's risk as preparation of the project control phase during project progress. This dynamic scheduling point of view implicitly assumes that the usability of a project's baseline schedule is rather limited and only acts as a point of reference in the project life cycle.

  20. NASA scheduling technologies

    Science.gov (United States)

    Adair, Jerry R.

    1994-01-01

    This paper is a consolidated report on ten major planning and scheduling systems that have been developed by the National Aeronautics and Space Administration (NASA). A description of each system, its components, and how it could be potentially used in private industry is provided in this paper. The planning and scheduling technology represented by the systems ranges from activity based scheduling employing artificial intelligence (AI) techniques to constraint based, iterative repair scheduling. The space related application domains in which the systems have been deployed vary from Space Shuttle monitoring during launch countdown to long term Hubble Space Telescope (HST) scheduling. This paper also describes any correlation that may exist between the work done on different planning and scheduling systems. Finally, this paper documents the lessons learned from the work and research performed in planning and scheduling technology and describes the areas where future work will be conducted.

  1. 78 FR 26701 - Schedules of Controlled Substances: Placement of Lorcaserin Into Schedule IV

    Science.gov (United States)

    2013-05-08

    .... Phentermine Being Combined With Lorcaserin Eight commenters expressed concern about the probability that... follows: Several commenters were critical of DEA's handling of the scheduling process. The commenters did..., 1301.74, 1301.75(b) and (c), 1301.76, and 1301.77 on or after June 7, 2013. Labeling and Packaging. All...

  2. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments...

  3. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    1999-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments....

  4. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2004-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays with four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments....

  5. Crane Scheduling for a Plate Storage

    DEFF Research Database (Denmark)

    Hansen, Jesper; Clausen, Jens

    2002-01-01

    Odense Steel Shipyard produces the worlds largest container ships. The first process of producing the steel ships is handling arrival and storage of steel plates until they are needed in production. This paper considers the problem of scheduling two cranes that carry out the movements of plates...... into, around and out of the storage. The system is required to create a daily schedule for the cranes, but also handle possible disruptions during the execution of the plan. The problem is solved with a Simulated Annealing algorithm....

  6. Application of Tabu Search Algorithm in Job Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Betrianis Betrianis

    2010-10-01

    Full Text Available Tabu Search is one of local search methods which is used to solve the combinatorial optimization problem. This method aimed is to make the searching process of the best solution in a complex combinatorial optimization problem(np hard, ex : job shop scheduling problem, became more effective, in a less computational time but with no guarantee to optimum solution.In this paper, tabu search is used to solve the job shop scheduling problem consists of 3 (three cases, which is ordering package of September, October and November with objective of minimizing makespan (Cmax. For each ordering package, there is a combination for initial solution and tabu list length. These result then  compared with 4 (four other methods using basic dispatching rules such as Shortest Processing Time (SPT, Earliest Due Date (EDD, Most Work Remaining (MWKR dan First Come First Served (FCFS. Scheduling used Tabu Search Algorithm is sensitive for variables changes and gives makespan shorter than scheduling used by other four methods.

  7. Incorporation of Tropical Cyclone Avoidance Into Automated Ship Scheduling

    Science.gov (United States)

    2014-06-01

    illustrated during World War II ( Drury & Clavin, 2007), when Admiral Fredrick “Bull” Halsey was maneuvering his Third Fleet and trying to refuel, while...or damaged beyond repair ( Drury & Clavin, 2007). While this is an extreme historical case, it illustrates the dangers of not considering TC tracks...replenishment schedule that takes into account the supply levels of all the ships and maintains the supplies above required levels. With the proper inputs

  8. 75 FR 42831 - Proposed Collection; Comment Request for Form 1065, Schedule C, Schedule D, Schedule K-1...

    Science.gov (United States)

    2010-07-22

    .../or continuing information collections, as required by the Paperwork Reduction Act of 1995, Public Law... Income, Credits, Deductions and Other Items), Schedule L (Balance Sheets per Books), Schedule M-1 (Reconciliation of Income (Loss) per Books With Income (Loss) per Return)), Schedule M-2 (Analysis of Partners...

  9. The consideration and practice of data processing of WBS-II portal β monitor

    International Nuclear Information System (INIS)

    Du Xiangyang; Dong Qiangmin; Zhang Yong; Han Shuping; Wang Xiaodong; Fan Liya; Rao Xianming

    2001-01-01

    The main aspects of background and human body measurement data processing of WBS-II Portal β Monitor were discussed. The theory analysis of setting high and low background-warning threshold in data processing was done. The relative reference values were partly provided to the local executives. The measurement 'blind zone' and the whole warning function of data processing were discussed. And the structure, the process of monitoring and the microcomputer's hard wares of WBS-II Portal β Monitor were simply introduced

  10. Hybrid Pareto artificial bee colony algorithm for multi-objective single machine group scheduling problem with sequence-dependent setup times and learning effects.

    Science.gov (United States)

    Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao

    2016-01-01

    Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.

  11. Analysis of Issues for Project Scheduling by Multiple, Dispersed Schedulers (distributed Scheduling) and Requirements for Manual Protocols and Computer-based Support

    Science.gov (United States)

    Richards, Stephen F.

    1991-01-01

    Although computerized operations have significant gains realized in many areas, one area, scheduling, has enjoyed few benefits from automation. The traditional methods of industrial engineering and operations research have not proven robust enough to handle the complexities associated with the scheduling of realistic problems. To address this need, NASA has developed the computer-aided scheduling system (COMPASS), a sophisticated, interactive scheduling tool that is in wide-spread use within NASA and the contractor community. Therefore, COMPASS provides no explicit support for the large class of problems in which several people, perhaps at various locations, build separate schedules that share a common pool of resources. This research examines the issue of distributing scheduling, as applied to application domains characterized by the partial ordering of tasks, limited resources, and time restrictions. The focus of this research is on identifying issues related to distributed scheduling, locating applicable problem domains within NASA, and suggesting areas for ongoing research. The issues that this research identifies are goals, rescheduling requirements, database support, the need for communication and coordination among individual schedulers, the potential for expert system support for scheduling, and the possibility of integrating artificially intelligent schedulers into a network of human schedulers.

  12. Stochastic scheduling on unrelated machines

    NARCIS (Netherlands)

    Skutella, Martin; Sviridenko, Maxim; Uetz, Marc Jochen

    2013-01-01

    Two important characteristics encountered in many real-world scheduling problems are heterogeneous machines/processors and a certain degree of uncertainty about the actual sizes of jobs. The first characteristic entails machine dependent processing times of jobs and is captured by the classical

  13. Microcomputer-based workforce scheduling for hospital porters.

    Science.gov (United States)

    Lin, C K

    1999-01-01

    This paper focuses on labour scheduling for hospital porters who are the major workforce providing routine cleansing of wards, transportation and messenger services. Generating an equitable monthly roster for porters while meeting the daily minimum demand is a tedious task scheduled manually by a supervisor. In considering a variety of constraints and goals, a manual schedule was usually produced in seven to ten days. To be in line with the strategic goal of scientific management of an acute care regional hospital in Hong Kong, a microcomputer-based algorithm was developed to schedule the monthly roster. The algorithm, coded in Digital Visual Fortran 5.0 Professional, could generate a monthly roster in seconds. Implementation has been carried out since September 1998 and the results proved to be useful to hospital administrators and porters. This paper discusses both the technical and human issues involved during the computerization process.

  14. Human Performance-Aware Scheduling and Routing of a Multi-Skilled Workforce

    Directory of Open Access Journals (Sweden)

    Maikel L. van Eck

    2017-10-01

    Full Text Available Planning human activities within business processes often happens based on the same methods and algorithms as are used in the area of manufacturing systems. However, human behaviour is quite different from machine behaviour. Their performance depends on a number of factors, including workload, stress, personal preferences, etc. In this article we describe an approach for scheduling activities of people that takes into account business rules and dynamic human performance in order to optimise the schedule. We formally describe the scheduling problem we address and discuss how it can be constructed from inputs in the form of business process models and performance measurements. Finally, we discuss and evaluate an implementation for our planning approach to show the impact of considering dynamic human performance in scheduling.

  15. Baseline development, economic risk, and schedule risk: An integrated approach

    International Nuclear Information System (INIS)

    Tonkinson, J.A.

    1994-01-01

    The economic and schedule risks of Environmental Restoration (ER) projects are commonly analyzed toward the end of the baseline development process. Risk analysis is usually performed as the final element of the scheduling or estimating processes for the purpose of establishing cost and schedule contingency. However, there is an opportunity for earlier assessment of risks, during development of the technical scope and Work Breakdown Structure (WBS). Integrating the processes of risk management and baselining provides for early incorporation of feedback regarding schedule and cost risk into the proposed scope of work. Much of the information necessary to perform risk analysis becomes available during development of the technical baseline, as the scope of work and WBS are being defined. The analysis of risk can actually be initiated early on during development of the technical baseline and continue throughout development of the complete project baseline. Indeed, best business practices suggest that information crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable

  16. Accelerating exact schedulability analysis for fixed-priority pre-emptive scheduling

    NARCIS (Netherlands)

    Hang, Y.; Jiale, Z.; Keskin, U.; Bril, R.J.

    2010-01-01

    The schedulability analysis for fixed-priority preemptive scheduling (FPPS) plays a significant role in the real-time systems domain. The so-called Hyperplanes Exact Test (HET) [1] is an example of an exact schedulability test for FPPS. In this paper, we aim at improving the efficiency of HET by

  17. Split scheduling with uniform setup times.

    NARCIS (Netherlands)

    F. Schalekamp; R.A. Sitters (René); S.L. van der Ster; L. Stougie (Leen); V. Verdugo; A. van Zuylen

    2015-01-01

    htmlabstractWe study a scheduling problem in which jobs may be split into parts, where the parts of a split job may be processed simultaneously on more than one machine. Each part of a job requires a setup time, however, on the machine where the job part is processed. During setup, a

  18. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    Science.gov (United States)

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  19. A hybrid electromagnetism-like algorithm for a multi-mode resource-constrained project scheduling problem

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Sadeghi

    2013-08-01

    Full Text Available In this paper, two different sub-problems are considered to solve a resource constrained project scheduling problem (RCPSP, namely i assignment of modes to tasks and ii scheduling of these tasks in order to minimize the makespan of the project. The modified electromagnetism-like algorithm deals with the first problem to create an assignment of modes to activities. This list is used to generate a project schedule. When a new assignment is made, it is necessary to fix all mode dependent requirements of the project activities and to generate a random schedule with the serial SGS method. A local search will optimize the sequence of the activities. Also in this paper, a new penalty function has been proposed for solutions which are infeasible with respect to non-renewable resources. Performance of the proposed algorithm has been compared with the best algorithms published so far on the basis of CPU time and number of generated schedules stopping criteria. Reported results indicate excellent performance of the algorithm.

  20. Putting ROSE to Work: A Proposed Application of a Request-Oriented Scheduling Engine for Space Station Operations

    Science.gov (United States)

    Jaap, John; Muery, Kim

    2000-01-01

    Scheduling engines are found at the core of software systems that plan and schedule activities and resources. A Request-Oriented Scheduling Engine (ROSE) is one that processes a single request (adding a task to a timeline) and then waits for another request. For the International Space Station, a robust ROSE-based system would support multiple, simultaneous users, each formulating requests (defining scheduling requirements), submitting these requests via the internet to a single scheduling engine operating on a single timeline, and immediately viewing the resulting timeline. ROSE is significantly different from the engine currently used to schedule Space Station operations. The current engine supports essentially one person at a time, with a pre-defined set of requirements from many payloads, working in either a "batch" scheduling mode or an interactive/manual scheduling mode. A planning and scheduling process that takes advantage of the features of ROSE could produce greater customer satisfaction at reduced cost and reduced flow time. This paper describes a possible ROSE-based scheduling process and identifies the additional software component required to support it. Resulting changes to the management and control of the process are also discussed.

  1. Advertisement scheduling on commercial radio station using genetics algorithm

    Science.gov (United States)

    Purnamawati, S.; Nababan, E. B.; Tsani, B.; Taqyuddin, R.; Rahmat, R. F.

    2018-03-01

    On the commercial radio station, the advertising schedule is done manually, which resulted in ineffectiveness of ads schedule. Playback time consists of two types such as prime time and regular time. Radio Ads scheduling will be discussed in this research is based on ad playback schedule between 5am until 12am which rules every 15 minutes. It provides 3 slots ads with playback duration per ads maximum is 1 minute. If the radio broadcast time per day is 12 hours, then the maximum number of ads per day which can be aired is 76 ads. The other is the enactment of rules of prime time, namely the hours where the common people (listeners) have the greatest opportunity to listen to the radio, namely between the hours and hours of 4 am - 8 am, 6 pm - 10 pm. The number of screenings of the same ads on one day are limited to prime time ie 5 times, while for regular time is 8 times. Radio scheduling process is done using genetic algorithms which are composed of processes initialization chromosomes, selection, crossover and mutation. Study on chromosome 3 genes, each chromosome will be evaluated based on the value of fitness calculated based on the number of infractions that occurred on each individual chromosome. Where rule 1 is the number of screenings per ads must not be more than 5 times per day and rule 2 is there should never be two or more scheduling ads delivered on the same day and time. After fitness value of each chromosome is acquired, then the do the selection, crossover and mutation. From this research result, the optimal advertising schedule with schedule a whole day and ads data playback time ads with this level of accuracy: the average percentage was 83.79%.

  2. Duality-based algorithms for scheduling on unrelated parallel machines

    NARCIS (Netherlands)

    van de Velde, S.L.; van de Velde, S.L.

    1993-01-01

    We consider the following parallel machine scheduling problem. Each of n independent jobs has to be scheduled on one of m unrelated parallel machines. The processing of job J[sub l] on machine Mi requires an uninterrupted period of positive length p[sub lj]. The objective is to find an assignment of

  3. Web Publishing Schedule

    Science.gov (United States)

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  4. Fractional distillation as a strategy for reducing the genotoxic potential of SRC-II coal liquids: a status report

    Energy Technology Data Exchange (ETDEWEB)

    Pelroy, R.A.; Wilson, B.W.

    1981-09-01

    This report presents results of studies on the effects of fractional distillation on the genotoxic potential of Solvent Refined Coal (SRC-II) liquids. SRC-II source materials and distilled liquids were provided by Pittsburg and Midway Coal Mining Co. Fractional distillations were conducted on products from the P-99 process development unit operating under conditions approximating those anticipated at the SRC-II demonstration facility. Distillation cuts were subjected to chemical fractionation, in vitro bioassay and initial chemical analysis. Findings are discussed as they relate to the temperature at which various distillate cuts were produced. This document is the first of two status reports scheduled for 1981 describing these studies.

  5. PLACEMENT APPLICATIONS SCHEDULING LECTURE IN INTERNATIONAL PROGRAM UNIKOM BASED ANDROID

    Directory of Open Access Journals (Sweden)

    Andri Sahata Sitanggang

    2017-12-01

    Full Text Available One who determines life of a classroom namely mapping scheduling courses especially at college. The process scheduling has included time or schedule of a class of available, room available, lecture who is scheduled for, and schedule for lecturer going to teach. Hopefully with a scheduling it will facilitate the students and teachers in obtaining information lecture schedule. With the emergence of the android application ( is implanted in mobile phones , the public can now use the internet so fast that is based .So with that researchers give one a technology based solutions to build android application .This is because one of the technology has given the functions which may make it easier for students and university lecturers in terms of access to information. In building this application used method of the prototype consisting 2 access namely access user and admin , where module user consisting of modules register , login , scheduling module , while for admin given module login , register and arrangement information scheduling courses both the administration and lecturers .Application made will be integrated with internet so that this program is real-time application.

  6. Schedule Matters: Understanding the Relationship between Schedule Delays and Costs on Overruns

    Science.gov (United States)

    Majerowicz, Walt; Shinn, Stephen A.

    2016-01-01

    This paper examines the relationship between schedule delays and cost overruns on complex projects. It is generally accepted by many project practitioners that cost overruns are directly related to schedule delays. But what does "directly related to" actually mean? Some reasons or root causes for schedule delays and associated cost overruns are obvious, if only in hindsight. For example, unrealistic estimates, supply chain difficulties, insufficient schedule margin, technical problems, scope changes, or the occurrence of risk events can negatively impact schedule performance. Other factors driving schedule delays and cost overruns may be less obvious and more difficult to quantify. Examples of these less obvious factors include project complexity, flawed estimating assumptions, over-optimism, political factors, "black swan" events, or even poor leadership and communication. Indeed, is it even possible the schedule itself could be a source of delay and subsequent cost overrun? Through literature review, surveys of project practitioners, and the authors' own experience on NASA programs and projects, the authors will categorize and examine the various factors affecting the relationship between project schedule delays and cost growth. The authors will also propose some ideas for organizations to consider to help create an awareness of the factors which could cause or influence schedule delays and associated cost growth on complex projects.

  7. Tank waste processing analysis: Database development, tank-by-tank processing requirements, and examples of pretreatment sequences and schedules as applied to Hanford Double-Shell Tank Supernatant Waste - FY 1993

    International Nuclear Information System (INIS)

    Colton, N.G.; Orth, R.J.; Aitken, E.A.

    1994-09-01

    This report gives the results of work conducted in FY 1993 by the Tank Waste Processing Analysis Task for the Underground Storage Tank Integrated Demonstration. The main purpose of this task, led by Pacific Northwest Laboratory, is to demonstrate a methodology to identify processing sequences, i.e., the order in which a tank should be processed. In turn, these sequences may be used to assist in the development of time-phased deployment schedules. Time-phased deployment is implementation of pretreatment technologies over a period of time as technologies are required and/or developed. The work discussed here illustrates how tank-by-tank databases and processing requirements have been used to generate processing sequences and time-phased deployment schedules. The processing sequences take into account requirements such as the amount and types of data available for the tanks, tank waste form and composition, required decontamination factors, and types of compact processing units (CPUS) required and technology availability. These sequences were developed from processing requirements for the tanks, which were determined from spreadsheet analyses. The spreadsheet analysis program was generated by this task in FY 1993. Efforts conducted for this task have focused on the processing requirements for Hanford double-shell tank (DST) supernatant wastes (pumpable liquid) because this waste type is easier to retrieve than the other types (saltcake and sludge), and more tank space would become available for future processing needs. The processing requirements were based on Class A criteria set by the U.S. Nuclear Regulatory Commission and Clean Option goals provided by Pacific Northwest Laboratory

  8. Adapting planning and scheduling concepts to an engineering perspective: Key issues and successful techniques

    International Nuclear Information System (INIS)

    Finnegan, J.M.

    1986-01-01

    Traditional approaches to engineering planning are slanted toward the formats and interests of downstream implementation, and do not always consider the form and criticality of the front-end engineering development process. These processes and scopes are less defined and more subjective than most construction and operations tasks, and require flexible scheduling methods. This paper discusses the characteristics and requirement of engineering schedules, presents concepts for approaching planning in this field, and illustrates simple methods for developing and analyzing engineering plans, and evaluating schedule performance. Engineering plans are structured into a schedule hierarchy which delineates appropriate control and responsibilities, and is governed by key evaluation and decision milestones. Schedule risk analysis considers the uncertainty of engineering tasks, and critical resource constraints. Methods to evaluate schedule performance recognize that engineers and managers are responsible for adequate planning and forecasting, and quality decisions, even if they cannot control all factors influencing schedule results

  9. Iterative Relay Scheduling with Hybrid ARQ under Multiple User Equipment (Type II) Relay Environments

    KAUST Repository

    Nam, Sung Sik; Alouini, Mohamed-Slim; Choi, Seyeong

    2018-01-01

    -generation cellular systems (e.g., LTE-Advanced and beyond). The proposed IRS-HARQ aims to increase the achievable data rate by iteratively scheduling a relatively better UE relay closer to the end user in a probabilistic sense, provided that the relay-to-end user

  10. Galaxy S II

    CERN Document Server

    Gralla, Preston

    2011-01-01

    Unlock the potential of Samsung's outstanding smartphone with this jargon-free guide from technology guru Preston Gralla. You'll quickly learn how to shoot high-res photos and HD video, keep your schedule, stay in touch, and enjoy your favorite media. Every page is packed with illustrations and valuable advice to help you get the most from the smartest phone in town. The important stuff you need to know: Get dialed in. Learn your way around the Galaxy S II's calling and texting features.Go online. Browse the Web, manage email, and download apps with Galaxy S II's 3G/4G network (or create you

  11. 77 FR 42467 - Special Local Regulations; Fajardo Offshore Festival II, Fajardo, PR

    Science.gov (United States)

    2012-07-19

    ... 1625-AA08 Special Local Regulations; Fajardo Offshore Festival II, Fajardo, PR AGENCY: Coast Guard, DHS... Festival II, a series of high-speed boat races. The event is scheduled to take place on Sunday, September... the Fajardo Offshore Festival II. C. Discussion of Proposed Rule On September 16, 2012, Puerto Rico...

  12. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  13. Brokdorf nuclear power station: Construction scheduling and deadline control

    International Nuclear Information System (INIS)

    Lembcke, U.D.F.

    1986-01-01

    Scheduling, especially deadline control, for all installations of the Brokdorf nuclear power station was carried out centrally by the Project Management of Kraftwerk Union AG. 130 timetables encompassing some 13,000 activities were handled, which were interconnected and linked to 189 timetables (approx. 18,000 activities) from various specialized sections by means of data processing systems. Much space in time scheduling was taken by controls of software processing, especially of the preliminary inspection documents in the piping sector and of working documents for construction management in the control systems area. (orig.) [de

  14. Processing time tolerance-based ACO algorithm for solving job-shop scheduling problem

    Science.gov (United States)

    Luo, Yabo; Waden, Yongo P.

    2017-06-01

    Ordinarily, Job Shop Scheduling Problem (JSSP) is known as NP-hard problem which has uncertainty and complexity that cannot be handled by a linear method. Thus, currently studies on JSSP are concentrated mainly on applying different methods of improving the heuristics for optimizing the JSSP. However, there still exist many problems for efficient optimization in the JSSP, namely, low efficiency and poor reliability, which can easily trap the optimization process of JSSP into local optima. Therefore, to solve this problem, a study on Ant Colony Optimization (ACO) algorithm combined with constraint handling tactics is carried out in this paper. Further, the problem is subdivided into three parts: (1) Analysis of processing time tolerance-based constraint features in the JSSP which is performed by the constraint satisfying model; (2) Satisfying the constraints by considering the consistency technology and the constraint spreading algorithm in order to improve the performance of ACO algorithm. Hence, the JSSP model based on the improved ACO algorithm is constructed; (3) The effectiveness of the proposed method based on reliability and efficiency is shown through comparative experiments which are performed on benchmark problems. Consequently, the results obtained by the proposed method are better, and the applied technique can be used in optimizing JSSP.

  15. Web-Based Requesting and Scheduling Use of Facilities

    Science.gov (United States)

    Yeager, Carolyn M.

    2010-01-01

    Automated User's Training Operations Facility Utilization Request (AutoFUR) is prototype software that administers a Web-based system for requesting and allocating facilities and equipment for astronaut-training classes in conjunction with scheduling the classes. AutoFUR also has potential for similar use in such applications as scheduling flight-simulation equipment and instructors in commercial airplane-pilot training, managing preventive- maintenance facilities, and scheduling operating rooms, doctors, nurses, and medical equipment for surgery. Whereas requesting and allocation of facilities was previously a manual process that entailed examination of documents (including paper drawings) from different sources, AutoFUR partly automates the process and makes all of the relevant information available via the requester s computer. By use of AutoFUR, an instructor can fill out a facility-utilization request (FUR) form on line, consult the applicable flight manifest(s) to determine what equipment is needed and where it should be placed in the training facility, reserve the corresponding hardware listed in a training-hardware inventory database, search for alternative hardware if necessary, submit the FUR for processing, and cause paper forms to be printed. Auto-FUR also maintains a searchable archive of prior FURs.

  16. A Procedure for scheduling and setting processing priority of MC requests

    CERN Document Server

    Balcar, Stepan

    2013-01-01

    My project contains designing and programming a base of an open system, which should help with the scheduling Monte Carlo production requests needed by the CMS physicists for data analysis within the CMS collaboration. A primary requirement was to create web interface that would be portable and independent of the control logic of the system. Another point of the project was to make a scheduler for the Monte Carlo production planning and to design and program interfaces between the various logical blocks of the system. Introduction Many research groups in CERN which specialize in different areas of particle physics works with CMS. They are mostly scientists working at universities or research institutes in their countries. Their research consists in constructing models of elementary particles and subsequent experimental verification of the behavior of these models. All these groups of people create MC production requests which are to be executed using computing resources located at CERN and other institutes. T...

  17. 29 CFR 825.203 - Scheduling of intermittent or reduced schedule leave.

    Science.gov (United States)

    2010-07-01

    ... leave intermittently or on a reduced leave schedule for planned medical treatment, then the employee... 29 Labor 3 2010-07-01 2010-07-01 false Scheduling of intermittent or reduced schedule leave. 825... OF LABOR OTHER LAWS THE FAMILY AND MEDICAL LEAVE ACT OF 1993 Employee Leave Entitlements Under the...

  18. Residency Applicants Prefer Online System for Scheduling Interviews

    Directory of Open Access Journals (Sweden)

    Wills, Charlotte

    2015-03-01

    Full Text Available Introduction: Residency coordinators may be overwhelmed when scheduling residency interviews. Applicants often have to coordinate interviews with multiple programs at once, and relying on verbal or email confirmation may delay the process. Our objective was to determine applicant mean time to schedule and satisfaction using online scheduling. Methods: This pilot study is a retrospective analysis performed on a sample of applicants offered interviews at an urban county emergency medicine residency. Applicants were asked their estimated time to schedule with the online system compared to their average time using other methods. In addition, they were asked on a five-point anchored scale to rate their satisfaction. Results: Of 171 applicants, 121 completed the survey (70.8%. Applicants were scheduling an average of 13.3 interviews. Applicants reported scheduling interviews using the online system in mean of 46.2 minutes (median 10, range 1-1800 from the interview offer as compared with a mean of 320.2 minutes (median 60, range 3-2880 for other programs not using this system. This difference was statistically significant. In addition, applicants were more likely to rate their satisfaction using the online system as “satisfied” (83.5% vs 16.5%. Applicants were also more likely to state that they preferred scheduling their interviews using the online system rather than the way other programs scheduled interviews (74.2% vs 4.1% and that the online system aided them coordinating travel arrangements (52.1% vs 4.1%. Conclusion: An online interview scheduling system is associated with higher satisfaction among applicants both in coordinating travel arrangements and in overall satisfaction. [West J Emerg Med. 2015;16(2:352-354.

  19. FLOWSHOP SCHEDULING USING A NETWORK APPROACH ...

    African Journals Online (AJOL)

    eobe

    time when the last job completes on the last machine. The objective ... more jobs in a permutation flow shop scheduling problem ... processing time of a job on a machine is zero, it ..... hybrid flow shops with sequence dependent setup times ...

  20. CO II laser free-form processing of hard tissue

    Science.gov (United States)

    Werner, Martin; Klasing, Manfred; Ivanenko, Mikhail; Harbecke, Daniela; Steigerwald, Hendrik; Hering, Peter

    2007-07-01

    Drilling and surface processing of bone and tooth tissue belongs to standard medical procedures (bores and embeddings for implants, trepanation etc.). Small circular bores can be generally quickly produced with mechanical drills. However problems arise at angled drilling, the need to execute drilling procedures without damaging of sensitive soft tissue structures underneath the bone or the attempt to mill small non-circular cavities in hard tissue with high precision. We present investigations on laser hard tissue "milling", which can be advantageous for solving these problems. The processing of bone is done with a CO II laser (10.6 μm) with pulse durations of 50 - 100 μs, combined with a PC-controlled fast galvanic laser beam scanner and a fine water-spray, which helps keeping the ablation process effective and without thermal side-effects. Laser "milling" of non-circular cavities with 1 - 4 mm width and about 10 mm depth can be especially interesting for dental implantology. In ex-vivo investigations we found conditions for fast laser processing of these cavities without thermal damage and with minimised tapering. It included the exploration of different filling patterns (concentric rings, crosshatch, parallel lines, etc.), definition of maximal pulse duration, repetition rate and laser power, and optimal water spray position. The optimised results give evidence for the applicability of pulsed CO II lasers for biologically tolerable effective processing of deep cavities in hard tissue.

  1. Identification of new fluorescence processes in the UV spectra of cool stars from new energy levels of Fe II and Cr II

    Science.gov (United States)

    Johansson, Sveneric; Carpenter, Kenneth G.

    1988-01-01

    Two fluorescence processes operating in atmospheres of cool stars, symbiotic stars, and the Sun are presented. Two emission lines, at 1347.03 and 1360.17 A, are identified as fluorescence lines of Cr II and Fe II. The lines are due to transitions from highly excited levels, which are populated radiatively by the hydrogen Lyman alpha line due to accidental wavelength coincidences. Three energy levels, one in Cr II and two in Fe II, are reported.

  2. Review process and quality assurance in the EBR-II probabilistic risk assessment

    International Nuclear Information System (INIS)

    Roglans, J.; Hill, D.J.; Ragland, W.A.

    1992-01-01

    A Probabilistic Risk Assessment (PRA) of the Experimental Breeder Reactor II (EBR-II), a Department of Energy (DOE) Category A reactor, has recently been completed at Argonne National Laboratory (ANL). Within the scope of the ANL QA Programs, a QA Plan specifically for the EBR-II PRA was developed. The QA Plan covered all aspects of the PRA development, with emphasis on the procedures for document and software control, and the internal and external review process. The effort spent in the quality assurance tasks for the EBR-II PRA has reciprocated by providing acceptance of the work and confidence in the quality of the results

  3. Research on the ITOC based scheduling system for ship piping production

    Science.gov (United States)

    Li, Rui; Liu, Yu-Jun; Hamada, Kunihiro

    2010-12-01

    Manufacturing of ship piping systems is one of the major production activities in shipbuilding. The schedule of pipe production has an important impact on the master schedule of shipbuilding. In this research, the ITOC concept was introduced to solve the scheduling problems of a piping factory, and an intelligent scheduling system was developed. The system, in which a product model, an operation model, a factory model, and a knowledge database of piping production were integrated, automated the planning process and production scheduling. Details of the above points were discussed. Moreover, an application of the system in a piping factory, which achieved a higher level of performance as measured by tardiness, lead time, and inventory, was demonstrated.

  4. An Automatic Course Scheduling Approach Using Instructors' Preferences

    Directory of Open Access Journals (Sweden)

    Hossam Faris

    2012-03-01

    Full Text Available University Courses Timetabling problem has been extensively researched in the last decade. Therefore, numerous approaches were proposed to solve UCT problem. This paper proposes a new approach to process a sequence of meetings between instructors, rooms, and students in predefined periods of time with satisfying a set of constraints divided in variety of types. In addition, this paper proposes new representation for courses timetabling and conflict-free for each time slot by mining instructor preferences from previous schedules to avoid undesirable times for instructors. Experiments on different real data showed the approach achieved increased satisfaction degree for each instructor and gives feasible schedule with satisfying all hard constraints in construction operation. The generated schedules have high satisfaction degrees comparing with schedules created manually. The research conducts experiments on collected data gathered from the computer science department and other related departments in Jordan University of Science and Technology- Jordan.

  5. When drugs in the same controlled substance schedule differ in real-world abuse, should they be differentiated in labeling?

    Science.gov (United States)

    Dasgupta, Nabarun; Henningfield, Jack E; Ertischek, Michelle D; Schnoll, Sidney H

    2011-12-01

    The prescription drugs regulated in the most restrictive controlled substance schedule for those with an approved therapeutic use vary widely in their real world risk of abuse and harm. Opioid analgesics have the highest rates of abuse, overdose death, drug abuse treatment needs and societal costs in comparison to other Schedule II drugs. Stimulants for attention-deficit/hyperactivity disorders (ADHD) account for substantially lower rates of abuse, harm, and public health impact. The scheduling of drugs is determined by the World Health Organization, the United States Food and Drug Administration, and other regulatory agencies, through a quasi-public process that relies heavily on pre-marketing studies that are conducted in highly controlled clinical settings. We propose that it is increasingly in the interest of science-based regulation and public health to recognize and communicate differences among drugs based on their real-world abuse and public health harm using surveillance data. Appropriate differentiation through labeling of drugs that will likely remain in the same schedule could provide powerful incentives for drug development and research, would aid prescriber/patient decision making by informing them of real differences across drugs within a schedule, and may also contribute to public health efforts to reduce drug abuse. There are risks of course, that include inadvertent perceptions that drugs labeled to be lower in risk are not taken as seriously as others in the same category. Challenges such as these, however, can be overcome and should not serve as barriers to objective communications regarding a drug's actual risks. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. Fears of Children in the United States: An Examination of the American Fear Survey Schedule with 20 New Contemporary Fear Items

    Science.gov (United States)

    Burnham, Joy J.

    2005-01-01

    Twenty contemporary fears (e.g., terrorist attacks, drive-by shootings, having to fight in a war) were added to E. Gullone and N. J. King's (1992) Australian Fear Survey Schedule for Children-II for use in the United States. The revised survey, the American Fear Survey Schedule for Children (J. J. Burnham, 1995), was investigated. The component…

  7. Computerized transportation model for the NRC Physical Protection Project. Versions I and II

    International Nuclear Information System (INIS)

    Anderson, G.M.

    1978-01-01

    Details on two versions of a computerized model for the transportation system of the NRC Physical Protection Project are presented. The Version I model permits scheduling of all types of transport units associated with a truck fleet, including truck trailers, truck tractors, escort vehicles and crews. A fixed-fleet itinerary construction process is used in which iterations on fleet size are required until the service requirements are satisfied. The Version II model adds an aircraft mode capability and provides for a more efficient non-fixed-fleet itinerary generation process. Test results using both versions are included

  8. The role of the production scheduling system in rescheduling

    Science.gov (United States)

    Kalinowski, K.; Grabowik, C.; Kempa, W.; Paprocka, I.

    2015-11-01

    The paper presents the rescheduling problem in the context of cooperation between production scheduling system (PSS) and other units in an integrated manufacturing environment - decision makers and software systems. The main aim is to discuss the PSS functionality for maximizing automation of the rescheduling process, reducing the response time and improving the quality of generated solutions. PSSs operate in the meeting of tactical and operational level of planning and control, and play an important role in the production preparation and control. On the basis of information about orders, technology and production system state (e.g. resources availability) they prepare and/or update a detailed plan of production flow - a schedule. All necessary data for scheduling and rescheduling are usually collected in other systems both from organizational and technical production preparation, e.g. ERP, PLM, MES, CAPP or others, as well as they are entered directly by the decision- makers/operators. Data acquired in this way are often incomplete and inconsistent. Therefore the existing rescheduling software works according to interactive method - rather support but does not replace the human decision maker in tasks planning. When rescheduling, due to the limited amount of time to make a decision this interaction is particularly important. An additional problem arises in data acquisition, in the process of data exchanging between systems or in the identification of new data sources and their processing. Different approaches to rescheduling were characterized, including those solutions, where all these operations are carried out by an autonomous system and those in which scheduling is performed only upon request from the outside, for the newly created scheduling data representing the current state of the production system.

  9. Multimodal Processes Rescheduling

    DEFF Research Database (Denmark)

    Bocewicz, Grzegorz; Banaszak, Zbigniew A.; Nielsen, Peter

    2013-01-01

    Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe-cuted in the......Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe...

  10. The R-Shell approach - Using scheduling agents in complex distributed real-time systems

    Science.gov (United States)

    Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre

    1993-01-01

    Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.

  11. Exploring a QoS Driven Scheduling Approach for Peer-to-Peer Live Streaming Systems with Network Coding

    Science.gov (United States)

    Cui, Laizhong; Lu, Nan; Chen, Fu

    2014-01-01

    Most large-scale peer-to-peer (P2P) live streaming systems use mesh to organize peers and leverage pull scheduling to transmit packets for providing robustness in dynamic environment. The pull scheduling brings large packet delay. Network coding makes the push scheduling feasible in mesh P2P live streaming and improves the efficiency. However, it may also introduce some extra delays and coding computational overhead. To improve the packet delay, streaming quality, and coding overhead, in this paper are as follows. we propose a QoS driven push scheduling approach. The main contributions of this paper are: (i) We introduce a new network coding method to increase the content diversity and reduce the complexity of scheduling; (ii) we formulate the push scheduling as an optimization problem and transform it to a min-cost flow problem for solving it in polynomial time; (iii) we propose a push scheduling algorithm to reduce the coding overhead and do extensive experiments to validate the effectiveness of our approach. Compared with previous approaches, the simulation results demonstrate that packet delay, continuity index, and coding ratio of our system can be significantly improved, especially in dynamic environments. PMID:25114968

  12. 1170-MW(t) HTGR-PS/C plant application study report: SRC-II process application

    International Nuclear Information System (INIS)

    Rao, R.; McMain, A.T. Jr.

    1981-05-01

    The solvent refined coal (SRC-II) process is an advanced process being developed by Gulf Mineral Resources Ltd. (a Gulf Oil Corporation subsidiary) to produce a clean, non-polluting liquid fuel from high-sulfur bituminous coals. The SRC-II commercial plant will process about 24,300 tonnes (26,800 tons) of feed coal per stream day, producing primarily fuel oil plus secondary fuel gases. This summary report describes the integration of a high-temperature gas-cooled reactor operating in a process steam/cogeneration mode (HTGR-PS/C) to provide the energy requirements for the SRC-II process. The HTGR-PS/C plant was developed by General Atomic Company (GA) specifically for industries which require energy in the form of both steam and electricity. General Atomic has developed an 1170-MW(t) HTGR-PS/C design which is particularly well suited to industrial applications and is expected to have excellent cost benefits over other sources of energy

  13. A genetic algorithm-based job scheduling model for big data analytics.

    Science.gov (United States)

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  14. An Extended Flexible Job Shop Scheduling Model for Flight Deck Scheduling with Priority, Parallel Operations, and Sequence Flexibility

    Directory of Open Access Journals (Sweden)

    Lianfei Yu

    2017-01-01

    Full Text Available Efficient scheduling for the supporting operations of aircrafts in flight deck is critical to the aircraft carrier, and even several seconds’ improvement may lead to totally converse outcome of a battle. In the paper, we ameliorate the supporting operations of carrier-based aircrafts and investigate three simultaneous operation relationships during the supporting process, including precedence constraints, parallel operations, and sequence flexibility. Furthermore, multifunctional aircrafts have to take off synergistically and participate in a combat cooperatively. However, their takeoff order must be restrictively prioritized during the scheduling period accorded by certain operational regulations. To efficiently prioritize the takeoff order while minimizing the total time budget on the whole takeoff duration, we propose a novel mixed integer liner programming formulation (MILP for the flight deck scheduling problem. Motivated by the hardness of MILP, we design an improved differential evolution algorithm combined with typical local search strategies to improve computational efficiency. We numerically compare the performance of our algorithm with the classical genetic algorithm and normal differential evolution algorithm and the results show that our algorithm obtains better scheduling schemes that can meet both the operational relations and the takeoff priority requirements.

  15. Constraint-based scheduling applying constraint programming to scheduling problems

    CERN Document Server

    Baptiste, Philippe; Nuijten, Wim

    2001-01-01

    Constraint Programming is a problem-solving paradigm that establishes a clear distinction between two pivotal aspects of a problem: (1) a precise definition of the constraints that define the problem to be solved and (2) the algorithms and heuristics enabling the selection of decisions to solve the problem. It is because of these capabilities that Constraint Programming is increasingly being employed as a problem-solving tool to solve scheduling problems. Hence the development of Constraint-Based Scheduling as a field of study. The aim of this book is to provide an overview of the most widely used Constraint-Based Scheduling techniques. Following the principles of Constraint Programming, the book consists of three distinct parts: The first chapter introduces the basic principles of Constraint Programming and provides a model of the constraints that are the most often encountered in scheduling problems. Chapters 2, 3, 4, and 5 are focused on the propagation of resource constraints, which usually are responsibl...

  16. Scheduling for decommissioning projects

    International Nuclear Information System (INIS)

    Podmajersky, O.E.

    1987-01-01

    This paper describes the Project Scheduling system being employed by the Decommissioning Operations Contractor at the Shippingport Station Decommissioning Project (SSDP). Results from the planning system show that the project continues to achieve its cost and schedule goals. An integrated cost and schedule control system (C/SCS) which uses the concept of earned value for measurement of performance was instituted in accordance with DOE orders. The schedule and cost variances generated by the C/SCS system are used to confirm management's assessment of project status. This paper describes the types of schedules and tools used on the SSDP project to plan and monitor the work, and identifies factors that are unique to a decommissioning project that make scheduling critical to the achievement of the project's goals. 1 fig

  17. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  18. Second-order schedules of token reinforcement with pigeons: effects of fixed- and variable-ratio exchange schedules.

    Science.gov (United States)

    Foster, T A; Hackenberg, T D; Vaidya, M

    2001-09-01

    Pigeons' key pecks produced food under second-order schedules of token reinforcement, with light-emitting diodes serving as token reinforcers. In Experiment 1, tokens were earned according to a fixed-ratio 50 schedule and were exchanged for food according to either fixed-ratio or variable-ratio exchange schedules, with schedule type varied across conditions. In Experiment 2, schedule type was varied within sessions using a multiple schedule. In one component, tokens were earned according to a fixed-ratio 50 schedule and exchanged according to a variable-ratio schedule. In the other component, tokens were earned according to a variable-ratio 50 schedule and exchanged according to a fixed-ratio schedule. In both experiments, the number of responses per exchange was varied parametrically across conditions, ranging from 50 to 400 responses. Response rates decreased systematically with increases in the fixed-ratio exchange schedules, but were much less affected by changes in the variable-ratio exchange schedules. Response rates were consistently higher under variable-ratio exchange schedules than tinder comparable fixed-ratio exchange schedules, especially at higher exchange ratios. These response-rate differences were due both to greater pre-ratio pausing and to lower local rates tinder the fixed-ratio exchange schedules. Local response rates increased with proximity to food under the higher fixed-ratio exchange schedules, indicative of discriminative control by the tokens.

  19. Single-machine scheduling of proportionally deteriorating jobs by two agents

    OpenAIRE

    S Gawiejnowicz; W-C Lee; C-L Lin; C-C Wu

    2011-01-01

    We consider a problem of scheduling a set of independent jobs by two agents on a single machine. Every agent has its own subset of jobs to be scheduled and uses its own optimality criterion. The processing time of each job proportionally deteriorates with respect to the starting time of the job. The problem is to find a schedule that minimizes the total tardiness of the first agent, provided that no tardy job is allowed for the second agent. We prove basic properties of the problem and give a...

  20. An Improved Genetic Algorithm for Single-Machine Inverse Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Jianhui Mou

    2014-01-01

    Full Text Available The goal of the scheduling is to arrange operations on suitable machines with optimal sequence for corresponding objectives. In order to meet market requirements, scheduling systems must own enough flexibility against uncertain events. These events can change production status or processing parameters, even causing the original schedule to no longer be optimal or even to be infeasible. Traditional scheduling strategies, however, cannot cope with these cases. Therefore, a new idea of scheduling called inverse scheduling has been proposed. In this paper, the inverse scheduling with weighted completion time (SMISP is considered in a single-machine shop environment. In this paper, an improved genetic algorithm (IGA with a local searching strategy is proposed. To improve the performance of IGA, efficient encoding scheme, fitness evaluation mechanism, feasible initialization methods, and a local search procedure have been employed in the paper. Because of the local improving method, the proposed IGA can balance its exploration ability and exploitation ability. We adopt 27 instances to verify the effectiveness of the proposed algorithm. The experimental results illustrated that the proposed algorithm can generate satisfactory solutions. This approach also has been applied to solve the scheduling problem in the real Chinese shipyard and can bring some benefits.

  1. Time-critical multirate scheduling using contemporary real-time operating system services

    Science.gov (United States)

    Eckhardt, D. E., Jr.

    1983-01-01

    Although real-time operating systems provide many of the task control services necessary to process time-critical applications (i.e., applications with fixed, invariant deadlines), it may still be necessary to provide a scheduling algorithm at a level above the operating system in order to coordinate a set of synchronized, time-critical tasks executing at different cyclic rates. The scheduling requirements for such applications and develops scheduling algorithms using services provided by contemporary real-time operating systems.

  2. Perceptions of randomized security schedules.

    Science.gov (United States)

    Scurich, Nicholas; John, Richard S

    2014-04-01

    Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.

  3. U-target irradiation at FRM II aiming the production of Mo-99 - A feasibility study

    International Nuclear Information System (INIS)

    Gerstenberg, H.; Mueller, C.; Neuhaus, I.; Roehrmoser, A.

    2010-01-01

    Following the shortage in radioisotope availability the Technische Unversitaet Muenchen and the Belgian Institut National des Radioelements conducted a common study on the suitability of the FRM II reactor for the generation of Mo-99 as a fission product. A suitable irradiation channel was determined and neutronic calculations resulted in sufficiently high neutron flux densities to make FRM II a promising candidate for Mo-99 production. In addition the feasibility study provides thermohydraulic calculations as input for the design and integration of the additional cooling circuit into the existing heat removal systems of FRM II. The required in-house processes for a regular uranium target irradiation programme have been defined and necessary upgrades identified. Finally the required investment cost was estimated and a possible time schedule was given. (author)

  4. NMR investigation of dynamic processes in complexes of nickel(II) and zinc(II) with iminodiacetate, n-methyliminodiacetate and n-ethyliminodiacetate

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1985-11-01

    Analysis of oxygen-17 bulk water relaxation rates with an aqueous solution of 1:1 Ni(II):ida reveals that two rate-limiting processes are involved with solvent exchange. Analysis of carbon-13 longitudinal relaxation rates of the bis-ligand complexes with zinc(II) are used to determine molecular tumbling rates and methyl rotation rates. The carbon-13 transverse relaxation rates for the carbons in the bis-ligand complex with Ni(II) are adequately fitted to the Solomon-Bloembergen equation. Three carboxylate carbon peaks are seen with the 13 C spectrum of the 1:2 Ni(II):ida complex, which coalesce into a single peak above about 360 K. The mechanism and rate of ligand exchange are determined for the complexes Zn(II)L 2 -2 (L = mida, eida) in aqueous solution by total lineshape analysis of the proton spectrum at 500 MHz

  5. Distributed continuous energy scheduling for dynamic virtual power plants

    International Nuclear Information System (INIS)

    Niesse, Astrid

    2015-01-01

    This thesis presents DynaSCOPE as distributed control method for continuous energy scheduling for dynamic virtual power plants (DVPP). DVPPs aggregate the flexibility of distributed energy units to address current energy markets. As an extension of the Virtual Power Plant concept they show high dynamics in aggregation and operation of energy units. Whereas operation schedules are set up for all energy units in a day-ahead planning procedure, incidents may render these schedules infeasible during execution, like deviation from prognoses or outages. Thus, a continuous scheduling process is needed to ensure product fulfillment. With DynaSCOPE, software agents representing single energy units solve this problem in a completely distributed heuristic approach. Using a stepped concept, several damping mechanisms are applied to allow minimum disturbance while continuously trying to fulfill the product as contracted at the market.

  6. A Photo Storm Report Mobile Application, Processing/Distribution System, and AWIPS-II Display Concept

    Science.gov (United States)

    Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.

    2014-12-01

    The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the

  7. Proposed algorithm to improve job shop production scheduling using ant colony optimization method

    Science.gov (United States)

    Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari

    2017-12-01

    This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.

  8. SIMULTANEOUS SCHEDULING AND OPERATIONAL OPTIMIZATION OF MULTIPRODUCT, CYCLIC CONTINUOUS PLANTS

    Directory of Open Access Journals (Sweden)

    A. Alle

    2002-03-01

    Full Text Available The problems of scheduling and optimization of operational conditions in multistage, multiproduct continuous plants with intermediate storage are simultaneously addressed. An MINLP model, called TSPFLOW, which is based on the TSP formulation for product sequencing, is proposed to schedule the operation of such plants. TSPFLOW yields a one-order-of-magnitude CPU time reduction as well as the solution of instances larger than those formerly reported (Pinto and Grossmann, 1994. Secondly, processing rates and yields are introduced as additional optimization variables in order to state the simultaneous problem of scheduling with operational optimization. Results show that trade-offs are very complex and that the development of a straightforward (rule of thumb method to optimally schedule the operation is less effective than the proposed approach.

  9. SIMULTANEOUS SCHEDULING AND OPERATIONAL OPTIMIZATION OF MULTIPRODUCT, CYCLIC CONTINUOUS PLANTS

    Directory of Open Access Journals (Sweden)

    Alle A.

    2002-01-01

    Full Text Available The problems of scheduling and optimization of operational conditions in multistage, multiproduct continuous plants with intermediate storage are simultaneously addressed. An MINLP model, called TSPFLOW, which is based on the TSP formulation for product sequencing, is proposed to schedule the operation of such plants. TSPFLOW yields a one-order-of-magnitude CPU time reduction as well as the solution of instances larger than those formerly reported (Pinto and Grossmann, 1994. Secondly, processing rates and yields are introduced as additional optimization variables in order to state the simultaneous problem of scheduling with operational optimization. Results show that trade-offs are very complex and that the development of a straightforward (rule of thumb method to optimally schedule the operation is less effective than the proposed approach.

  10. DIMACS Workshop on Interconnection Networks and Mapping, and Scheduling Parallel Computations

    CERN Document Server

    Rosenberg, Arnold L; Sotteau, Dominique; NSF Science and Technology Center in Discrete Mathematics and Theoretical Computer Science; Interconnection networks and mapping and scheduling parallel computations

    1995-01-01

    The interconnection network is one of the most basic components of a massively parallel computer system. Such systems consist of hundreds or thousands of processors interconnected to work cooperatively on computations. One of the central problems in parallel computing is the task of mapping a collection of processes onto the processors and routing network of a parallel machine. Once this mapping is done, it is critical to schedule computations within and communication among processor from universities and laboratories, as well as practitioners involved in the design, implementation, and application of massively parallel systems. Focusing on interconnection networks of parallel architectures of today and of the near future , the book includes topics such as network topologies,network properties, message routing, network embeddings, network emulation, mappings, and efficient scheduling. inputs for a process are available where and when the process is scheduled to be computed. This book contains the refereed pro...

  11. How should periods without social interaction be scheduled? Children's preference for practical schedules of positive reinforcement.

    Science.gov (United States)

    Luczynski, Kevin C; Hanley, Gregory P

    2014-01-01

    Several studies have shown that children prefer contingent reinforcement (CR) rather than yoked noncontingent reinforcement (NCR) when continuous reinforcement is programmed in the CR schedule. Preference has not, however, been evaluated for practical schedules that involve CR. In Study 1, we assessed 5 children's preference for obtaining social interaction via a multiple schedule (periods of fixed-ratio 1 reinforcement alternating with periods of extinction), a briefly signaled delayed reinforcement schedule, and an NCR schedule. The multiple schedule promoted the most efficient level of responding. In general, children chose to experience the multiple schedule and avoided the delay and NCR schedules, indicating that they preferred multiple schedules as the means to arrange practical schedules of social interaction. In Study 2, we evaluated potential controlling variables that influenced 1 child's preference for the multiple schedule and found that the strong positive contingency was the primary variable. © Society for the Experimental Analysis of Behavior.

  12. Autonomous scheduling technology for Earth orbital missions

    Science.gov (United States)

    Srivastava, S.

    1982-01-01

    The development of a dynamic autonomous system (DYASS) of resources for the mission support of near-Earth NASA spacecraft is discussed and the current NASA space data system is described from a functional perspective. The future (late 80's and early 90's) NASA space data system is discussed. The DYASS concept, the autonomous process control, and the NASA space data system are introduced. Scheduling and related disciplines are surveyed. DYASS as a scheduling problem is also discussed. Artificial intelligence and knowledge representation is considered as well as the NUDGE system and the I-Space system.

  13. Optimal Time-Abstract Schedulers for CTMDPs and Markov Games

    Directory of Open Access Journals (Sweden)

    Markus Rabe

    2010-06-01

    Full Text Available We study time-bounded reachability in continuous-time Markov decision processes for time-abstract scheduler classes. Such reachability problems play a paramount role in dependability analysis and the modelling of manufacturing and queueing systems. Consequently, their analysis has been studied intensively, and techniques for the approximation of optimal control are well understood. From a mathematical point of view, however, the question of approximation is secondary compared to the fundamental question whether or not optimal control exists. We demonstrate the existence of optimal schedulers for the time-abstract scheduler classes for all CTMDPs. Our proof is constructive: We show how to compute optimal time-abstract strategies with finite memory. It turns out that these optimal schedulers have an amazingly simple structure---they converge to an easy-to-compute memoryless scheduling policy after a finite number of steps. Finally, we show that our argument can easily be lifted to Markov games: We show that both players have a likewise simple optimal strategy in these more general structures.

  14. Flexible job-shop scheduling based on genetic algorithm and simulation validation

    Directory of Open Access Journals (Sweden)

    Zhou Erming

    2017-01-01

    Full Text Available This paper selects flexible job-shop scheduling problem as the research object, and Constructs mathematical model aimed at minimizing the maximum makespan. Taking the transmission reverse gear production line of a transmission corporation as an example, genetic algorithm is applied for flexible jobshop scheduling problem to get the specific optimal scheduling results with MATLAB. DELMIA/QUEST based on 3D discrete event simulation is applied to construct the physical model of the production workshop. On the basis of the optimal scheduling results, the logical link of the physical model for the production workshop is established, besides, importing the appropriate process parameters to make virtual simulation on the production workshop. Finally, through analyzing the simulated results, it shows that the scheduling results are effective and reasonable.

  15. Advance Resource Provisioning in Bulk Data Scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet

    2012-10-01

    Today?s scientific and business applications generate mas- sive data sets that need to be transferred to remote sites for sharing, processing, and long term storage. Because of increasing data volumes and enhancement in current net- work technology that provide on-demand high-speed data access between collaborating institutions, data handling and scheduling problems have reached a new scale. In this paper, we present a new data scheduling model with ad- vance resource provisioning, in which data movement operations are defined with earliest start and latest comple- tion times. We analyze time-dependent resource assign- ment problem, and propose a new methodology to improve the current systems by allowing researchers and higher-level meta-schedulers to use data-placement as-a-service, so they can plan ahead and submit transfer requests in advance. In general, scheduling with time and resource conflicts is NP-hard. We introduce an efficient algorithm to organize multiple requests on the fly, while satisfying users? time and resource constraints. We successfully tested our algorithm in a simple benchmark simulator that we have developed, and demonstrated its performance with initial test results.

  16. Task Balanced Workflow Scheduling Technique considering Task Processing Rate in Spot Market

    Directory of Open Access Journals (Sweden)

    Daeyong Jung

    2014-01-01

    Full Text Available Recently, the cloud computing is a computing paradigm that constitutes an advanced computing environment that evolved from the distributed computing. And the cloud computing provides acquired computing resources in a pay-as-you-go manner. For example, Amazon EC2 offers the Infrastructure-as-a-Service (IaaS instances in three different ways with different price, reliability, and various performances of instances. Our study is based on the environment using spot instances. Spot instances can significantly decrease costs compared to reserved and on-demand instances. However, spot instances give a more unreliable environment than other instances. In this paper, we propose the workflow scheduling scheme that reduces the out-of-bid situation. Consequently, the total task completion time is decreased. The simulation results reveal that, compared to various instance types, our scheme achieves performance improvements in terms of an average combined metric of 12.76% over workflow scheme without considering the processing rate. However, the cost in our scheme is higher than an instance with low performance and is lower than an instance with high performance.

  17. Simultaneous decomplexation in blended Cu(II)/Ni(II)-EDTA systems by electro-Fenton process using iron sacrificing electrodes.

    Science.gov (United States)

    Zhao, Zilong; Dong, Wenyi; Wang, Hongjie; Chen, Guanhan; Tang, Junyi; Wu, Yang

    2018-05-15

    This research explored the application of electro-Fenton (E-Fenton) technique for the simultaneous decomplexation in blended Cu(II)/Ni(II)-EDTA systems by using iron sacrificing electrodes. Standard discharge (0.3 mg L -1 for Cu and 0.1 mg L -1 for Ni in China) could be achieved after 30 min reaction under the optimum conditions (i.e. initial solution pH of 2.0, H 2 O 2 dosage of 6 mL L -1  h -1 , current density of 20 mA/cm 2 , inter-electrode distance of 2 cm, and sulfate electrolyte concentration of 2000 mg L -1 ). The distinct differences in apparent kinetic rate constants (k app ) and intermediate removal efficiencies corresponding to mere and blended systems indicated the mutual promotion effect toward the decomplexation between Cu(II) and Ni(II). Massive accumulation of Fe(Ⅲ) favored the further removal of Cu(II) and Ni(II) by metal ion substitution. Species distribution results demonstrated that the decomplexation of metal-EDTA in E-Fenton process was mainly contributed to the combination of various reactions, including Fenton reaction together with the anodic oxidation, electro-coagulation (E-coagulation) and electrodeposition. Unlike hypophosphite and citrate, the presence of chlorine ion displayed favorable effects on the removal efficiencies of Cu(II) and Ni(II) at low dosage, but facilitated the ammonia nitrogen (NH 4 + -N) removal only at high dosage. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. The comparison of predictive scheduling algorithms for different sizes of job shop scheduling problems

    Science.gov (United States)

    Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.; Krenczyk, D.

    2016-08-01

    In the paper a survey of predictive and reactive scheduling methods is done in order to evaluate how the ability of prediction of reliability characteristics influences over robustness criteria. The most important reliability characteristics are: Mean Time to Failure, Mean Time of Repair. Survey analysis is done for a job shop scheduling problem. The paper answers the question: what method generates robust schedules in the case of a bottleneck failure occurrence before, at the beginning of planned maintenance actions or after planned maintenance actions? Efficiency of predictive schedules is evaluated using criteria: makespan, total tardiness, flow time, idle time. Efficiency of reactive schedules is evaluated using: solution robustness criterion and quality robustness criterion. This paper is the continuation of the research conducted in the paper [1], where the survey of predictive and reactive scheduling methods is done only for small size scheduling problems.

  19. Supporting Real-Time Operations and Execution through Timeline and Scheduling Aids

    Science.gov (United States)

    Marquez, Jessica J.; Pyrzak, Guy; Hashemi, Sam; Ahmed, Samia; McMillin, Kevin Edward; Medwid, Joseph Daniel; Chen, Diana; Hurtle, Esten

    2013-01-01

    Since 2003, the NASA Ames Research Center has been actively involved in researching and advancing the state-of-the-art of planning and scheduling tools for NASA mission operations. Our planning toolkit SPIFe (Scheduling and Planning Interface for Exploration) has supported a variety of missions and field tests, scheduling activities for Mars rovers as well as crew on-board International Space Station and NASA earth analogs. The scheduled plan is the integration of all the activities for the day/s. In turn, the agents (rovers, landers, spaceships, crew) execute from this schedule while the mission support team members (e.g., flight controllers) follow the schedule during execution. Over the last couple of years, our team has begun to research and validate methods that will better support users during realtime operations and execution of scheduled activities. Our team utilizes human-computer interaction principles to research user needs, identify workflow processes, prototype software aids, and user test these. This paper discusses three specific prototypes developed and user tested to support real-time operations: Score Mobile, Playbook, and Mobile Assistant for Task Execution (MATE).

  20. Scheduling with Time Lags

    NARCIS (Netherlands)

    X. Zhang (Xiandong)

    2010-01-01

    textabstractScheduling is essential when activities need to be allocated to scarce resources over time. Motivated by the problem of scheduling barges along container terminals in the Port of Rotterdam, this thesis designs and analyzes algorithms for various on-line and off-line scheduling problems

  1. Scheduling with Bus Access Optimization for Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Eles, Petru; Doboli, Alex; Pop, Paul

    2000-01-01

    of control. Our goal is to derive a worst case delay by which the system completes execution, such that this delay is as small as possible; to generate a logically and temporally deterministic schedule; and to optimize parameters of the communication protocol such that this delay is guaranteed. We have......In this paper, we concentrate on aspects related to the synthesis of distributed embedded systems consisting of programmable processors and application-specific hardware components. The approach is based on an abstract graph representation that captures, at process level, both dataflow and the flow......, generates an efficient bus access scheme as well as the schedule tables for activation of processes and communications....

  2. Automated Scheduling Via Artificial Intelligence

    Science.gov (United States)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  3. Cavity Processing and Preparation of 650 MHz Elliptical Cell Cavities for PIP-II

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, Allan [Fermilab; Chandrasekaran, Saravan Kumar [Fermilab; Grassellino, Anna [Fermilab; Melnychuk, Oleksandr [Fermilab; Merio, Margherita [Fermilab; Reid, Thomas [Argonne (main); Sergatskov, Dmitri [Fermilab

    2017-05-01

    The PIP-II project at Fermilab requires fifteen 650 MHz SRF cryomodules as part of the 800 MeV LINAC that will provide a high intensity proton beam to the Fermilab neutrino program. A total of fifty-seven high-performance SRF cavities will populate the cryomodules and will operate in both pulsed and continuous wave modes. These cavities will be processed and prepared for performance testing utilizing adapted cavity processing infrastructure already in place at Fermilab and Argonne. The processing recipes implemented for these structures will incorporate state-of-the art processing and cleaning techniques developed for 1.3 GHz SRF cavities for the ILC, XFEL, and LCLS-II projects. This paper describes the details of the processing recipes and associated chemistry, heat treatment, and cleanroom processes at the Fermilab and Argonne cavity processing facilities. This paper also presents single and multi-cell cavity test results with quality factors above 5·10¹⁰ and accelerating gradients above 30 MV/m.

  4. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  5. Coordinating space telescope operations in an integrated planning and scheduling architecture

    Science.gov (United States)

    Muscettola, Nicola; Smith, Stephen F.; Cesta, Amedeo; D'Aloisi, Daniela

    1992-01-01

    The Heuristic Scheduling Testbed System (HSTS), a software architecture for integrated planning and scheduling, is discussed. The architecture has been applied to the problem of generating observation schedules for the Hubble Space Telescope. This problem is representative of the class of problems that can be addressed: their complexity lies in the interaction of resource allocation and auxiliary task expansion. The architecture deals with this interaction by viewing planning and scheduling as two complementary aspects of the more general process of constructing behaviors of a dynamical system. The principal components of the software architecture are described, indicating how to model the structure and dynamics of a system, how to represent schedules at multiple levels of abstraction in the temporal database, and how the problem solving machinery operates. A scheduler for the detailed management of Hubble Space Telescope operations that has been developed within HSTS is described. Experimental performance results are given that indicate the utility and practicality of the approach.

  6. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks

    Directory of Open Access Journals (Sweden)

    Mihai-Victor Micea

    2017-06-01

    Full Text Available Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H2RTS, which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU utilization factor. From the detailed, integrated schedulability analysis of the H2RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller.

  7. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks.

    Science.gov (United States)

    Micea, Mihai-Victor; Stangaciu, Cristina-Sorina; Stangaciu, Valentin; Curiac, Daniel-Ioan

    2017-06-26

    Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H²RTS), which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU) utilization factor. From the detailed, integrated schedulability analysis of the H²RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller.

  8. Variable Neighborhood Search for Parallel Machines Scheduling Problem with Step Deteriorating Jobs

    Directory of Open Access Journals (Sweden)

    Wenming Cheng

    2012-01-01

    Full Text Available In many real scheduling environments, a job processed later needs longer time than the same job when it starts earlier. This phenomenon is known as scheduling with deteriorating jobs to many industrial applications. In this paper, we study a scheduling problem of minimizing the total completion time on identical parallel machines where the processing time of a job is a step function of its starting time and a deteriorating date that is individual to all jobs. Firstly, a mixed integer programming model is presented for the problem. And then, a modified weight-combination search algorithm and a variable neighborhood search are employed to yield optimal or near-optimal schedule. To evaluate the performance of the proposed algorithms, computational experiments are performed on randomly generated test instances. Finally, computational results show that the proposed approaches obtain near-optimal solutions in a reasonable computational time even for large-sized problems.

  9. Adaptive scheduling with postexamining user selection under nonidentical fading

    KAUST Repository

    Gaaloul, Fakhreddine

    2012-11-01

    This paper investigates an adaptive scheduling algorithm for multiuser environments with statistically independent but nonidentically distributed (i.n.d.) channel conditions. The algorithm aims to reduce feedback load by sequentially and arbitrarily examining the user channels. It also provides improved performance by realizing postexamining best user selection. The first part of the paper presents new formulations for the statistics of the signal-to-noise ratio (SNR) of the scheduled user under i.n.d. channel conditions. The second part capitalizes on the findings in the first part and presents various performance and processing complexity measures for adaptive discrete-time transmission. The results are then extended to investigate the effect of outdated channel estimates on the statistics of the scheduled user SNR, as well as some performance measures. Numerical results are provided to clarify the usefulness of the scheduling algorithm under perfect or outdated channel estimates. © 1967-2012 IEEE.

  10. Current status for TRR-II Cold Neutron Source

    International Nuclear Information System (INIS)

    Lee, C.H.; Guung, T.C.; Lan, K.C.; Wang, C.H.; Chan, Y.K.; Shieh, D.J.

    2001-01-01

    The Taiwan Research Reactor (TRR) project (TRR-II) is carrying out at Institute of Nuclear Energy Research (INER) from October 1998 to December 2006. The purpose of Cold Neutron Source (CNS) project is to build entire CNS facility to generate cold neutrons within TRR-II reactor. The objective of CNS design is to install CNS facility with a competitive brightness of cold neutron beam to other facilities in the world. Based on the TRR-II CNS project schedule, the conceptual design for TRR-II CNS facility has been completed and the mock-up test facility for full-scale hydrogen loop has been designed. (author)

  11. Preemptive scheduling with rejection

    NARCIS (Netherlands)

    Hoogeveen, H.; Skutella, M.; Woeginger, Gerhard

    2003-01-01

    We consider the problem of preemptively scheduling a set of n jobs on m (identical, uniformly related, or unrelated) parallel machines. The scheduler may reject a subset of the jobs and thereby incur job-dependent penalties for each rejected job, and he must construct a schedule for the remaining

  12. Preemptive scheduling with rejection

    NARCIS (Netherlands)

    Hoogeveen, J.A.; Skutella, M.; Woeginger, G.J.; Paterson, M.

    2000-01-01

    We consider the problem of preemptively scheduling a set of n jobs on m (identical, uniformly related, or unrelated) parallel machines. The scheduler may reject a subset of the jobs and thereby incur job-dependent penalties for each rejected job, and he must construct a schedule for the remaining

  13. Crane Scheduling on a Plate Storage

    DEFF Research Database (Denmark)

    Hansen, Jesper

    2002-01-01

    OSS produces the worlds largest container ships. The first process of producing the steel ships is handling arrival and storage of steel plates until they are needed in production. Two gantry cranes carry out this task. The planning task is now to create a schedule of movements for the 2 cranes...

  14. Joint optimization of production scheduling and machine group preventive maintenance

    International Nuclear Information System (INIS)

    Xiao, Lei; Song, Sanling; Chen, Xiaohui; Coit, David W.

    2016-01-01

    Joint optimization models were developed combining group preventive maintenance of a series system and production scheduling. In this paper, we propose a joint optimization model to minimize the total cost including production cost, preventive maintenance cost, minimal repair cost for unexpected failures and tardiness cost. The total cost depends on both the production process and the machine maintenance plan associated with reliability. For the problems addressed in this research, any machine unavailability leads to system downtime. Therefore, it is important to optimize the preventive maintenance of machines because their performance impacts the collective production processing associated with all machines. Too lengthy preventive maintenance intervals may be associated with low scheduled machine maintenance cost, but may incur expensive costs for unplanned failure due to low machine reliability. Alternatively, too frequent preventive maintenance activities may achieve the desired high reliability machines, but unacceptably high scheduled maintenance cost. Additionally, product scheduling plans affect tardiness and maintenance cost. Two results are obtained when solving the problem; the optimal group preventive maintenance interval for machines, and the assignment of each job, including the corresponding start time and completion time. To solve this non-deterministic polynomial-time problem, random keys genetic algorithms are used, and a numerical example is solved to illustrate the proposed model. - Highlights: • Group preventive maintenance (PM) planning and production scheduling are jointed. • Maintenance interval and assignment of jobs are decided by minimizing total cost. • Relationships among system age, PM, job processing time are quantified. • Random keys genetic algorithms (GA) are used to solve the NP-hard problem. • Random keys GA and Particle Swarm Optimization (PSO) are compared.

  15. Flow shop scheduling algorithm to optimize warehouse activities

    Directory of Open Access Journals (Sweden)

    P. Centobelli

    2016-01-01

    Full Text Available Successful flow-shop scheduling outlines a more rapid and efficient process of order fulfilment in warehouse activities. Indeed the way and the speed of order processing and, in particular, the operations concerning materials handling between the upper stocking area and a lower forward picking one must be optimized. The two activities, drops and pickings, have considerable impact on important performance parameters for Supply Chain wholesaler companies. In this paper, a new flow shop scheduling algorithm is formulated in order to process a greater number of orders by replacing the FIFO logic for the drops activities of a wholesaler company on a daily basis. The System Dynamics modelling and simulation have been used to simulate the actual scenario and the output solutions. Finally, a t-Student test validates the modelled algorithm, granting that it can be used for all wholesalers based on drop and picking activities.

  16. CMS Planning and Scheduling System

    CERN Document Server

    Kotamaki, M

    1998-01-01

    The paper describes the procedures and the system to build and maintain the schedules needed to manage time, resources, and progress of the CMS project. The system is based on the decomposition of the project into work packages, which can be each considered as a complete project with its own structure. The system promotes the distribution of the decision making and responsibilities to lower levels in the organisation by providing a state-of-the-art system to formalise the external commitments of the work packages without limiting their ability to modify their internal schedules to best meet their commitments. The system lets the project management focus on the interfaces between the work packages and alerts the management immediately if a conflict arises. The proposed system simplifies the planning and management process and eliminates the need for a large, centralised project management system.

  17. Designing cyclic appointment schedules for outpatient clinics with scheduled and unscheduled patient arrivals

    NARCIS (Netherlands)

    Kortbeek, Nikky; Zonderland, Maartje E.; Braaksma, Aleida; Vliegen, Ingrid M. H.; Boucherie, Richard J.; Litvak, Nelly; Hans, Erwin W.

    2014-01-01

    We present a methodology to design appointment systems for outpatient clinics and diagnostic facilities that offer both walk-in and scheduled service. The developed blueprint for the appointment schedule prescribes the number of appointments to plan per day and the moment on the day to schedule the

  18. Designing cyclic appointment schedules for outpatient clinics with scheduled and unscheduled patient arrivals

    NARCIS (Netherlands)

    Kortbeek, Nikky; Zonderland, Maartje Elisabeth; Boucherie, Richardus J.; Litvak, Nelli; Hans, Elias W.

    2011-01-01

    We present a methodology to design appointment systems for outpatient clinics and diagnostic facilities that offer both walk-in and scheduled service. The developed blueprint for the appointment schedule prescribes the number of appointments to plan per day and the moment on the day to schedule the

  19. Unifying practice schedules in the timescales of motor learning and performance.

    Science.gov (United States)

    Verhoeven, F Martijn; Newell, Karl M

    2018-06-01

    In this article, we elaborate from a multiple time scales model of motor learning to examine the independent and integrated effects of massed and distributed practice schedules within- and between-sessions on the persistent (learning) and transient (warm-up, fatigue) processes of performance change. The timescales framework reveals the influence of practice distribution on four learning-related processes: the persistent processes of learning and forgetting, and the transient processes of warm-up decrement and fatigue. The superposition of the different processes of practice leads to a unified set of effects for massed and distributed practice within- and between-sessions in learning motor tasks. This analysis of the interaction between the duration of the interval of practice trials or sessions and parameters of the introduced time scale model captures the unified influence of the between trial and session scheduling of practice on learning and performance. It provides a starting point for new theoretically based hypotheses, and the scheduling of practice that minimizes the negative effects of warm-up decrement, fatigue and forgetting while exploiting the positive effects of learning and retention. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. NRC comprehensive records disposition schedule

    International Nuclear Information System (INIS)

    1992-03-01

    Title 44 United States Code, ''Public Printing and Documents,'' regulations cited in the General Services Administration's (GSA) ''Federal Information Resources Management Regulations'' (FIRMR), Part 201-9, ''Creation, Maintenance, and Use of Records,'' and regulation issued by the National Archives and Records Administration (NARA) in 36 CFR Chapter XII, Subchapter B, ''Records Management,'' require each agency to prepare and issue a comprehensive records disposition schedule that contains the NARA approved records disposition schedules for records unique to the agency and contains the NARA's General Records Schedules for records common to several or all agencies. The approved records disposition schedules specify the appropriate duration of retention and the final disposition for records created or maintained by the NRC. NUREG-0910, Rev. 2, contains ''NRC's Comprehensive Records Disposition Schedule,'' and the original authorized approved citation numbers issued by NARA. Rev. 2 totally reorganizes the records schedules from a functional arrangement to an arrangement by the host office. A subject index and a conversion table have also been developed for the NRC schedules to allow staff to identify the new schedule numbers easily and to improve their ability to locate applicable schedules

  1. Integrating Preventive Maintenance Scheduling As Probability Machine Failure And Batch Production Scheduling

    Directory of Open Access Journals (Sweden)

    Zahedi Zahedi

    2016-06-01

    Full Text Available This paper discusses integrated model of batch production scheduling and machine maintenance scheduling. Batch production scheduling uses minimize total actual flow time criteria and machine maintenance scheduling uses the probability of machine failure based on Weibull distribution. The model assumed no nonconforming parts in a planning horizon. The model shows an increase in the number of the batch (length of production run up to a certain limit will minimize the total actual flow time. Meanwhile, an increase in the length of production run will implicate an increase in the number of PM. An example was given to show how the model and algorithm work.

  2. "What Do I Teach for 90 Minutes?" Creating a Successful Block-Scheduled English Classroom.

    Science.gov (United States)

    Porter, Carol

    The story of the process that Mundelein High School (located in a northwest suburb of Chicago, Illinois) as it moved from a traditional schedule to a block schedule is described throughout this book as a way to blend theory with practice. The book addresses types of block schedules; key issues for effective preparation; professional development…

  3. Guidelines of Decommissioning Schedule Establishment

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae Yong; Yun, Taesik; Kim, Younggook; Kim, Hee-Geun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    Decommissioning has recently become an issue highlighted in Korea due to the Permanent Shutdown (PS) of Kori-1 plant. Since Korea Hydro and Nuclear Power (KHNP) Company decided the PS of Kori-1 instead of further continued operation, Kori-1 will be the first decommissioning plant of the commercial reactors in Korea. Korean regulatory authority demands Initial Decommissioning Plan (IDP) for all the plants in operation and under construction. In addition, decommissioning should be considered for the completion of the life cycle of NPPs. To date, Korea has no experience regarding decommissioning of the commercial reactor and a lot of uncertainties will be expected due to its site-specific factors. However, optimized decommissioning process schedule must be indispensable in the safety and economic efficiency of the project. Differed from USA, Korea has no experience and know-hows of the operation and site management for decommissioning. Hence, in Korea, establishment of decommissioning schedule has to give more weight to safety than precedent cases. More economical and rational schedule will be composed by collecting and analyzing the experience data and site-specific data and information as the decommissioning progresses. In a long-range outlook, KHNP having capability of NPP decommissioning will try to decommissioning business in Korea and foreign countries.

  4. Gain Scheduling for the Orion Launch Abort Vehicle Controller

    Science.gov (United States)

    McNamara, Sara J.; Restrepo, Carolina I.; Madsen, Jennifer M.; Medina, Edgar A.; Proud, Ryan W.; Whitley, Ryan J.

    2011-01-01

    One of NASAs challenges for the Orion vehicle is the control system design for the Launch Abort Vehicle (LAV), which is required to abort safely at any time during the atmospheric ascent portion of ight. The focus of this paper is the gain design and scheduling process for a controller that covers the wide range of vehicle configurations and flight conditions experienced during the full envelope of potential abort trajectories from the pad to exo-atmospheric flight. Several factors are taken into account in the automation process for tuning the gains including the abort effectors, the environmental changes and the autopilot modes. Gain scheduling is accomplished using a linear quadratic regulator (LQR) approach for the decoupled, simplified linear model throughout the operational envelope in time, altitude and Mach number. The derived gains are then implemented into the full linear model for controller requirement validation. Finally, the gains are tested and evaluated in a non-linear simulation using the vehicles ight software to ensure performance requirements are met. An overview of the LAV controller design and a description of the linear plant models are presented. Examples of the most significant challenges with the automation of the gain tuning process are then discussed. In conclusion, the paper will consider the lessons learned through out the process, especially in regards to automation, and examine the usefulness of the gain scheduling tool and process developed as applicable to non-Orion vehicles.

  5. Parallel Machine Scheduling with Batch Delivery to Two Customers

    Directory of Open Access Journals (Sweden)

    Xueling Zhong

    2015-01-01

    Full Text Available In some make-to-order supply chains, the manufacturer needs to process and deliver products for customers at different locations. To coordinate production and distribution operations at the detailed scheduling level, we study a parallel machine scheduling model with batch delivery to two customers by vehicle routing method. In this model, the supply chain consists of a processing facility with m parallel machines and two customers. A set of jobs containing n1 jobs from customer 1 and n2 jobs from customer 2 are first processed in the processing facility and then delivered to the customers directly without intermediate inventory. The problem is to find a joint schedule of production and distribution such that the tradeoff between maximum arrival time of the jobs and total distribution cost is minimized. The distribution cost of a delivery shipment consists of a fixed charge and a variable cost proportional to the total distance of the route taken by the shipment. We provide polynomial time heuristics with worst-case performance analysis for the problem. If m=2 and (n1-b(n2-b<0, we propose a heuristic with worst-case ratio bound of 3/2, where b is the capacity of the delivery shipment. Otherwise, the worst-case ratio bound of the heuristic we propose is 2-2/(m+1.

  6. The new German neutron source FRM-II

    International Nuclear Information System (INIS)

    Nuding, M.; Axmann, A.; Boening, K.

    2002-01-01

    The construction of a new high-flux research reactor, the FRM-II is finished. This new reactor shall replace the existing FRM, which has been operated very successfully for about 43 years. The report at first presents the main applications of the FRM-II and its core and plant design. After that a description of the tests performed during the licensing procedure is given. At the end some current topics are discussed and an outlook on the time schedule is presented [ru

  7. Scheduling Parallel Jobs Using Migration and Consolidation in the Cloud

    Directory of Open Access Journals (Sweden)

    Xiaocheng Liu

    2012-01-01

    Full Text Available An increasing number of high performance computing parallel applications leverages the power of the cloud for parallel processing. How to schedule the parallel applications to improve the quality of service is the key to the successful host of parallel applications in the cloud. The large scale of the cloud makes the parallel job scheduling more complicated as even simple parallel job scheduling problem is NP-complete. In this paper, we propose a parallel job scheduling algorithm named MEASY. MEASY adopts migration and consolidation to enhance the most popular EASY scheduling algorithm. Our extensive experiments on well-known workloads show that our algorithm takes very good care of the quality of service. For two common parallel job scheduling objectives, our algorithm produces an up to 41.1% and an average of 23.1% improvement on the average response time; an up to 82.9% and an average of 69.3% improvement on the average slowdown. Our algorithm is robust even in terms that it allows inaccurate CPU usage estimation and high migration cost. Our approach involves trivial modification on EASY and requires no additional technique; it is practical and effective in the cloud environment.

  8. Outage scheduling and implementation

    International Nuclear Information System (INIS)

    Allison, J.E.; Segall, P.; Smith, R.R.

    1986-01-01

    Successful preparation and implementation of an outage schedule and completion of scheduled and emergent work within an identified critical path time frame is a result of careful coordination by Operations, Work Control, Maintenance, Engineering, Planning and Administration and others. At the Fast Flux Test Facility (FFTF) careful planning has been responsible for meeting all scheduled outage critical paths

  9. Degradation of a xanthene dye by Fe(II)-mediated activation of Oxone process.

    Science.gov (United States)

    Wang, Y R; Chu, W

    2011-02-28

    A powerful oxidation process using sulfate radicals activated by transition metal mediated Oxone process has been evaluated in depth by monitoring the degradation of a xanthene dye Rhodamine B (RhB) in aqueous solution. Ferrous ion was chosen as the transition metal due to its potential catalytic effect and wide availability in dyeing industrial effluent. The effects of parameters including reactant dosing sequence, Fe(II)/Oxone molar ratio and concentration, solution pH, and inorganic salts on the process performance have been investigated. Total RhB removal was obtained within 90 min under an optimal Fe(II)/Oxone molar ratio of 1:1. The RhB degradation was found to be a two-stage kinetics, consisting of a rapid initial decay and followed by a retarded stage. Additionally, experimental results indicated that the presence of certain anions had either a positive or negative effect on the process. The inhibitory effect in the presence of SO(4)(2-) was elucidated by a proposed formula using Nernst equation. Furthermore, dye mineralization in terms of TOC removal indicates that stepwise addition of Fe(II) and Oxone can significantly improve the process performance by about 20%, and the retention time required can be greatly reduced comparing with the conventional one-off dosing method. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Patients' preference for radiotherapy fractionation schedule in the palliation of symptomatic unresectable lung cancer

    International Nuclear Information System (INIS)

    Tang, J. I.; Lu, J. J.; Wong, L. C.

    2008-01-01

    Full text: The palliative radiotherapeutic management of unresectable non-small-cell lung cancer is controversial, with various fractionation (F x) schedules available. We aimed to determine patient's choice of F x schedule after involvement in a decision-making process using a decision board. A decision board outlining the various advantages and disadvantages apparent in the Medical Research Council study of F x schedules (17 Gy in two fractions vs 39 Gy in 13 fractions) was discussed with patients who met Medical Research Council eligibility criteria. Patients were then asked to indicate their preferred F x schedules, reasons and their level of satisfaction with being involved in the decision making process. Radiation oncologists (R O ) could prescribe radiotherapy schedules irrespective of patients' preferences. Of 92 patients enrolled, 55% chose the longer schedule. English-speaking patients were significantly more likely to choose the longer schedule (P 0.02, 95% confidence interval: 1.2-7.6). Longer F x was chosen because of longer survival (90%) and better local control (12%). Shorter F x was chosen for shorter overall treatment duration (80%), cost (61%) and better symptom control (20%). In all, 56% of patients choosing the shorter schedule had their treatment altered by the treating R O , whereas only 4% of patients choosing longer F x had their treatment altered (P O 's own biases.

  11. Adaptive Incremental Genetic Algorithm for Task Scheduling in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kairong Duan

    2018-05-01

    Full Text Available Cloud computing is a new commercial model that enables customers to acquire large amounts of virtual resources on demand. Resources including hardware and software can be delivered as services and measured by specific usage of storage, processing, bandwidth, etc. In Cloud computing, task scheduling is a process of mapping cloud tasks to Virtual Machines (VMs. When binding the tasks to VMs, the scheduling strategy has an important influence on the efficiency of datacenter and related energy consumption. Although many traditional scheduling algorithms have been applied in various platforms, they may not work efficiently due to the large number of user requests, the variety of computation resources and complexity of Cloud environment. In this paper, we tackle the task scheduling problem which aims to minimize makespan by Genetic Algorithm (GA. We propose an incremental GA which has adaptive probabilities of crossover and mutation. The mutation and crossover rates change according to generations and also vary between individuals. Large numbers of tasks are randomly generated to simulate various scales of task scheduling problem in Cloud environment. Based on the instance types of Amazon EC2, we implemented virtual machines with different computing capacity on CloudSim. We compared the performance of the adaptive incremental GA with that of Standard GA, Min-Min, Max-Min , Simulated Annealing and Artificial Bee Colony Algorithm in finding the optimal scheme. Experimental results show that the proposed algorithm can achieve feasible solutions which have acceptable makespan with less computation time.

  12. Schedulability-Driven Frame Packing for Multi-Cluster Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2003-01-01

    We present an approach to frame packing for multi-cluster distributed embedded systems consisting of time-triggered and event-triggered clusters, interconnected via gateways. In our approach, the application messages are packed into frames such that the application is schedulable. Thus, we have...... also proposed a schedulability analysis for applications consisting of mixed event-triggered and time-triggered processes and messages, and a worst case queuing delay analysis for the gateways, responsible for routing inter-cluster traffic. Optimization heuristics for frame packing aiming at producing...... a schedulable system have been proposed. Extensive experiments and a real-life example show the efficiency of our frame-packing approach....

  13. Artificial intelligence for the CTA Observatory scheduler

    Science.gov (United States)

    Colomé, Josep; Colomer, Pau; Campreciós, Jordi; Coiffard, Thierry; de Oña, Emma; Pedaletti, Giovanna; Torres, Diego F.; Garcia-Piquer, Alvaro

    2014-08-01

    The Cherenkov Telescope Array (CTA) project will be the next generation ground-based very high energy gamma-ray instrument. The success of the precursor projects (i.e., HESS, MAGIC, VERITAS) motivated the construction of this large infrastructure that is included in the roadmap of the ESFRI projects since 2008. CTA is planned to start the construction phase in 2015 and will consist of two arrays of Cherenkov telescopes operated as a proposal-driven open observatory. Two sites are foreseen at the southern and northern hemispheres. The CTA observatory will handle several observation modes and will have to operate tens of telescopes with a highly efficient and reliable control. Thus, the CTA planning tool is a key element in the control layer for the optimization of the observatory time. The main purpose of the scheduler for CTA is the allocation of multiple tasks to one single array or to multiple sub-arrays of telescopes, while maximizing the scientific return of the facility and minimizing the operational costs. The scheduler considers long- and short-term varying conditions to optimize the prioritization of tasks. A short-term scheduler provides the system with the capability to adapt, in almost real-time, the selected task to the varying execution constraints (i.e., Targets of Opportunity, health or status of the system components, environment conditions). The scheduling procedure ensures that long-term planning decisions are correctly transferred to the short-term prioritization process for a suitable selection of the next task to execute on the array. In this contribution we present the constraints to CTA task scheduling that helped classifying it as a Flexible Job-Shop Problem case and finding its optimal solution based on Artificial Intelligence techniques. We describe the scheduler prototype that uses a Guarded Discrete Stochastic Neural Network (GDSN), for an easy representation of the possible long- and short-term planning solutions, and Constraint

  14. Elementary sulfur in effluent from denitrifying sulfide removal process as adsorbent for zinc(II).

    Science.gov (United States)

    Chen, Chuan; Zhou, Xu; Wang, Aijie; Wu, Dong-hai; Liu, Li-hong; Ren, Nanqi; Lee, Duu-Jong

    2012-10-01

    The denitrifying sulfide removal (DSR) process can simultaneously convert sulfide, nitrate and organic compounds into elementary sulfur (S(0)), di-nitrogen gas and carbon dioxide, respectively. However, the S(0) formed in the DSR process are micro-sized colloids with negatively charged surface, making isolation of S(0) colloids from other biological cells and metabolites difficult. This study proposed the use of S(0) in DSR effluent as a novel adsorbent for zinc removal from wastewaters. Batch and continuous tests were conducted for efficient zinc removal with S(0)-containing DSR effluent. At pHremoval rates of zinc(II) were increased with increasing pH. The formed S(0) colloids carried negative charge onto which zinc(II) ions could be adsorbed via electrostatic interactions. The zinc(II) adsorbed S(0) colloids further enhanced coagulation-sedimentation efficiency of suspended solids in DSR effluents. The DSR effluent presents a promising coagulant for zinc(II) containing wastewaters. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Project Schedule Simulation

    DEFF Research Database (Denmark)

    Mizouni, Rabeb; Lazarova-Molnar, Sanja

    2015-01-01

    overrun both their budget and time. To improve the quality of initial project plans, we show in this paper the importance of (1) reflecting features’ priorities/risk in task schedules and (2) considering uncertainties related to human factors in plan schedules. To make simulation tasks reflect features......’ priority as well as multimodal team allocation, enhanced project schedules (EPS), where remedial actions scenarios (RAS) are added, were introduced. They reflect potential schedule modifications in case of uncertainties and promote a dynamic sequencing of involved tasks rather than the static conventional...... this document as an instruction set. The electronic file of your paper will be formatted further at Journal of Software. Define all symbols used in the abstract. Do not cite references in the abstract. Do not delete the blank line immediately above the abstract; it sets the footnote at the bottom of this column....

  16. Cultural-Based Genetic Tabu Algorithm for Multiobjective Job Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Yuzhen Yang

    2014-01-01

    Full Text Available The job shop scheduling problem, which has been dealt with by various traditional optimization methods over the decades, has proved to be an NP-hard problem and difficult in solving, especially in the multiobjective field. In this paper, we have proposed a novel quadspace cultural genetic tabu algorithm (QSCGTA to solve such problem. This algorithm provides a different structure from the original cultural algorithm in containing double brief spaces and population spaces. These spaces deal with different levels of populations globally and locally by applying genetic and tabu searches separately and exchange information regularly to make the process more effective towards promising areas, along with modified multiobjective domination and transform functions. Moreover, we have presented a bidirectional shifting for the decoding process of job shop scheduling. The computational results we presented significantly prove the effectiveness and efficiency of the cultural-based genetic tabu algorithm for the multiobjective job shop scheduling problem.

  17. Innovative Production Scheduling with Customer Satisfaction Based Measurement for the Sustainability of Manufacturing Firms

    Directory of Open Access Journals (Sweden)

    Sang-Oh Shim

    2017-12-01

    Full Text Available Scheduling problems for the sustainability of manufacturing firms in the era of the fourth industrial revolution is addressed in this research. In terms of open innovation, innovative production scheduling can be defined as scheduling using big data, cyber-physical systems, internet of things, cloud computing, mobile network, and so on. In this environment, one of the most important things is to develop an innovative scheduling algorithm for the sustainability of manufacturing firms. In this research, a flexible flowshop scheduling problem is considered with the properties of sequence-dependent setup and different process plans for jobs. In a flexible flowshop, there are serial workstations with multiple pieces of equipment that are able to process multiple lots simultaneously. Since the scheduling in this workshop is known to be extremely difficult, it is important to devise an efficient and effective scheduling algorithm. In this research, a heuristic algorithm is proposed based on a few dispatching rules and economic lot size model with the objective of minimizing total tardiness of orders. For the purposes of performance evaluation, a simulation study is conducted on randomly generated problem instances. The results imply that our proposed method outperforms the existing ones, and greatly enhances the sustainability of manufacturing firms.

  18. Cbs (Contrastrain Based Schedulling Adalah Faktor Penentu Keberhasilan Perusahanan Printing

    Directory of Open Access Journals (Sweden)

    Hendra Achmadi

    2010-06-01

    Full Text Available In a highly competitive industry faces today ranging from small or home-based printing to using machine that can print offset a hundred thousand copies per hour. But, the increasing competition resulted in requiring a faster production time from order entry, print proff until the production process to delivery to customers. Often times in case of orders which will result in the concurrent PPIC will experience vertigo in the setting of production schedules which have concurrent delivery time. Often will end up with no receipt of orders due to difficulties in the production schedule, especially if the orders require the same offset machine and cylinder wear the same length, while the number of cylinders is limited. Therefore, the printing company should be able to do so in the conduct of a penetration timing of production can easily be simulated and implemented on the ground. CBS (Base Constraint scheduling is a technique to do the scheduling of production so that production can be carried out smoothly and quickly that fulfill the promise made to customers. In scheduling, there are several techniques that can be used are: FCFS (First Came First Serve, EDD (Earliest Date, and LCLS (Last Came Last Serve. So, it is required to be able to do way better scheduling to get results quickly in this fast changing schedules.

  19. On non-permutation solutions to some two machine flow shop scheduling problems

    NARCIS (Netherlands)

    V. Strusevich (Vitaly); P.J. Zwaneveld (Peter)

    1994-01-01

    textabstractIn this paper, we study two versions of the two machine flow shop scheduling problem, where schedule length is to be minimized. First, we consider the two machine flow shop with setup, processing, and removal times separated. It is shown that an optimal solution need not be a permutation

  20. Gain scheduling using the Youla parameterization

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    1999-01-01

    Gain scheduling controllers are considered in this paper. The gain scheduling problem where the scheduling parameter vector cannot be measured directly, but needs to be estimated is considered. An estimation of the scheduling vector has been derived by using the Youla parameterization. The use...... in connection with H_inf gain scheduling controllers....

  1. Value Stream Mapping for Evaluation of Load Scheduling Possibilities in a District Heating Plant

    Directory of Open Access Journals (Sweden)

    Raivo Melsas

    2016-09-01

    Full Text Available The aim of this paper is to provide a solution for load scheduling by implementing value stream mapping, which is a straightforward enough for production management. Decision makers in the industry should have a clear understanding about positive effect from load scheduling and its effect to production outcome and process availability. Value stream mapping is a well-known process optimization tool from lean production philosophy. The aim of value stream mapping is to shorten the lead time of industrial processes and to reduce the intermediate stock amounts. By complementing value stream map with process energy intensity and energy stored in intermediate stocks, we can promote load scheduling possibilities. Our methodology provides a tool that is understandable and traceable for industry-minded decision makers. Finally, we present a real life test example for the new methodology, which is based on the production process of a district heating plant.

  2. NRC comprehensive records disposition schedule

    International Nuclear Information System (INIS)

    1982-07-01

    Effective January 1, 1982, NRC will institute records retention and disposal practices in accordance with the approved Comprehensive Records Disposition Schedule (CRDS). CRDS is comprised of NRC Schedules (NRCS) 1 to 4 which apply to the agency's program or substantive records and General Records Schedules (GRS) 1 to 22 which apply to housekeeping or facilitative records. The schedules are assembled functionally/organizationally to facilitate their use. Preceding the records descriptions and disposition instructions for both NRCS and GRS, there are brief statements on the organizational units which accumulate the records in each functional area, and other information regarding the schedules' applicability

  3. Fabrication process for the PEP II RF cavities

    Energy Technology Data Exchange (ETDEWEB)

    Franks, R.M.; Rimmer, R.A. [Lawrence Berkeley National Lab., CA (United States); Schwarz, H. [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1997-06-05

    This paper presents the major steps used in the fabrication of the 26 RF Cavities required for the PEP-II B-factory. Several unique applications of conventional processes have been developed and successfully implemented: electron beam welding (EBW), with minimal porosity, of .75 inch (19 mm) copper cross-sections; extensive 5-axis milling of water channels; electroplating of .37 inch (10 mm) thick OFE copper; tuning of the cavity by profiling beam noses prior to final joining with the cavity body; and machining of the cavity interior, are described here.

  4. Extension of the ACE solar panels is tested in SAEF-II

    Science.gov (United States)

    1997-01-01

    Extension of the solar panels is tested on the Advanced Composition Explorer (ACE) spacecraft in KSC's Spacecraft Assembly and Encapsulation Facility-II (SAEF-II). Scheduled for launch on a Delta II rocket from Cape Canaveral Air Station on Aug. 25, ACE will study low-energy particles of solar origin and high-energy galactic particles. The collecting power of instruments aboard ACE is 10 to 1,000 times greater than anything previously flown to collect similar data by NASA.

  5. How Home Health Nurses Plan Their Work Schedules: A Qualitative Descriptive Study.

    Science.gov (United States)

    Irani, Elliane; Hirschman, Karen B; Cacchione, Pamela Z; Bowles, Kathryn H

    2018-06-12

    To describe how home health nurses plan their daily work schedules and what challenges they face during the planning process. Home health nurses are viewed as independent providers and value the nature of their work because of the flexibility and autonomy they hold in developing their work schedules. However, there is limited empirical evidence about how home health nurses plan their work schedules, including the factors they consider during the process and the challenges they face within the dynamic home health setting. Qualitative descriptive design. Semi-structured interviews were conducted with 20 registered nurses who had greater than 2 years of experience in home health and were employed by one of the three participating home health agencies in the mid-Atlantic region of the United States. Data were analyzed using conventional content analysis. Four themes emerged about planning work schedules and daily itineraries: identifying patient needs to prioritize visits accordingly, partnering with patients to accommodate their preferences, coordinating visit timing with other providers to avoid overwhelming patients, and working within agency standards to meet productivity requirements. Scheduling challenges included readjusting the schedule based on patient needs and staffing availability, anticipating longer visits, and maintaining continuity of care with patients. Home health nurses make autonomous decisions regarding their work schedules while considering specific patient and agency factors, and overcome challenges related to the unpredictable nature of providing care in a home health setting. Future research is needed to further explore nurse productivity in home health and improve home health work environments. Home health nurses plan their work schedules to provide high quality care that is patient-centered and timely. The findings also highlight organizational priorities to facilitate continuity of care and support nurses while alleviating the burnout

  6. A General State-Space Formulation for Online Scheduling

    Directory of Open Access Journals (Sweden)

    Dhruv Gupta

    2017-11-01

    Full Text Available We present a generalized state-space model formulation particularly motivated by an online scheduling perspective, which allows modeling (1 task-delays and unit breakdowns; (2 fractional delays and unit downtimes, when using discrete-time grid; (3 variable batch-sizes; (4 robust scheduling through the use of conservative yield estimates and processing times; (5 feedback on task-yield estimates before the task finishes; (6 task termination during its execution; (7 post-production storage of material in unit; and (8 unit capacity degradation and maintenance. Through these proposed generalizations, we enable a natural way to handle routinely encountered disturbances and a rich set of corresponding counter-decisions. Thereby, greatly simplifying and extending the possible application of mathematical programming based online scheduling solutions to diverse application settings. Finally, we demonstrate the effectiveness of this model on a case study from the field of bio-manufacturing.

  7. An effective repetitive training schedule to achieve skill proficiency using a novel robotic virtual reality simulator.

    Science.gov (United States)

    Kang, Sung Gu; Ryu, Byung Ju; Yang, Kyung Sook; Ko, Young Hwii; Cho, Seok; Kang, Seok Ho; Patel, Vipul R; Cheon, Jun

    2015-01-01

    A robotic virtual reality simulator (Mimic dV-Trainer) can be a useful training method for the da Vinci surgical system. Herein, we investigate several repetitive training schedules and determine which is the most effective. A total of 30 medical students were enrolled and were divided into 3 groups according to the training schedule. Group 1 performed the task 1 hour daily for 4 consecutive days, group II performed the task on once per week for 1 hour for 4 consecutive weeks, and group III performed the task for 4 consecutive hours in 1 day. The effects of training were investigated by analyzing the number of repetitions and the time required to complete the "Tube 2" simulation task when the learning curve plateau was reached. The point at which participants reached a stable score was evaluated using the cumulative sum control graph. The average time to complete the task at the learning curve plateau was 150.3 seconds in group I, 171.9 seconds in group II, and 188.5 seconds in group III. The number of task repetitions required to reach the learning curve plateau was 45 repetitions in group I, 36 repetitions in group II, and 39 repetitions in group III. Therefore, there was continuous improvement in the time required to perform the task after 40 repetitions in group I only. There was a significant correlation between improvement in each trial interval and attempt, and the correlation coefficient (0.924) in group I was higher than that in group II (0.899) and group III (0.838). Daily 1-hour practice sessions performed for 4 consecutive days resulted in the best final score, continuous score improvement, and effective training while minimizing fatigue. This repetition schedule can be used for effectively training novices in future. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. A comparative study of itraconazole in various dose schedules in the treatment of pulmonary aspergilloma in treated patients of pulmonary tuberculosis

    Directory of Open Access Journals (Sweden)

    Prahlad Rai Gupta

    2015-01-01

    Full Text Available Introduction: The optimal dose, duration, and efficacy of itraconazole in Indian patients of pulmonary aspergilloma (PA are not clearly defined. Therefore, a study was carried out, to resolve these issues in diagnosed cases of PA complicating old treated patients of pulmonary tuberculosis. Materials and Methods: The study patients randomly received itraconazole either in a fixed dose schedule of 200 mg (group I, 200 mg twice daily (group II or a variable dose schedule (group III, for 12 months. All the patients were followed up for the entire duration of the study for clinical, radiological, and immunological response. The side effects were recorded as and when reported by the patients and managed symptomatically. Results: A total of 60 patients were enrolled, 20, in each group. There were no intergroup differences with regard to age, sex, body weight, smoking status, alcohol intake, symptoms, Potassium hydroxide (KOH mount, fungal culture, pattern of radiological lesions or anti-aspergillus antibodies (anti-Asp-Ab titers. The radiological response was poor in group I patients, as compared to the other groups, at two months (P < 0.05. The dose of itraconazole was increased in five of the patients in group I due to poor response. A higher number of group II patients suffered side effects and the dose of itraconazole had to be decreased in three of these patients, but none of the patients on a variable dose schedule required a change in dose schedule. Conclusion: Thus, a weight-based variable dose schedule of itraconazole was found to be a more effective and safer modality in the management of PA than a fixed dose schedule.

  9. Time and Energy Efficient DVS Scheduling for Real-Time Pinwheel Tasks

    OpenAIRE

    Da-Ren, Chen; Young-Long, Chen; You-Shyang, Chen

    2014-01-01

    Dynamic voltage/frequency scaling (DVFS) is one of the most effective techniques for reducing energy use. In this paper, we focus on the pinwheel task model to develop a variable voltage processor with d discrete voltage/speed levels. Depending on the granularity of execution unit to which voltage scaling is applied, DVFS scheduling can be defined in two categories: (i) inter-task DVFS and (ii) intra-task DVFS. In the periodic pinwheel task model, we modified the definitions of both intra- an...

  10. Parallel-Machine Scheduling with Time-Dependent and Machine Availability Constraints

    Directory of Open Access Journals (Sweden)

    Cuixia Miao

    2015-01-01

    Full Text Available We consider the parallel-machine scheduling problem in which the machines have availability constraints and the processing time of each job is simple linear increasing function of its starting times. For the makespan minimization problem, which is NP-hard in the strong sense, we discuss the Longest Deteriorating Rate algorithm and List Scheduling algorithm; we also provide a lower bound of any optimal schedule. For the total completion time minimization problem, we analyze the strong NP-hardness, and we present a dynamic programming algorithm and a fully polynomial time approximation scheme for the two-machine problem. Furthermore, we extended the dynamic programming algorithm to the total weighted completion time minimization problem.

  11. Interface between the production plan and the master production schedule in assembly environments

    OpenAIRE

    Moya Navarro, Marcos; Sánchez Brenes, Magaly

    2012-01-01

    In a production environment there is a direct relationship between the market and the manufacturing process of goods.When production is immersed in an assembly environment, the process of production planning and scheduling becomes complex, and the enterprises have the risk of losing competitive advantages in terms of not meeting delivery dates and production high costs. Linear programming has become an appropriate tool for production planning and scheduling in complex manufacturing environmen...

  12. Multiuser switched diversity scheduling systems with per-user threshold

    KAUST Repository

    Nam, Haewoon

    2010-05-01

    A multiuser switched diversity scheduling scheme with per-user feedback threshold is proposed and analyzed in this paper. The conventional multiuser switched diversity scheduling scheme uses a single feedback threshold for every user, where the threshold is a function of the average signal-to-noise ratios (SNRs) of the users as well as the number of users involved in the scheduling process. The proposed scheme, however, constructs a sequence of feedback thresholds instead of a single feedback threshold such that each user compares its channel quality with the corresponding feedback threshold in the sequence. Numerical and simulation results show that thanks to the flexibility of threshold selection, where a potentially different threshold can be used for each user, the proposed scheme provides a higher system capacity than that for the conventional scheme. © 2006 IEEE.

  13. Comparison Performance of Genetic Algorithm and Ant Colony Optimization in Course Scheduling Optimizing

    Directory of Open Access Journals (Sweden)

    Imam Ahmad Ashari

    2016-11-01

    Full Text Available Scheduling problems at the university is a complex type of scheduling problems. The scheduling process should be carried out at every turn of the semester's. The core of the problem of scheduling courses at the university is that the number of components that need to be considered in making the schedule, some of the components was made up of students, lecturers, time and a room with due regard to the limits and certain conditions so that no collision in the schedule such as mashed room, mashed lecturer and others. To resolve a scheduling problem most appropriate technique used is the technique of optimization. Optimization techniques can give the best results desired. Metaheuristic algorithm is an algorithm that has a lot of ways to solve the problems to the very limit the optimal solution. In this paper, we use a genetic algorithm and ant colony optimization algorithm is an algorithm metaheuristic to solve the problem of course scheduling. The two algorithm will be tested and compared to get performance is the best. The algorithm was tested using data schedule courses of the university in Semarang. From the experimental results we conclude that the genetic algorithm has better performance than the ant colony optimization  algorithm in solving the case of course scheduling.

  14. Amphetamine increases schedule-induced drinking reduced by negative punishment procedures.

    Science.gov (United States)

    Pérez-Padilla, Angeles; Pellón, Ricardo

    2003-05-01

    d-Amphetamine has been reported to increase schedule-induced drinking punished by lick-dependent signalled delays in food delivery. This might reflect a drug-behaviour interaction dependent on the type of punisher, because no such effect has been found when drinking was reduced by lick-contingent electric shocks. However, the anti-punishment effect of amphetamine could be mediated by other behavioural processes, such as a loss of discriminative control or an increase in the value of delayed reinforcers. To test the effects of d-amphetamine on the acquisition and maintenance of schedule-induced drinking reduced by unsignalled delays in food delivery. Rats received 10-s unsignalled delays initiated by each lick after polydipsia was induced by a fixed-time 30-s food reinforcement schedule or from the outset of the experiment. Yoked-control rats received these same delays but independently of their own behaviour. d-Amphetamine (0.1-3.0 mg/kg) was then tested IP. d-Amphetamine dose-dependently increased and then decreased punished schedule-induced drinking. The drug led to dose-dependent reductions when the delays were not contingent or when they were applied from the outset of training. These results support the contention that d-amphetamine has an increasing effect on schedule-induced drinking that has been previously reduced by a negative punishment procedure. This effect cannot be attributed to other potentially involved processes, and therefore support the idea that drug effects on punished behaviour depend on punishment being delays in food or shock deliveries.

  15. It Is Not Just about the Schedule: Key Factors in Effective Reference Desk Scheduling and Management

    Science.gov (United States)

    Sciammarella, Susan; Fernandes, Maria Isabel; McKay, Devin

    2008-01-01

    Reference desk scheduling is one of the most challenging tasks in the organizational structure of an academic library. The ability to turn this challenge into a workable and effective function lies with the scheduler and indirectly the cooperation of all librarians scheduled for reference desk service. It is the scheduler's sensitivity to such…

  16. Real-time energy resources scheduling considering short-term and very short-term wind forecast

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Marco; Sousa, Tiago; Morais, Hugo; Vale, Zita [Polytechnic of Porto (Portugal). GECAD - Knowledge Engineering and Decision Support Research Center

    2012-07-01

    This paper proposes an energy resources management methodology based on three distinct time horizons: day-ahead scheduling, hour-ahead scheduling, and real-time scheduling. In each scheduling process the update of generation and consumption operation and of the storage and electric vehicles storage status are used. Besides the new operation conditions, the most accurate forecast values of wind generation and of consumption using results of short-term and very short-term methods are used. A case study considering a distribution network with intensive use of distributed generation and electric vehicles is presented. (orig.)

  17. Approximating Preemptive Stochastic Scheduling

    OpenAIRE

    Megow Nicole; Vredeveld Tjark

    2009-01-01

    We present constant approximative policies for preemptive stochastic scheduling. We derive policies with a guaranteed performance ratio of 2 for scheduling jobs with release dates on identical parallel machines subject to minimizing the sum of weighted completion times. Our policies as well as their analysis apply also to the recently introduced more general model of stochastic online scheduling. The performance guarantee we give matches the best result known for the corresponding determinist...

  18. ATD-2 Surface Scheduling and Metering Concept

    Science.gov (United States)

    Coppenbarger, Richard A.; Jung, Yoon Chul; Capps, Richard Alan; Engelland, Shawn A.

    2017-01-01

    This presentation describes the concept of ATD-2 tactical surface scheduling and metering. The concept is composed of several elements, including data exchange and integration; surface modeling; surface scheduling; and surface metering. The presentation explains each of the elements. Surface metering is implemented to balance demand and capacity• When surface metering is on, target times from surface scheduler areconverted to advisories for throttling demand• Through the scheduling process, flights with CTOTs will not get addedmetering delay (avoids potential for ‘double delay’)• Carriers can designate certain flights as exempt from metering holds• Demand throttle in Phase 1 at CLT is through advisories sent to rampcontrollers for pushback instructions to the flight deck– Push now– Hold for an advised period of time (in minutes)• Principles of surface metering can be more generally applied to otherairports in the NAS to throttle demand via spot-release times (TMATs Strong focus on optimal use of airport resources• Flexibility enables stakeholders to vary the amount of delay theywould like transferred to gate• Addresses practical aspects of executing surface metering in aturbulent real world environment• Algorithms designed for both short term demand/capacityimbalances (banks) or sustained metering situations• Leverage automation to enable surface metering capability withoutrequiring additional positions• Represents first step in Tactical/Strategic fusion• Provides longer look-ahead calculations to enable analysis ofstrategic surface metering potential usage

  19. Schedules of Controlled Substances: Temporary Placement of ortho-Fluorofentanyl, Tetrahydrofuranyl Fentanyl, and Methoxyacetyl Fentanyl Into Schedule I. Temporary amendment; temporary scheduling order.

    Science.gov (United States)

    2017-10-26

    The Administrator of the Drug Enforcement Administration is issuing this temporary scheduling order to schedule the synthetic opioids, N-(2-fluorophenyl)-N-(1-phenethylpiperidin-4-yl)propionamide (ortho-fluorofentanyl or 2-fluorofentanyl), N-(1-phenethylpiperidin-4-yl)-N-phenyltetrahydrofuran-2-carboxamide (tetrahydrofuranyl fentanyl), and 2-methoxy-N-(1-phenethylpiperidin-4-yl)-N-phenylacetamide (methoxyacetyl fentanyl), into Schedule I. This action is based on a finding by the Administrator that the placement of ortho-fluorofentanyl, tetrahydrofuranyl fentanyl, and methoxyacetyl fentanyl into Schedule I of the Controlled Substances Act is necessary to avoid an imminent hazard to the public safety. As a result of this order, the regulatory controls and administrative, civil, and criminal sanctions applicable to Schedule I controlled substances will be imposed on persons who handle (manufacture, distribute, reverse distribute, import, export, engage in research, conduct instructional activities or chemical analysis, or possess), or propose to handle, ortho-fluorofentanyl, tetrahydrofuranyl fentanyl, and methoxyacetyl fentanyl.

  20. Planning and Scheduling for Environmental Sensor Networks

    Science.gov (United States)

    Frank, J. D.

    2005-12-01

    Environmental Sensor Networks are a new way of monitoring the environment. They comprise autonomous sensor nodes in the environment that record real-time data, which is retrieved, analyzed, integrated with other data sets (e.g. satellite images, GIS, process models) and ultimately lead to scientific discoveries. Sensor networks must operate within time and resource constraints. Sensors have limited onboard memory, energy, computational power, communications windows and communications bandwidth. The value of data will depend on when, where and how it was collected, how detailed the data is, how long it takes to integrate the data, and how important the data was to the original scientific question. Planning and scheduling of sensor networks is necessary for effective, safe operations in the face of these constraints. For example, power bus limitations may preclude sensors from simultaneously collecting data and communicating without damaging the sensor; planners and schedulers can ensure these operations are ordered so that they do not happen simultaneously. Planning and scheduling can also ensure best use of the sensor network to maximize the value of collected science data. For example, if data is best recorded using a particular camera angle but it is costly in time and energy to achieve this, planners and schedulers can search for times when time and energy are available to achieve the optimal camera angle. Planning and scheduling can handle uncertainty in the problem specification; planners can be re-run when new information is made available, or can generate plans that include contingencies. For example, if bad weather may prevent the collection of data, a contingent plan can check lighting conditions and turn off data collection to save resources if lighting is not ideal. Both mobile and immobile sensors can benefit from planning and scheduling. For example, data collection on otherwise passive sensors can be halted to preserve limited power and memory

  1. Data Model Approach And Markov Chain Based Analysis Of Multi-Level Queue Scheduling

    Directory of Open Access Journals (Sweden)

    Diwakar Shukla

    2010-01-01

    Full Text Available There are many CPU scheduling algorithms inliterature like FIFO, Round Robin, Shortest-Job-First and so on.The Multilevel-Queue-Scheduling is superior to these due to itsbetter management of a variety of processes. In this paper, aMarkov chain model is used for a general setup of Multilevelqueue-scheduling and the scheduler is assumed to performrandom movement on queue over the quantum of time.Performance of scheduling is examined through a rowdependent data model. It is found that with increasing value of αand d, the chance of system going over the waiting state reduces.At some of the interesting combinations of α and d, it diminishesto zero, thereby, provides us some clue regarding better choice ofqueues over others for high priority jobs. It is found that ifqueue priorities are added in the scheduling intelligently thenbetter performance could be obtained. Data model helpschoosing appropriate preferences.

  2. OL-DEC-MDP Model for Multiagent Online Scheduling with a Time-Dependent Probability of Success

    Directory of Open Access Journals (Sweden)

    Cheng Zhu

    2014-01-01

    Full Text Available Focusing on the on-line multiagent scheduling problem, this paper considers the time-dependent probability of success and processing duration and proposes an OL-DEC-MDP (opportunity loss-decentralized Markov Decision Processes model to include opportunity loss into scheduling decision to improve overall performance. The success probability of job processing as well as the process duration is dependent on the time at which the processing is started. The probability of completing the assigned job by an agent would be higher when the process is started earlier, but the opportunity loss could also be high due to the longer engaging duration. As a result, OL-DEC-MDP model introduces a reward function considering the opportunity loss, which is estimated based on the prediction of the upcoming jobs by a sampling method on the job arrival. Heuristic strategies are introduced in computing the best starting time for an incoming job by each agent, and an incoming job will always be scheduled to the agent with the highest reward among all agents with their best starting policies. The simulation experiments show that the OL-DEC-MDP model will improve the overall scheduling performance compared with models not considering opportunity loss in heavy-loading environment.

  3. A three-stage heuristic for harvest scheduling with access road network development

    Science.gov (United States)

    Mark M. Clark; Russell D. Meller; Timothy P. McDonald

    2000-01-01

    In this article we present a new model for the scheduling of forest harvesting with spatial and temporal constraints. Our approach is unique in that we incorporate access road network development into the harvest scheduling selection process. Due to the difficulty of solving the problem optimally, we develop a heuristic that consists of a solution construction stage...

  4. A Heuristic Scheduling Algorithm for Minimizing Makespan and Idle Time in a Nagare Cell

    Directory of Open Access Journals (Sweden)

    M. Muthukumaran

    2012-01-01

    Full Text Available Adopting a focused factory is a powerful approach for today manufacturing enterprise. This paper introduces the basic manufacturing concept for a struggling manufacturer with limited conventional resources, providing an alternative solution to cell scheduling by implementing the technique of Nagare cell. Nagare cell is a Japanese concept with more objectives than cellular manufacturing system. It is a combination of manual and semiautomatic machine layout as cells, which gives maximum output flexibility for all kind of low-to-medium- and medium-to-high- volume productions. The solution adopted is to create a dedicated group of conventional machines, all but one of which are already available on the shop floor. This paper focuses on the development of heuristic scheduling algorithm in step-by-step method. The algorithm states that the summation of processing time of all products on each machine is calculated first and then the sum of processing time is sorted by the shortest processing time rule to get the assignment schedule. Based on the assignment schedule Nagare cell layout is arranged for processing the product. In addition, this algorithm provides steps to determine the product ready time, machine idle time, and product idle time. And also the Gantt chart, the experimental analysis, and the comparative results are illustrated with five (1×8 to 5×8 scheduling problems. Finally, the objective of minimizing makespan and idle time with greater customer satisfaction is studied through.

  5. Barriers to participation in a phase II cardiac rehabilitation programme.

    Science.gov (United States)

    Mak, Y M W; Chan, W K; Yue, C S S

    2005-12-01

    To identify barriers to participation in a phase II cardiac rehabilitation programme and measures that may enhance participation. Prospective study. Regional hospital, Hong Kong. Cardiac patients recruited for a phase I cardiac rehabilitation programme from July 2002 to January 2003. Reasons for not participating in a phase II cardiac rehabilitation programme. Of the 193 patients recruited for a phase I cardiac rehabilitation programme, 152 (79%) patients, with a mean age of 70.3 years (standard deviation, 11.9 years), did not proceed to phase II programme. Eleven (7%) deaths occurred before commencement of phase II and 74 (49%) patients were considered physically unfit. Reasons for the latter included fractures, pain, or degenerative changes in the lower limbs (24%), and co-morbidities such as cerebrovascular accident (19%), chronic renal failure (11%), congestive heart failure (9%), and unstable angina (8%). Phase II rehabilitation was postponed until after completion of scheduled cardiac interventions in 13% of patients. Failure of physicians to arrange the pre-phase II exercise stress test as per protocol was reported in 7% of patients. Other reasons were reported: work or time conflicts (16%), non-compliance with cardiac treatment (5%), financial constraints (4%), self-exercise (3%), fear after exercise stress testing (3%), and patients returning to their original cardiologists for treatment (3%). A significant (79%) proportion of patients did not proceed to a phase II cardiac rehabilitation programme for a variety of reasons. These included physical unfitness, work or time conflicts, and need to attend scheduled cardiac interventions. Further studies are required to determine how to overcome obstacles to cardiac rehabilitation.

  6. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  7. Successful Implementation of Six Sigma to Schedule Student Staffing for Circulation Service Desks

    Science.gov (United States)

    Jankowski, Janiece

    2013-01-01

    In fall of 2011 the University at Buffalo Libraries circulation department undertook Six Sigma training for the purpose of overhauling its student scheduling process. The department was able to mitigate significant staffing budgetary reductions and resource reallocations and to overcome the unique challenges of scheduling student labor for a…

  8. Revisiting Symbiotic Job Scheduling

    OpenAIRE

    Eyerman , Stijn; Michaud , Pierre; Rogiest , Wouter

    2015-01-01

    International audience; —Symbiotic job scheduling exploits the fact that in a system with shared resources, the performance of jobs is impacted by the behavior of other co-running jobs. By coscheduling combinations of jobs that have low interference, the performance of a system can be increased. In this paper, we investigate the impact of using symbiotic job scheduling for increasing throughput. We find that even for a theoretically optimal scheduler, this impact is very low, despite the subs...

  9. Collaborative Distributed Scheduling Approaches for Wireless Sensor Network

    Science.gov (United States)

    Niu, Jianjun; Deng, Zhidong

    2009-01-01

    Energy constraints restrict the lifetime of wireless sensor networks (WSNs) with battery-powered nodes, which poses great challenges for their large scale application. In this paper, we propose a family of collaborative distributed scheduling approaches (CDSAs) based on the Markov process to reduce the energy consumption of a WSN. The family of CDSAs comprises of two approaches: a one-step collaborative distributed approach and a two-step collaborative distributed approach. The approaches enable nodes to learn the behavior information of its environment collaboratively and integrate sleep scheduling with transmission scheduling to reduce the energy consumption. We analyze the adaptability and practicality features of the CDSAs. The simulation results show that the two proposed approaches can effectively reduce nodes' energy consumption. Some other characteristics of the CDSAs like buffer occupation and packet delay are also analyzed in this paper. We evaluate CDSAs extensively on a 15-node WSN testbed. The test results show that the CDSAs conserve the energy effectively and are feasible for real WSNs. PMID:22408491

  10. Preliminary evaluation of alternative waste form solidification processes. Volume II. Evaluation of the processes

    International Nuclear Information System (INIS)

    1980-08-01

    This Volume II presents engineering feasibility evaluations of the eleven processes for solidification of nuclear high-level liquid wastes (HHLW) described in Volume I of this report. Each evaluation was based in a systematic assessment of the process in respect to six principal evaluation criteria: complexity of process; state of development; safety; process requirements; development work required; and facility requirements. The principal criteria were further subdivided into a total of 22 subcriteria, each of which was assigned a weight. Each process was then assigned a figure of merit, on a scale of 1 to 10, for each of the subcriteria. A total rating was obtained for each process by summing the products of the subcriteria ratings and the subcriteria weights. The evaluations were based on the process descriptions presented in Volume I of this report, supplemented by information obtained from the literature, including publications by the originators of the various processes. Waste form properties were, in general, not evaluated. This document describes the approach which was taken, the developent and application of the rating criteria and subcriteria, and the evaluation results. A series of appendices set forth summary descriptions of the processes and the ratings, together with the complete numerical ratings assigned; two appendices present further technical details on the rating process

  11. A stochastic simulation approach for production scheduling and ...

    African Journals Online (AJOL)

    International Journal of Engineering, Science and Technology ... management decisions related to production scheduling and investment planning. ... and indicate the value of promoting an information culture in the entire work forces. ... to support decision making in a BPR (Business Processes Re-engineering) scenario.

  12. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the

  13. Sharing Data for Production Scheduling Using the ISA-95 Standard

    Energy Technology Data Exchange (ETDEWEB)

    Harjunkoski, Iiro, E-mail: iiro.harjunkoski@de.abb.com; Bauer, Reinhard [ABB Corporate Research, Industrial Software and Applications, Ladenburg (Germany)

    2014-10-21

    In the development and deployment of production scheduling solutions, one major challenge is to establish efficient information sharing with industrial production management systems. Information comprising production orders to be scheduled, processing plant structure, product recipes, available equipment, and other resources are necessary for producing a realistic short-term production plan. Currently, a widely accepted standard for information sharing is missing. This often leads to the implementation of costly custom-tailored interfaces, or in the worst case the scheduling solution will be abandoned. Additionally, it becomes difficult to easily compare different methods on various problem instances, which complicates the re-use of existing scheduling solutions. In order to overcome these hurdles, a platform-independent and holistic approach is needed. Nevertheless, it is difficult for any new solution to gain wide acceptance within industry as new standards are often refused by companies already using a different established interface. From an acceptance point of view, the ISA-95 standard could act as a neutral data-exchange platform. In this paper, we assess if this already widespread standard is simple, yet powerful enough to act as the desired holistic data exchange for scheduling solutions.

  14. Sharing Data for Production Scheduling Using the ISA-95 Standard

    International Nuclear Information System (INIS)

    Harjunkoski, Iiro; Bauer, Reinhard

    2014-01-01

    In the development and deployment of production scheduling solutions, one major challenge is to establish efficient information sharing with industrial production management systems. Information comprising production orders to be scheduled, processing plant structure, product recipes, available equipment, and other resources are necessary for producing a realistic short-term production plan. Currently, a widely accepted standard for information sharing is missing. This often leads to the implementation of costly custom-tailored interfaces, or in the worst case the scheduling solution will be abandoned. Additionally, it becomes difficult to easily compare different methods on various problem instances, which complicates the re-use of existing scheduling solutions. In order to overcome these hurdles, a platform-independent and holistic approach is needed. Nevertheless, it is difficult for any new solution to gain wide acceptance within industry as new standards are often refused by companies already using a different established interface. From an acceptance point of view, the ISA-95 standard could act as a neutral data-exchange platform. In this paper, we assess if this already widespread standard is simple, yet powerful enough to act as the desired holistic data exchange for scheduling solutions.

  15. Sharing data for production scheduling using the ISA-95 standard

    Directory of Open Access Journals (Sweden)

    Iiro eHarjunkoski

    2014-10-01

    Full Text Available In the development and deployment of production scheduling solutions one major challenge is to establish efficient information sharing with industrial production management systems. Information comprising production orders to be scheduled, processing plant structure, product recipes, available equipment and other resources are necessary for producing a realistic short-term production plan. Currently, a widely-accepted standard for information sharing is missing. This often leads to the implementation of costly custom-tailored interfaces, or in the worst case the scheduling solution will be abandoned. Additionally, it becomes difficult to easily compare different methods on various problem instances, which complicates the re-use of existing scheduling solutions. In order to overcome these hurdles, a platform-independent and holistic approach is needed. Nevertheless, it is difficult for any new solution to gain wide acceptance within industry as new standards are often refused by companies already using a different established interface. From an acceptance point of view, the ISA-95 standard could act as a neutral data-exchange platform. In this paper, we assess if this already widespread standard is simple, yet powerful enough to act as the desired holistic data-exchange for scheduling solutions.

  16. Power-scheduling - Introduction of the FPBG1 System

    International Nuclear Information System (INIS)

    Braun, A.; Meier, W.

    2006-01-01

    This article takes a look at the Scheduled Balance Group system (in German: 'Fahrplanbilanzgruppensystem', FPBG) which was successfully introduced in Switzerland in December 2005. The development of the system is described and the reasons for the development of a new concept for Switzerland are discussed. The system permits the use of standard European scheduling processes for national and international power traders and supports Switzerland's function as an important power hub in Europe. The basics behind the concept are discussed and the mechanisms of its functioning are illustrated in graphical form. The implementation of the system by the ETRANS company is looked at and various questions posed in this connection are answered

  17. A master surgical scheduling approach for cyclic scheduling in operating room departments

    NARCIS (Netherlands)

    van Oostrum, Jeroen M.; van Houdenhoven, M.; Hurink, Johann L.; Hans, Elias W.; Wullink, Gerhard; Kazemier, G.

    This paper addresses the problem of operating room (OR) scheduling at the tactical level of hospital planning and control. Hospitals repetitively construct operating room schedules, which is a time-consuming, tedious, and complex task. The stochasticity of the durations of surgical procedures

  18. Reusable rocket engine preventive maintenance scheduling using genetic algorithm

    International Nuclear Information System (INIS)

    Chen, Tao; Li, Jiawen; Jin, Ping; Cai, Guobiao

    2013-01-01

    This paper deals with the preventive maintenance (PM) scheduling problem of reusable rocket engine (RRE), which is different from the ordinary repairable systems, by genetic algorithm. Three types of PM activities for RRE are considered and modeled by introducing the concept of effective age. The impacts of PM on all subsystems' aging processes are evaluated based on improvement factor model. Then the reliability of engine is formulated by considering the accumulated time effect. After that, optimization model subjected to reliability constraint is developed for RRE PM scheduling at fixed interval. The optimal PM combination is obtained by minimizing the total cost in the whole life cycle for a supposed engine. Numerical investigations indicate that the subsystem's intrinsic reliability characteristic and the improvement factor of maintain operations are the most important parameters in RRE's PM scheduling management

  19. Immunization Schedules for Adults

    Science.gov (United States)

    ... ACIP Vaccination Recommendations Why Immunize? Vaccines: The Basics Immunization Schedule for Adults (19 Years of Age and ... diseases that can be prevented by vaccines . 2018 Immunization Schedule Recommended Vaccinations for Adults by Age and ...

  20. Instant Childhood Immunization Schedule

    Science.gov (United States)

    ... Recommendations Why Immunize? Vaccines: The Basics Instant Childhood Immunization Schedule Recommend on Facebook Tweet Share Compartir Get ... date. See Disclaimer for additional details. Based on Immunization Schedule for Children 0 through 6 Years of ...

  1. Nontraditional work schedules for pharmacists.

    Science.gov (United States)

    Mahaney, Lynnae; Sanborn, Michael; Alexander, Emily

    2008-11-15

    Nontraditional work schedules for pharmacists at three institutions are described. The demand for pharmacists and health care in general continues to increase, yet significant material changes are occurring in the pharmacy work force. These changing demographics, coupled with historical vacancy rates and turnover trends for pharmacy staff, require an increased emphasis on workplace changes that can improve staff recruitment and retention. At William S. Middleton Memorial Veterans Affairs Hospital in Madison, Wisconsin, creative pharmacist work schedules and roles are now mainstays to the recruitment and retention of staff. The major challenge that such scheduling presents is the 8 hours needed to prepare a six-week schedule. Baylor Medical Center at Grapevine in Dallas, Texas, has a total of 45 pharmacy employees, and slightly less than half of the 24.5 full-time-equivalent staff work full-time, with most preferring to work one, two, or three days per week. As long as the coverage needs of the facility are met, Envision Telepharmacy in Alpine, Texas, allows almost any scheduling arrangement preferred by individual pharmacists or the pharmacist group covering the facility. Staffing involves a great variety of shift lengths and intervals, with shifts ranging from 2 to 10 hours. Pharmacy leaders must be increasingly aware of opportunities to provide staff with unique scheduling and operational enhancements that can provide for a better work-life balance. Compressed workweeks, job-sharing, and team scheduling were the most common types of alternative work schedules implemented at three different institutions.

  2. Optimizing Music Learning: Exploring How Blocked and Interleaved Practice Schedules Affect Advanced Performance.

    Science.gov (United States)

    Carter, Christine E; Grahn, Jessica A

    2016-01-01

    Repetition is the most commonly used practice strategy by musicians. Although blocks of repetition continue to be suggested in the pedagogical literature, work in the field of cognitive psychology suggests that repeated events receive less processing, thereby reducing the potential for long-term learning. Motor skill learning and sport psychology research offer an alternative. Instead of using a blocked practice schedule, with practice completed on one task before moving on to the next task, an interleaved schedule can be used, in which practice is frequently alternated between tasks. This frequent alternation involves more effortful processing, resulting in increased long-term learning. The finding that practicing in an interleaved schedule leads to better retention than practicing in a blocked schedule has been labeled the "contextual interference effect." While the effect has been observed across a wide variety of fields, few studies have researched this phenomenon in a music-learning context, despite the broad potential for application to music practice. This study compared the effects of blocked and interleaved practice schedules on advanced clarinet performance in an ecologically valid context. Ten clarinetists were given one concerto exposition and one technical excerpt to practice in a blocked schedule (12 min per piece) and a second concerto exposition and technical excerpt to practice in an interleaved schedule (3 min per piece, alternating until a total of 12 min of practice were completed on each piece). Participants sight-read the four pieces prior to practice and performed them at the end of practice and again one day later. The sight-reading and two performance run-throughs of each piece were recorded and given to three professional clarinetists to rate using a percentage scale. Overall, whenever there was a ratings difference between the conditions, pieces practiced in the interleaved schedule were rated better than those in the blocked schedule

  3. Optimizing Music Learning: Exploring How Blocked and Interleaved Practice Schedules Affect Advanced Performance

    Science.gov (United States)

    Carter, Christine E.; Grahn, Jessica A.

    2016-01-01

    Repetition is the most commonly used practice strategy by musicians. Although blocks of repetition continue to be suggested in the pedagogical literature, work in the field of cognitive psychology suggests that repeated events receive less processing, thereby reducing the potential for long-term learning. Motor skill learning and sport psychology research offer an alternative. Instead of using a blocked practice schedule, with practice completed on one task before moving on to the next task, an interleaved schedule can be used, in which practice is frequently alternated between tasks. This frequent alternation involves more effortful processing, resulting in increased long-term learning. The finding that practicing in an interleaved schedule leads to better retention than practicing in a blocked schedule has been labeled the “contextual interference effect.” While the effect has been observed across a wide variety of fields, few studies have researched this phenomenon in a music-learning context, despite the broad potential for application to music practice. This study compared the effects of blocked and interleaved practice schedules on advanced clarinet performance in an ecologically valid context. Ten clarinetists were given one concerto exposition and one technical excerpt to practice in a blocked schedule (12 min per piece) and a second concerto exposition and technical excerpt to practice in an interleaved schedule (3 min per piece, alternating until a total of 12 min of practice were completed on each piece). Participants sight-read the four pieces prior to practice and performed them at the end of practice and again one day later. The sight-reading and two performance run-throughs of each piece were recorded and given to three professional clarinetists to rate using a percentage scale. Overall, whenever there was a ratings difference between the conditions, pieces practiced in the interleaved schedule were rated better than those in the blocked schedule

  4. Optimizing music learning: Exploring how blocked and interleaved practice schedules affect advanced performance

    Directory of Open Access Journals (Sweden)

    Christine E Carter

    2016-08-01

    Full Text Available Repetition is the most commonly used practice strategy by musicians. Although blocks of repetition continue to be suggested in the pedagogical literature, work in the field of cognitive psychology suggests that repeated events receive less processing, thereby reducing the potential for long-term learning. Motor skill learning and sport psychology research offer an alternative. Instead of using a blocked practice schedule, with practice completed on one task before moving on to the next task, an interleaved schedule can be used, in which practice is frequently alternated between tasks. This frequent alternation involves more effortful processing, resulting in increased long-term learning. The finding that practicing in an interleaved schedule leads to better retention than practicing in a blocked schedule has been labeled the contextual interference effect. While the effect has been observed across a wide variety of fields, few studies have researched this phenomenon in a music-learning context, despite the broad potential for application to music practice. This study compared the effects of blocked and interleaved practice schedules on advanced clarinet performance in an ecologically valid context. Ten clarinetists were given one concerto exposition and one technical excerpt to practice in a blocked schedule (twelve minutes per piece and a second concerto exposition and technical excerpt to practice in an interleaved schedule (three minutes per piece, alternating until a total of twelve minutes of practice were completed on each piece. Participants sight-read the four pieces prior to practice and performed them at the end of practice and again one day later. The sight-reading and two performance run-throughs of each piece were recorded and given to three professional clarinetists to rate using a percentage scale. Overall, whenever there was a ratings difference between the conditions, pieces practiced in the interleaved schedule were rated

  5. RTNS-II irradiations and operations

    International Nuclear Information System (INIS)

    Logan, C.M.; Heikkinen, D.W.

    1982-01-01

    The objectives of this work are operation of RTNS-II (a 14-MeV neutron source facility), machine development, and support of the experimental program that utilizes this facility. Experimenter services include dosimetry handling, scheduling, coordination, and reporting. RTNS-II is dedicated to materials research for the fusion power program. Its primary use is to aid in the development of models of high-energy neutron effects. Such models are needed in interpreting and projecting to the fusion environment engineering data obtained in other neutron spectra. Irradiations were performed for a total of twenty-nine different experimenters during this quarter. A JOEL 200 CX TEM and other post-irradiation test equipment have been installed

  6. F/H Area Effluent Treatment Facility. Phase II. CAC basic data

    International Nuclear Information System (INIS)

    Collins, W.W.; O'Leary, C.D.

    1984-01-01

    Project objectives and requirements are listed for both Phase I and II. Schedule is listed with startup targeted for 1989. Storage facilities will be provided for both chemical and radioactive effluents. 8 figs., 19 tabs

  7. Flexibility Driven Scheduling and Mapping for Distributed Real-Time Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2002-01-01

    In this paper we present an approach to mapping and scheduling of distributed hard real-time systems, aiming at improving the flexibility of the design process. We consider an incremental design process that starts from an already existing system running a set of applications, with preemptive...

  8. Scheduling permutation flowshops with initial availability constraint: Analysis of solutions and constructive heuristics

    OpenAIRE

    Pérez González, Paz; Framiñán Torres, José Manuel

    2009-01-01

    In this paper, we address the problem of scheduling a set of jobs in a flowshop with makespan objective. In contrast to the usual assumption of machine availability presented in most research, we consider that machines may not be available at the beginning of the planning period, due to processing of previously scheduled jobs. We first formulate the problem, analyse the structure of solutions depending on a number of factors (such as machines, jobs, structure of the processing times, availabi...

  9. Effects of the amount and schedule of varied practice after constant practice on the adaptive process of motor learning

    Directory of Open Access Journals (Sweden)

    Umberto Cesar Corrêa

    2014-12-01

    Full Text Available This study investigated the effects of different amounts and schedules of varied practice, after constant practice, on the adaptive process of motor learning. Participants were one hundred and seven children with a mean age of 11.1 ± 0.9 years. Three experiments were carried out using a complex anticipatory timing task manipulating the following components in the varied practice: visual stimulus speed (experiment 1; sequential response pattern (experiment 2; and visual stimulus speed plus sequential response pattern (experiment 3. In all experiments the design involved three amounts (18, 36, and 63 trials, and two schedules (random and blocked of varied practice. The experiments also involved two learning phases: stabilization and adaptation. The dependent variables were the absolute, variable, and constant errors related to the task goal, and the relative timing of the sequential response. Results showed that all groups worsened the performances in the adaptation phase, and no difference was observed between them. Altogether, the results of the three experiments allow the conclusion that the amounts of trials manipulated in the random and blocked practices did not promote the diversification of the skill since no adaptation was observed.

  10. ADVANCED SCHEDULER FOR COOPERATIVE EXECUTION OF THREADS ON MULTI-CORE SYSTEM

    Directory of Open Access Journals (Sweden)

    O. N. Karasik

    2017-01-01

    Full Text Available Three architectures of the cooperative thread scheduler in a multithreaded application that is executed on a multi-core system are considered. Architecture A0 is based on the synchronization and scheduling facilities, which are provided by the operating system. Architecture A1 introduces a new synchronization primitive and a single queue of the blocked threads in the scheduler, which reduces the interaction activity between the threads and operating system, and significantly speed up the processes of blocking and unblocking the threads. Architecture A2 replaces the single queue of blocked threads with dedicated queues, one for each of the synchronizing primitives, extends the number of internal states of the primitive, reduces the inter- dependence of the scheduling threads, and further significantly speeds up the processes of blocking and unblocking the threads. All scheduler architectures are implemented on Windows operating systems and based on the User Mode Scheduling. Important experimental results are obtained for multithreaded applications that implement two blocked parallel algorithms of solving the linear algebraic equation systems by the Gaussian elimination. The algorithms differ in the way of the data distribution among threads and by the thread synchronization models. The number of threads varied from 32 to 7936. Architecture A1 shows the acceleration of up to 8.65% and the architecture A2 shows the acceleration of up to 11.98% compared to A0 architecture for the blocked parallel algorithms computing the triangular form and performing the back substitution. On the back substitution stage of the algorithms, architecture A1 gives the acceleration of up to 125%, and architecture A2 gives the acceleration of up to 413% compared to architecture A0. The experiments clearly show that the proposed architectures, A1 and A2 outperform A0 depending on the number of thread blocking and unblocking operations, which happen during the execution of

  11. Network scheduling at Belene NPP construction site

    International Nuclear Information System (INIS)

    Matveev, A.

    2010-01-01

    Four types of schedules differing in the level of their detail are singled out to enhance the efficiency of Belene NPP Project implementation planning and monitoring: Level 1 Schedule–Summary Integrated Overall Time Schedule (SIOTS) is an appendix to EPC Contract. The main purpose of SIOTS is the large scale presentation of the current information on the Project implementation. Level 2 Schedule–Integrated Overall Time Schedule (IOTS)is the contract schedule for the Contractor (ASE JSC) and their subcontractors.The principal purpose of IOTS is the work progress planning and monitoring, the analysis of the effect of activities implementation upon the progress of the Project as a whole. IOTS is the reporting schedule at the Employer –Contractor level. Level 3 Schedules, Detail Time Schedules(DTS) are developed by those who actually perform the work and are agreed upon with Atomstroyexport JSC.The main purpose of DTS is the detail planning of Atomstroyexport subcontractor's activities. DTSare the reporting schedules at the level of Contractor-Subcontractor. Level 4 Schedules are the High Detail Time Schedules (HDTS), which are the day-to-day plans of work implementation and are developed, as a rule, for a week's time period.Each lower level time schedule details the activities of the higher level time schedule

  12. Long term scheduling technique for wastewater minimisation in multipurpose batch processes

    CSIR Research Space (South Africa)

    Nonyane, DR

    2012-05-01

    Full Text Available (2011) xxx?xxx Contents lists available at SciVerse ScienceDirect Applied Mathematical Modelling doi:10.1016/j.apm.2011.08.007 The effect of industrial activities on freshwater resources has become more apparent in the past few decades. This has led... journal homepage: www.elsevier .com/locate /apm e, T. Majozi, Long term scheduling technique for wastewater minimisation in multipurpose :10.1016/j.apm.2011.08.007 Nomenclature Sets P {p|p = time point} J {j|j = unit} C {c|c = contaminant} Sin {sin...

  13. Parents' Family Time and Work Schedules: The Split-Shift Schedule in Spain

    NARCIS (Netherlands)

    Gracia, P.; Kalmijn, M.

    2016-01-01

    This study used data on couples from the 2003 Spanish Time Use Survey (N = 1,416) to analyze how work schedules are associated with family, couple, parent–child, and non-family leisure activities. Spain is clearly an interesting case for the institutionalized split-shift schedule, a long lunch break

  14. Relative performance of priority rules for hybrid flow shop scheduling with setup times

    Directory of Open Access Journals (Sweden)

    Helio Yochihiro Fuchigami

    2015-12-01

    Full Text Available This paper focuses the hybrid flow shop scheduling problem with explicit and sequence-independent setup times. This production environment is a multistage system with unidirectional flow of jobs, wherein each stage may contain multiple machines available for processing. The optimized measure was the total time to complete the schedule (makespan. The aim was to propose new priority rules to support the schedule and to evaluate their relative performance at the production system considered by the percentage of success, relative deviation, standard deviation of relative deviation, and average CPU time. Computational experiments have indicated that the rules using ascending order of the sum of processing and setup times of the first stage (SPT1 and SPT1_ERD performed better, reaching together more than 56% of success.

  15. Scheduling Broadcasts in a Network of Timelines

    KAUST Repository

    Manzoor, Emaad A.

    2015-05-12

    Broadcasts and timelines are the primary mechanism of information exchange in online social platforms today. Services like Facebook, Twitter and Instagram have enabled ordinary people to reach large audiences spanning cultures and countries, while their massive popularity has created increasingly competitive marketplaces of attention. Timing broadcasts to capture the attention of such geographically diverse audiences has sparked interest from many startups and social marketing gurus. However, formal study is lacking on both the timing and frequency problems. In this thesis, we introduce, motivate and solve the broadcast scheduling problem of specifying the timing and frequency of publishing content to maximise the attention received. We validate and quantify three interacting behavioural phenomena to parametrise social platform users: information overload, bursty circadian rhythms and monotony aversion, which is defined here for the first time. Our analysis of the influence of monotony refutes the common assumption that posts on social network timelines are consumed piecemeal independently. Instead, we reveal that posts are consumed in chunks, which has important consequences for any future work considering human behaviour over social network timelines. Our quantification of monotony aversion is also novel, and has applications to problems in various domains such as recommender list diversification, user satiation and variety-seeking consumer behaviour. Having studied the underlying behavioural phenomena, we link schedules, timelines, attention and behaviour by formalising a timeline information exchange process. Our formulation gives rise to a natural objective function that quantifies the expected collective attention an arrangement of posts on a timeline will receive. We apply this formulation as a case-study on real-data from Twitter, where we estimate behavioural parameters, calculate the attention potential for different scheduling strategies and, using the

  16. A Conceptual Level Design for a Static Scheduler for Hard Real-Time Systems

    Science.gov (United States)

    1988-03-01

    The design of hard real - time systems is gaining a great deal of attention in the software engineering field as more and more real-world processes are...for these hard real - time systems . PSDL, as an executable design language, is supported by an execution support system consisting of a static scheduler, dynamic scheduler, and translator.

  17. Future aircraft networks and schedules

    Science.gov (United States)

    Shu, Yan

    2011-07-01

    Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents

  18. Handling machine breakdown for dynamic scheduling by a colony of cognitive agents in a holonic manufacturing framework

    Directory of Open Access Journals (Sweden)

    T. K. Jana

    2015-09-01

    Full Text Available There is an ever increasing need of providing quick, yet improved solution to dynamic scheduling by better responsiveness following simple coordination mechanism to better adapt to the changing environments. In this endeavor, a cognitive agent based approach is proposed to deal with machine failure. A Multi Agent based Holonic Adaptive Scheduling (MAHoAS architecture is developed to frame the schedule by explicit communication between the product holons and the resource holons in association with the integrated process planning and scheduling (IPPS holon under normal situation. In the event of breakdown of a resource, the cooperation is sought by implicit communication. Inspired by the cognitive behavior of human being, a cognitive decision making scheme is proposed that reallocates the incomplete task to another resource in the most optimized manner and tries to expedite the processing in view of machine failure. A metamorphic algorithm is developed and implemented in Oracle 9i to identify the best candidate resource for task re-allocation. Integrated approach to process planning and scheduling realized under Multi Agent System (MAS framework facilitates dynamic scheduling with improved performance under such situations. The responsiveness of the resources having cognitive capabilities helps to overcome the adverse consequences of resource failure in a better way.

  19. Postulated licensing schedule for an independent spent-fuel-storage installation

    International Nuclear Information System (INIS)

    Ludwick, J.D.

    1982-11-01

    A review of licensing requirements, processes, and anticipated actions for independent spent fuel storage installations (ISFSIs) was conducted in order to develop an estimated schedule and sequence of events for licensing a new ISFSI. This estimate will be useful to potential ISFSI owners in planning for the licensing of their facilities. It is concluded that, although many uncertainties exist with respect to such things as legal appeals, about 29 months are estimated to elapse between license application and license issuance for an ISFSI. This estimate is in reasonable agreement with a previous time estimate for licensing an ISFSI, and, taking into account the special circumstances involved, with the actual licensing schedule for the GE-Morris ISFSI. However, individual portions of the licensing schedule from each case studied sometimes vary significantly

  20. Range Scheduling Aid (RSA)

    Science.gov (United States)

    Logan, J. R.; Pulvermacher, M. K.

    1991-01-01

    Range Scheduling Aid (RSA) is presented in the form of the viewgraphs. The following subject areas are covered: satellite control network; current and new approaches to range scheduling; MITRE tasking; RSA features; RSA display; constraint based analytic capability; RSA architecture; and RSA benefits.

  1. Comparative Simulation Study of Production Scheduling in the Hybrid and the Parallel Flow

    Directory of Open Access Journals (Sweden)

    Varela Maria L.R.

    2017-06-01

    Full Text Available Scheduling is one of the most important decisions in production control. An approach is proposed for supporting users to solve scheduling problems, by choosing the combination of physical manufacturing system configuration and the material handling system settings. The approach considers two alternative manufacturing scheduling configurations in a two stage product oriented manufacturing system, exploring the hybrid flow shop (HFS and the parallel flow shop (PFS environments. For illustrating the application of the proposed approach an industrial case from the automotive components industry is studied. The main aim of this research to compare results of study of production scheduling in the hybrid and the parallel flow, taking into account the makespan minimization criterion. Thus the HFS and the PFS performance is compared and analyzed, mainly in terms of the makespan, as the transportation times vary. The study shows that the performance HFS is clearly better when the work stations’ processing times are unbalanced, either in nature or as a consequence of the addition of transport times just to one of the work station processing time but loses advantage, becoming worse than the performance of the PFS configuration when the work stations’ processing times are balanced, either in nature or as a consequence of the addition of transport times added on the work stations’ processing times. This means that physical layout configurations along with the way transport time are including the work stations’ processing times should be carefully taken into consideration due to its influence on the performance reached by both HFS and PFS configurations.

  2. Immunization Schedules for Infants and Children

    Science.gov (United States)

    ... ACIP Vaccination Recommendations Why Immunize? Vaccines: The Basics Immunization Schedule for Infants and Children (Birth through 6 ... any questions please talk to your doctor. 2018 Immunization Schedule Recommended Vaccinations for Infants and Children Schedule ...

  3. Scheduling preemptable jobs on identical processors under varying availability of an additional continuous resource

    Directory of Open Access Journals (Sweden)

    Różycki Rafał

    2016-09-01

    Full Text Available In this work we consider a problem of scheduling preemptable, independent jobs, characterized by the fact that their processing speeds depend on the amounts of a continuous, renewable resource allocated to jobs at a time. Jobs are scheduled on parallel, identical machines, with the criterion of minimization of the schedule length. Since two categories of resources occur in the problem: discrete (set of machines and continuous, it is generally called a discrete-continuous scheduling problem. The model studied in this paper allows the total available amount of the continuous resource to vary over time, which is a practically important generalization that has not been considered yet for discrete-continuous scheduling problems. For this model we give some properties of optimal schedules on a basis of which we propose a general methodology for solving the considered class of problems. The methodology uses a two-phase approach in which, firstly, an assignment of machines to jobs is defined and, secondly, for this assignment an optimal continuous resource allocation is found by solving an appropriate mathematical programming problem. In the approach various cases are considered, following from assumptions made on the form of the processing speed functions of jobs. For each case an iterative algorithm is designed, leading to an optimal solution in a finite number of steps.

  4. A Formal Product-Line Engineering Approach for Schedulers

    NARCIS (Netherlands)

    Orhan, Güner; Aksit, Mehmet; Rensink, Arend; Jololian, Leon; Robbins, David E.; Fernandes, Steven L.

    2017-01-01

    Scheduling techniques have been applied to a large category of software systems, such as, processor scheduling in operating systems, car scheduling in elevator systems, facility scheduling at airports, antenna scheduling in radar systems, scheduling of events, control signals and data in

  5. An Organizational and Qualitative Approach to Improving University Course Scheduling

    Science.gov (United States)

    Hill, Duncan L.

    2010-01-01

    Focusing on the current timetabling process at the University of Toronto Mississauga (UTM), I apply David Wesson's theoretical framework in order to understand (1) how increasing enrollment interacts with a decentralized timetabling process to limit the flexibility of course schedules and (2) the resultant impact on educational quality. I then…

  6. Implementing real-time robotic systems using CHIMERA II

    Science.gov (United States)

    Stewart, David B.; Schmitz, Donald E.; Khosla, Pradeep K.

    1990-01-01

    A description is given of the CHIMERA II programming environment and operating system, which was developed for implementing real-time robotic systems. Sensor-based robotic systems contain both general- and special-purpose hardware, and thus the development of applications tends to be a time-consuming task. The CHIMERA II environment is designed to reduce the development time by providing a convenient software interface between the hardware and the user. CHIMERA II supports flexible hardware configurations which are based on one or more VME-backplanes. All communication across multiple processors is transparent to the user through an extensive set of interprocessor communication primitives. CHIMERA II also provides a high-performance real-time kernel which supports both deadline and highest-priority-first scheduling. The flexibility of CHIMERA II allows hierarchical models for robot control, such as NASREM, to be implemented with minimal programming time and effort.

  7. The development of KMRR schedule and progress control system (KSPCS) for the master schedule of KMRR project

    International Nuclear Information System (INIS)

    Choi, Chang Woong; Lee, Tae Joon; Kim, Joon Yun; Cho, Yun Ho; Hah, Jong Hyun

    1993-07-01

    This report was to development the computerized schedule and progress control system for the master schedule of KMRR project with ARTEMIS 7000/386 CM (Ver. 7.4.2.) based on project management theory (PERT/CPM, PDM, and S-curve). This system has been efficiently used for KMRR master schedule and will be utilized for the detail scheduling of KMRR project. (Author) 23 refs., 26 figs., 52 tabs

  8. The development of KMRR schedule and progress control system (KSPCS) for the master schedule of KMRR project

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woong; Lee, Tae Joon; Kim, Joon Yun; Cho, Yun Ho; Hah, Jong Hyun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-07-01

    This report was to development the computerized schedule and progress control system for the master schedule of KMRR project with ARTEMIS 7000/386 CM (Ver. 7.4.2.) based on project management theory (PERT/CPM, PDM, and S-curve). This system has been efficiently used for KMRR master schedule and will be utilized for the detail scheduling of KMRR project. (Author) 23 refs., 26 figs., 52 tabs.

  9. Advanced oxidation removal of hypophosphite by O3/H2O2 combined with sequential Fe(II) catalytic process.

    Science.gov (United States)

    Zhao, Zilong; Dong, Wenyi; Wang, Hongjie; Chen, Guanhan; Wang, Wei; Liu, Zekun; Gao, Yaguang; Zhou, Beili

    2017-08-01

    Elimination of hypophosphite (HP) was studied as an example of nickel plating effluents treatment by O 3 /H 2 O 2 and sequential Fe(II) catalytic oxidation process. Performance assessment performed with artificial HP solution by varying initial pH and employing various oxidation processes clearly showed that the O 3 /H 2 O 2 ─Fe(II) two-step oxidation process possessed the highest removal efficiency when operating under the same conditions. The effects of O 3 dosing, H 2 O 2 concentration, Fe(II) addition and Fe(II) feeding time on the removal efficiency of HP were further evaluated in terms of apparent kinetic rate constant. Under improved conditions (initial HP concentration of 50 mg L -1 , 75 mg L -1 O 3 , 1 mL L -1 H 2 O 2 , 150 mg L -1 Fe(II) and pH 7.0), standard discharge (<0.5 mg L -1 in China) could be achieved, and the Fe(II) feeding time was found to be the limiting factor for the evolution of apparent kinetic rate constant in the second stage. Characterization studies showed that neutralization process after oxidation treatment favored the improvement of phosphorus removal due to the formation of more metal hydroxides. Moreover, as a comparison with lab-scale Fenton approach, the O 3 /H 2 O 2 ─Fe(II) oxidation process had more competitive advantages with respect to applicable pH range, removal efficiency, sludge production as well as economic costs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Decentralized Job Scheduling in the Cloud Based on a Spatially Generalized Prisoner’s Dilemma Game

    Directory of Open Access Journals (Sweden)

    Gąsior Jakub

    2015-12-01

    Full Text Available We present in this paper a novel distributed solution to a security-aware job scheduling problem in cloud computing infrastructures. We assume that the assignment of the available resources is governed exclusively by the specialized brokers assigned to individual users submitting their jobs to the system. The goal of this scheme is allocating a limited quantity of resources to a specific number of jobs minimizing their execution failure probability and total completion time. Our approach is based on the Pareto dominance relationship and implemented at an individual user level. To select the best scheduling strategies from the resulting Pareto frontiers and construct a global scheduling solution, we developed a decision-making mechanism based on the game-theoretic model of Spatial Prisoner’s Dilemma, realized by selfish agents operating in the two-dimensional cellular automata space. Their behavior is conditioned by the objectives of the various entities involved in the scheduling process and driven towards a Nash equilibrium solution by the employed social welfare criteria. The performance of the scheduler applied is verified by a number of numerical experiments. The related results show the effectiveness and scalability of the scheme in the presence of a large number of jobs and resources involved in the scheduling process.

  11. Single Machine Scheduling and Due Date Assignment with Past-Sequence-Dependent Setup Time and Position-Dependent Processing Time

    Directory of Open Access Journals (Sweden)

    Chuan-Li Zhao

    2014-01-01

    Full Text Available This paper considers single machine scheduling and due date assignment with setup time. The setup time is proportional to the length of the already processed jobs; that is, the setup time is past-sequence-dependent (p-s-d. It is assumed that a job's processing time depends on its position in a sequence. The objective functions include total earliness, the weighted number of tardy jobs, and the cost of due date assignment. We analyze these problems with two different due date assignment methods. We first consider the model with job-dependent position effects. For each case, by converting the problem to a series of assignment problems, we proved that the problems can be solved in On4 time. For the model with job-independent position effects, we proved that the problems can be solved in On3 time by providing a dynamic programming algorithm.

  12. Forecasting and prevention of water inrush during the excavation process of a diversion tunnel at the Jinping II Hydropower Station, China.

    Science.gov (United States)

    Hou, Tian-Xing; Yang, Xing-Guo; Xing, Hui-Ge; Huang, Kang-Xin; Zhou, Jia-Wen

    2016-01-01

    Estimating groundwater inflow into a tunnel before and during the excavation process is an important task to ensure the safety and schedule during the underground construction process. Here we report a case of the forecasting and prevention of water inrush at the Jinping II Hydropower Station diversion tunnel groups during the excavation process. The diversion tunnel groups are located in mountains and valleys, and with high water pressure head. Three forecasting methods are used to predict the total water inflow of the #2 diversion tunnel. Furthermore, based on the accurate estimation of the water inrush around the tunnel working area, a theoretical method is presented to forecast the water inflow at the working area during the excavation process. The simulated results show that the total water flow is 1586.9, 1309.4 and 2070.2 m(3)/h using the Qshima method, Kostyakov method and Ochiai method, respectively. The Qshima method is the best one because it most closely matches the monitoring result. According to the huge water inflow into the #2 diversion tunnel, reasonable drainage measures are arranged to prevent the potential disaster of water inrush. The groundwater pressure head can be determined using the water flow velocity from the advancing holes; then, the groundwater pressure head can be used to predict the possible water inflow. The simulated results show that the groundwater pressure head and water inflow re stable and relatively small around the region of the intact rock mass, but there is a sudden change around the fault region with a large water inflow and groundwater pressure head. Different countermeasures are adopted to prevent water inrush disasters during the tunnel excavation process. Reasonable forecasting the characteristic parameters of water inrush is very useful for the formation of prevention and mitigation schemes during the tunnel excavation process.

  13. Collaborative Distributed Scheduling Approaches for Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Zhidong Deng

    2009-10-01

    Full Text Available Energy constraints restrict the lifetime of wireless sensor networks (WSNs with battery-powered nodes, which poses great challenges for their large scale application. In this paper, we propose a family of collaborative distributed scheduling approaches (CDSAs based on the Markov process to reduce the energy consumption of a WSN. The family of CDSAs comprises of two approaches: a one-step collaborative distributed approach and a two-step collaborative distributed approach. The approaches enable nodes to learn the behavior information of its environment collaboratively and integrate sleep scheduling with transmission scheduling to reduce the energy consumption. We analyze the adaptability and practicality features of the CDSAs. The simulation results show that the two proposed approaches can effectively reduce nodes’ energy consumption. Some other characteristics of the CDSAs like buffer occupation and packet delay are also analyzed in this paper. We evaluate CDSAs extensively on a 15-node WSN testbed. The test results show that the CDSAs conserve the energy effectively and are feasible for real WSNs.

  14. Generation-Side Power Scheduling in a Grid-Connected DC Microgrid

    DEFF Research Database (Denmark)

    Hernández, Adriana Carolina Luna; Aldana, Nelson Leonardo Diaz; Meng, Lexuan

    2015-01-01

    In this paper, a constrained mixed-integer programming model for scheduling the active power supplied by the generation units in storage-based DC microgrids is presented. The optimization problem minimizes operating costs taking into account a two-stage mode operation of the energy storage system...... so that a more accurate model for optimization of the microgrid operation can be obtained. The model is used in a particular grid-connected DC microgrid that includes two renewable energy sources and an energy storage system which supply a critical load. The results of the scheduling process...

  15. Flowshop Scheduling Problems with a Position-Dependent Exponential Learning Effect

    Directory of Open Access Journals (Sweden)

    Mingbao Cheng

    2013-01-01

    Full Text Available We consider a permutation flowshop scheduling problem with a position-dependent exponential learning effect. The objective is to minimize the performance criteria of makespan and the total flow time. For the two-machine flow shop scheduling case, we show that Johnson’s rule is not an optimal algorithm for minimizing the makespan given the exponential learning effect. Furthermore, by using the shortest total processing times first (STPT rule, we construct the worst-case performance ratios for both criteria. Finally, a polynomial-time algorithm is proposed for special cases of the studied problem.

  16. Schedule goals for civilian radioactive waste management - Can we have confidence?

    International Nuclear Information System (INIS)

    Bartlett, John W.

    1992-01-01

    The schedule goals for the Civilian Radioactive Waste Management Program are to begin spent fuel receipt from reactors in 1998 and to begin waste disposal in 2010. Although there are various reasons for these goals, the most important is to set demanding goals and be responsible for achieving them. Meeting these goals requires taking into account an array of facilitators and potential inhibitors that affect schedule confidence. Facilitators include actions to prioritize the program, and make its operations efficient. These include actions to baseline activities, emphasize communications with constituencies, use help from others, and facilitate the licensing process. Inhibitors include problems in monitored storage facilities negotiations, obstruction by the State of Nevada, funding deficiencies, and technical uncertainties at Yucca Mountain. At the present time, the program can, in principle meet its schedule goals. In the near-term, the linchpin of schedule confidence is Congressional action to match the Administration's commitment to progress. (author)

  17. Decentralized Ground Staff Scheduling

    DEFF Research Database (Denmark)

    Sørensen, M. D.; Clausen, Jens

    2002-01-01

    scheduling is investigated. The airport terminal is divided into zones, where each zone consists of a set of stands geographically next to each other. Staff is assigned to work in only one zone and the staff scheduling is planned decentralized for each zone. The advantage of this approach is that the staff...... work in a smaller area of the terminal and thus spends less time walking between stands. When planning decentralized the allocation of stands to flights influences the staff scheduling since the workload in a zone depends on which flights are allocated to stands in the zone. Hence solving the problem...... depends on the actual stand allocation but also on the number of zones and the layout of these. A mathematical model of the problem is proposed, which integrates the stand allocation and the staff scheduling. A heuristic solution method is developed and applied on a real case from British Airways, London...

  18. Schedule control in Ling Ao nuclear power project

    International Nuclear Information System (INIS)

    Xie Ahai

    2007-01-01

    Ling Ao Nuclear Power Station (LANP) is first one built up by self-reliance in China with power capacity 990x2 MWe. The results of quality control, schedule control and cost control are satisfactory. The commercial operation days of Unit 1 and Unit 2 were 28th May 2002 and 8th Jan. 2003 respectively, which were 48 days and 66 days in advance of the project schedule. This paper presents the practices of self-reliance schedule control system in LANP. The paper includes 10 sections: schedule control system; targets of schedule control; schedule control at early stage of project; construction schedule; scheduling practice; Point curves; schedule control of design and procurement; a good practice of construction schedule control on site; commissioning and startup schedule; schedule control culture. Three figures are attached. The main contents of the self-reliance schedule control system are as follows: to draw up reasonable schedules and targets; to setup management mechanism and procedures; to organize powerful project management team; to establish close monitoring system; to provide timely progress reports and statistics information. Five kinds of schedule control targets are introduced, i.e. bar-chart schedule; milesones; Point curves; interface management; hydraulic test schedule of auxiliary piping loops; EMR/EMC/EESR issuance schedules. Six levels of bar-chart schedules were adopted in LANP, but the bar-chart schedules were not satisfactory for complicated erection condition on site, even using six levels of schedules. So a kind of Point curves was developed and their advantages are explained. Scheduling method of three elements: activity, duration, logic, which was adopted in LANP, is introduced. The duration of each piping activities in LANP level 2 project schedule was calculated based on the relevant working Point quantities. The analysis and adjustment of Point curves are illustrated, i.e. balance of monthly quantities; possible production in the peakload

  19. Multiuser switched diversity scheduling schemes

    KAUST Repository

    Shaqfeh, Mohammad; Alnuweiri, Hussein M.; Alouini, Mohamed-Slim

    2012-01-01

    Multiuser switched-diversity scheduling schemes were recently proposed in order to overcome the heavy feedback requirements of conventional opportunistic scheduling schemes by applying a threshold-based, distributed, and ordered scheduling mechanism. The main idea behind these schemes is that slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we characterize the achievable rate region of multiuser switched diversity systems and compare it with the rate region of full feedback multiuser diversity systems. We propose also a novel proportional fair multiuser switched-based scheduling scheme and we demonstrate that it can be optimized using a practical and distributed method to obtain the feedback thresholds. We finally demonstrate by numerical examples that switched-diversity scheduling schemes operate within 0.3 bits/sec/Hz from the ultimate network capacity of full feedback systems in Rayleigh fading conditions. © 2012 IEEE.

  20. Multiuser switched diversity scheduling schemes

    KAUST Repository

    Shaqfeh, Mohammad

    2012-09-01

    Multiuser switched-diversity scheduling schemes were recently proposed in order to overcome the heavy feedback requirements of conventional opportunistic scheduling schemes by applying a threshold-based, distributed, and ordered scheduling mechanism. The main idea behind these schemes is that slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we characterize the achievable rate region of multiuser switched diversity systems and compare it with the rate region of full feedback multiuser diversity systems. We propose also a novel proportional fair multiuser switched-based scheduling scheme and we demonstrate that it can be optimized using a practical and distributed method to obtain the feedback thresholds. We finally demonstrate by numerical examples that switched-diversity scheduling schemes operate within 0.3 bits/sec/Hz from the ultimate network capacity of full feedback systems in Rayleigh fading conditions. © 2012 IEEE.

  1. Updating Linear Schedules with Lowest Cost: a Linear Programming Model

    Science.gov (United States)

    Biruk, Sławomir; Jaśkowski, Piotr; Czarnigowska, Agata

    2017-10-01

    Many civil engineering projects involve sets of tasks repeated in a predefined sequence in a number of work areas along a particular route. A useful graphical representation of schedules of such projects is time-distance diagrams that clearly show what process is conducted at a particular point of time and in particular location. With repetitive tasks, the quality of project performance is conditioned by the ability of the planner to optimize workflow by synchronizing the works and resources, which usually means that resources are planned to be continuously utilized. However, construction processes are prone to risks, and a fully synchronized schedule may expire if a disturbance (bad weather, machine failure etc.) affects even one task. In such cases, works need to be rescheduled, and another optimal schedule should be built for the changed circumstances. This typically means that, to meet the fixed completion date, durations of operations have to be reduced. A number of measures are possible to achieve such reduction: working overtime, employing more resources or relocating resources from less to more critical tasks, but they all come at a considerable cost and affect the whole project. The paper investigates the problem of selecting the measures that reduce durations of tasks of a linear project so that the cost of these measures is kept to the minimum and proposes an algorithm that could be applied to find optimal solutions as the need to reschedule arises. Considering that civil engineering projects, such as road building, usually involve less process types than construction projects, the complexity of scheduling problems is lower, and precise optimization algorithms can be applied. Therefore, the authors put forward a linear programming model of the problem and illustrate its principle of operation with an example.

  2. SYSTEMATIC LITERATURE REVIEW ON RESOURCE ALLOCATION AND RESOURCE SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    B. Muni Lavanya; C. Shoba Bindu

    2016-01-01

    The objective the work is intended to highlight the key features and afford finest future directions in the research community of Resource Allocation, Resource Scheduling and Resource management from 2009 to 2016. Exemplifying how research on Resource Allocation, Resource Scheduling and Resource management has progressively increased in the past decade by inspecting articles, papers from scientific and standard publications. Survey materialized in three-fold process. Firstly, investigate on t...

  3. Optimizing the Steel Plate Storage Yard Crane Scheduling Problem Using a Two Stage Planning/Scheduling Approach

    DEFF Research Database (Denmark)

    Hansen, Anders Dohn; Clausen, Jens

    This paper presents the Steel Plate Storage Yard Crane Scheduling Problem. The task is to generate a schedule for two gantry cranes sharing tracks. The schedule must comply with a number of constraints and at the same time be cost efficient. We propose some ideas for a two stage planning...

  4. The triangle scheduling problem

    NARCIS (Netherlands)

    Dürr, Christoph; Hanzálek, Zdeněk; Konrad, Christian; Seddik, Yasmina; Sitters, R.A.; Vásquez, Óscar C.; Woeginger, Gerhard

    2017-01-01

    This paper introduces a novel scheduling problem, where jobs occupy a triangular shape on the time line. This problem is motivated by scheduling jobs with different criticality levels. A measure is introduced, namely the binary tree ratio. It is shown that the Greedy algorithm solves the problem to

  5. Scheduling with Learning Effects and/or Time-Dependent Processing Times to Minimize the Weighted Number of Tardy Jobs on a Single Machine

    Directory of Open Access Journals (Sweden)

    Jianbo Qian

    2013-01-01

    Full Text Available We consider single machine scheduling problems with learning/deterioration effects and time-dependent processing times, with due date assignment consideration, and our objective is to minimize the weighted number of tardy jobs. By reducing all versions of the problem to an assignment problem, we solve them in O(n4 time. For some important special cases, the time complexity can be improved to be O(n2 using dynamic programming techniques.

  6. Integrated Production-Distribution Scheduling Problem with Multiple Independent Manufacturers

    Directory of Open Access Journals (Sweden)

    Jianhong Hao

    2015-01-01

    Full Text Available We consider the nonstandard parts supply chain with a public service platform for machinery integration in China. The platform assigns orders placed by a machinery enterprise to multiple independent manufacturers who produce nonstandard parts and makes production schedule and batch delivery schedule for each manufacturer in a coordinate manner. Each manufacturer has only one plant with parallel machines and is located at a location far away from other manufacturers. Orders are first processed at the plants and then directly shipped from the plants to the enterprise in order to be finished before a given deadline. We study the above integrated production-distribution scheduling problem with multiple manufacturers to maximize a weight sum of the profit of each manufacturer under the constraints that all orders are finished before the deadline and the profit of each manufacturer is not negative. According to the optimal condition analysis, we formulate the problem as a mixed integer programming model and use CPLEX to solve it.

  7. PLAStiCC: Predictive Look-Ahead Scheduling for Continuous dataflows on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Kumbhare, Alok [Univ. of Southern California, Los Angeles, CA (United States); Simmhan, Yogesh [Indian Inst. of Technology (IIT), Bangalore (India); Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States)

    2014-05-27

    Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application’s throughput. In this paper we propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based look-ahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from public and private IaaS clouds. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.

  8. A System for Automatically Generating Scheduling Heuristics

    Science.gov (United States)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  9. 76 FR 60359 - Phytosanitary Treatments; Location of and Process for Updating Treatment Schedules; Technical...

    Science.gov (United States)

    2011-09-29

    ... supporting information and data, to the Animal and Plant Health Inspection Service, Plant Protection and... supporting information and data, to the Animal and Plant Health Inspection Service, Plant Protection and... that approved treatment schedules will instead be found in the Plant Protection and Quarantine...

  10. Integrated batch production and maintenance scheduling for multiple items processed on a deteriorating machine to minimize total production and maintenance costs with due date constraint

    Directory of Open Access Journals (Sweden)

    Zahedi Zahedi

    2016-04-01

    Full Text Available This paper discusses an integrated model of batch production and maintenance scheduling on a deteriorating machine producing multiple items to be delivered at a common due date. The model describes the trade-off between total inventory cost and maintenance cost as the increase of production run length. The production run length is a time bucket between two consecutive preventive maintenance activities. The objective function of the model is to minimize total cost consisting of in process and completed part inventory costs, setup cost, preventive and corrective maintenance costs and rework cost. The problem is to determine the optimal production run length and to schedule the batches obtained from determining the production run length in order to minimize total cost.

  11. Applying dynamic priority scheduling scheme to static systems of pinwheel task model in power-aware scheduling.

    Science.gov (United States)

    Seol, Ye-In; Kim, Young-Kuk

    2014-01-01

    Power-aware scheduling reduces CPU energy consumption in hard real-time systems through dynamic voltage scaling (DVS). In this paper, we deal with pinwheel task model which is known as static and predictable task model and could be applied to various embedded or ubiquitous systems. In pinwheel task model, each task's priority is static and its execution sequence could be predetermined. There have been many static approaches to power-aware scheduling in pinwheel task model. But, in this paper, we will show that the dynamic priority scheduling results in power-aware scheduling could be applied to pinwheel task model. This method is more effective than adopting the previous static priority scheduling methods in saving energy consumption and, for the system being still static, it is more tractable and applicable to small sized embedded or ubiquitous computing. Also, we introduce a novel power-aware scheduling algorithm which exploits all slacks under preemptive earliest-deadline first scheduling which is optimal in uniprocessor system. The dynamic priority method presented in this paper could be applied directly to static systems of pinwheel task model. The simulation results show that the proposed algorithm with the algorithmic complexity of O(n) reduces the energy consumption by 10-80% over the existing algorithms.

  12. A simple rule based model for scheduling farm management operations in SWAT

    Science.gov (United States)

    Schürz, Christoph; Mehdi, Bano; Schulz, Karsten

    2016-04-01

    For many interdisciplinary questions at the watershed scale, the Soil and Water Assessment Tool (SWAT; Arnold et al., 1998) has become an accepted and widely used tool. Despite its flexibility, the model is highly demanding when it comes to input data. At SWAT's core the water balance and the modeled nutrient cycles are plant growth driven (implemented with the EPIC crop growth model). Therefore, land use and crop data with high spatial and thematic resolution, as well as detailed information on cultivation and farm management practices are required. For many applications of the model however, these data are unavailable. In order to meet these requirements, SWAT offers the option to trigger scheduled farm management operations by applying the Potential Heat Unit (PHU) concept. The PHU concept solely takes into account the accumulation of daily mean temperature for management scheduling. Hence, it contradicts several farming strategies that take place in reality; such as: i) Planting and harvesting dates are set much too early or too late, as the PHU concept is strongly sensitivity to inter-annual temperature fluctuations; ii) The timing of fertilizer application, in SWAT this often occurs simultaneously on the same date in in each field; iii) and can also coincide with precipitation events. Particularly, the latter two can lead to strong peaks in modeled nutrient loads. To cope with these shortcomings we propose a simple rule based model (RBM) to schedule management operations according to realistic farmer management practices in SWAT. The RBM involves simple strategies requiring only data that are input into the SWAT model initially, such as temperature and precipitation data. The user provides boundaries of time periods for operation schedules to take place for all crops in the model. These data are readily available from the literature or from crop variety trials. The RBM applies the dates by complying with the following rules: i) Operations scheduled in the

  13. An Online Scheduling Algorithm with Advance Reservation for Large-Scale Data Transfers

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet; Kosar, Tevfik

    2010-05-20

    Scientific applications and experimental facilities generate massive data sets that need to be transferred to remote collaborating sites for sharing, processing, and long term storage. In order to support increasingly data-intensive science, next generation research networks have been deployed to provide high-speed on-demand data access between collaborating institutions. In this paper, we present a practical model for online data scheduling in which data movement operations are scheduled in advance for end-to-end high performance transfers. In our model, data scheduler interacts with reservation managers and data transfer nodes in order to reserve available bandwidth to guarantee completion of jobs that are accepted and confirmed to satisfy preferred time constraint given by the user. Our methodology improves current systems by allowing researchers and higher level meta-schedulers to use data placement as a service where theycan plan ahead and reserve the scheduler time in advance for their data movement operations. We have implemented our algorithm and examined possible techniques for incorporation into current reservation frameworks. Performance measurements confirm that the proposed algorithm is efficient and scalable.

  14. Artificial intelligence approaches to astronomical observation scheduling

    Science.gov (United States)

    Johnston, Mark D.; Miller, Glenn

    1988-01-01

    Automated scheduling will play an increasing role in future ground- and space-based observatory operations. Due to the complexity of the problem, artificial intelligence technology currently offers the greatest potential for the development of scheduling tools with sufficient power and flexibility to handle realistic scheduling situations. Summarized here are the main features of the observatory scheduling problem, how artificial intelligence (AI) techniques can be applied, and recent progress in AI scheduling for Hubble Space Telescope.

  15. Distributed scheduling for autonomous vehicles by reinforcement learning; Kyoka gakushu ni yoru mujin hansosha no bunsangata scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Unoki, T.; Suetake, N. [Oki Electric Industry Co. Ltd., Tokyo (Japan)

    1997-08-20

    In this paper, we propose an autonomous vehicle scheduling schema in large physical distribution terminals publicly used as the next generation wide area physical distribution bases. This schema uses Learning Automaton for vehicles scheduling based on Contract Net Protocol, in order to obtain useful emergent behaviors of agents in the system based on the local decision-making of each agent. The state of the automaton is updated at each instant on the basis of new information that includes the arrival estimation time of vehicles. Each agent estimates the arrival time of vehicles by using Bayesian learning process. Using traffic simulation, we evaluate the schema in various simulated environments. The result shows the advantage of the schema over when each agent provides the same criteria from the top down, and each agent voluntarily generates criteria via interactions with the environment, playing an individual role in tie system. 22 refs., 5 figs., 2 tabs.

  16. Compilation time analysis to minimize run-time overhead in preemptive scheduling on multiprocessors

    Science.gov (United States)

    Wauters, Piet; Lauwereins, Rudy; Peperstraete, J.

    1994-10-01

    This paper describes a scheduling method for hard real-time Digital Signal Processing (DSP) applications, implemented on a multi-processor. Due to the very high operating frequencies of DSP applications (typically hundreds of kHz) runtime overhead should be kept as small as possible. Because static scheduling introduces very little run-time overhead it is used as much as possible. Dynamic pre-emption of tasks is allowed if and only if it leads to better performance in spite of the extra run-time overhead. We essentially combine static scheduling with dynamic pre-emption using static priorities. Since we are dealing with hard real-time applications we must be able to guarantee at compile-time that all timing requirements will be satisfied at run-time. We will show that our method performs at least as good as any static scheduling method. It also reduces the total amount of dynamic pre-emptions compared with run time methods like deadline monotonic scheduling.

  17. The Gerda Phase II detector assembly

    Energy Technology Data Exchange (ETDEWEB)

    Bode, Tobias; Schoenert, Stefan [Physik-Department E15, Technische Universitaet Muenchen (Germany); Schwingenheuer, Bernhard [Max-Planck-Institut fuer Kernphysik, Heidelberg (Germany); Collaboration: GERDA-Collaboration

    2013-07-01

    Phase II of the Gerda (Germanium Detector Array) experiment will continue the search for the neutrinoless double beta decay (0νββ) of {sup 76}Ge. Prerequisites for Phase II are an increased target mass and a reduced background index of < 10 {sup -3} cts/(keV.kg.yr). Major hardware upgrades to achieve these requirements are scheduled for 2013. They include the deployment of a new radio pure low mass detector assembly. The structural properties of available radio-pure materials and reduction of mass necessitate a change of the electrical contacting used to bias and read-out the detectors. The detector assembly design and the favored contacting solution are presented.

  18. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  19. Organization of the construction and schedule of realization

    International Nuclear Information System (INIS)

    Szerovay, Antal; Vogel, Oszkar

    1988-01-01

    The four units of the Paks nuclear power plant with all the auxiliary and service facilities were constructed between 1974 and 1987. Major features of building activities, types of buildings classified according to the connection between the facility and the technology are listed. Processes and structures applied to the construction of the main reactor building are mentioned emphasizing the importance of up-to-date building methods. To move the large equipment into the reactor building Potain cranes of 400 Mp were used. As a result of adequate schedule the auxiliary buildings could be used as preparatory buildings during the construction of the plant. The energy and material supply, the schedule of the construction works are discussed. (V.N.) 6 refs.; 5 figs

  20. Alternative Work Schedules: Definitions

    Science.gov (United States)

    Journal of the College and University Personnel Association, 1977

    1977-01-01

    The term "alternative work schedules" encompasses any variation of the requirement that all permanent employees in an organization or one shift of employees adhere to the same five-day, seven-to-eight-hour schedule. This article defines staggered hours, flexible working hours (flexitour and gliding time), compressed work week, the task system, and…

  1. ENHANCED HYBRID PSO – ACO ALGORITHM FOR GRID SCHEDULING

    Directory of Open Access Journals (Sweden)

    P. Mathiyalagan

    2010-07-01

    Full Text Available Grid computing is a high performance computing environment to solve larger scale computational demands. Grid computing contains resource management, task scheduling, security problems, information management and so on. Task scheduling is a fundamental issue in achieving high performance in grid computing systems. A computational GRID is typically heterogeneous in the sense that it combines clusters of varying sizes, and different clusters typically contains processing elements with different level of performance. In this, heuristic approaches based on particle swarm optimization and ant colony optimization algorithms are adopted for solving task scheduling problems in grid environment. Particle Swarm Optimization (PSO is one of the latest evolutionary optimization techniques by nature. It has the better ability of global searching and has been successfully applied to many areas such as, neural network training etc. Due to the linear decreasing of inertia weight in PSO the convergence rate becomes faster, which leads to the minimal makespan time when used for scheduling. To make the convergence rate faster, the PSO algorithm is improved by modifying the inertia parameter, such that it produces better performance and gives an optimized result. The ACO algorithm is improved by modifying the pheromone updating rule. ACO algorithm is hybridized with PSO algorithm for efficient result and better convergence in PSO algorithm.

  2. Cleaning Schedule Operations in Heat Exchanger Networks

    Directory of Open Access Journals (Sweden)

    Huda Hairul

    2018-01-01

    Full Text Available Heat exchanger networks have been known to be the essential parts in the chemical industries. Unfortunately, since the performance of heat exchanger can be decreasing in transferring the heat from hot stream into cold stream due to fouling, then cleaning the heat exchanger is needed to restore its initial performance periodically. A process of heating crude oil in a refinery plant was used as a case study. As many as eleven heat exchangers were used to heat crude oil before it was heated by a furnace to the temperature required to the crude unit distillation column. The purpose of this study is to determine the cleaning schedule of heat exchanger on the heat exchanger networks due to the decrease of the overall heat transfer coefficient by various percentage of the design value. A close study on the process of heat exchanger cleaning schedule in heat exchanger networks using the method of decreasing overall heat transfer coefficient as target. The result showed that the higher the fouling value the more often the heat exchanger is cleaned because the overall heat transfer coefficient decreases quickly.

  3. Planning and Scheduling of Airline Operations

    Directory of Open Access Journals (Sweden)

    İlkay ORHAN

    2010-02-01

    Full Text Available The Turkish Civil Aviation sector has grown at a rate of 53 % between the years 2002-2008 owing to countrywide economical developments and some removed restrictions in the aviation field. Successful international companies in the sector use advanced computer-supported solution methods for their planning and scheduling problems. These methods have been providing significant competitive advantages to those companies. There are four major scheduling and planning problems in the airline sector: flight scheduling, aircraft scheduling, crew scheduling and disruptions management. These aforementioned scheduling and planning problems faced by all airline companies in the airline sector were examined in detail. Studies reveal that companies using the advanced methods might gain significant cost reductions. However, even then, the time required for solving large scale problems may not satisfy the decision quality desired by decision makers. In such cases, using modern decision methods integrated with advanced technologies offer companies an opportunity for significant cost-advantages.

  4. Laser Welding Process Parameters Optimization Using Variable-Fidelity Metamodel and NSGA-II

    Directory of Open Access Journals (Sweden)

    Wang Chaochao

    2017-01-01

    Full Text Available An optimization methodology based on variable-fidelity (VF metamodels and nondominated sorting genetic algorithm II (NSGA-II for laser bead-on-plate welding of stainless steel 316L is presented. The relationships between input process parameters (laser power, welding speed and laser focal position and output responses (weld width and weld depth are constructed by VF metamodels. In VF metamodels, the information from two levels fidelity models are integrated, in which the low-fidelity model (LF is finite element simulation model that is used to capture the general trend of the metamodels, and high-fidelity (HF model which from physical experiments is used to ensure the accuracy of metamodels. The accuracy of the VF metamodel is verified by actual experiments. To slove the optimization problem, NSGA-II is used to search for multi-objective Pareto optimal solutions. The results of verification experiments show that the obtained optimal parameters are effective and reliable.

  5. Development of Watch Schedule Using Rules Approach

    Science.gov (United States)

    Jurkevicius, Darius; Vasilecas, Olegas

    The software for schedule creation and optimization solves a difficult, important and practical problem. The proposed solution is an online employee portal where administrator users can create and manage watch schedules and employee requests. Each employee can login with his/her own account and see his/her assignments, manage requests, etc. Employees set as administrators can perform the employee scheduling online, manage requests, etc. This scheduling software allows users not only to see the initial and optimized watch schedule in a simple and understandable form, but also to create special rules and criteria and input their business. The system using rules automatically will generate watch schedule.

  6. Distributed Hybrid Scheduling in Multi-Cloud Networks using Conflict Graphs

    KAUST Repository

    Douik, Ahmed

    2017-09-07

    Recent studies on cloud-radio access networks assume either signal-level or scheduling-level coordination. This paper considers a hybrid coordinated scheme as a means to benefit from both policies. Consider the downlink of a multi-cloud radio access network, where each cloud is connected to several base-stations (BSs) via high capacity links, and, therefore, allows for joint signal processing within the cloud transmission. Across the multiple clouds, however, only scheduling-level coordination is permitted, as low levels of backhaul communication are feasible. The frame structure of every BS is composed of various time/frequency blocks, called power-zones (PZs), which are maintained at a fixed power level. The paper addresses the problem of maximizing a network-wide utility by associating users to clouds and scheduling them to the PZs, under the practical constraints that each user is scheduled to a single cloud at most, but possibly to many BSs within the cloud, and can be served by one or more distinct PZs within the BSs’ frame. The paper solves the problem using graph theory techniques by constructing the conflict graph. The considered scheduling problem is, then, shown to be equivalent to a maximum-weight independent set problem in the constructed graph, which can be solved using efficient techniques. The paper then proposes solving the problem using both optimal and heuristic algorithms that can be implemented in a distributed fashion across the network. The proposed distributed algorithms rely on the well-chosen structure of the constructed conflict graph utilized to solve the maximum-weight independent set problem. Simulation results suggest that the proposed optimal and heuristic hybrid scheduling strategies provide appreciable gain as compared to the scheduling-level coordinated networks, with a negligible degradation to signal-level coordination.

  7. DAS, Uncoordinated Femto and Joint Scheduling Systems for In-Building Wireless Solutions

    DEFF Research Database (Denmark)

    Liu, Zhen; Sørensen, Troels Bundgaard; Wigard, Jeoren

    2011-01-01

    -data-rate services. Distributed antenna systems and Femto cells are cost-efficient techniques for this application. In this paper, their performance is evaluated in an LTE downlink context along with a proposed joint scheduling system, which maximizes the supported number of users under a QoS constraint....... The selection of the enterprise building model includes a general office building model described in the WINNER II project and a site-specific office building with large scale path-loss values retrieved from measurements. Results show superior performance of the Femto system compared to DAS in providing high...

  8. Decentralization and mechanism design for online machine scheduling

    NARCIS (Netherlands)

    Arge, Lars; Heydenreich, Birgit; Müller, Rudolf; Freivalds, Rusins; Uetz, Marc Jochen

    We study the online version of the classical parallel machine scheduling problem to minimize the total weighted completion time from a new perspective: We assume that the data of each job, namely its release date $r_j$, its processing time $p_j$ and its weight $w_j$ is only known to the job itself,

  9. Data analysis with the DIANA meta-scheduling approach

    International Nuclear Information System (INIS)

    Anjum, A; McClatchey, R; Willers, I

    2008-01-01

    The concepts, design and evaluation of the Data Intensive and Network Aware (DIANA) meta-scheduling approach for solving the challenges of data analysis being faced by CERN experiments are discussed in this paper. Our results suggest that data analysis can be made robust by employing fault tolerant and decentralized meta-scheduling algorithms supported in our DIANA meta-scheduler. The DIANA meta-scheduler supports data intensive bulk scheduling, is network aware and follows a policy centric meta-scheduling. In this paper, we demonstrate that a decentralized and dynamic meta-scheduling approach is an effective strategy to cope with increasing numbers of users, jobs and datasets. We present 'quality of service' related statistics for physics analysis through the application of a policy centric fair-share scheduling model. The DIANA meta-schedulers create a peer-to-peer hierarchy of schedulers to accomplish resource management that changes with evolving loads and is dynamic and adapts to the volatile nature of the resources

  10. Scheduling lessons learned from the Autonomous Power System

    Science.gov (United States)

    Ringer, Mark J.

    1992-01-01

    The Autonomous Power System (APS) project at NASA LeRC is designed to demonstrate the applications of integrated intelligent diagnosis, control, and scheduling techniques to space power distribution systems. The project consists of three elements: the Autonomous Power Expert System (APEX) for Fault Diagnosis, Isolation, and Recovery (FDIR); the Autonomous Intelligent Power Scheduler (AIPS) to efficiently assign activities start times and resources; and power hardware (Brassboard) to emulate a space-based power system. The AIPS scheduler was tested within the APS system. This scheduler is able to efficiently assign available power to the requesting activities and share this information with other software agents within the APS system in order to implement the generated schedule. The AIPS scheduler is also able to cooperatively recover from fault situations by rescheduling the affected loads on the Brassboard in conjunction with the APEX FDIR system. AIPS served as a learning tool and an initial scheduling testbed for the integration of FDIR and automated scheduling systems. Many lessons were learned from the AIPS scheduler and are now being integrated into a new scheduler called SCRAP (Scheduler for Continuous Resource Allocation and Planning). This paper will service three purposes: an overview of the AIPS implementation, lessons learned from the AIPS scheduler, and a brief section on how these lessons are being applied to the new SCRAP scheduler.

  11. Comparative assessment of TRU waste forms and processes. Volume II. Waste form data, process descriptions, and costs

    International Nuclear Information System (INIS)

    Ross, W.A.; Lokken, R.O.; May, R.P.; Roberts, F.P.; Thornhill, R.E.; Timmerman, C.L.; Treat, R.L.; Westsik, J.H. Jr.

    1982-09-01

    This volume contains supporting information for the comparative assessment of the transuranic waste forms and processes summarized in Volume I. Detailed data on the characterization of the waste forms selected for the assessment, process descriptions, and cost information are provided. The purpose of this volume is to provide additional information that may be useful when using the data in Volume I and to provide greater detail on particular waste forms and processes. Volume II is divided into two sections and two appendixes. The first section provides information on the preparation of the waste form specimens used in this study and additional characterization data in support of that in Volume I. The second section includes detailed process descriptions for the eight processes evaluated. Appendix A lists the results of MCC-1 leach test and Appendix B lists additional cost data. 56 figures, 12 tables

  12. Optimization of Hierarchically Scheduled Heterogeneous Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Traian; Pop, Paul; Eles, Petru

    2005-01-01

    We present an approach to the analysis and optimization of heterogeneous distributed embedded systems. The systems are heterogeneous not only in terms of hardware components, but also in terms of communication protocols and scheduling policies. When several scheduling policies share a resource......, they are organized in a hierarchy. In this paper, we address design problems that are characteristic to such hierarchically scheduled systems: assignment of scheduling policies to tasks, mapping of tasks to hardware components, and the scheduling of the activities. We present algorithms for solving these problems....... Our heuristics are able to find schedulable implementations under limited resources, achieving an efficient utilization of the system. The developed algorithms are evaluated using extensive experiments and a real-life example....

  13. Sport Tournament Automated Scheduling System

    OpenAIRE

    Raof R. A. A; Sudin S.; Mahrom N.; Rosli A. N. C

    2018-01-01

    The organizer of sport events often facing problems such as wrong calculations of marks and scores, as well as difficult to create a good and reliable schedule. Most of the time, the issues about the level of integrity of committee members and also issues about errors made by human came into the picture. Therefore, the development of sport tournament automated scheduling system is proposed. The system will be able to automatically generate the tournament schedule as well as automatically calc...

  14. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    Science.gov (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  15. Cost, Schedule, And Performance Elements For Comparison of Hydrodynamic Models of Near-Surface Unmanned Underwater Vehicle Operations

    Science.gov (United States)

    2017-12-01

    2-89) Prescribed by ANSI Std. 239-18 ii THIS PAGE INTENTIONALLY LEFT BLANK iii Approved for public release. Distribution is unlimited. COST ...from the scope of this demonstration due to time constraints. Further study of this software would benefit similar cost , schedule, and performance...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA SYSTEMS ENGINEERING CAPSTONE PROJECT REPORT Approved for public release. Distribution

  16. The Business Change Initiative: A Novel Approach to Improved Cost and Schedule Management

    Science.gov (United States)

    Shinn, Stephen A.; Bryson, Jonathan; Klein, Gerald; Lunz-Ruark, Val; Majerowicz, Walt; McKeever, J.; Nair, Param

    2016-01-01

    Goddard Space Flight Center's Flight Projects Directorate employed a Business Change Initiative (BCI) to infuse a series of activities coordinated to drive improved cost and schedule performance across Goddard's missions. This sustaining change framework provides a platform to manage and implement cost and schedule control techniques throughout the project portfolio. The BCI concluded in December 2014, deploying over 100 cost and schedule management changes including best practices, tools, methods, training, and knowledge sharing. The new business approach has driven the portfolio to improved programmatic performance. The last eight launched GSFC missions have optimized cost, schedule, and technical performance on a sustained basis to deliver on time and within budget, returning funds in many cases. While not every future mission will boast such strong performance, improved cost and schedule tools, management practices, and ongoing comprehensive evaluations of program planning and control methods to refine and implement best practices will continue to provide a framework for sustained performance. This paper will describe the tools, techniques, and processes developed during the BCI and the utilization of collaborative content management tools to disseminate project planning and control techniques to ensure continuous collaboration and optimization of cost and schedule management in the future.

  17. Short term scheduling of multiple grid-parallel PEM fuel cells for microgrid applications

    Energy Technology Data Exchange (ETDEWEB)

    El-Sharkh, M.Y.; Rahman, A.; Alam, M.S. [Dept. of Electrical and Computer Engineering, University of South Alabama, Mobile, AL 36688 (United States)

    2010-10-15

    This paper presents a short term scheduling scheme for multiple grid-parallel PEM fuel cell power plants (FCPPs) connected to supply electrical and thermal energy to a microgrid community. As in the case of regular power plants, short term scheduling of FCPP is also a cost-based optimization problem that includes the cost of operation, thermal power recovery, and the power trade with the local utility grid. Due to the ability of the microgrid community to trade power with the local grid, the power balance constraint is not applicable, other constraints like the real power operating limits of the FCPP, and minimum up and down time are therefore used. To solve the short term scheduling problem of the FCPPs, a hybrid technique based on evolutionary programming (EP) and hill climbing technique (HC) is used. The EP is used to estimate the optimal schedule and the output power from each FCPP. The HC technique is used to monitor the feasibility of the solution during the search process. The short term scheduling problem is used to estimate the schedule and the electrical and thermal power output of five FCPPs supplying a maximum power of 300 kW. (author)

  18. 40 CFR 141.702 - Sampling schedules.

    Science.gov (United States)

    2010-07-01

    ... serving at least 10,000 people must submit their sampling schedule for the initial round of source water... submitting the sampling schedule that EPA approves. (3) Systems serving fewer than 10,000 people must submit... analytical result for a scheduled sampling date due to equipment failure, loss of or damage to the sample...

  19. Scheduling with Optimized Communication for Time-Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    1999-01-01

    We present an approach to process scheduling for synthesis of safety-critical distributed embedded systems.Our system model captures both the flow of data and that of control. The communication model is based on a time-triggered protocol. We take into consideration overheads due to communication...

  20. Prevalence of α(+)-Thalassemia in the Scheduled Tribe and Scheduled Caste Populations of Damoh District in Madhya Pradesh, Central India.

    Science.gov (United States)

    Singh, Mendi P S S; Gupta, Rasik B; Yadav, Rajiv; Sharma, Ravendra K; Shanmugam, Rajasubramaniam

    2016-08-01

    This study was carried out to ascertain the allelic frequency of α(+)-thalassemia (α(+)-thal) in Scheduled caste and scheduled tribe populations of the Damoh district of Madhya Pradesh, India. Random blood samples of Scheduled tribe (267) and Scheduled caste (168), considering the family as a sampling unit, were analyzed for the presence of the -α(3.7) (rightward) (NG_000006.1: g.34164_37967del3804) and -α(4.2) (leftward) (AF221717) deletions. α(+)-Thal was significantly higher in the Scheduled tribals (77.9%) as compared to the scheduled caste population (9.0%). About 58.0% scheduled tribals carried at least one chromosome with the -α(3.7) deletion and 20.0% scheduled tribals carried the -α(4.2) deletion. Frequency for the -α(3.7) allele was 0.487 in the scheduled tribal populations in comparison to 0.021 in scheduled castes. Allelic frequency for -α(4.2) was 0.103 and 0.024, respectively, in the above communities. No Hardy-Weinberg equilibrium for α-thal gene (p population, indicating the presence of selection pressures in favor of α-thal mutation and adaptation.

  1. Voltage scheduling for low power/energy

    Science.gov (United States)

    Manzak, Ali

    2001-07-01

    Power considerations have become an increasingly dominant factor in the design of both portable and desk-top systems. An effective way to reduce power consumption is to lower the supply voltage since voltage is quadratically related to power. This dissertation considers the problem of lowering the supply voltage at (i) the system level and at (ii) the behavioral level. At the system level, the voltage of the variable voltage processor is dynamically changed with the work load. Processors with limited sized buffers as well as those with very large buffers are considered. Given the task arrival times, deadline times, execution times, periods and switching activities, task scheduling algorithms that minimize energy or peak power are developed for the processors equipped with very large buffers. A relation between the operating voltages of the tasks for minimum energy/power is determined using the Lagrange multiplier method, and an iterative algorithm that utilizes this relation is developed. Experimental results show that the voltage assignment obtained by the proposed algorithm is very close (0.1% error) to that of the optimal energy assignment and the optimal peak power (1% error) assignment. Next, on-line and off-fine minimum energy task scheduling algorithms are developed for processors with limited sized buffers. These algorithms have polynomial time complexity and present optimal (off-line) and close-to-optimal (on-line) solutions. A procedure to calculate the minimum buffer size given information about the size of the task (maximum, minimum), execution time (best case, worst case) and deadlines is also presented. At the behavioral level, resources operating at multiple voltages are used to minimize power while maintaining the throughput. Such a scheme has the advantage of allowing modules on the critical paths to be assigned to the highest voltage levels (thus meeting the required timing constraints) while allowing modules on non-critical paths to be assigned

  2. Routine environmental monitoring schedule, calendar year 1995

    International Nuclear Information System (INIS)

    Schmidt, J.W.; Markes, B.M.; McKinney, S.M.

    1994-12-01

    This document provides Bechtel Hanford, Inc. (BHI) and Westinghouse Hanford Company (WHC) a schedule of monitoring and sampling routines for the Operational Environmental Monitoring (OEM) program during calendar year (CY) 1995. Every attempt will be made to consistently follow this schedule; any deviation from this schedule will be documented by an internal memorandum (DSI) explaining the reason for the deviation. The DSI will be issued by the scheduled performing organization and directed to Near-Field Monitoring. The survey frequencies for particular sites are determined by the technical judgment of Near-Field Monitoring and may depend on the site history, radiological status, use and general conditions. Additional surveys may be requested at irregular frequencies if conditions warrant. All radioactive wastes sites are scheduled to be surveyed at least annually. Any newly discovered wastes sites not documented by this schedule will be included in the revised schedule for CY 1995

  3. Resource-constrained project scheduling problem: review of past and recent developments

    Directory of Open Access Journals (Sweden)

    Farhad Habibi

    2018-01-01

    Full Text Available The project scheduling problem is both practically and theoretically of paramount importance. From the practical perspective, improvement of project scheduling as a critical part of project management process can lead to successful project completion and significantly decrease of the relevant costs. From the theoretical perspective, project scheduling is regarded as one of the in-teresting optimization issues, which has attracted the attention of many researchers in the area of operations research. Therefore, the project scheduling issue has been significantly evaluated over time and has been developed from various aspects. In this research, the topics related to Re-source-Constrained Project Scheduling Problem (RCPSP are reviewed, recent developments in this field are evaluated, and the results are presented for future studies. In this regard, first, the standard problem of RCPSP is expressed and related developments are presented from four as-pects of resources, characteristics of activities, type of objective functions, and availability level of information. Following that, details about 216 articles conducted on RCPSP during 1980-2017 are expressed. At the end, in line with the statistics obtained from the evaluation of previ-ous articles, suggestions are made for the future studies in order to help the development of new issues in this area.

  4. MULTICRITERIA HYBRID FLOW SHOP SCHEDULING PROBLEM: LITERATURE REVIEW, ANALYSIS, AND FUTURE RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcia de Fatima Morais

    2014-12-01

    Full Text Available This research focuses on the Hybrid Flow Shop production scheduling problem, which is one of the most difficult problems to solve. The literature points to several studies that focus the Hybrid Flow Shop scheduling problem with monocriteria functions. Despite of the fact that, many real world problems involve several objective functions, they can often compete and conflict, leading researchers to concentrate direct their efforts on the development of methods that take consider this variant into consideration. The goal of the study is to review and analyze the methods in order to solve the Hybrid Flow Shop production scheduling problem with multicriteria functions in the literature. The analyses were performed using several papers that have been published over the years, also the parallel machines types, the approach used to develop solution methods, the type of method develop, the objective function, the performance criterion adopted, and the additional constraints considered. The results of the reviewing and analysis of 46 papers showed opportunities for future research on this topic, including the following: (i use uniform and dedicated parallel machines, (ii use exact and metaheuristics approaches, (iv develop lower and uppers bounds, relations of dominance and different search strategies to improve the computational time of the exact methods,  (v develop  other types of metaheuristic, (vi work with anticipatory setups, and (vii add constraints faced by the production systems itself.

  5. Single machine scheduling with time-dependent linear deterioration and rate-modifying maintenance

    OpenAIRE

    Rustogi, Kabir; Strusevich, Vitaly A.

    2015-01-01

    We study single machine scheduling problems with linear time-dependent deterioration effects and maintenance activities. Maintenance periods (MPs) are included into the schedule, so that the machine, that gets worse during the processing, can be restored to a better state. We deal with a job-independent version of the deterioration effects, that is, all jobs share a common deterioration rate. However, we introduce a novel extension to such models and allow the deterioration rates to change af...

  6. Options for Parallelizing a Planning and Scheduling Algorithm

    Science.gov (United States)

    Clement, Bradley J.; Estlin, Tara A.; Bornstein, Benjamin D.

    2011-01-01

    Space missions have a growing interest in putting multi-core processors onboard spacecraft. For many missions processing power significantly slows operations. We investigate how continual planning and scheduling algorithms can exploit multi-core processing and outline different potential design decisions for a parallelized planning architecture. This organization of choices and challenges helps us with an initial design for parallelizing the CASPER planning system for a mesh multi-core processor. This work extends that presented at another workshop with some preliminary results.

  7. Resource allocation in IT projects: using schedule optimization

    Directory of Open Access Journals (Sweden)

    Michael Chilton

    2014-01-01

    Full Text Available Resource allocation is the process of assigning resources to tasks throughout the life of a project. Despite sophisticated software packages devoted to keeping track of tasks, resources and resource assignments, it is often the case that project managers find some resources over-allocated and therefore unable to complete the assigned work in the allotted amount of time. Most scheduling software has provisions for leveling resources, but the techniques for doing so simply add time to the schedule and may cause delays in tasks that are critical to the project in meeting deadlines. This paper presents a software application that ensures that resources are properly balanced at the beginning of the project and eliminates the situation in which resources become over-allocated. It can be used in a multi-project environment and reused throughout the project as tasks, resource assignments and availability, and the project scope change. The application utilizes the bounded enumeration technique to formulate an optimal schedule for which both the task sequence and resource availability are taken into account. It is run on a database server to reduce the running time and make it a viable application for practitioners.

  8. Automated scheduling and planning from theory to practice

    CERN Document Server

    Ozcan, Ender; Urquhart, Neil

    2013-01-01

      Solving scheduling problems has long presented a challenge for computer scientists and operations researchers. The field continues to expand as researchers and practitioners examine ever more challenging problems and develop automated methods capable of solving them. This book provides 11 case studies in automated scheduling, submitted by leading researchers from across the world. Each case study examines a challenging real-world problem by analysing the problem in detail before investigating how the problem may be solved using state of the art techniques.The areas covered include aircraft scheduling, microprocessor instruction scheduling, sports fixture scheduling, exam scheduling, personnel scheduling and production scheduling.  Problem solving methodologies covered include exact as well as (meta)heuristic approaches, such as local search techniques, linear programming, genetic algorithms and ant colony optimisation.The field of automated scheduling has the potential to impact many aspects of our lives...

  9. Computing Models of CDF and D0 in Run II

    International Nuclear Information System (INIS)

    Lammel, S.

    1997-05-01

    The next collider run of the Fermilab Tevatron, Run II, is scheduled for autumn of 1999. Both experiments, the Collider Detector at Fermilab (CDF) and the D0 experiment are being modified to cope with the higher luminosity and shorter bunchspacing of the Tevatron. New detector components, higher event complexity, and an increased data volume require changes from the data acquisition systems up to the analysis systems. In this paper we present a summary of the computing models of the two experiments for Run II

  10. Computing Models of CDF and D0 in Run II

    International Nuclear Information System (INIS)

    Lammel, S.

    1997-01-01

    The next collider run of the Fermilab Tevatron, Run II, is scheduled for autumn of 1999. Both experiments, the Collider Detector at Fermilab (CDF) and the D0 experiment are being modified to cope with the higher luminosity and shorter bunch spacing of the Tevatron. New detector components, higher event complexity, and an increased data volume require changes from the data acquisition systems up to the analysis systems. In this paper we present a summary of the computing models of the two experiments for Run II

  11. Single machine total completion time minimization scheduling with a time-dependent learning effect and deteriorating jobs

    Science.gov (United States)

    Wang, Ji-Bo; Wang, Ming-Zheng; Ji, Ping

    2012-05-01

    In this article, we consider a single machine scheduling problem with a time-dependent learning effect and deteriorating jobs. By the effects of time-dependent learning and deterioration, we mean that the job processing time is defined by a function of its starting time and total normal processing time of jobs in front of it in the sequence. The objective is to determine an optimal schedule so as to minimize the total completion time. This problem remains open for the case of -1 < a < 0, where a denotes the learning index; we show that an optimal schedule of the problem is V-shaped with respect to job normal processing times. Three heuristic algorithms utilising the V-shaped property are proposed, and computational experiments show that the last heuristic algorithm performs effectively and efficiently in obtaining near-optimal solutions.

  12. CHIMERA II - A real-time multiprocessing environment for sensor-based robot control

    Science.gov (United States)

    Stewart, David B.; Schmitz, Donald E.; Khosla, Pradeep K.

    1989-01-01

    A multiprocessing environment for a wide variety of sensor-based robot system, providing the flexibility, performance, and UNIX-compatible interface needed for fast development of real-time code is addressed. The requirements imposed on the design of a programming environment for sensor-based robotic control is outlined. The details of the current hardware configuration are presented, along with the details of the CHIMERA II software. Emphasis is placed on the kernel, low-level interboard communication, user interface, extended file system, user-definable and dynamically selectable real-time schedulers, remote process synchronization, and generalized interprocess communication. A possible implementation of a hierarchical control model, the NASA/NBS standard reference model for telerobot control system is demonstrated.

  13. Direct demonstration of rapid insulin-like growth factor II receptor internalization and recycling in rat adipocytes. Insulin stimulates 125I-insulin-like growth factor II degradation by modulating the IGF-II receptor recycling process

    International Nuclear Information System (INIS)

    Oka, Y.; Rozek, L.M.; Czech, M.P.

    1985-01-01

    The photoactive insulin-like growth factor (IGF)-II analogue 4-azidobenzoyl- 125 I-IGF-II was synthesized and used to label specifically and covalently the Mr = 250,000 Type II IGF receptor. When rat adipocytes are irradiated after a 10-min incubation with 4-azidobenzoyl- 125 I-IGF-II at 10 degrees C and immediately homogenized, most of the labeled IGF-II receptors are associated with the plasma membrane fraction, indicating that receptors accessible to the labeling reagent at low temperature are on the cell surface. However, when the photolabeled cells are incubated at 37 degrees C for various times before homogenization, labeled IGF-II receptors are rapidly internalized with a half-time of 3.5 min as evidenced by a loss from the plasma membrane fraction and a concomitant appearance in the low density microsome fraction. The steady state level of cell surface IGF-II receptors in the presence or absence of IGF-II remains constant under these conditions, demonstrating that IGF-II receptors rapidly recycle back to the cell surface at the same rate as receptor internalization. Using the above methodology, it is shown that acute insulin action: 1) increases the steady state number of cell surface IGF-II receptors; 2) increases the number of ligand-bound IGF-II receptors that are internalized per unit of time; and 3) increases the rate of cellular 125 I-IGF-II degradation by a process that is blocked by anti-IGF-II receptor antibody

  14. Impact of the Hydrocodone Schedule Change on Opioid Prescription Patterns in South Dakota.

    Science.gov (United States)

    Kuschel, Lauren M; Mort, Jane M

    2017-10-01

    Prescription opioid use is becoming increasingly common; consequently, opioid overdose deaths are increasing at an alarming rate. Hydrocodone, one of the most commonly abused opioids, was changed from a schedule III controlled substance to the more stringent schedule II to decrease abuse and diversion, effective Oct. 6, 2014. The objective of this study was to examine the impact of the hydrocodone schedule change on opioid prescribing in South Dakota. Opioid prescription patterns were examined in the following six-month phases: the baseline phase before the change, the transition phase when existing hydrocodone prescriptions could still be refilled, and the final phase. The South Dakota Board of Pharmacy Prescription Drug Monitoring Program provided aggregate monthly data for South Dakota opioid prescriptions (i.e., total number of prescriptions and days supplied), including urban and rural stratification. T-tests were performed on the monthly values for each phase to determine the significance of differences in prescription features between phases. The number of hydrocodone prescriptions significantly decreased 14 percent from baseline to final phase, while the days supplied per prescription significantly increased 7.4 percent. These changes were greater in rural areas than in urban areas. Conversely, the number of other opioid prescriptions significantly increased by 6.5 percent over this timeframe. The number of hydrocodone prescriptions decreased, while the days supplied per prescription increased. These changes were greater in rural areas than in urban areas. In addition, the number of other opioid prescriptions increased. These trends may reflect some unintended effects of the schedule change.

  15. PRACTICAL IMPLICATIONS OF LOCATION-BASED SCHEDULING

    DEFF Research Database (Denmark)

    Andersson, Niclas; Christensen, Knud

    2007-01-01

    The traditional method for planning, scheduling and controlling activities and resources in construction projects is the CPM-scheduling, which has been the predominant scheduling method since its introduction in the late 1950s. Over the years, CPM has proven to be a very powerful technique...... that will be used in this study. LBS is a scheduling method that rests upon the theories of line-of-balance and which uses the graphic representation of a flowline chart. As such, LBS is adapted for planning and management of workflows and, thus, may provide a solution to the identified shortcomings of CPM. Even...

  16. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons

  17. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons.

  18. The Lot Sizing and Scheduling of Sand Casting Operations

    NARCIS (Netherlands)

    Hans, Elias W.; van de Velde, S.L.; van de Velde, Steef

    2011-01-01

    We describe a real world case study that involves the monthly planning and scheduling of the sand-casting department in a metal foundry. The problem can be characterised as a single-level multi-item capacitated lot-sizing model with a variety of additional process-specific constraints. The main

  19. Content Analysis Schedule for Bilingual Education Programs: Proyecto PAL.

    Science.gov (United States)

    Gonzalez, Castor

    This content analysis schedule for "Proyecto PAL" in San Jose, California, presents information on the history, funding, and scope of the project. Included are sociolinguistic process variables such as the native and dominant languages of students and their interaction. Information is provided on staff selection and the linguistic…

  20. A Two-Level Task Scheduler on Multiple DSP System for OpenCL

    Directory of Open Access Journals (Sweden)

    Li Tian

    2014-04-01

    Full Text Available This paper addresses the problem that multiple DSP system does not support OpenCL programming. With the compiler, runtime, and the kernel scheduler proposed, an OpenCL application becomes portable not only between multiple CPU and GPU, but also between embedded multiple DSP systems. Firstly, the LLVM compiler was imported for source-to-source translation in which the translated source was supported by CCS. Secondly, two-level schedulers were proposed to support efficient OpenCL kernel execution. The DSP/BIOS is used to schedule system level tasks such as interrupts and drivers; however, the synchronization mechanism resulted in heavy overhead during task switching. So we designed an efficient second level scheduler especially for OpenCL kernel work-item scheduling. The context switch process utilizes the 8 functional units and cross path links which was superior to DSP/BIOS in the aspect of task switching. Finally, dynamic loading and software managed CACHE were redesigned for OpenCL running on multiple DSP system. We evaluated the performance using some common OpenCL kernels from NVIDIA, AMD, NAS, and Parboil benchmarks. Experimental results show that the DSP OpenCL can efficiently exploit the computing resource of multiple cores.

  1. Schedule and staffing of a nuclear power project

    International Nuclear Information System (INIS)

    Polliart, A.J.; Csik, B.

    1977-01-01

    Establishment of construction schedule: a) preliminary construction schedule; b) PERT (Program Evaluation Review Techniques) analytical method; c) identify key milestone target dates; d) inter-action by participants and contribution to support revised construction schedule. - Construction schedule control: a) ability to update and modify construction schedule; b) alternate plans to circumvent restraints (problems); c) critical path activity-controls; d) continuous review and report system. - Updating construction site reports to include: 1) progress, 2) accomplishments, 3) potential problems and alternate plans; b) progress reports on related support services; c) total assessment of participating groups on schedule; d) information required by management for decisions. - Typical causes for delays in project schedule. (orig.) [de

  2. Parental nonstandard work schedules during infancy and children's BMI trajectories

    Directory of Open Access Journals (Sweden)

    Afshin Zilanawala

    2017-09-01

    Full Text Available Background: Empirical evidence has demonstrated adverse associations between parental nonstandard work schedules (i.e., evenings, nights, or weekends and child developmental outcomes. However, there are mixed findings concerning the relationship between parental nonstandard employment and children's body mass index (BMI, and few studies have incorporated information on paternal work schedules. Objective: This paper investigated BMI trajectories from early to middle childhood (ages 3-11 by parental work schedules at 9 months of age, using nationally representative cohort data from the United Kingdom. This study is the first to examine the link between nonstandard work schedules and children's BMI in the United Kingdom. Methods: We used data from the Millennium Cohort Study (2001‒2013, n = 13,021 to estimate trajectories in BMI, using data from ages 3, 5, 7, and 11 years. Joint parental work schedules and a range of biological, socioeconomic, and psychosocial covariates were assessed in the initial interviews at 9 months. Results: Compared to children in two-parent families where parents worked standard shifts, we found steeper BMI growth trajectories for children in two-parent families where both parents worked nonstandard shifts and children in single-parent families whose mothers worked a standard shift. Fathers' shift work, compared to standard shifts, was independently associated with significant increases in BMI. Conclusions: Future public health initiatives focused on reducing the risk of rapid BMI gain in childhood can potentially consider the disruptions to family processes resulting from working nonstandard hours. Contribution: Children in families in which both parents work nonstandard schedules had steeper BMI growth trajectories across the first decade of life. Fathers' nonstandard shifts were independently associated with increases in BMI.

  3. Limited Preemptive Scheduling in Real-time Systems

    OpenAIRE

    Thekkilakattil, Abhilash

    2016-01-01

    Preemptive and non-preemptive scheduling paradigms typically introduce undesirable side effects when scheduling real-time tasks, mainly in the form of preemption overheads and blocking, that potentially compromise timeliness guarantees. The high preemption overheads in preemptive real-time scheduling may imply high resource utilization, often requiring significant over-provisioning, e.g., pessimistic Worst Case Execution Time (WCET) approximations. Non-preemptive scheduling, on the other hand...

  4. Pre-emptive resource-constrained multimode project scheduling using genetic algorithm: A dynamic forward approach

    Directory of Open Access Journals (Sweden)

    Aidin Delgoshaei

    2016-09-01

    Full Text Available Purpose: The issue resource over-allocating is a big concern for project engineers in the process of scheduling project activities. Resource over-allocating drawback is frequently seen after scheduling of a project in practice which causes a schedule to be useless. Modifying an over-allocated schedule is very complicated and needs a lot of efforts and time. In this paper, a new and fast tracking method is proposed to schedule large scale projects which can help project engineers to schedule the project rapidly and with more confidence. Design/methodology/approach: In this article, a forward approach for maximizing net present value (NPV in multi-mode resource constrained project scheduling problem while assuming discounted positive cash flows (MRCPSP-DCF is proposed. The progress payment method is used and all resources are considered as pre-emptible. The proposed approach maximizes NPV using unscheduled resources through resource calendar in forward mode. For this purpose, a Genetic Algorithm is applied to solve. Findings: The findings show that the proposed method is an effective way to maximize NPV in MRCPSP-DCF problems while activity splitting is allowed. The proposed algorithm is very fast and can schedule experimental cases with 1000 variables and 100 resources in few seconds. The results are then compared with branch and bound method and simulated annealing algorithm and it is found the proposed genetic algorithm can provide results with better quality. Then algorithm is then applied for scheduling a hospital in practice. Originality/value: The method can be used alone or as a macro in Microsoft Office Project® Software to schedule MRCPSP-DCF problems or to modify resource over-allocated activities after scheduling a project. This can help project engineers to schedule project activities rapidly with more accuracy in practice.

  5. Research on information models for the construction schedule management based on the IFC standard

    Directory of Open Access Journals (Sweden)

    Weirui Xue

    2015-05-01

    Full Text Available Purpose: The purpose of this article is to study the description and extension of the Industry Foundation Classes (IFC standard in construction schedule management, which achieves the information exchange and sharing among the different information systems and stakeholders, and facilitates the collaborative construction in the construction projects. Design/methodology/approach: The schedule information processing and coordination are difficult in the complex construction project. Building Information Modeling (BIM provides the platform for exchanging and sharing information among information systems and stakeholders based on the IFC standard. Through analyzing the schedule plan, implementing, check and control, the information flow in the schedule management is reflected based on the IDEF. According to the IFC4, the information model for the schedule management is established, which not only includes the each aspect of the schedule management, but also includes the cost management, the resource management, the quality management and the risk management. Findings: The information requirement for the construction schedule management can be summarized into three aspects: the schedule plan information, the implementing information and the check and control information. The three aspects can be described through the existing and extended entities of IFC4, and the information models are established. Originality/value: The main contribution of the article is to establish the construction schedule management information model, which achieves the information exchange and share in the construction project, and facilitates the development of the application software to meet the requirements of the construction project.

  6. Implementation of parallel processing in the basf2 framework for Belle II

    International Nuclear Information System (INIS)

    Itoh, Ryosuke; Lee, Soohyung; Katayama, N; Mineo, S; Moll, A; Kuhr, T; Heck, M

    2012-01-01

    Recent PC servers are equipped with multi-core CPUs and it is desired to utilize the full processing power of them for the data analysis in large scale HEP experiments. A software framework basf2 is being developed for the use in the Belle II experiment, a new generation B-factory experiment at KEK, and the parallel event processing to utilize the multi-core CPUs is in its design for the use in the massive data production. The details of the implementation of event parallel processing in the basf2 framework are discussed with the report of preliminary performance study in the realistic use on a 32 core PC server.

  7. Clinch River Breeder Reactor Plant Project: construction schedule

    International Nuclear Information System (INIS)

    Purcell, W.J.; Martin, E.M.; Shivley, J.M.

    1982-01-01

    The construction schedule for the Clinch River Breeder Reactor Plant and its evolution are described. The initial schedule basis, changes necessitated by the evaluation of the overall plant design, and constructability improvements that have been effected to assure adherence to the schedule are presented. The schedule structure and hierarchy are discussed, as are tools used to define, develop, and evaluate the schedule

  8. Analysis on the nitrogen drilling accident of Well Qionglai 1 (II: Restoration of the accident process and lessons learned

    Directory of Open Access Journals (Sweden)

    Yingfeng Meng

    2015-12-01

    Full Text Available All the important events of the accident of nitrogen drilling of Well Qionglai 1 have been speculated and analyzed in the paper I. In this paper II, based on the investigating information, the well log data and some calculating and simulating results, according to the analysis method of the fault tree of safe engineering, the every possible compositions, their possibilities and time schedule of the events of the accident of Well Qionglai 1 have been analyzed, the implications of the logging data have been revealed, the process of the accident of Well Qionglai 1 has been restored. Some important understandings have been obtained: the objective causes of the accident is the rock burst and the induced events form rock burst, the subjective cause of the accident is that the blooie pipe could not bear the flow burden of the clasts from rock burst and was blocked by the clasts. The blocking of blooie pipe caused high pressure in wellhead, the high pressure made the blooie pipe burst, natural gas came out and flared fire. This paper also thinks that the rock burst in gas drilling in fractured tight sandstone gas zone is objective and not avoidable, but the accidents induced from rock burst can be avoidable by improving the performance of the blooie pipe, wellhead assemblies and drilling tool accessories aiming at the downhole rock burst.

  9. Release procedure according to paragraph 29 StrlSchv on example of the nuclear research reactor TRIGA Heidelberg II; Durchfuehrung von Freigabeverfahren nach paragraph 29 am Beispiel des TRIGA Heidelberg II

    Energy Technology Data Exchange (ETDEWEB)

    Cremer, J. [Siempelkamp Nukleartechnik GmbH (SNT) (Germany); Sold, A. [Deutsches Krebsforschungszentrum Heidelberg (DKFZ) (Germany)

    2005-07-01

    The aim of this lecture is to show the schedule of a release procedure according to paragraph 29 StrlSchV on the example of the decommissioning of the nuclear research reactor TRIGA Heidelberg II. It is shown on the effort done by the radiation protection representative of this plant. Considering this example, starting with planning, application, survey and execution, the complex context of the release procedure is becomes apparent. Thereby the new applied measuring techniques that require a certain practice and the responsibility of the radiation protection representative in the radiation protection law play a relevant role. In such small facilities as the TRIGA Heidelberg II, the radiation protection staff are employed according to the plant's size and work is focussed on radiation protection research and laboratories. The decommissioning process with its wide range of radiation protection requirements represents new challenges which have to be coordinated with the present duties of the radiation protection representative. The supervision and the responsibility for the release procedure according to paragraph 29 are the largest and the most sensitive part of decommissioning of the nuclear research reactor TRIGA Heidelberg II. (orig.)

  10. Preventive maintenance optimization for a multi-component system under changing job shop schedule

    International Nuclear Information System (INIS)

    Zhou Xiaojun; Lu Zhiqiang; Xi Lifeng

    2012-01-01

    Variability and small lot size is a common feature for many discrete manufacturing processes designed to meet a wide array of customer needs. Because of this, job shop schedule often has to be continuously updated in reaction to changes in production plan. Generally, the aim of preventive maintenance is to ensure production effectiveness and therefore the preventive maintenance models must have the ability to be adaptive to changes in job shop schedule. In this paper, a dynamic opportunistic preventive maintenance model is developed for a multi-component system with considering changes in job shop schedule. Whenever a job is completed, preventive maintenance opportunities arise for all the components in the system. An optimal maintenance practice is dynamically determined by maximizing the short-term cumulative opportunistic maintenance cost savings for the system. The numerical example shows that the scheme obtained by the proposed model can effectively address the preventive maintenance scheduling problem caused by the changes in job shop schedule and is more efficient than the ones based on two other commonly used preventive maintenance models.

  11. Discrete harmony search algorithm for scheduling and rescheduling the reprocessing problems in remanufacturing: a case study

    Science.gov (United States)

    Gao, Kaizhou; Wang, Ling; Luo, Jianping; Jiang, Hua; Sadollah, Ali; Pan, Quanke

    2018-06-01

    In this article, scheduling and rescheduling problems with increasing processing time and new job insertion are studied for reprocessing problems in the remanufacturing process. To handle the unpredictability of reprocessing time, an experience-based strategy is used. Rescheduling strategies are applied for considering the effect of increasing reprocessing time and the new subassembly insertion. To optimize the scheduling and rescheduling objective, a discrete harmony search (DHS) algorithm is proposed. To speed up the convergence rate, a local search method is designed. The DHS is applied to two real-life cases for minimizing the maximum completion time and the mean of earliness and tardiness (E/T). These two objectives are also considered together as a bi-objective problem. Computational optimization results and comparisons show that the proposed DHS is able to solve the scheduling and rescheduling problems effectively and productively. Using the proposed approach, satisfactory optimization results can be achieved for scheduling and rescheduling on a real-life shop floor.

  12. Heuristic Method for Decision-Making in Common Scheduling Problems

    Directory of Open Access Journals (Sweden)

    Edyta Kucharska

    2017-10-01

    Full Text Available The aim of the paper is to present a heuristic method for decision-making regarding an NP-hard scheduling problem with limitations related to tasks and the resources dependent on the current state of the process. The presented approach is based on the algebraic-logical meta-model (ALMM, which enables making collective decisions in successive process stages, not separately for individual objects or executors. Moreover, taking into account the limitations of the problem, it involves constructing only an acceptable solution and significantly reduces the amount of calculations. A general algorithm based on the presented method is composed of the following elements: preliminary analysis of the problem, techniques for the choice of decision at a given state, the pruning non-perspective trajectory, selection technique of the initial state for the trajectory final part, and the trajectory generation parameters modification. The paper includes applications of the presented approach to scheduling problems on unrelated parallel machines with a deadline and machine setup time dependent on the process state, where the relationship between tasks is defined by the graph. The article also presents the results of computational experiments.

  13. Project Robust Scheduling Based on the Scattered Buffer Technology

    Directory of Open Access Journals (Sweden)

    Nansheng Pang

    2018-04-01

    Full Text Available The research object in this paper is the sub network formed by the predecessor’s affect on the solution activity. This paper is to study three types of influencing factors from the predecessors that lead to the delay of starting time of the solution activity on the longest path, and to analyze the influence degree on the delay of the solution activity’s starting time from different types of factors. On this basis, through the comprehensive analysis of various factors that influence the solution activity, this paper proposes a metric that is used to evaluate the solution robustness of the project scheduling, and this metric is taken as the optimization goal. This paper also adopts the iterative process to design a scattered buffer heuristics algorithm based on the robust scheduling of the time buffer. At the same time, the resource flow network is introduced in this algorithm, using the tabu search algorithm to solve baseline scheduling. For the generation of resource flow network in the baseline scheduling, this algorithm designs a resource allocation algorithm with the maximum use of the precedence relations. Finally, the algorithm proposed in this paper and some other algorithms in previous literature are taken into the simulation experiment; under the comparative analysis, the experimental results show that the algorithm proposed in this paper is reasonable and feasible.

  14. Optimal deployment schedule of an active twist rotor for performance enhancement and vibration reduction in high-speed flights

    Directory of Open Access Journals (Sweden)

    Young H. YOU

    2017-08-01

    Full Text Available The best active twist schedules exploiting various waveform types are sought taking advantage of the global search algorithm for the reduction of hub vibration and/or power required of a rotor in high-speed conditions. The active twist schedules include two non-harmonic inputs formed based on segmented step functions as well as the simple harmonic waveform input. An advanced Particle Swarm assisted Genetic Algorithm (PSGA is employed for the optimizer. A rotorcraft Computational Structural Dynamics (CSD code CAMRAD II is used to perform the rotor aeromechanics analysis. A Computation Fluid Dynamics (CFD code is coupled with CSD for verification and some physical insights. The PSGA optimization results are verified against the parameter sweep study performed using the harmonic actuation. The optimum twist schedules according to the performance and/or vibration reduction strategy are obtained and their optimization gains are compared between the actuation cases. A two-phase non-harmonic actuation schedule demonstrates the best outcome in decreasing the power required while a four-phase non-harmonic schedule results in the best vibration reduction as well as the simultaneous reductions in the power required and vibration. The mechanism of reduction to the performance gains is identified illustrating the section airloads, angle-of-attack distribution, and elastic twist deformation predicted by the present approaches.

  15. ETA-II experiments for determining advanced radiographic capabilities of induction linacs

    International Nuclear Information System (INIS)

    Weir, J.T.; Caporaso, G.J.; Clark, J.C.; Kirbie, H.C.; Chen, Y.J.; Lund, S.M.; Westenskow, G.A.; Paul, A.C.

    1997-05-01

    LLNL has proposed a multi-pulsed, multi-line of sight radiographic machine based on induction linac technology to be the core of the advanced hydrotest facility (AHF) being considered by the Department of Energy. In order to test the new technologies being developed for AHF we have recommissioned the Experimental Test Accelerator (ETA II). We will conduct our initial experiments using kickers and large angle bending optics at the ETA II facility. Our current status and our proposed experimental schedule will be presented

  16. Utilization Bound of Non-preemptive Fixed Priority Schedulers

    Science.gov (United States)

    Park, Moonju; Chae, Jinseok

    It is known that the schedulability of a non-preemptive task set with fixed priority can be determined in pseudo-polynomial time. However, since Rate Monotonic scheduling is not optimal for non-preemptive scheduling, the applicability of existing polynomial time tests that provide sufficient schedulability conditions, such as Liu and Layland's bound, is limited. This letter proposes a new sufficient condition for non-preemptive fixed priority scheduling that can be used for any fixed priority assignment scheme. It is also shown that the proposed schedulability test has a tighter utilization bound than existing test methods.

  17. The New Alvin and the Scheduling/Planning Processes for the National Deep Submergence Facility Jon C. Alberts, Barrie B. Walden, Richard F. Pittinger

    Science.gov (United States)

    Alberts, J.; Walden, B.

    2003-12-01

    Research. Operation of the NDSF remotely operated vehicle (ROV) assets can be arranged in a fly-away mode on appropriate vessels within the UNOLS fleet or on commercial vessels or foreign research vessels provided they are suitably equipped. Scheduling of the R/V ATLANTIS is arranged through UNOLS, as is the use of the ROVs on UNOLS ships. Coordination between funding agencies and the UNOLS scheduling process strives to provide the users with the optimal scheduling of the assets in a given year. Requests for at-sea use of these assets remain strong for the foreseeable future.

  18. Athabasca--special report No. 1, we are progressing on schedule

    Energy Technology Data Exchange (ETDEWEB)

    Moss, A E

    1966-09-01

    Since April of 1964 when the government of Alberta gave permission for the production of crude oil from the Athabasca Tar Sands, construction of a $240 million oil sands venture has been progressing according to schedule. This plant will be capable of processing 45,000 bbl of crude oil per day and will require the mining of approximately 135,000 tons of material per day. The scheduled completion date for this project is Sept. 1967. This huge project consists of the construction of various plant units and service facilities, preparation of the mine for production, construction of a highway and a multi- million dollar bridge, and the design and construction of a 16-in. pipeline 266 miles in length. A shortage of experienced engineers, supervisors, and skilled tradesmen has been the largest problem. In spite of the many problems encountered, approximately 50% of the construction has now been completed and the plant will be completed on schedule.

  19. Dynamic scheduling and analysis of real time systems with multiprocessors

    Directory of Open Access Journals (Sweden)

    M.D. Nashid Anjum

    2016-08-01

    Full Text Available This research work considers a scenario of cloud computing job-shop scheduling problems. We consider m realtime jobs with various lengths and n machines with different computational speeds and costs. Each job has a deadline to be met, and the profit of processing a packet of a job differs from other jobs. Moreover, considered deadlines are either hard or soft and a penalty is applied if a deadline is missed where the penalty is considered as an exponential function of time. The scheduling problem has been formulated as a mixed integer non-linear programming problem whose objective is to maximize net-profit. The formulated problem is computationally hard and not solvable in deterministic polynomial time. This research work proposes an algorithm named the Tube-tap algorithm as a solution to this scheduling optimization problem. Extensive simulation shows that the proposed algorithm outperforms existing solutions in terms of maximizing net-profit and preserving deadlines.

  20. Artificial neural network (ANN) approach for modeling Zn(II) adsorption in batch process

    Energy Technology Data Exchange (ETDEWEB)

    Yildiz, Sayiter [Engineering Faculty, Cumhuriyet University, Sivas (Turkmenistan)

    2017-09-15

    Artificial neural networks (ANN) were applied to predict adsorption efficiency of peanut shells for the removal of Zn(II) ions from aqueous solutions. Effects of initial pH, Zn(II) concentrations, temperature, contact duration and adsorbent dosage were determined in batch experiments. The sorption capacities of the sorbents were predicted with the aid of equilibrium and kinetic models. The Zn(II) ions adsorption onto peanut shell was better defined by the pseudo-second-order kinetic model, for both initial pH, and temperature. The highest R{sup 2} value in isotherm studies was obtained from Freundlich isotherm for the inlet concentration and from Temkin isotherm for the sorbent amount. The high R{sup 2} values prove that modeling the adsorption process with ANN is a satisfactory approach. The experimental results and the predicted results by the model with the ANN were found to be highly compatible with each other.

  1. Artificial neural network (ANN) approach for modeling Zn(II) adsorption in batch process

    International Nuclear Information System (INIS)

    Yildiz, Sayiter

    2017-01-01

    Artificial neural networks (ANN) were applied to predict adsorption efficiency of peanut shells for the removal of Zn(II) ions from aqueous solutions. Effects of initial pH, Zn(II) concentrations, temperature, contact duration and adsorbent dosage were determined in batch experiments. The sorption capacities of the sorbents were predicted with the aid of equilibrium and kinetic models. The Zn(II) ions adsorption onto peanut shell was better defined by the pseudo-second-order kinetic model, for both initial pH, and temperature. The highest R"2 value in isotherm studies was obtained from Freundlich isotherm for the inlet concentration and from Temkin isotherm for the sorbent amount. The high R"2 values prove that modeling the adsorption process with ANN is a satisfactory approach. The experimental results and the predicted results by the model with the ANN were found to be highly compatible with each other.

  2. Brucella abortus Inhibits Major Histocompatibility Complex Class II Expression and Antigen Processing through Interleukin-6 Secretion via Toll-Like Receptor 2▿

    Science.gov (United States)

    Barrionuevo, Paula; Cassataro, Juliana; Delpino, M. Victoria; Zwerdling, Astrid; Pasquevich, Karina A.; Samartino, Clara García; Wallach, Jorge C.; Fossati, Carlos A.; Giambartolomei, Guillermo H.

    2008-01-01

    The strategies that allow Brucella abortus to survive inside macrophages for prolonged periods and to avoid the immunological surveillance of major histocompatibility complex class II (MHC-II)-restricted gamma interferon (IFN-γ)-producing CD4+ T lymphocytes are poorly understood. We report here that infection of THP-1 cells with B. abortus inhibited expression of MHC-II molecules and antigen (Ag) processing. Heat-killed B. abortus (HKBA) also induced both these phenomena, indicating the independence of bacterial viability and involvement of a structural component of the bacterium. Accordingly, outer membrane protein 19 (Omp19), a prototypical B. abortus lipoprotein, inhibited both MHC-II expression and Ag processing to the same extent as HKBA. Moreover, a synthetic lipohexapeptide that mimics the structure of the protein lipid moiety also inhibited MHC-II expression, indicating that any Brucella lipoprotein could down-modulate MHC-II expression and Ag processing. Inhibition of MHC-II expression and Ag processing by either HKBA or lipidated Omp19 (L-Omp19) depended on Toll-like receptor 2 and was mediated by interleukin-6. HKBA or L-Omp19 also inhibited MHC-II expression and Ag processing of human monocytes. In addition, exposure to the synthetic lipohexapeptide inhibited Ag-specific T-cell proliferation and IFN-γ production of peripheral blood mononuclear cells from Brucella-infected patients. Together, these results indicate that there is a mechanism by which B. abortus may prevent recognition by T cells to evade host immunity and establish a chronic infection. PMID:17984211

  3. Integrated Job Scheduling and Network Routing

    DEFF Research Database (Denmark)

    Gamst, Mette; Pisinger, David

    2013-01-01

    We consider an integrated job scheduling and network routing problem which appears in Grid Computing and production planning. The problem is to schedule a number of jobs at a finite set of machines, such that the overall profit of the executed jobs is maximized. Each job demands a number of resou...... indicate that the algorithm can be used as an actual scheduling algorithm in the Grid or as a tool for analyzing Grid performance when adding extra machines or jobs. © 2012 Wiley Periodicals, Inc.......We consider an integrated job scheduling and network routing problem which appears in Grid Computing and production planning. The problem is to schedule a number of jobs at a finite set of machines, such that the overall profit of the executed jobs is maximized. Each job demands a number...... of resources which must be sent to the executing machine through a network with limited capacity. A job cannot start before all of its resources have arrived at the machine. The scheduling problem is formulated as a Mixed Integer Program (MIP) and proved to be NP-hard. An exact solution approach using Dantzig...

  4. Scheduling and Mapping in an Incremental Design Methodology for Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2004-01-01

    In this paper we present an approach to mapping and scheduling of distributed embedded systems for hard real-time applications, aiming at a minimization of the system modification cost. We consider an incremental design process that starts from an already existing system running a set of applicat......In this paper we present an approach to mapping and scheduling of distributed embedded systems for hard real-time applications, aiming at a minimization of the system modification cost. We consider an incremental design process that starts from an already existing system running a set...... be added to the resulted system. Thus, we propose a heuristic which finds the set of already running applications which have to be remapped and rescheduled at the same time with mapping and scheduling the new application, such that the disturbance on the running system (expressed as the total cost implied...... by the modifications) is minimized. Once this set of applications has been determined, we outline a mapping and scheduling algorithm aimed at fulfilling the requirements stated above. The approaches have been evaluated based on extensive experiments using a large number of generated benchmarks as well as a real...

  5. A Data Scheduling and Management Infrastructure for the TEAM Network

    Science.gov (United States)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.; Unwin, R.

    2009-04-01

    currently partnering with the San Diego Super Computer Center to build the data management infrastructure. Data collected from the three core protocols as well as others are currently made available through the TEAM Network portal, which provides the content management framework, the data scheduling and management framework, an administrative framework to implement and manage TEAM sites, collaborative tools and a number of tools and applications utilizing Google Map and Google Earth products. A critical element of the TEAM Network data management infrastructure is to make the data publicly available in as close to real-time as possible (the TEAM Network Data Use Policy: http://www.teamnetwork.org/en/data/policy). This requires two essential tasks to be accomplished, 1) A data collection schedule has to be planned, proposed and approved for a given TEAM site. This is a challenging process since TEAM sites are geographically distributed across the tropics and hence have different seasons where they schedule field sampling for the different TEAM protocols. Capturing this information and ensuring that TEAM sites follow the outlined legal contract is key to the data collection process and 2) A stream-lined and efficient information management system to ensure data collected from the field meet the minimum data standards (i.e. are of the highest scientific quality) and are securely transferred, archived, processed and be rapidly made publicaly available, as a finished consumable product via the TEAM Network portal. The TEAM Network is achieving these goals by implementing an end-to-end framework consisting of the Sampling Scheduler application and the Data Management Framework. Sampling Scheduler The Sampling Scheduler is a project management, calendar based portal application that will allow scientists at a TEAM site to schedule field sampling for each of the TEAM protocols implemented at that site. The sampling scheduler addresses the specific requirements established in the

  6. Estimating exponential scheduling preferences

    DEFF Research Database (Denmark)

    Hjorth, Katrine; Börjesson, Maria; Engelson, Leonid

    2015-01-01

    of car drivers' route and mode choice under uncertain travel times. Our analysis exposes some important methodological issues related to complex non-linear scheduling models: One issue is identifying the point in time where the marginal utility of being at the destination becomes larger than the marginal......Different assumptions about travelers' scheduling preferences yield different measures of the cost of travel time variability. Only few forms of scheduling preferences provide non-trivial measures which are additive over links in transport networks where link travel times are arbitrarily...... utility of being at the origin. Another issue is that models with the exponential marginal utility formulation suffer from empirical identification problems. Though our results are not decisive, they partly support the constant-affine specification, in which the value of travel time variability...

  7. Timing analysis for embedded systems using non-preemptive EDF scheduling under bounded error arrivals

    Directory of Open Access Journals (Sweden)

    Michael Short

    2017-07-01

    Full Text Available Embedded systems consist of one or more processing units which are completely encapsulated by the devices under their control, and they often have stringent timing constraints associated with their functional specification. Previous research has considered the performance of different types of task scheduling algorithm and developed associated timing analysis techniques for such systems. Although preemptive scheduling techniques have traditionally been favored, rapid increases in processor speeds combined with improved insights into the behavior of non-preemptive scheduling techniques have seen an increased interest in their use for real-time applications such as multimedia, automation and control. However when non-preemptive scheduling techniques are employed there is a potential lack of error confinement should any timing errors occur in individual software tasks. In this paper, the focus is upon adding fault tolerance in systems using non-preemptive deadline-driven scheduling. Schedulability conditions are derived for fault-tolerant periodic and sporadic task sets experiencing bounded error arrivals under non-preemptive deadline scheduling. A timing analysis algorithm is presented based upon these conditions and its run-time properties are studied. Computational experiments show it to be highly efficient in terms of run-time complexity and competitive ratio when compared to previous approaches.

  8. Group Elevator Peak Scheduling Based on Robust Optimization Model

    Directory of Open Access Journals (Sweden)

    ZHANG, J.

    2013-08-01

    Full Text Available Scheduling of Elevator Group Control System (EGCS is a typical combinatorial optimization problem. Uncertain group scheduling under peak traffic flows has become a research focus and difficulty recently. RO (Robust Optimization method is a novel and effective way to deal with uncertain scheduling problem. In this paper, a peak scheduling method based on RO model for multi-elevator system is proposed. The method is immune to the uncertainty of peak traffic flows, optimal scheduling is realized without getting exact numbers of each calling floor's waiting passengers. Specifically, energy-saving oriented multi-objective scheduling price is proposed, RO uncertain peak scheduling model is built to minimize the price. Because RO uncertain model could not be solved directly, RO uncertain model is transformed to RO certain model by elevator scheduling robust counterparts. Because solution space of elevator scheduling is enormous, to solve RO certain model in short time, ant colony solving algorithm for elevator scheduling is proposed. Based on the algorithm, optimal scheduling solutions are found quickly, and group elevators are scheduled according to the solutions. Simulation results show the method could improve scheduling performances effectively in peak pattern. Group elevators' efficient operation is realized by the RO scheduling method.

  9. NRC comprehensive records disposition schedule. Revision 3

    International Nuclear Information System (INIS)

    1998-02-01

    Title 44 US Code, ''Public Printing and Documents,'' regulations issued by the General Service Administration (GSA) in 41 CFR Chapter 101, Subchapter B, ''Management and Use of Information and Records,'' and regulations issued by the National Archives and Records Administration (NARA) in 36 CFR Chapter 12, Subchapter B, ''Records Management,'' require each agency to prepare and issue a comprehensive records disposition schedule that contains the NARA approved records disposition schedules for records unique to the agency and contains the NARA's General Records Schedules for records common to several or all agencies. The approved records disposition schedules specify the appropriate duration of retention and the final disposition for records created or maintained by the NRC. NUREG-0910, Rev. 3, contains ''NRC's Comprehensive Records Disposition Schedule,'' and the original authorized approved citation numbers issued by NARA. Rev. 3 incorporates NARA approved changes and additions to the NRC schedules that have been implemented since the last revision dated March, 1992, reflects recent organizational changes implemented at the NRC, and includes the latest version of NARA's General Records Schedule (dated August 1995)

  10. NRC comprehensive records disposition schedule. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    Title 44 US Code, ``Public Printing and Documents,`` regulations issued by the General Service Administration (GSA) in 41 CFR Chapter 101, Subchapter B, ``Management and Use of Information and Records,`` and regulations issued by the National Archives and Records Administration (NARA) in 36 CFR Chapter 12, Subchapter B, ``Records Management,`` require each agency to prepare and issue a comprehensive records disposition schedule that contains the NARA approved records disposition schedules for records unique to the agency and contains the NARA`s General Records Schedules for records common to several or all agencies. The approved records disposition schedules specify the appropriate duration of retention and the final disposition for records created or maintained by the NRC. NUREG-0910, Rev. 3, contains ``NRC`s Comprehensive Records Disposition Schedule,`` and the original authorized approved citation numbers issued by NARA. Rev. 3 incorporates NARA approved changes and additions to the NRC schedules that have been implemented since the last revision dated March, 1992, reflects recent organizational changes implemented at the NRC, and includes the latest version of NARA`s General Records Schedule (dated August 1995).

  11. Energy-efficient approach to minimizing the energy consumption in an extended job-shop scheduling problem

    Science.gov (United States)

    Tang, Dunbing; Dai, Min

    2015-09-01

    The traditional production planning and scheduling problems consider performance indicators like time, cost and quality as optimization objectives in manufacturing processes. However, environmentally-friendly factors like energy consumption of production have not been completely taken into consideration. Against this background, this paper addresses an approach to modify a given schedule generated by a production planning and scheduling system in a job shop floor, where machine tools can work at different cutting speeds. It can adjust the cutting speeds of the operations while keeping the original assignment and processing sequence of operations of each job fixed in order to obtain energy savings. First, the proposed approach, based on a mixed integer programming mathematical model, changes the total idle time of the given schedule to minimize energy consumption in the job shop floor while accepting the optimal solution of the scheduling objective, makespan. Then, a genetic-simulated annealing algorithm is used to explore the optimal solution due to the fact that the problem is strongly NP-hard. Finally, the effectiveness of the approach is performed smalland large-size instances, respectively. The experimental results show that the approach can save 5%-10% of the average energy consumption while accepting the optimal solution of the makespan in small-size instances. In addition, the average maximum energy saving ratio can reach to 13%. And it can save approximately 1%-4% of the average energy consumption and approximately 2.4% of the average maximum energy while accepting the near-optimal solution of the makespan in large-size instances. The proposed research provides an interesting point to explore an energy-aware schedule optimization for a traditional production planning and scheduling problem.

  12. Cure Schedule for Stycast 2651/Catalyst 9.

    Energy Technology Data Exchange (ETDEWEB)

    Kropka, Jamie Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McCoy, John D. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    2017-11-01

    The Emerson & Cuming technical data sheet (TDS) for Stycast 2651/Catalyst 9 lists three alternate cure schedules for the material, each of which would result in a different state of reaction and different material properties. Here, a cure schedule that attains full reaction of the material is defined. The use of this cure schedule will eliminate variance in material properties due to changes in the cure state of the material, and the cure schedule will serve as the method to make material prior to characterizing properties. The following recommendation uses one of the schedules within the TDS and adds a “post cure” to obtain full reaction.

  13. Job shop scheduling problem with late work criterion

    Science.gov (United States)

    Piroozfard, Hamed; Wong, Kuan Yew

    2015-05-01

    Scheduling is considered as a key task in many industries, such as project based scheduling, crew scheduling, flight scheduling, machine scheduling, etc. In the machine scheduling area, the job shop scheduling problems are considered to be important and highly complex, in which they are characterized as NP-hard. The job shop scheduling problems with late work criterion and non-preemptive jobs are addressed in this paper. Late work criterion is a fairly new objective function. It is a qualitative measure and concerns with late parts of the jobs, unlike classical objective functions that are quantitative measures. In this work, simulated annealing was presented to solve the scheduling problem. In addition, operation based representation was used to encode the solution, and a neighbourhood search structure was employed to search for the new solutions. The case studies are Lawrence instances that were taken from the Operations Research Library. Computational results of this probabilistic meta-heuristic algorithm were compared with a conventional genetic algorithm, and a conclusion was made based on the algorithm and problem.

  14. Acquisition: Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs

    National Research Council Canada - National Science Library

    2004-01-01

    ... deviations in cost, schedule, and performance requirements in acquisition program baselines for Acquisition Category II and III programs and in identifying whether program managers are reporting...

  15. Efficiency of Chitosan for the Removal of Pb (II, Fe (II and Cu (II Ions from Aqueous Solutions

    Directory of Open Access Journals (Sweden)

    Soheil Sobhanardakani

    2014-09-01

    Full Text Available Background: Heavy metals have been recognized as harmful environmental pollutant known to produce highly toxic effects on different organs and systems of both humans and animals. The aim of this paper is to evaluate the adsorption potential of chitosan for the removal of Pb(II, Fe(II and Cu(II ions from aqueous solutions. Methods: This study was conducted in laboratory scale. In this paper chitosan has been used as an adsorbent for the removal of Pb(II, Fe(II and Cu(II from aqueous solution. In batch tests, the effects of parameters like pH solution (1.0-8.0, initial metal concentrations (100-1000 mgL-1, contact time (5.0-150 min and adsorbent dose (1.0-7.0 g on the adsorption process were studied. Results: The results showed that the adsorption of Pb(II, Fe(II and Cu(II ions on chitosan strongly depends on pH. The experimental isothermal data were analyzed using the Langmuir and Freundlich equations and it was found that the removal process followed the Langmuir isotherm and maximum adsorption capacity for the adsorption of Pb(II, Fe(II and Cu(II ions by the chitosan were 55.5mg g−1, 71.4 mg g−1 and 59 mg g−1, respectively, under equilibrium conditions at 25±1 ºC. The adsorption process was found to be well described by the pseudo-second-order rate model. Conclusion: The obtained results showed that chitosan is a readily, available, economic adsorbent and was found suitable for removing Pb(II, Fe(II and Cu(II ions from aqueous solution.

  16. Strategies for accelerating the SLARette process

    International Nuclear Information System (INIS)

    Grewal, P.

    1997-01-01

    The SLARette (Spacer Location and Repositioning) process is continuing on several CANDU reactors, where loose fitting garter springs (spacers) were used, to prevent contact between the calandria tube and the pressure tube for the target life. With time, the sag in the fuel channel is increasing and consequently increasing the potential for contact between the pressure tube and the calandria tube. Also, due to increasing sag in the pressure tubes and increasing magnitude of the fuel channel constrictions on the eddy current detection system, the Spacer Location and Repositioning activities are becoming more time consuming and difficult. For CANDU owners, during the SLARette campaigns, station outage time is the most expensive item. Therefore, it is beneficial to complete the SLARette process as early as possible and as fast as possible. New SLARette strategies can substantially accelerate the overall SLARette process and thus minimize the outage time. There are several strategies to perform the SLARette process. These strategies include: using the SLARette Mark II Delivery System; using the SLARette Advanced Delivery System; implement creative fuel handling technique; operate from both sides of the reactor using Mark II Delivery Systems; operate both sides using Advanced Delivery Systems. Each strategy offers different benefits, rate of fuel channel processing (SLARette Activity), and schedule constraints. This paper provides the details of each strategy and compare them in terms of outage time, man-rem consumption, and constraints. (author)

  17. Understanding Applications of Project Planning and Scheduling in Construction Projects

    OpenAIRE

    AlNasseri, Hammad Abdullah

    2015-01-01

    Construction project life-cycle processes must be managed in a more effective and predictable way to meet project stakeholders’ needs. However, there is increasing concern about whether know-how effectively improves understanding of underlying theories of project management processes for construction organizations and their project managers. Project planning and scheduling are considered as key and challenging tools in controlling and monitoring project performance, but many worldwide constru...

  18. MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING

    Directory of Open Access Journals (Sweden)

    Ladislav Rosocha

    2015-07-01

    Full Text Available Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a well-known method from statistical thermodynamics. We define hard constraints, which are linked to legal and working regulations, and minimize the violations of soft constraints, which are related to the quality of work, psychic, and work-life balance of staff.Findings: On a sample of 60 physicians and nurses from gynecology department we generated monthly schedules and optimized their preferences in terms of soft constraints. Our results indicate that the final value of objective function optimized by proposed algorithm is more than 18-times better in violations of soft constraints than initially generated random schedule that satisfied hard constraints.Research Limitation/implication: Even though the global optimality of final outcome is not guaranteed, desirable solutionwas obtained in reasonable time. Originality/Value of paper: We show that designed algorithm is able to successfully generate schedules regarding hard and soft constraints. Moreover, presented method is significantly faster than standard schedule generation and is able to effectively reschedule due to the local neighborhood search characteristics of simulated annealing.

  19. Practical principles in appointment scheduling

    NARCIS (Netherlands)

    Kuiper, A.; Mandjes, M.

    2015-01-01

    Appointment schedules aim at achieving a proper balance between the conflicting interests of the service provider and her clients: a primary objective of the service provider is to fully utilize her available time, whereas clients want to avoid excessive waiting times. Setting up schedules that

  20. A model for generating master surgical schedules to allow cyclic scheduling in operating room departments

    NARCIS (Netherlands)

    van Oostrum, J.M.; van Houdenhoven, M.; Hurink, Johann L.; Hans, Elias W.; Wullink, Gerhard; Kazemier, G.

    2005-01-01

    This paper addresses the problem of operating room scheduling at the tactical level of hospital planning and control. Hospitals repetitively construct operating room schedules, which is a time consuming tedious and complex task. The stochasticity of the durations of surgical procedures complicates