WorldWideScience

Sample records for scheduling process ii

  1. 21 CFR 1308.12 - Schedule II.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Schedule II. 1308.12 Section 1308.12 Food and... 9733 (26) Remifentanil 9739 (27) Sufentanil 9740 (28) Tapentadol 9780 (d) Stimulants. Unless... which contains any quantity of the following substances having a stimulant effect on the central nervous...

  2. Intelligence amplification framework for enhancing scheduling processes

    NARCIS (Netherlands)

    Dobrkovic, Andrej; Liu, Luyao; Iacob, Maria Eugenia; van Hillegersberg, Jos

    2016-01-01

    The scheduling process in a typical business environment consists of predominantly repetitive tasks that have to be completed in limited time and often containing some form of uncertainty. The intelligence amplification is a symbiotic relationship between a human and an intelligent agent. This

  3. AsmL Specification of a Ptolemy II Scheduler

    DEFF Research Database (Denmark)

    Lázaro Cuadrado, Daniel; Koch, Peter; Ravn, Anders Peter

    2003-01-01

    Ptolemy II is a tool that combines different computational models for simulation and design of embedded systems. AsmL is a software specification language based on the Abstract State Machine formalism. This paper reports on development of an AsmL model of the Synchronous Dataflow domain scheduler...

  4. Pharmacists correcting schedule II prescriptions: DEA flip-flops continue.

    Science.gov (United States)

    Abood, Richard R

    2010-12-01

    The Drug Enforcement Administration (DEA) has in recent years engaged in flip-flopping over important policy decisions. The most recent example involved whether a pharmacist can correct a written schedule II prescription upon verification with the prescriber. For several years the DEA's policy permitted this practice. Then the DEA issued a conflicting policy statement in 2007 in the preamble to the multiple schedule II prescription regulation, causing a series of subsequent contradictory statements ending with the policy that pharmacists should follow state law or policy until the Agency issues a regulation. It is doubtful that the DEA's opinion in the preamble would in itself constitute legal authority, or that the Agency would try to enforce the opinion. Nonetheless, these flip-flop opinions have confused pharmacists, caused some pharmacies to have claims rejected by third party payors, and most likely have inconvenienced patients.

  5. 21 CFR 113.83 - Establishing scheduled processes.

    Science.gov (United States)

    2010-04-01

    ... commercial production runs should be determined on the basis of recognized scientific methods to be of a size... CONTAINERS Production and Process Controls § 113.83 Establishing scheduled processes. Scheduled processes for... production shall be adequately provided for in establishing the scheduled process. Critical factors, e.g...

  6. Prescriptions for schedule II opioids and benzodiazepines increase after the introduction of computer-generated prescriptions.

    Science.gov (United States)

    McGerald, Genevieve; Dvorkin, Ronald; Levy, David; Lovell-Rose, Stephanie; Sharma, Adhi

    2009-06-01

    Prescriptions for controlled substances decrease when regulatory barriers are put in place. The converse has not been studied. The objective was to determine whether a less complicated prescription writing process is associated with a change in the prescribing patterns of controlled substances in the emergency department (ED). The authors conducted a retrospective nonconcurrent cohort study of all patients seen in an adult ED between April 19, 2005, and April 18, 2007, who were discharged with a prescription. Prior to April 19, 2006, a specialized prescription form stored in a locked cabinet was obtained from the nursing staff to write a prescription for benzodiazepines or Schedule II opioids. After April 19, 2006, New York State mandated that all prescriptions, regardless of schedule classification, be generated on a specialized bar-coded prescription form. The main outcome of the study was to compare the proportion of Schedule III-V opioids to Schedule II opioids and benzodiazepines prescribed in the ED before and after the introduction of a less cumbersome prescription writing process. Of the 26,638 charts reviewed, 2.1% of the total number of prescriptions generated were for a Schedule II controlled opioid before the new system was implemented compared to 13.6% after (odds ratio [OR] = 7.3, 95% confidence interval [CI] = 6.4 to 8.4). The corresponding percentages for Schedule III-V opioids were 29.9% to 18.1% (OR = 0.52, 95% CI = 0.49 to 0.55) and for benzodiazepines 1.4% to 3.9% (OR = 2.8, 95% CI = 2.4 to 3.4). Patients were more likely to receive a prescription for a Schedule II opioid or a benzodiazepine after a more streamlined computer-generated prescription writing process was introduced in this ED. (c) 2009 by the Society for Academic Emergency Medicine.

  7. Analyzing scheduling in the food-processing industry

    DEFF Research Database (Denmark)

    Akkerman, Renzo; van Donk, Dirk Pieter

    2009-01-01

    Production scheduling has been widely studied in several research areas, resulting in a large number of methods, prescriptions, and approaches. However, the impact on scheduling practice seems relatively low. This is also the case in the food-processing industry, where industry......-specific characteristics induce specific and complex scheduling problems. Based on ideas about decomposition of the scheduling task and the production process, we develop an analysis methodology for scheduling problems in food processing. This combines an analysis of structural (technological) elements of the production...... process with an analysis of the tasks of the scheduler. This helps to understand, describe, and structure scheduling problems in food processing, and forms a basis for improving scheduling and applying methods developed in literature. It also helps in evaluating the organisational structures...

  8. LHC Experiments Phase II - TDRs Approval Process

    CERN Document Server

    Forti, F

    2017-01-01

    The overall review process and steps of Phase II were described in CERN-LHCC-2015-077. As experiments submit detailed technical design reports (TDRs), the LHCC and UCG work in close connection to ensure a timely review of the scientific and technical feasibility as well as of the budget and schedule of the upgrade programme.

  9. Thin film processes II

    CERN Document Server

    Kern, Werner

    1991-01-01

    This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

  10. Step-by-step cyclic processes scheduling

    DEFF Research Database (Denmark)

    Bocewicz, G.; Nielsen, Izabela Ewa; Banaszak, Z.

    2013-01-01

    Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research is to pro......Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research...... is to provide a declarative model enabling to state a constraint satisfaction problem aimed at AGVs fleet scheduling subject to assumed itineraries of concurrently manufactured product types. In other words, assuming a given layout of FMS’s material handling and production routes of simultaneously manufactured...... orders, the main objective is to provide the declarative framework aimed at conditions allowing one to calculate the AGVs fleet schedule in online mode. An illustrative example of the relevant algebra-like driven step-by-stem cyclic scheduling is provided....

  11. A Bee Evolutionary Guiding Nondominated Sorting Genetic Algorithm II for Multiobjective Flexible Job-Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Qianwang Deng

    2017-01-01

    Full Text Available Flexible job-shop scheduling problem (FJSP is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II for multiobjective FJSP (MO-FJSP with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N, in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.

  12. A Bee Evolutionary Guiding Nondominated Sorting Genetic Algorithm II for Multiobjective Flexible Job-Shop Scheduling.

    Science.gov (United States)

    Deng, Qianwang; Gong, Guiliang; Gong, Xuran; Zhang, Like; Liu, Wei; Ren, Qinghua

    2017-01-01

    Flexible job-shop scheduling problem (FJSP) is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP) characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II) for multiobjective FJSP (MO-FJSP) with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N , in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.

  13. 78 FR 37237 - Proposed Adjustments to the Aggregate Production Quotas for Schedule I and II Controlled...

    Science.gov (United States)

    2013-06-20

    ... class of controlled substance listed in schedules I and II and for ephedrine, pseudoephedrine, and... disposal by the registrants holding individual manufacturing quotas for the class; (2) whether any... the Aggregate Production Quotas for Schedule I and II Controlled Substances and Assessment of Annual...

  14. 78 FR 55099 - Established Aggregate Production Quotas for Schedule I and II Controlled Substances and...

    Science.gov (United States)

    2013-09-09

    ... aggregate production quotas, an additional 25% of the estimated medical, scientific, and research needs as... Production Quotas for Schedule I and II Controlled Substances and Established Assessment of Annual Needs for... initial 2014 aggregate production quotas for controlled substances in Schedules I and II of the Controlled...

  15. Combined Noncyclic Scheduling and Advanced Control for Continuous Chemical Processes

    Directory of Open Access Journals (Sweden)

    Damon Petersen

    2017-12-01

    Full Text Available A novel formulation for combined scheduling and control of multi-product, continuous chemical processes is introduced in which nonlinear model predictive control (NMPC and noncyclic continuous-time scheduling are efficiently combined. A decomposition into nonlinear programming (NLP dynamic optimization problems and mixed-integer linear programming (MILP problems, without iterative alternation, allows for computationally light solution. An iterative method is introduced to determine the number of production slots for a noncyclic schedule during a prediction horizon. A filter method is introduced to reduce the number of MILP problems required. The formulation’s closed-loop performance with both process disturbances and updated market conditions is demonstrated through multiple scenarios on a benchmark continuously stirred tank reactor (CSTR application with fluctuations in market demand and price for multiple products. Economic performance surpasses cyclic scheduling in all scenarios presented. Computational performance is sufficiently light to enable online operation in a dual-loop feedback structure.

  16. International Literature Review on WHODAS II (World Health Organization Disability Assessment Schedule II

    Directory of Open Access Journals (Sweden)

    Federici, Stefano

    2009-06-01

    Full Text Available This review is a critical analysis regarding the study and utilization of the World Health Organization Disability Assessment Schedule II (WHODAS II as a basis for establishing specific criteria for evaluating relevant international scientific literature.The WHODAS II is an instrument developed by the World Health Organisation in order to assess behavioural limitations and restrictions related to an individual’s participation, independent from a medical diagnosis. This instrument was developed by the WHO’s Assessment, Classification and Epidemiology Group within the framework of the WHO/NIH Joint Project on Assessment and Classification of Disablements. To ascertain the international dissemination level of for WHODAS II’s utilization and, at the same time, analyse the studies regarding the psychometric validation of the WHODAS II translation and adaptation in other languages and geographical contests. Particularly, our goal is to highlight which psychometric features have been investigated, focusing on the factorial structure, the reliability, and the validity of this instrument. International literature was researched through the main data bases of indexed scientific production: the Cambridge Scientific Abstracts – CSA, PubMed, and Google Scholar, from 1990 through to December 2008.The following search terms were used:“whodas”, in the field query, plus “title” and “abstract”.The WHODAS II has been used in 54 studies, of which 51 articles are published in international journals, 2 conference abstracts, and one dissertation abstract. Nevertheless, only 7 articles are published in journals and conference proceedings regarding disability and rehabilitation. Others have been published in medical and psychiatric journals, with the aim of indentifying comorbidity correlations in clinical diagnosis concerning patients with mental illness. Just 8 out of 51 articles have studied the psychometric properties of the WHODAS II. The

  17. Surgical scheduling: a lean approach to process improvement.

    Science.gov (United States)

    Simon, Ross William; Canacari, Elena G

    2014-01-01

    A large teaching hospital in the northeast United States had an inefficient, paper-based process for scheduling orthopedic surgery that caused delays and contributed to site/side discrepancies. The hospital's leaders formed a team with the goals of developing a safe, effective, patient-centered, timely, efficient, and accurate orthopedic scheduling process; smoothing the schedule so that block time was allocated more evenly; and ensuring correct site/side. Under the resulting process, real-time patient information is entered into a database during the patient's preoperative visit in the surgeon's office. The team found the new process reduced the occurrence of site/side discrepancies to zero, reduced instances of changing the sequence of orthopedic procedures by 70%, and increased patient satisfaction. Copyright © 2014 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  18. Car painting process scheduling with harmony search algorithm

    Science.gov (United States)

    Syahputra, M. F.; Maiyasya, A.; Purnamawati, S.; Abdullah, D.; Albra, W.; Heikal, M.; Abdurrahman, A.; Khaddafi, M.

    2018-02-01

    Automotive painting program in the process of painting the car body by using robot power, making efficiency in the production system. Production system will be more efficient if pay attention to scheduling of car order which will be done by considering painting body shape of car. Flow shop scheduling is a scheduling model in which the job-job to be processed entirely flows in the same product direction / path. Scheduling problems often arise if there are n jobs to be processed on the machine, which must be specified which must be done first and how to allocate jobs on the machine to obtain a scheduled production process. Harmony Search Algorithm is a metaheuristic optimization algorithm based on music. The algorithm is inspired by observations that lead to music in search of perfect harmony. This musical harmony is in line to find optimal in the optimization process. Based on the tests that have been done, obtained the optimal car sequence with minimum makespan value.

  19. Fractional Poisson process (II)

    International Nuclear Information System (INIS)

    Wang Xiaotian; Wen Zhixiong; Zhang Shiying

    2006-01-01

    In this paper, we propose a stochastic process W H (t)(H-bar (12,1)) which we call fractional Poisson process. The process W H (t) is self-similar in wide sense, displays long range dependence, and has more fatter tail than Gaussian process. In addition, it converges to fractional Brownian motion in distribution

  20. Multimodal processes scheduling in mesh-like network environment

    Directory of Open Access Journals (Sweden)

    Bocewicz Grzegorz

    2015-06-01

    Full Text Available Multimodal processes planning and scheduling play a pivotal role in many different domains including city networks, multimodal transportation systems, computer and telecommunication networks and so on. Multimodal process can be seen as a process partially processed by locally executed cyclic processes. In that context the concept of a Mesh-like Multimodal Transportation Network (MMTN in which several isomorphic subnetworks interact each other via distinguished subsets of common shared intermodal transport interchange facilities (such as a railway station, bus station or bus/tram stop as to provide a variety of demand-responsive passenger transportation services is examined. Consider a mesh-like layout of a passengers transport network equipped with different lines including buses, trams, metro, trains etc. where passenger flows are treated as multimodal processes. The goal is to provide a declarative model enabling to state a constraint satisfaction problem aimed at multimodal transportation processes scheduling encompassing passenger flow itineraries. Then, the main objective is to provide conditions guaranteeing solvability of particular transport lines scheduling, i.e. guaranteeing the right match-up of local cyclic acting bus, tram, metro and train schedules to a given passengers flow itineraries.

  1. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  2. Multi-Objective Flexible Flow Shop Scheduling Problem Considering Variable Processing Time due to Renewable Energy

    Directory of Open Access Journals (Sweden)

    Xiuli Wu

    2018-03-01

    Full Text Available Renewable energy is an alternative to non-renewable energy to reduce the carbon footprint of manufacturing systems. Finding out how to make an alternative energy-efficient scheduling solution when renewable and non-renewable energy drives production is of great importance. In this paper, a multi-objective flexible flow shop scheduling problem that considers variable processing time due to renewable energy (MFFSP-VPTRE is studied. First, the optimization model of the MFFSP-VPTRE is formulated considering the periodicity of renewable energy and the limitations of energy storage capacity. Then, a hybrid non-dominated sorting genetic algorithm with variable local search (HNSGA-II is proposed to solve the MFFSP-VPTRE. An operation and machine-based encoding method is employed. A low-carbon scheduling algorithm is presented. Besides the crossover and mutation, a variable local search is used to improve the offspring’s Pareto set. The offspring and the parents are combined and those that dominate more are selected to continue evolving. Finally, two groups of experiments are carried out. The results show that the low-carbon scheduling algorithm can effectively reduce the carbon footprint under the premise of makespan optimization and the HNSGA-II outperforms the traditional NSGA-II and can solve the MFFSP-VPTRE effectively and efficiently.

  3. Multi-core processing and scheduling performance in CMS

    International Nuclear Information System (INIS)

    Hernández, J M; Evans, D; Foulkes, S

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.

  4. Integrated project scheduling and staff assignment with controllable processing times.

    Science.gov (United States)

    Fernandez-Viagas, Victor; Framinan, Jose M

    2014-01-01

    This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.

  5. Integrated Project Scheduling and Staff Assignment with Controllable Processing Times

    Directory of Open Access Journals (Sweden)

    Victor Fernandez-Viagas

    2014-01-01

    Full Text Available This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.

  6. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times.

    Science.gov (United States)

    Yang, Xin; Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems.

  7. Application of coupled symbolic and numeric processing to an advanced scheduling system for plant construction

    International Nuclear Information System (INIS)

    Kobayashi, Yasuhiro; Takamoto, Masanori; Nonaka, Hisanori; Yamada, Naoyuki

    1994-01-01

    A scheduling system has been developed by integrating symbolic processing functions for constraint handling and modification guidance, with numeric processing functions for schedule optimization and evaluation. The system is composed of an automatic schedule generation module, interactive schedule revision module and schedule evaluation module. The goal of the problem solving is the flattening of the daily resources requirement throughout the scheduling period. The automatic schedule generation module optimizes the initial schedule according to the formulatable portion of requirement description specified in a predicate-like language. A planning engineer refines the near-goal schedule through a knowledge-based interactive optimization process to obtain the goal schedule which fully covers the requirement description, with the interactive schedule revision module and schedule evaluation module. A scheduling system has been implemented on the basis of the proposed problem solving framework and experimentally applied to real-world sized scheduling problems for plant construction. With a result of the overall plant construction scheduling, a section schedule optimization process is described with the emphasis on the symbolic processing functions. (author)

  8. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  9. Economic Benefit from Progressive Integration of Scheduling and Control for Continuous Chemical Processes

    Directory of Open Access Journals (Sweden)

    Logan D. R. Beal

    2017-12-01

    Full Text Available Performance of integrated production scheduling and advanced process control with disturbances is summarized and reviewed with four progressive stages of scheduling and control integration and responsiveness to disturbances: open-loop segregated scheduling and control, closed-loop segregated scheduling and control, open-loop scheduling with consideration of process dynamics, and closed-loop integrated scheduling and control responsive to process disturbances and market fluctuations. Progressive economic benefit from dynamic rescheduling and integrating scheduling and control is shown on a continuously stirred tank reactor (CSTR benchmark application in closed-loop simulations over 24 h. A fixed horizon integrated scheduling and control formulation for multi-product, continuous chemical processes is utilized, in which nonlinear model predictive control (NMPC and continuous-time scheduling are combined.

  10. Synthesis of zero effluent multipurpose batch processes using effective scheduling

    CSIR Research Space (South Africa)

    Gouws, JF

    2008-06-01

    Full Text Available as follows. Given, i) required production over a given time horizon, ii) product recipe and production times, iii) maximum number of processing vessels and storage vessels, and iv) maximum and minimum capacity of processing vessels and storage vessels... the cleaning operation, due to the three different products mixed. Each type of wastewater has the possibility of being stored in a distinct storage vessel. The minimum and maximum capacity of each storage vessel is 500kg and 1500kg, respectively...

  11. Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching

    Science.gov (United States)

    Shen, Kaiming; Yu, Wei

    2018-05-01

    This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.

  12. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  14. Job schedulers for Big data processing in Hadoop environment: testing real-life schedulers using benchmark programs

    Directory of Open Access Journals (Sweden)

    Mohd Usama

    2017-11-01

    Full Text Available At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, manage, store, distribute, and analyze petabyte or larger-sized datasets having different structures with high speed. Big data can be structured, unstructured, or semi structured. Hadoop is an open source framework that is used to process large amounts of data in an inexpensive and efficient way, and job scheduling is a key factor for achieving high performance in big data processing. This paper gives an overview of big data and highlights the problems and challenges in big data. It then highlights Hadoop Distributed File System (HDFS, Hadoop MapReduce, and various parameters that affect the performance of job scheduling algorithms in big data such as Job Tracker, Task Tracker, Name Node, Data Node, etc. The primary purpose of this paper is to present a comparative study of job scheduling algorithms along with their experimental results in Hadoop environment. In addition, this paper describes the advantages, disadvantages, features, and drawbacks of various Hadoop job schedulers such as FIFO, Fair, capacity, Deadline Constraints, Delay, LATE, Resource Aware, etc, and provides a comparative study among these schedulers.

  15. A Scheduling Model for the Re-entrant Manufacturing System and Its Optimization by NSGA-II

    Directory of Open Access Journals (Sweden)

    Masoud Rabbani

    2016-11-01

    Full Text Available In this study, a two-objective mixed-integer linear programming model (MILP for multi-product re-entrant flow shop scheduling problem has been designed. As a result, two objectives are considered. One of them is maximization of the production rate and the other is the minimization of processing time. The system has m stations and can process several products in a moment. The re-entrant flow shop scheduling problem is well known as NP-hard problem and its complexity has been discussed by several researchers. Given that NSGA-II algorithm is one of the strongest and most applicable algorithm in solving multi-objective optimization problems, it is used to solve this problem. To increase algorithm performance, Taguchi technique is used to design experiments for algorithm’s parameters. Numerical experiments are proposed to show the efficiency and effectiveness of the model. Finally, the results of NSGA-II are compared with SPEA2 algorithm (Strength Pareto Evolutionary Algorithm 2. The experimental results show that the proposed algorithm performs significantly better than the SPEA2.

  16. Designing scheduling concept and computer support in the food processing industries

    NARCIS (Netherlands)

    van Donk, DP; van Wezel, W; Gaalman, G; Bititci, US; Carrie, AS

    1998-01-01

    Food processing industries cope with a specific production process and a dynamic market. Scheduling the production process is thus important in being competitive. This paper proposes a hierarchical concept for structuring the scheduling and describes the (computer) support needed for this concept.

  17. The 12-item World Health Organization Disability Assessment Schedule II (WHO-DAS II: a nonparametric item response analysis

    Directory of Open Access Journals (Sweden)

    Fernandez Ana

    2010-05-01

    Full Text Available Abstract Background Previous studies have analyzed the psychometric properties of the World Health Organization Disability Assessment Schedule II (WHO-DAS II using classical omnibus measures of scale quality. These analyses are sample dependent and do not model item responses as a function of the underlying trait level. The main objective of this study was to examine the effectiveness of the WHO-DAS II items and their options in discriminating between changes in the underlying disability level by means of item response analyses. We also explored differential item functioning (DIF in men and women. Methods The participants were 3615 adult general practice patients from 17 regions of Spain, with a first diagnosed major depressive episode. The 12-item WHO-DAS II was administered by the general practitioners during the consultation. We used a non-parametric item response method (Kernel-Smoothing implemented with the TestGraf software to examine the effectiveness of each item (item characteristic curves and their options (option characteristic curves in discriminating between changes in the underliying disability level. We examined composite DIF to know whether women had a higher probability than men of endorsing each item. Results Item response analyses indicated that the twelve items forming the WHO-DAS II perform very well. All items were determined to provide good discrimination across varying standardized levels of the trait. The items also had option characteristic curves that showed good discrimination, given that each increasing option became more likely than the previous as a function of increasing trait level. No gender-related DIF was found on any of the items. Conclusions All WHO-DAS II items were very good at assessing overall disability. Our results supported the appropriateness of the weights assigned to response option categories and showed an absence of gender differences in item functioning.

  18. ICPP calcined solids storage facility closure study. Volume II: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    This document contains Volume II of the Closure Study for the Idaho Chemical Processing Plant Calcined Solids Storage Facility. This volume contains draft information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the four options described in Volume I: (1) Risk-Based Clean Closure; NRC Class C fill, (2) Risk-Based Clean Closure; Clean fill, (3) Closure to landfill Standards; NRC Class C fill, and (4) Closure to Landfill Standards; Clean fill.

  19. ICPP calcined solids storage facility closure study. Volume II: Cost estimates, planning schedules, yearly cost flowcharts, and life-cycle cost estimates

    International Nuclear Information System (INIS)

    1998-02-01

    This document contains Volume II of the Closure Study for the Idaho Chemical Processing Plant Calcined Solids Storage Facility. This volume contains draft information on cost estimates, planning schedules, yearly cost flowcharts, and life-cycle costs for the four options described in Volume I: (1) Risk-Based Clean Closure; NRC Class C fill, (2) Risk-Based Clean Closure; Clean fill, (3) Closure to landfill Standards; NRC Class C fill, and (4) Closure to Landfill Standards; Clean fill

  20. The development of stochastic process modeling through risk analysis derived from scheduling of NPP project

    International Nuclear Information System (INIS)

    Lee, Kwang Ho; Roh, Myung Sub

    2013-01-01

    There are so many different factors to consider when constructing a nuclear power plant successfully from planning to decommissioning. According to PMBOK, all projects have nine domains from a holistic project management perspective. They are equally important to all projects, however, this study focuses mostly on the processes required to manage timely completion of the project and conduct risk management. The overall objective of this study is to let you know what the risk analysis derived from scheduling of NPP project is, and understand how to implement the stochastic process modeling through risk management. Building the Nuclear Power Plant is required a great deal of time and fundamental knowledge related to all engineering. That means that integrated project scheduling management with so many activities is necessary and very important. Simulation techniques for scheduling of NPP project using Open Plan program, Crystal Ball program, and Minitab program can be useful tools for designing optimal schedule planning. Thus far, Open Plan and Monte Carlo programs have been used to calculate the critical path for scheduling network analysis. And also, Minitab program has been applied to monitor the scheduling risk. This approach to stochastic modeling through risk analysis of project activities is very useful for optimizing the schedules of activities using Critical Path Method and managing the scheduling control of NPP project. This study has shown new approach to optimal scheduling of NPP project, however, this does not consider the characteristic of activities according to the NPP site conditions. Hence, this study needs more research considering those factors

  1. The development of stochastic process modeling through risk analysis derived from scheduling of NPP project

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kwang Ho; Roh, Myung Sub [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-10-15

    There are so many different factors to consider when constructing a nuclear power plant successfully from planning to decommissioning. According to PMBOK, all projects have nine domains from a holistic project management perspective. They are equally important to all projects, however, this study focuses mostly on the processes required to manage timely completion of the project and conduct risk management. The overall objective of this study is to let you know what the risk analysis derived from scheduling of NPP project is, and understand how to implement the stochastic process modeling through risk management. Building the Nuclear Power Plant is required a great deal of time and fundamental knowledge related to all engineering. That means that integrated project scheduling management with so many activities is necessary and very important. Simulation techniques for scheduling of NPP project using Open Plan program, Crystal Ball program, and Minitab program can be useful tools for designing optimal schedule planning. Thus far, Open Plan and Monte Carlo programs have been used to calculate the critical path for scheduling network analysis. And also, Minitab program has been applied to monitor the scheduling risk. This approach to stochastic modeling through risk analysis of project activities is very useful for optimizing the schedules of activities using Critical Path Method and managing the scheduling control of NPP project. This study has shown new approach to optimal scheduling of NPP project, however, this does not consider the characteristic of activities according to the NPP site conditions. Hence, this study needs more research considering those factors.

  2. Scheduling of Conditional Process Graphs for the Synthesis of Embedded Systems

    DEFF Research Database (Denmark)

    Eles, Petru; Kuchcinski, Krzysztof; Peng, Zebo

    1998-01-01

    We present an approach to process scheduling based on an abstract graph representation which captures both dataflow and the flow of control. Target architectures consist of several processors, ASICs and shared busses. We have developed a heuristic which generates a schedule table so that the worst...... case delay is minimized. Several experiments demonstrate the efficiency of the approach....

  3. Accuracy improvement of dataflow analysis for cyclic stream processing applications scheduled by static priority preemptive schedulers

    NARCIS (Netherlands)

    Kurtin, Philip Sebastian; Hausmans, J.P.H.M.; Geuns, S.J.; Bekooij, Marco Jan Gerrit

    2014-01-01

    Stream processing applications executed on embedded multiprocessor systems regularly contain cyclic data dependencies due to the presence of feedback loops and bounded FIFO buffers. Dataflow modeling is suitable for the temporal analysis of such applications. However, the accuracy can be

  4. Process simulations for the LCLS-II cryogenic systems

    Science.gov (United States)

    Ravindranath, V.; Bai, H.; Heloin, V.; Fauve, E.; Pflueckhahn, D.; Peterson, T.; Arenius, D.; Bevins, M.; Scanlon, C.; Than, R.; Hays, G.; Ross, M.

    2017-12-01

    Linac Coherent Light Source II (LCLS-II), a 4 GeV continuous-wave (CW) superconducting electron linear accelerator, is to be constructed in the existing two mile Linac facility at the SLAC National Accelerator Laboratory. The first light from the new facility is scheduled to be in 2020. The LCLS-II Linac consists of thirty-five 1.3 GHz and two 3.9 GHz superconducting cryomodules. The Linac cryomodules require cryogenic cooling for the super-conducting niobium cavities at 2.0 K, low temperature thermal intercept at 5.5-7.5 K, and a thermal shield at 35-55 K. The equivalent 4.5 K refrigeration capacity needed for the Linac operations range from a minimum of 11 kW to a maximum of 24 kW. Two cryogenic plants with 18 kW of equivalent 4.5 K refrigeration capacity will be used for supporting the Linac cryogenic cooling requirements. The cryogenic plants are based on the Jefferson Lab’s CHL-II cryogenic plant design which uses the “Floating Pressure” design to support a wide variation in the cooling load. In this paper, the cryogenic process for the integrated LCLS-II cryogenic system and the process simulation for a 4.5 K cryoplant in combination with a 2 K cold compressor box, and the Linac cryomodules are described.

  5. Extraterrestrial Metals Processing, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The Extraterrestrial Metals Processing (EMP) system produces iron, silicon, and light metals from Mars, Moon, or asteroid resources in support of advanced human...

  6. Silicon processing for photovoltaics II

    CERN Document Server

    Khattak, CP

    2012-01-01

    The processing of semiconductor silicon for manufacturing low cost photovoltaic products has been a field of increasing activity over the past decade and a number of papers have been published in the technical literature. This volume presents comprehensive, in-depth reviews on some of the key technologies developed for processing silicon for photovoltaic applications. It is complementary to Volume 5 in this series and together they provide the only collection of reviews in silicon photovoltaics available.The volume contains papers on: the effect of introducing grain boundaries in silicon; the

  7. Ground Processing Optimization Using Artificial Intelligence Techniques, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The ultimate goal is the automation of a large amount of KSC's planning, scheduling, and execution decision making. Phase II will result in a complete full-scale...

  8. Improved Low Power FPGA Binding of Datapaths from Data Flow Graphs with NSGA II -based Schedule Selection

    Directory of Open Access Journals (Sweden)

    BHUVANESWARI, M. C.

    2013-11-01

    Full Text Available FPGAs are increasingly being used to implement data path intensive algorithms for signal processing and image processing applications. In High Level Synthesis of Data Flow Graphs targeted at FPGAs, the effect of interconnect resources such as multiplexers must be considered since they contribute significantly to the area and switching power. We propose a binding framework for behavioral synthesis of Data Flow Graphs (DFGs onto FPGA targets with power reduction as the main criterion. The technique uses a multi-objective GA, NSGA II for design space exploration to identify schedules that have the potential to yield low-power bindings from a population of non-dominated solutions. A greedy constructive binding technique reported in the literature is adapted for interconnect minimization. The binding is further subjected to a perturbation process by altering the register and multiplexer assignments. Results obtained on standard DFG benchmarks indicate that our technique yields better power aware bindings than the constructive binding approach with little or no area overhead.

  9. Susceptibility of optimal train schedules to stochastic disturbances of process times

    DEFF Research Database (Denmark)

    Larsen, Rune; Pranzo, Marco; D’Ariano, Andrea

    2013-01-01

    study, an advanced branch and bound algorithm, on average, outperforms a First In First Out scheduling rule both in deterministic and stochastic traffic scenarios. However, the characteristic of the stochastic processes and the way a stochastic instance is handled turn out to have a serious impact...... and dwell times). In fact, the objective of railway traffic management is to reduce delay propagation and to increase disturbance robustness of train schedules at a network scale. We present a quantitative study of traffic disturbances and their effects on the schedules computed by simple and advanced...

  10. Multi-core processing and scheduling performance in CMS

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-...

  11. An Extended Genetic Algorithm for Distributed Integration of Fuzzy Process Planning and Scheduling

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2016-01-01

    Full Text Available The distributed integration of process planning and scheduling (DIPPS aims to simultaneously arrange the two most important manufacturing stages, process planning and scheduling, in a distributed manufacturing environment. Meanwhile, considering its advantage corresponding to actual situation, the triangle fuzzy number (TFN is adopted in DIPPS to represent the machine processing and transportation time. In order to solve this problem and obtain the optimal or near-optimal solution, an extended genetic algorithm (EGA with innovative three-class encoding method, improved crossover, and mutation strategies is proposed. Furthermore, a local enhancement strategy featuring machine replacement and order exchange is also added to strengthen the local search capability on the basic process of genetic algorithm. Through the verification of experiment, EGA achieves satisfactory results all in a very short period of time and demonstrates its powerful performance in dealing with the distributed integration of fuzzy process planning and scheduling (DIFPPS.

  12. Water-integrated scheduling of batch process plants

    NARCIS (Netherlands)

    Pulluru, Sai Jishna; Akkerman, Renzo

    2017-01-01

    Efficient water management is becoming increasingly important in production systems, but companies often do not have any concrete strategies to implement. While there are numerous technological options for improving water efficiency in process plants, there is a lack of effective decision support to

  13. Water-integrated scheduling of batch process plants

    NARCIS (Netherlands)

    Pulluru, Sai Jishna; Akkerman, Renzo

    2018-01-01

    Efficient water management is becoming increasingly important in production systems, but companies often do not have any concrete strategies to implement. While there are numerous technological options for improving water efficiency in process plants, there is a lack of effective decision support to

  14. Uncertainty management by relaxation of conflicting constraints in production process scheduling

    Science.gov (United States)

    Dorn, Juergen; Slany, Wolfgang; Stary, Christian

    1992-01-01

    Mathematical-analytical methods as used in Operations Research approaches are often insufficient for scheduling problems. This is due to three reasons: the combinatorial complexity of the search space, conflicting objectives for production optimization, and the uncertainty in the production process. Knowledge-based techniques, especially approximate reasoning and constraint relaxation, are promising ways to overcome these problems. A case study from an industrial CIM environment, namely high-grade steel production, is presented to demonstrate how knowledge-based scheduling with the desired capabilities could work. By using fuzzy set theory, the applied knowledge representation technique covers the uncertainty inherent in the problem domain. Based on this knowledge representation, a classification of jobs according to their importance is defined which is then used for the straightforward generation of a schedule. A control strategy which comprises organizational, spatial, temporal, and chemical constraints is introduced. The strategy supports the dynamic relaxation of conflicting constraints in order to improve tentative schedules.

  15. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    Science.gov (United States)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  16. Optimal methodology for a machining process scheduling in spot electricity markets

    International Nuclear Information System (INIS)

    Yusta, J.M.; Torres, F.; Khodr, H.M.

    2010-01-01

    Electricity spot markets have introduced hourly variations in the price of electricity. These variations allow the increase of the energy efficiency by the appropriate scheduling and adaptation of the industrial production to the hourly cost of electricity in order to obtain the maximum profit for the industry. In this article a mathematical optimization model simulates costs and the electricity demand of a machining process. The resultant problem is solved using the generalized reduced gradient approach, to find the optimum production schedule that maximizes the industry profit considering the hourly variations of the price of electricity in the spot market. Different price scenarios are studied to analyze the impact of the spot market prices for electricity on the optimal scheduling of the machining process and on the industry profit. The convenience of the application of the proposed model is shown especially in cases of very high electricity prices.

  17. 9 CFR 381.303 - Critical factors and the application of the process schedule.

    Science.gov (United States)

    2010-01-01

    ... PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Critical factors and the application of the process schedule. 381.303 Section 381.303 Animals and Animal Products FOOD SAFETY AND...

  18. 9 CFR 318.303 - Critical factors and the application of the process schedule.

    Science.gov (United States)

    2010-01-01

    ... of the process schedule. 318.303 Section 318.303 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY...; REINSPECTION AND PREPARATION OF PRODUCTS Canning and Canned Products § 318.303 Critical factors and the...

  19. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

    Science.gov (United States)

    Duan, Haoran

    1997-12-01

    This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a

  20. Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier

    Science.gov (United States)

    Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki

    We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.

  1. New heating schedule in hydrogen annealing furnace based on process simulation for less energy consumption

    International Nuclear Information System (INIS)

    Saboonchi, Ahmad; Hassanpour, Saeid; Abbasi, Shahram

    2008-01-01

    Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%

  2. New heating schedule in hydrogen annealing furnace based on process simulation for less energy consumption

    Energy Technology Data Exchange (ETDEWEB)

    Saboonchi, Ahmad [Department of Mechanical Engineering, Isfahan University of Technology, Isfahan 84154 (Iran); Hassanpour, Saeid [Rayan Tahlil Sepahan Co., Isfahan Science and Technology Town, Isfahan 84155 (Iran); Abbasi, Shahram [R and D Department, Mobarakeh Steel Complex, Isfahan (Iran)

    2008-11-15

    Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%. (author)

  3. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  4. A priority-based heuristic algorithm (PBHA for optimizing integrated process planning and scheduling problem

    Directory of Open Access Journals (Sweden)

    Muhammad Farhan Ausaf

    2015-12-01

    Full Text Available Process planning and scheduling are two important components of a manufacturing setup. It is important to integrate them to achieve better global optimality and improved system performance. To find optimal solutions for integrated process planning and scheduling (IPPS problem, numerous algorithm-based approaches exist. Most of these approaches try to use existing meta-heuristic algorithms for solving the IPPS problem. Although these approaches have been shown to be effective in optimizing the IPPS problem, there is still room for improvement in terms of quality of solution and algorithm efficiency, especially for more complicated problems. Dispatching rules have been successfully utilized for solving complicated scheduling problems, but haven’t been considered extensively for the IPPS problem. This approach incorporates dispatching rules with the concept of prioritizing jobs, in an algorithm called priority-based heuristic algorithm (PBHA. PBHA tries to establish job and machine priority for selecting operations. Priority assignment and a set of dispatching rules are simultaneously used to generate both the process plans and schedules for all jobs and machines. The algorithm was tested for a series of benchmark problems. The proposed algorithm was able to achieve superior results for most complex problems presented in recent literature while utilizing lesser computational resources.

  5. Energy Efficient Scheduling of Real Time Signal Processing Applications through Combined DVFS and DPM

    OpenAIRE

    Nogues , Erwan; Pelcat , Maxime; Menard , Daniel; Mercat , Alexandre

    2016-01-01

    International audience; This paper proposes a framework to design energy efficient signal processing systems. The energy efficiency is provided by combining Dynamic Frequency and Voltage Scaling (DVFS) and Dynamic Power Management (DPM). The framework is based on Synchronous Dataflow (SDF) modeling of signal processing applications. A transformation to a single rate form is performed to expose the application parallelism. An automated scheduling is then performed, minimizing the constraint of...

  6. Iterative Relay Scheduling with Hybrid ARQ under Multiple User Equipment (Type II) Relay Environments

    KAUST Repository

    Nam, Sung Sik; Alouini, Mohamed-Slim; Choi, Seyeong

    2018-01-01

    -generation cellular systems (e.g., LTE-Advanced and beyond). The proposed IRS-HARQ aims to increase the achievable data rate by iteratively scheduling a relatively better UE relay closer to the end user in a probabilistic sense, provided that the relay-to-end user

  7. Refinery scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, Marcus V.; Fraga, Eder T. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Shah, Nilay [Imperial College, London (United Kingdom)

    2004-07-01

    This work addresses the refinery scheduling problem using mathematical programming techniques. The solution adopted was to decompose the entire refinery model into a crude oil scheduling and a product scheduling problem. The envelope for the crude oil scheduling problem is composed of a terminal, a pipeline and the crude area of a refinery, including the crude distillation units. The solution method adopted includes a decomposition technique based on the topology of the system. The envelope for the product scheduling comprises all tanks, process units and products found in a refinery. Once crude scheduling decisions are Also available the product scheduling is solved using a rolling horizon algorithm. All models were tested with real data from PETROBRAS' REFAP refinery, located in Canoas, Southern Brazil. (author)

  8. An Artificial Bee Colony Algorithm for the Job Shop Scheduling Problem with Random Processing Times

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2011-09-01

    Full Text Available Due to the influence of unpredictable random events, the processing time of each operation should be treated as random variables if we aim at a robust production schedule. However, compared with the extensive research on the deterministic model, the stochastic job shop scheduling problem (SJSSP has not received sufficient attention. In this paper, we propose an artificial bee colony (ABC algorithm for SJSSP with the objective of minimizing the maximum lateness (which is an index of service quality. First, we propose a performance estimate for preliminary screening of the candidate solutions. Then, the K-armed bandit model is utilized for reducing the computational burden in the exact evaluation (through Monte Carlo simulation process. Finally, the computational results on different-scale test problems validate the effectiveness and efficiency of the proposed approach.

  9. Hypergraph+: An Improved Hypergraph-Based Task-Scheduling Algorithm for Massive Spatial Data Processing on Master-Slave Platforms

    Directory of Open Access Journals (Sweden)

    Bo Cheng

    2016-08-01

    Full Text Available Spatial data processing often requires massive datasets, and the task/data scheduling efficiency of these applications has an impact on the overall processing performance. Among the existing scheduling strategies, hypergraph-based algorithms capture the data sharing pattern in a global way and significantly reduce total communication volume. Due to heterogeneous processing platforms, however, single hypergraph partitioning for later scheduling may be not optimal. Moreover, these scheduling algorithms neglect the overlap between task execution and data transfer that could further decrease execution time. In order to address these problems, an extended hypergraph-based task-scheduling algorithm, named Hypergraph+, is proposed for massive spatial data processing. Hypergraph+ improves upon current hypergraph scheduling algorithms in two ways: (1 It takes platform heterogeneity into consideration offering a metric function to evaluate the partitioning quality in order to derive the best task/file schedule; and (2 It can maximize the overlap between communication and computation. The GridSim toolkit was used to evaluate Hypergraph+ in an IDW spatial interpolation application on heterogeneous master-slave platforms. Experiments illustrate that the proposed Hypergraph+ algorithm achieves on average a 43% smaller makespan than the original hypergraph scheduling algorithm but still preserves high scheduling efficiency.

  10. Iterative Relay Scheduling with Hybrid ARQ under Multiple User Equipment (Type II) Relay Environments

    KAUST Repository

    Nam, Sung Sik

    2018-01-09

    In this work, we propose an iterative relay scheduling with hybrid ARQ (IRS-HARQ) scheme which realizes fast jump-in/successive relaying and subframe-based decoding under the multiple user equipment (UE) relay environments applicable to the next-generation cellular systems (e.g., LTE-Advanced and beyond). The proposed IRS-HARQ aims to increase the achievable data rate by iteratively scheduling a relatively better UE relay closer to the end user in a probabilistic sense, provided that the relay-to-end user link should be operated in an open-loop and transparent mode. The latter is due to the fact that not only there are no dedicated control channels between the UE relay and the end user but also a new cell is not created. Under this open-loop and transparent mode, our proposed protocol is implemented by partially exploiting the channel state information based on the overhearing mechanism of ACK/NACK for HARQ. Further, the iterative scheduling enables UE-to-UE direct communication with proximity that offers spatial frequency reuse and energy saving.

  11. An Improved Hierarchical Genetic Algorithm for Sheet Cutting Scheduling with Process Constraints

    OpenAIRE

    Yunqing Rao; Dezhong Qi; Jinling Li

    2013-01-01

    For the first time, an improved hierarchical genetic algorithm for sheet cutting problem which involves n cutting patterns for m non-identical parallel machines with process constraints has been proposed in the integrated cutting stock model. The objective of the cutting scheduling problem is minimizing the weighted completed time. A mathematical model for this problem is presented, an improved hierarchical genetic algorithm (ant colony—hierarchical genetic algorithm) is developed for better ...

  12. Decoupling algorithms from schedules for easy optimization of image processing pipelines

    OpenAIRE

    Adams, Andrew; Paris, Sylvain; Levoy, Marc; Ragan-Kelley, Jonathan Millar; Amarasinghe, Saman P.; Durand, Fredo

    2012-01-01

    Using existing programming tools, writing high-performance image processing code requires sacrificing readability, portability, and modularity. We argue that this is a consequence of conflating what computations define the algorithm, with decisions about storage and the order of computation. We refer to these latter two concerns as the schedule, including choices of tiling, fusion, recomputation vs. storage, vectorization, and parallelism. We propose a representation for feed-forward imagi...

  13. Concurrent processes scheduling with scarce resources in small and medium enterprises

    Institute of Scientific and Technical Information of China (English)

    马嵩华

    2016-01-01

    Scarce resources , precedence and non-determined time-lag are three constraints commonly found in small and medium manufacturing enterprises (SMEs), which are deemed to block the ap-plication of workflow management system ( WfMS ) .To tackle this problem , a workflow scheduling approach is proposed based on timing workflow net (TWF-net) and genetic algorithm (GA).The workflow is modelled in a form of TWF-net in favour of process simulation and resource conflict checking .After simplifying and reconstructing the set of workflow instance , the conflict resolution problem is transformed into a resource-constrained project scheduling problem ( RCPSP ) , which could be efficiently solved by a heuristic method , such as GA.Finally, problems of various sizes are utilized to test the performance of the proposed algorithm and to compare it with first-come-first-served ( FCFS) strategy.The evaluation demonstrates that the proposed method is an overwhelming and effective approach for scheduling the concurrent processes with precedence and resource con -straints .

  14. Schedules of Controlled Substances: Placement of FDA-Approved Products of Oral Solutions Containing Dronabinol [(-)-delta-9-transtetrahydrocannabinol (delta-9-THC)] in Schedule II. Interim final rule, with request for comments.

    Science.gov (United States)

    2017-03-23

    On July 1, 2016, the U.S. Food and Drug Administration (FDA) approved a new drug application for Syndros, a drug product consisting of dronabinol [(-)-delta-9-trans-tetrahydrocannabinol (delta-9-THC)] oral solution. Thereafter, the Department of Health and Human Services (HHS) provided the Drug Enforcement Administration (DEA) with a scheduling recommendation that would result in Syndros (and other oral solutions containing dronabinol) being placed in schedule II of the Controlled Substances Act (CSA). In accordance with the CSA, as revised by the Improving Regulatory Transparency for New Medical Therapies Act, DEA is hereby issuing an interim final rule placing FDA-approved products of oral solutions containing dronabinol in schedule II of the CSA.

  15. Analyzing the nursing organizational structure and process from a scheduling perspective.

    Science.gov (United States)

    Maenhout, Broos; Vanhoucke, Mario

    2013-09-01

    The efficient and effective management of nursing personnel is of critical importance in a hospital's environment comprising approximately 25 % of the hospital's operational costs. The nurse organizational structure and the organizational processes highly affect the nurses' working conditions and the provided quality of care. In this paper, we investigate the impact of different nurse organization structures and different organizational processes for a real-life situation in a Belgian university hospital. In order to make accurate nurse staffing decisions, the employed solution methodology incorporates shift scheduling characteristics in order to overcome the deficiencies of the many phase-specific methodologies that are proposed in the academic literature.

  16. 21 CFR 1301.34 - Application for importation of Schedule I and II substances.

    Science.gov (United States)

    2010-04-01

    ... light of changes in: (i) raw materials and other costs and (ii) conditions of supply and demand; (2) The... controlled substances to a number of establishments which can produce an adequate and uninterrupted supply of..., and industrial purposes; (2) Compliance with applicable State and local law; (3) Promotion of...

  17. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    Science.gov (United States)

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  18. Effects of practice schedule and task specificity on the adaptive process of motor learning.

    Science.gov (United States)

    Barros, João Augusto de Camargo; Tani, Go; Corrêa, Umberto Cesar

    2017-10-01

    This study investigated the effects of practice schedule and task specificity based on the perspective of adaptive process of motor learning. For this purpose, tasks with temporal and force control learning requirements were manipulated in experiments 1 and 2, respectively. Specifically, the task consisted of touching with the dominant hand the three sequential targets with specific movement time or force for each touch. Participants were children (N=120), both boys and girls, with an average age of 11.2years (SD=1.0). The design in both experiments involved four practice groups (constant, random, constant-random, and random-constant) and two phases (stabilisation and adaptation). The dependent variables included measures related to the task goal (accuracy and variability of error of the overall movement and force patterns) and movement pattern (macro- and microstructures). Results revealed a similar error of the overall patterns for all groups in both experiments and that they adapted themselves differently in terms of the macro- and microstructures of movement patterns. The study concludes that the effects of practice schedules on the adaptive process of motor learning were both general and specific to the task. That is, they were general to the task goal performance and specific regarding the movement pattern. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A Flexible Job Shop Scheduling Problem with Controllable Processing Times to Optimize Total Cost of Delay and Processing

    Directory of Open Access Journals (Sweden)

    Hadi Mokhtari

    2015-11-01

    Full Text Available In this paper, the flexible job shop scheduling problem with machine flexibility and controllable process times is studied. The main idea is that the processing times of operations may be controlled by consumptions of additional resources. The purpose of this paper to find the best trade-off between processing cost and delay cost in order to minimize the total costs. The proposed model, flexible job shop scheduling with controllable processing times (FJCPT, is formulated as an integer non-linear programming (INLP model and then it is converted into an integer linear programming (ILP model. Due to NP-hardness of FJCPT, conventional analytic optimization methods are not efficient. Hence, in order to solve the problem, a Scatter Search (SS, as an efficient metaheuristic method, is developed. To show the effectiveness of the proposed method, numerical experiments are conducted. The efficiency of the proposed algorithm is compared with that of a genetic algorithm (GA available in the literature for solving FJSP problem. The results showed that the proposed SS provide better solutions than the existing GA.

  20. Biogenesis and proteolytic processing of lysosomal DNase II.

    Directory of Open Access Journals (Sweden)

    Susumu Ohkouchi

    Full Text Available Deoxyribonuclease II (DNase II is a key enzyme in the phagocytic digestion of DNA from apoptotic nuclei. To understand the molecular properties of DNase II, particularly the processing, we prepared a polyclonal antibody against carboxyl-terminal sequences of mouse DNase II. In the present study, partial purification of DNase II using Con A Sepharose enabled the detection of endogenous DNase II by Western blotting. It was interesting that two forms of endogenous DNase II were detected--a 30 kDa form and a 23 kDa form. Neither of those forms carried the expected molecular weight of 45 kDa. Subcellular fractionation showed that the 23 kDa and 30 kDa proteins were localized in lysosomes. The processing of DNase II in vivo was also greatly altered in the liver of mice lacking cathepsin L. DNase II that was extracellularly secreted from cells overexpressing DNase II was detected as a pro-form, which was activated under acidic conditions. These results indicate that DNase II is processed and activated in lysosomes, while cathepsin L is involved in the processing of the enzyme.

  1. An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System

    Science.gov (United States)

    Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed

    PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.

  2. Space network scheduling benchmark: A proof-of-concept process for technology transfer

    Science.gov (United States)

    Moe, Karen; Happell, Nadine; Hayden, B. J.; Barclay, Cathy

    1993-01-01

    This paper describes a detailed proof-of-concept activity to evaluate flexible scheduling technology as implemented in the Request Oriented Scheduling Engine (ROSE) and applied to Space Network (SN) scheduling. The criteria developed for an operational evaluation of a reusable scheduling system is addressed including a methodology to prove that the proposed system performs at least as well as the current system in function and performance. The improvement of the new technology must be demonstrated and evaluated against the cost of making changes. Finally, there is a need to show significant improvement in SN operational procedures. Successful completion of a proof-of-concept would eventually lead to an operational concept and implementation transition plan, which is outside the scope of this paper. However, a high-fidelity benchmark using actual SN scheduling requests has been designed to test the ROSE scheduling tool. The benchmark evaluation methodology, scheduling data, and preliminary results are described.

  3. Hybrid Metaheuristics for Solving a Fuzzy Single Batch-Processing Machine Scheduling Problem

    Directory of Open Access Journals (Sweden)

    S. Molla-Alizadeh-Zavardehi

    2014-01-01

    Full Text Available This paper deals with a problem of minimizing total weighted tardiness of jobs in a real-world single batch-processing machine (SBPM scheduling in the presence of fuzzy due date. In this paper, first a fuzzy mixed integer linear programming model is developed. Then, due to the complexity of the problem, which is NP-hard, we design two hybrid metaheuristics called GA-VNS and VNS-SA applying the advantages of genetic algorithm (GA, variable neighborhood search (VNS, and simulated annealing (SA frameworks. Besides, we propose three fuzzy earliest due date heuristics to solve the given problem. Through computational experiments with several random test problems, a robust calibration is applied on the parameters. Finally, computational results on different-scale test problems are presented to compare the proposed algorithms.

  4. An Improved Hierarchical Genetic Algorithm for Sheet Cutting Scheduling with Process Constraints

    Directory of Open Access Journals (Sweden)

    Yunqing Rao

    2013-01-01

    Full Text Available For the first time, an improved hierarchical genetic algorithm for sheet cutting problem which involves n cutting patterns for m non-identical parallel machines with process constraints has been proposed in the integrated cutting stock model. The objective of the cutting scheduling problem is minimizing the weighted completed time. A mathematical model for this problem is presented, an improved hierarchical genetic algorithm (ant colony—hierarchical genetic algorithm is developed for better solution, and a hierarchical coding method is used based on the characteristics of the problem. Furthermore, to speed up convergence rates and resolve local convergence issues, a kind of adaptive crossover probability and mutation probability is used in this algorithm. The computational result and comparison prove that the presented approach is quite effective for the considered problem.

  5. An improved hierarchical genetic algorithm for sheet cutting scheduling with process constraints.

    Science.gov (United States)

    Rao, Yunqing; Qi, Dezhong; Li, Jinling

    2013-01-01

    For the first time, an improved hierarchical genetic algorithm for sheet cutting problem which involves n cutting patterns for m non-identical parallel machines with process constraints has been proposed in the integrated cutting stock model. The objective of the cutting scheduling problem is minimizing the weighted completed time. A mathematical model for this problem is presented, an improved hierarchical genetic algorithm (ant colony--hierarchical genetic algorithm) is developed for better solution, and a hierarchical coding method is used based on the characteristics of the problem. Furthermore, to speed up convergence rates and resolve local convergence issues, a kind of adaptive crossover probability and mutation probability is used in this algorithm. The computational result and comparison prove that the presented approach is quite effective for the considered problem.

  6. Job schedulers for Big data processing in Hadoop environment: testing real-life schedulers using benchmark programs

    OpenAIRE

    Mohd Usama; Mengchen Liu; Min Chen

    2017-01-01

    At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, manage, store, distribute, and analyze petabyte or larger-sized datasets having different structures with high speed. Big data can be structured, unstructured, or semi structured. Hadoop is an open source framework that is used to process large amounts of data in an inexpensive and ...

  7. Two-Agent Single-Machine Scheduling of Jobs with Time-Dependent Processing Times and Ready Times

    Directory of Open Access Journals (Sweden)

    Jan-Yee Kung

    2013-01-01

    Full Text Available Scheduling involving jobs with time-dependent processing times has recently attracted much research attention. However, multiagent scheduling with simultaneous considerations of jobs with time-dependent processing times and ready times is relatively unexplored. Inspired by this observation, we study a two-agent single-machine scheduling problem in which the jobs have both time-dependent processing times and ready times. We consider the model in which the actual processing time of a job of the first agent is a decreasing function of its scheduled position while the actual processing time of a job of the second agent is an increasing function of its scheduled position. In addition, each job has a different ready time. The objective is to minimize the total completion time of the jobs of the first agent with the restriction that no tardy job is allowed for the second agent. We propose a branch-and-bound and several genetic algorithms to obtain optimal and near-optimal solutions for the problem, respectively. We also conduct extensive computational results to test the proposed algorithms and examine the impacts of different problem parameters on their performance.

  8. Process optimization and mechanistic studies of lead (II): Aspergillus ...

    African Journals Online (AJOL)

    The lead (II) accumulation potential of various biosorbent had been widely studied in the last few years, but an outstanding Pb(II) accumulating biomass still seems crucial for bringing the process to a successful application stage. This investigation describes the use of non-living biomass of Aspergillus caespitosus for ...

  9. Processing time tolerance-based ACO algorithm for solving job-shop scheduling problem

    Science.gov (United States)

    Luo, Yabo; Waden, Yongo P.

    2017-06-01

    Ordinarily, Job Shop Scheduling Problem (JSSP) is known as NP-hard problem which has uncertainty and complexity that cannot be handled by a linear method. Thus, currently studies on JSSP are concentrated mainly on applying different methods of improving the heuristics for optimizing the JSSP. However, there still exist many problems for efficient optimization in the JSSP, namely, low efficiency and poor reliability, which can easily trap the optimization process of JSSP into local optima. Therefore, to solve this problem, a study on Ant Colony Optimization (ACO) algorithm combined with constraint handling tactics is carried out in this paper. Further, the problem is subdivided into three parts: (1) Analysis of processing time tolerance-based constraint features in the JSSP which is performed by the constraint satisfying model; (2) Satisfying the constraints by considering the consistency technology and the constraint spreading algorithm in order to improve the performance of ACO algorithm. Hence, the JSSP model based on the improved ACO algorithm is constructed; (3) The effectiveness of the proposed method based on reliability and efficiency is shown through comparative experiments which are performed on benchmark problems. Consequently, the results obtained by the proposed method are better, and the applied technique can be used in optimizing JSSP.

  10. Mars Aqueous Processing System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The Mars Aqueous Processing System (MAPS) is a novel technology for recovering oxygen, iron, and other constituents from lunar and Mars soils. The closed-loop...

  11. Task Balanced Workflow Scheduling Technique considering Task Processing Rate in Spot Market

    Directory of Open Access Journals (Sweden)

    Daeyong Jung

    2014-01-01

    Full Text Available Recently, the cloud computing is a computing paradigm that constitutes an advanced computing environment that evolved from the distributed computing. And the cloud computing provides acquired computing resources in a pay-as-you-go manner. For example, Amazon EC2 offers the Infrastructure-as-a-Service (IaaS instances in three different ways with different price, reliability, and various performances of instances. Our study is based on the environment using spot instances. Spot instances can significantly decrease costs compared to reserved and on-demand instances. However, spot instances give a more unreliable environment than other instances. In this paper, we propose the workflow scheduling scheme that reduces the out-of-bid situation. Consequently, the total task completion time is decreased. The simulation results reveal that, compared to various instance types, our scheme achieves performance improvements in terms of an average combined metric of 12.76% over workflow scheme without considering the processing rate. However, the cost in our scheme is higher than an instance with low performance and is lower than an instance with high performance.

  12. Magnetite Dissolution Performance of HYBRID-II Decontamination Process

    International Nuclear Information System (INIS)

    Kim, Seonbyeong; Lee, Woosung; Won, Huijun; Moon, Jeikwon; Choi, Wangkyu

    2014-01-01

    In this study, we conducted the magnetite dissolution performance test of HYBRID-II (Hydrazine Based Reductive metal Ion Decontamination with sulfuric acid) as a part of decontamination process development. Decontamination performance of HYBRID process was successfully tested with the results of the acceptable decontamination factor (DF) in the previous study. While following-up studies such as the decomposition of the post-decontamination HYBRID solution and corrosion compatibility on the substrate metals of the target reactor coolant system have been continued, we also seek for an alternate version of HYBRID process suitable especially for decommissioning. Inspired by the relationship between the radius of reacting ion and the reactivity, we replaced the nitrate ion in HYBRID with bigger sulfate ion to accommodate the dissolution reaction and named HYBRID-II process. As a preliminary step for the decontamination performance, we tested the magnetite dissolution performance of developing HYBRID-II process and compared the results with those of HYBRID process. HYBRID process developed previously is known have the acceptable decontamination performance, but the relatively larger volume of secondary waste induced by anion exchange resin to treat nitrate ion is the one of the problems related in the development of HYBRID process to be applicable. Therefore we alternatively devised HYBRID-II process using sulfuric acid and tested its dissolution of magnetite in numerous conditions. From the results shown in this study, we can conclude that HYBRID-II process improves the decontamination performance and potentially reduces the volume of secondary waste. Rigorous tests with metal oxide coupons obtained from reactor coolant system will be followed to prove the robustness of HYBRID-II process in the future

  13. Optimization of multi-objective integrated process planning and scheduling problem using a priority based optimization algorithm

    Science.gov (United States)

    Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu

    2015-12-01

    For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.

  14. A comparison of mixed-integer linear programming models for workforce scheduling with position-dependent processing times

    Science.gov (United States)

    Moreno-Camacho, Carlos A.; Montoya-Torres, Jairo R.; Vélez-Gallego, Mario C.

    2018-06-01

    Only a few studies in the available scientific literature address the problem of having a group of workers that do not share identical levels of productivity during the planning horizon. This study considers a workforce scheduling problem in which the actual processing time is a function of the scheduling sequence to represent the decline in workers' performance, evaluating two classical performance measures separately: makespan and maximum tardiness. Several mathematical models are compared with each other to highlight the advantages of each approach. The mathematical models are tested with randomly generated instances available from a public e-library.

  15. PROCESS OF CHANGES OF MAINTENANCE-FREE ONBOARD SYSTEM OPERATIONAL STATUS BETWEEN SCHEDULED MAINTENANCES

    Directory of Open Access Journals (Sweden)

    Andrey Mikhaylovich Bronnikov

    2017-01-01

    Full Text Available In this article the authors consider the problem of simulating the process of a maintenance-free between scheduled maintenance aircraft system operational status changes, which failure during the flight leads to the disaster. On-board equipment with automatic self-repair between routine maintenance in the event the components fail is called maintenance-free. During operation, onboard equipment accumulates failures maintaining its functions with a safety level not lower than the required minimum. Trouble shooting is carried out either at the end of between-maintenance period (as a rule, or after the failure, which led to the functions disorder or to the decrease below the target level of flight safety (as an exception. The system contains both redundant and nonredundant units and elements with the known failure rates. The system can be in one of the three states: operable, extreme, failed. The excessive redundant elements allow the system to accumulate failures which are repaired during the routine maintenance. The process of system operational status changes is described with the discrete-continuous model in the flight time. Basing on the information about the probabilities of the on-board equipment being in an operable, extreme or failed state, it is possible to calculate such complex efficiency indicators as the average loss of sorties, the average operating costs, the expected number of emergency recovery operations and others. Numerical studies have been conducted to validate the proposed model. It is believed that maintenance work completely updates the system. The analysis of these indicators will allow to evaluate the maintenance-free aircraft equipment operation efficiency, as well as to make an effectiveness comparison with other methods of technical operation. The model can be also used to assess the technical operation systems performance. The model can be used to optimize the period between maintenance.

  16. Comparing Binaural Pre-processing Strategies II

    Directory of Open Access Journals (Sweden)

    Regina M. Baumgärtel

    2015-12-01

    Full Text Available Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs. 50% speech reception thresholds (SRT50 were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users.

  17. Process chemistry of neptunium. Part II

    Energy Technology Data Exchange (ETDEWEB)

    Srinivasan, N.; Ramaniah, M. V.; Patil, S. K.; Ramakrishna, V. V.; Swarup, R.; Chadha, A.; Avadhany, G. V.N.

    1974-07-01

    The oxidation state analysis of neptunium in the aqueous feed solution from the Plutonium Plant at Trombay was carried out and it was found that neptunium existed mainly as Np(V) in the feed solution. Batch extraction data for Np(IV) and Np(VI) into 30% TBP/Shell Sol T at different aqueous nitric acid concentration and uranium saturation of the organic phase were obtained at 45 deg C and 60 deg C and the results are summarized. The distribution coefficients of Np(IV) and Np(VI) were obtained as a function of TBP concentration and the data are reported. The effect of nitrous acid on the extraction of neptunium, present in the aqueous phase as Np(IV) and Np(V), by 30% TBP was studied and the data obtained are given. The data on the rate of reduction of NP(VI) and Np(V) to Np(IV) by U(IV) were obtained for different U(IV) and nitric acid concentrations. Some redox reactions involving Np(IV), Pu(IV) and V(V) were investigated and their possible application in the purex process for neptunium recovery were explored. (auth)

  18. Maximum Lateness Scheduling on Two-Person Cooperative Games with Variable Processing Times and Common Due Date

    OpenAIRE

    Liu, Peng; Wang, Xiaoli

    2017-01-01

    A new maximum lateness scheduling model in which both cooperative games and variable processing times exist simultaneously is considered in this paper. The job variable processing time is described by an increasing or a decreasing function dependent on the position of a job in the sequence. Two persons have to cooperate in order to process a set of jobs. Each of them has a single machine and their processing cost is defined as the minimum value of maximum lateness. All jobs have a common due ...

  19. 76 FR 60359 - Phytosanitary Treatments; Location of and Process for Updating Treatment Schedules; Technical...

    Science.gov (United States)

    2011-09-29

    ... supporting information and data, to the Animal and Plant Health Inspection Service, Plant Protection and... supporting information and data, to the Animal and Plant Health Inspection Service, Plant Protection and... that approved treatment schedules will instead be found in the Plant Protection and Quarantine...

  20. Phase II study of a 3-day schedule with topotecan and cisplatin in patients with previously untreated small cell lung cancer and extensive disease

    DEFF Research Database (Denmark)

    Sorensen, M.; Lassen, Ulrik Niels; Jensen, Peter Buhl

    2008-01-01

    INTRODUCTION: Treatment with a topoisomerase I inhibitor in combination with a platinum results in superior or equal survival compared with etoposide-based treatment in extensive disease small cell lung cancer (SCLC). Five-day topotecan is inconvenient and therefore shorter schedules of topotecan...... and cisplatin are needed. The aim of this phase II study was to establish the response rate and response duration in chemo-naive patients with SCLC receiving a 3-day topotecan and cisplatin schedule. METHODS: Simons optimal two-stage design was used. Patients with previously untreated extensive disease SCLC...... age was 59 (range 44-74), 79% had performance status 0 or 1. Thirty-one patients completed all six cycles. Grade 3/4 anemia, neutrocytopenia, and thrombocytopenia were recorded in 9.5%, 66.7%, and 21.4% of patients, respectively. Fourteen percent of patients experienced neutropenic fever. No episodes...

  1. Value of flexible resources, virtual bidding, and self-scheduling in two-settlement electricity markets with wind generation - Part II: ISO Models and Application

    DEFF Research Database (Denmark)

    Kazempour, Jalal; Hobbs, Benjamin F.

    2017-01-01

    In Part II of this paper, we present formulations for three two-settlement market models: baseline cost-minimization (Stoch-Opt); and two sequential market models in which an independent system operator (ISO) runs real-time (RT) balancing markets after making day-ahead (DA) generating unit...... commitment decisions based upon deterministic wind forecasts, while virtual bidders arbitrage the two markets (Seq and SeqSS). The latter two models differ in terms of whether some slow-start generators can self-schedule in the DA market while anticipating probabilities of RT prices. Models in Seq and Seq......-SS build on components of the two-settlement equilibrium model (Stoch-MP) defined in Part I of this paper [1]. We then provide numerical results for all four models. A simple single-node case illustrates the economic impacts of flexibility, virtual bidding, and self-schedules, and is followed by a larger...

  2. Sustainable Scheduling of Cloth Production Processes by Multi-Objective Genetic Algorithm with Tabu-Enhanced Local Search

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2017-09-01

    Full Text Available The dyeing of textile materials is the most critical process in cloth production because of the strict technological requirements. In addition to the technical aspect, there have been increasing concerns over how to minimize the negative environmental impact of the dyeing industry. The emissions of pollutants are mainly caused by frequent cleaning operations which are necessary for initializing the dyeing equipment, as well as idled production capacity which leads to discharge of unconsumed chemicals. Motivated by these facts, we propose a methodology to reduce the pollutant emissions by means of systematic production scheduling. Firstly, we build a three-objective scheduling model that incorporates both the traditional tardiness objective and the environmentally-related objectives. A mixed-integer programming formulation is also provided to accurately define the problem. Then, we present a novel solution method for the sustainable scheduling problem, namely, a multi-objective genetic algorithm with tabu-enhanced iterated greedy local search strategy (MOGA-TIG. Finally, we conduct extensive computational experiments to investigate the actual performance of the MOGA-TIG. Based on a fair comparison with two state-of-the-art multi-objective optimizers, it is concluded that the MOGA-TIG is able to achieve satisfactory solution quality within tight computational time budget for the studied scheduling problem.

  3. Optimal production scheduling for energy efficiency improvement in biofuel feedstock preprocessing considering work-in-process particle separation

    International Nuclear Information System (INIS)

    Li, Lin; Sun, Zeyi; Yao, Xufeng; Wang, Donghai

    2016-01-01

    Biofuel is considered a promising alternative to traditional liquid transportation fuels. The large-scale substitution of biofuel can greatly enhance global energy security and mitigate greenhouse gas emissions. One major concern of the broad adoption of biofuel is the intensive energy consumption in biofuel manufacturing. This paper focuses on the energy efficiency improvement of biofuel feedstock preprocessing, a major process of cellulosic biofuel manufacturing. An improved scheme of the feedstock preprocessing considering work-in-process particle separation is introduced to reduce energy waste and improve energy efficiency. A scheduling model based on the improved scheme is also developed to identify an optimal production schedule that can minimize the energy consumption of the feedstock preprocessing under production target constraint. A numerical case study is used to illustrate the effectiveness of the proposed method. The research outcome is expected to improve the energy efficiency and enhance the environmental sustainability of biomass feedstock preprocessing. - Highlights: • A novel method to schedule production in biofuel feedstock preprocessing process. • Systems modeling approach is used. • Capable of optimize preprocessing to reduce energy waste and improve energy efficiency. • A numerical case is used to illustrate the effectiveness of the method. • Energy consumption per unit production can be significantly reduced.

  4. A Procedure for scheduling and setting processing priority of MC requests

    CERN Document Server

    Balcar, Stepan

    2013-01-01

    My project contains designing and programming a base of an open system, which should help with the scheduling Monte Carlo production requests needed by the CMS physicists for data analysis within the CMS collaboration. A primary requirement was to create web interface that would be portable and independent of the control logic of the system. Another point of the project was to make a scheduler for the Monte Carlo production planning and to design and program interfaces between the various logical blocks of the system. Introduction Many research groups in CERN which specialize in different areas of particle physics works with CMS. They are mostly scientists working at universities or research institutes in their countries. Their research consists in constructing models of elementary particles and subsequent experimental verification of the behavior of these models. All these groups of people create MC production requests which are to be executed using computing resources located at CERN and other institutes. T...

  5. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    Science.gov (United States)

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  6. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  7. Long term scheduling technique for wastewater minimisation in multipurpose batch processes

    CSIR Research Space (South Africa)

    Nonyane, DR

    2012-05-01

    Full Text Available (2011) xxx?xxx Contents lists available at SciVerse ScienceDirect Applied Mathematical Modelling doi:10.1016/j.apm.2011.08.007 The effect of industrial activities on freshwater resources has become more apparent in the past few decades. This has led... journal homepage: www.elsevier .com/locate /apm e, T. Majozi, Long term scheduling technique for wastewater minimisation in multipurpose :10.1016/j.apm.2011.08.007 Nomenclature Sets P {p|p = time point} J {j|j = unit} C {c|c = contaminant} Sin {sin...

  8. Evaluation of Selected Resource Allocation and Scheduling Methods in Heterogeneous Many-Core Processors and Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Ciznicki Milosz

    2014-12-01

    Full Text Available Heterogeneous many-core computing resources are increasingly popular among users due to their improved performance over homogeneous systems. Many developers have realized that heterogeneous systems, e.g. a combination of a shared memory multi-core CPU machine with massively parallel Graphics Processing Units (GPUs, can provide significant performance opportunities to a wide range of applications. However, the best overall performance can only be achieved if application tasks are efficiently assigned to different types of processor units in time taking into account their specific resource requirements. Additionally, one should note that available heterogeneous resources have been designed as general purpose units, however, with many built-in features accelerating specific application operations. In other words, the same algorithm or application functionality can be implemented as a different task for CPU or GPU. Nevertheless, from the perspective of various evaluation criteria, e.g. the total execution time or energy consumption, we may observe completely different results. Therefore, as tasks can be scheduled and managed in many alternative ways on both many-core CPUs or GPUs and consequently have a huge impact on the overall computing resources performance, there are needs for new and improved resource management techniques. In this paper we discuss results achieved during experimental performance studies of selected task scheduling methods in heterogeneous computing systems. Additionally, we present a new architecture for resource allocation and task scheduling library which provides a generic application programming interface at the operating system level for improving scheduling polices taking into account a diversity of tasks and heterogeneous computing resources characteristics.

  9. Maximum Lateness Scheduling on Two-Person Cooperative Games with Variable Processing Times and Common Due Date

    Directory of Open Access Journals (Sweden)

    Peng Liu

    2017-01-01

    Full Text Available A new maximum lateness scheduling model in which both cooperative games and variable processing times exist simultaneously is considered in this paper. The job variable processing time is described by an increasing or a decreasing function dependent on the position of a job in the sequence. Two persons have to cooperate in order to process a set of jobs. Each of them has a single machine and their processing cost is defined as the minimum value of maximum lateness. All jobs have a common due date. The objective is to maximize the multiplication of their rational positive cooperative profits. A division of those jobs should be negotiated to yield a reasonable cooperative profit allocation scheme acceptable to them. We propose the sufficient and necessary conditions for the problems to have positive integer solution.

  10. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  11. A continuous time model for a short-term multiproduct batch process scheduling

    Directory of Open Access Journals (Sweden)

    Jenny Díaz Ramírez

    2018-01-01

    Full Text Available In the chemical industry, it is common to find production systems characterized by having a single stage or a previously identified bottleneck stage, with multiple non-identical parallel stations and with setup costs that depend on the production sequence. This paper proposes a mixed integer production-scheduling model that identifies lot size and product sequence that maximize profit. It considers multiple typical industry conditions, such as penalties for noncompliance or out of service periods of the productive units (or stations for preventive maintenance activities. The model was validated with real data from an oil chemical company.  Aiming to analyze its performance, we applied the model to 155 instances of production, which were obtained using Monte Carlo technique on the historical production data of the same company.  We obtained an average 12 % reduction in the total cost of production and a 19 % increase in the estimated profit.

  12. CO II laser free-form processing of hard tissue

    Science.gov (United States)

    Werner, Martin; Klasing, Manfred; Ivanenko, Mikhail; Harbecke, Daniela; Steigerwald, Hendrik; Hering, Peter

    2007-07-01

    Drilling and surface processing of bone and tooth tissue belongs to standard medical procedures (bores and embeddings for implants, trepanation etc.). Small circular bores can be generally quickly produced with mechanical drills. However problems arise at angled drilling, the need to execute drilling procedures without damaging of sensitive soft tissue structures underneath the bone or the attempt to mill small non-circular cavities in hard tissue with high precision. We present investigations on laser hard tissue "milling", which can be advantageous for solving these problems. The processing of bone is done with a CO II laser (10.6 μm) with pulse durations of 50 - 100 μs, combined with a PC-controlled fast galvanic laser beam scanner and a fine water-spray, which helps keeping the ablation process effective and without thermal side-effects. Laser "milling" of non-circular cavities with 1 - 4 mm width and about 10 mm depth can be especially interesting for dental implantology. In ex-vivo investigations we found conditions for fast laser processing of these cavities without thermal damage and with minimised tapering. It included the exploration of different filling patterns (concentric rings, crosshatch, parallel lines, etc.), definition of maximal pulse duration, repetition rate and laser power, and optimal water spray position. The optimised results give evidence for the applicability of pulsed CO II lasers for biologically tolerable effective processing of deep cavities in hard tissue.

  13. Technology Estimating 2: A Process to Determine the Cost and Schedule of Space Technology Research and Development

    Science.gov (United States)

    Cole, Stuart K.; Wallace, Jon; Schaffer, Mark; May, M. Scott; Greenberg, Marc W.

    2014-01-01

    As a leader in space technology research and development, NASA is continuing in the development of the Technology Estimating process, initiated in 2012, for estimating the cost and schedule of low maturity technology research and development, where the Technology Readiness Level is less than TRL 6. NASA' s Technology Roadmap areas consist of 14 technology areas. The focus of this continuing Technology Estimating effort included four Technology Areas (TA): TA3 Space Power and Energy Storage, TA4 Robotics, TA8 Instruments, and TA12 Materials, to confine the research to the most abundant data pool. This research report continues the development of technology estimating efforts completed during 2013-2014, and addresses the refinement of parameters selected and recommended for use in the estimating process, where the parameters developed are applicable to Cost Estimating Relationships (CERs) used in the parametric cost estimating analysis. This research addresses the architecture for administration of the Technology Cost and Scheduling Estimating tool, the parameters suggested for computer software adjunct to any technology area, and the identification of gaps in the Technology Estimating process.

  14. An Improved Version of Discrete Particle Swarm Optimization for Flexible Job Shop Scheduling Problem with Fuzzy Processing Time

    Directory of Open Access Journals (Sweden)

    Song Huang

    2016-01-01

    Full Text Available The fuzzy processing time occasionally exists in job shop scheduling problem of flexible manufacturing system. To deal with fuzzy processing time, fuzzy flexible job shop model was established in several papers and has attracted numerous researchers’ attention recently. In our research, an improved version of discrete particle swarm optimization (IDPSO is designed to solve flexible job shop scheduling problem with fuzzy processing time (FJSPF. In IDPSO, heuristic initial methods based on triangular fuzzy number are developed, and a combination of six initial methods is applied to initialize machine assignment and random method is used to initialize operation sequence. Then, some simple and effective discrete operators are employed to update particle’s position and generate new particles. In order to guide the particles effectively, we extend global best position to a set with several global best positions. Finally, experiments are designed to investigate the impact of four parameters in IDPSO by Taguchi method, and IDPSO is tested on five instances and compared with some state-of-the-art algorithms. The experimental results show that the proposed algorithm can obtain better solutions for FJSPF and is more competitive than the compared algorithms.

  15. Evaluation criteria for dialogue processes: key findings from RISCOM II

    International Nuclear Information System (INIS)

    Atherton, Elizabeth

    2003-01-01

    As part of Work Package 4 (undertaken by a consortium of partners from the United Kingdom) in the joint European project RISCOM II, work was undertaken on evaluation criteria for determining the success of dialogue processes; this note outlines its key findings as, in order to continue the development of dialogue processes, it is important to evaluate and learn from the experience of engaging with stakeholders. Criteria can be developed to evaluate how successful a process has been, these can range from very practical criteria relating to how well the process worked or be linked to more subjective criteria developed from the aims of the dialogue process itself. Some criteria are particularly relevant to dialogue processes that aim to encourage deliberation and the development of stakeholders' views through participation in the dialogue process: transparency, legitimacy, equality of access, 'being able to speak', a deliberative environment, openness of framing, developing insight into range of issues (new meanings are generated), inclusive and 'best' knowledge elicited, producing acceptable/tolerable and usable outcomes/decisions, improvement of trust and understanding between participants, developing a sense of shared responsibility and common good. Evaluation will incur a cost in terms of time and money, but will help practitioners to be able to develop processes that meet the needs of those who participate and improve the way that we try to engage people in the debate

  16. VA Construction: Improved Processes Needed to Monitor Contract Modifications, Develop Schedules, and Estimate Costs

    Science.gov (United States)

    2017-03-01

    the Handbook.36 VA headquarters officials told us that regional CFM offices monitor change- order - processing time frames for projects in their...visited collected different types of data on change orders. Because VA lacks the data on the change order processing timeframes required by the Handbook...goals of processing change orders in a timelier manner, especially given our previous findings that change- order - processing time frames caused

  17. Fabrication process for the PEP II RF cavities

    Energy Technology Data Exchange (ETDEWEB)

    Franks, R.M.; Rimmer, R.A. [Lawrence Berkeley National Lab., CA (United States); Schwarz, H. [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1997-06-05

    This paper presents the major steps used in the fabrication of the 26 RF Cavities required for the PEP-II B-factory. Several unique applications of conventional processes have been developed and successfully implemented: electron beam welding (EBW), with minimal porosity, of .75 inch (19 mm) copper cross-sections; extensive 5-axis milling of water channels; electroplating of .37 inch (10 mm) thick OFE copper; tuning of the cavity by profiling beam noses prior to final joining with the cavity body; and machining of the cavity interior, are described here.

  18. Efficient Buffer Capacity and Scheduler Setting Computation for Soft Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Bekooij, Marco; Bekooij, Marco Jan Gerrit; Wiggers, M.H.; van Meerbergen, Jef

    2007-01-01

    Soft real-time applications that process data streams can often be intuitively described as dataflow process networks. In this paper we present a novel analysis technique to compute conservative estimates of the required buffer capacities in such process networks. With the same analysis technique

  19. Abstractions for aperiodic multiprocessor scheduling of real-time stream processing applications

    NARCIS (Netherlands)

    Hausmans, J.P.H.M.

    2015-01-01

    Embedded multiprocessor systems are often used in the domain of real-time stream processing applications to keep up with increasing power and performance requirements. Examples of such real-time stream processing applications are digital radio baseband processing and WLAN transceivers. These stream

  20. Single Machine Scheduling and Due Date Assignment with Past-Sequence-Dependent Setup Time and Position-Dependent Processing Time

    Directory of Open Access Journals (Sweden)

    Chuan-Li Zhao

    2014-01-01

    Full Text Available This paper considers single machine scheduling and due date assignment with setup time. The setup time is proportional to the length of the already processed jobs; that is, the setup time is past-sequence-dependent (p-s-d. It is assumed that a job's processing time depends on its position in a sequence. The objective functions include total earliness, the weighted number of tardy jobs, and the cost of due date assignment. We analyze these problems with two different due date assignment methods. We first consider the model with job-dependent position effects. For each case, by converting the problem to a series of assignment problems, we proved that the problems can be solved in On4 time. For the model with job-independent position effects, we proved that the problems can be solved in On3 time by providing a dynamic programming algorithm.

  1. Effects of the amount and schedule of varied practice after constant practice on the adaptive process of motor learning

    Directory of Open Access Journals (Sweden)

    Umberto Cesar Corrêa

    2014-12-01

    Full Text Available This study investigated the effects of different amounts and schedules of varied practice, after constant practice, on the adaptive process of motor learning. Participants were one hundred and seven children with a mean age of 11.1 ± 0.9 years. Three experiments were carried out using a complex anticipatory timing task manipulating the following components in the varied practice: visual stimulus speed (experiment 1; sequential response pattern (experiment 2; and visual stimulus speed plus sequential response pattern (experiment 3. In all experiments the design involved three amounts (18, 36, and 63 trials, and two schedules (random and blocked of varied practice. The experiments also involved two learning phases: stabilization and adaptation. The dependent variables were the absolute, variable, and constant errors related to the task goal, and the relative timing of the sequential response. Results showed that all groups worsened the performances in the adaptation phase, and no difference was observed between them. Altogether, the results of the three experiments allow the conclusion that the amounts of trials manipulated in the random and blocked practices did not promote the diversification of the skill since no adaptation was observed.

  2. Resources allocation and scheduling approaches for business process applications in Cloud contexts

    OpenAIRE

    Bessai , Kahina; Youcef , Samir; Oulamara , Ammar; Godart , Claude; Nurcan , Selmin

    2012-01-01

    International audience; In the last years, the Cloud computing environment has emerged as new execution support of business process. However, despite the proven benefits of using Cloud to run business process, users lack guidance for choosing between multiple offerings while taking into account several objectives, which are often conflicting. On the other side, elastic computing, such as Amazon EC2, allows users to allocate and release compute resources (virtual machines) on-demand and pay on...

  3. Schedule-selective biochemical modulation of 5-fluorouracil in advanced colorectal cancer – a phase II study

    Directory of Open Access Journals (Sweden)

    Savage Paul

    2002-05-01

    Full Text Available Abstract Background 5-fluorouracil remains the standard therapy for patients with advanced/metastatic colorectal cancer. Pre-clinical studies have demonstrated the biological modulation of 5-fluorouracil by methotrexate and leucovorin. This phase II study was initiated to determine the activity and toxicity of sequential methotrexate – leucovorin and 5-fluorouracil chemotherapy in patients with advanced colorectal cancer. Methods Ninety-seven patients with metastatic colorectal cancer were enrolled onto the study. Methotrexate – 30 mg/m2 was administered every 6 hours for 6 doses followed by a 2 hour infusion of LV – 500 mg/m2. Midway through the leucovorin infusion, patients received 5-fluorouracil – 600 mg/m2. This constituted a cycle of therapy and was repeated every 2 weeks until progression. Results The median age was 64 yrs (34–84 and the Eastern Cooperative Group Oncology performance score was 0 in 37%, 1 in 55% and 2 in 8% of patients. Partial and complete responses were seen in 31% of patients with a median duration of response of 6.4 months. The overall median survival was 13.0 months. The estimated 1-year survival was 53.7%. Grade III and IV toxic effects were modest and included mucositis, nausea and vomiting. Conclusions This phase II study supports previously reported data demonstrating the modest clinical benefit of 5-FU modulation utilizing methotrexate and leucovorin in patients with metastatic colorectal cancer. Ongoing studies evaluating 5-fluorouracil modulation with more novel agents (Irinotecan and/or oxaliplatin are in progress and may prove encouraging.

  4. Technology Estimating: A Process to Determine the Cost and Schedule of Space Technology Research and Development

    Science.gov (United States)

    Cole, Stuart K.; Reeves, John D.; Williams-Byrd, Julie A.; Greenberg, Marc; Comstock, Doug; Olds, John R.; Wallace, Jon; DePasquale, Dominic; Schaffer, Mark

    2013-01-01

    NASA is investing in new technologies that include 14 primary technology roadmap areas, and aeronautics. Understanding the cost for research and development of these technologies and the time it takes to increase the maturity of the technology is important to the support of the ongoing and future NASA missions. Overall, technology estimating may help provide guidance to technology investment strategies to help improve evaluation of technology affordability, and aid in decision support. The research provides a summary of the framework development of a Technology Estimating process where four technology roadmap areas were selected to be studied. The framework includes definition of terms, discussion for narrowing the focus from 14 NASA Technology Roadmap areas to four, and further refinement to include technologies, TRL range of 2 to 6. Included in this paper is a discussion to address the evaluation of 20 unique technology parameters that were initially identified, evaluated and then subsequently reduced for use in characterizing these technologies. A discussion of data acquisition effort and criteria established for data quality are provided. The findings obtained during the research included gaps identified, and a description of a spreadsheet-based estimating tool initiated as a part of the Technology Estimating process.

  5. Mercury Phase II Study - Mercury Behavior in Salt Processing Flowsheet

    International Nuclear Information System (INIS)

    Jain, V.; Shah, H.; Wilmarth, W. R.

    2016-01-01

    Mercury (Hg) in the Savannah River Site Liquid Waste System (LWS) originated from decades of canyon processing where it was used as a catalyst for dissolving the aluminum cladding of reactor fuel. Approximately 60 metric tons of mercury is currently present throughout the LWS. Mercury has long been a consideration in the LWS, from both hazard and processing perspectives. In February 2015, a Mercury Program Team was established at the request of the Department of Energy to develop a comprehensive action plan for long-term management and removal of mercury. Evaluation was focused in two Phases. Phase I activities assessed the Liquid Waste inventory and chemical processing behavior using a system-by-system review methodology, and determined the speciation of the different mercury forms (Hg+, Hg++, elemental Hg, organomercury, and soluble versus insoluble mercury) within the LWS. Phase II activities are building on the Phase I activities, and results of the LWS flowsheet evaluations will be summarized in three reports: Mercury Behavior in the Salt Processing Flowsheet (i.e. this report); Mercury Behavior in the Defense Waste Processing Facility (DWPF) Flowsheet; and Mercury behavior in the Tank Farm Flowsheet (Evaporator Operations). The evaluation of the mercury behavior in the salt processing flowsheet indicates, inter alia, the following: (1) In the assembled Salt Batches 7, 8 and 9 in Tank 21, the total mercury is mostly soluble with methylmercury (MHg) contributing over 50% of the total mercury. Based on the analyses of samples from 2H Evaporator feed and drop tanks (Tanks 38/43), the source of MHg in Salt Batches 7, 8 and 9 can be attributed to the 2H evaporator concentrate used in assembling the salt batches. The 2H Evaporator is used to evaporate DWPF recycle water. (2) Comparison of data between Tank 21/49, Salt Solution Feed Tank (SSFT), Decontaminated Salt Solution Hold Tank (DSSHT), and Tank 50 samples suggests that the total mercury as well as speciated

  6. Nonlinear model-based control of the Czochralski process III: Proper choice of manipulated variables and controller parameter scheduling

    Science.gov (United States)

    Neubert, M.; Winkler, J.

    2012-12-01

    This contribution continues an article series [1,2] about the nonlinear model-based control of the Czochralski crystal growth process. The key idea of the presented approach is to use a sophisticated combination of nonlinear model-based and conventional (linear) PI controllers for tracking of both, crystal radius and growth rate. Using heater power and pulling speed as manipulated variables several controller structures are possible. The present part tries to systematize the properties of the materials to be grown in order to get unambiguous decision criteria for a most profitable choice of the controller structure. For this purpose a material specific constant M called interface mobility and a more process specific constant S called system response number are introduced. While the first one summarizes important material properties like thermal conductivity and latent heat the latter one characterizes the process by evaluating the average axial thermal gradients at the phase boundary and the actual growth rate at which the crystal is grown. Furthermore these characteristic numbers are useful for establishing a scheduling strategy for the PI controller parameters in order to improve the controller performance. Finally, both numbers give a better understanding of the general thermal system dynamics of the Czochralski technique.

  7. Kinetic Simulations of Type II Radio Burst Emission Processes

    Science.gov (United States)

    Ganse, U.; Spanier, F. A.; Vainio, R. O.

    2011-12-01

    The fundamental emission process of Type II Radio Bursts has been under discussion for many decades. While analytic deliberations point to three wave interaction as the source for fundamental and harmonic radio emissions, sparse in-situ observational data and high computational demands for kinetic simulations have not allowed for a definite conclusion to be reached. A popular model puts the radio emission into the foreshock region of a coronal mass ejection's shock front, where shock drift acceleration can create eletrcon beam populations in the otherwise quiescent foreshock plasma. Beam-driven instabilities are then assumed to create waves, forming the starting point of three wave interaction processes. Using our kinetic particle-in-cell code, we have studied a number of emission scenarios based on electron beam populations in a CME foreshock, with focus on wave-interaction microphysics on kinetic scales. The self-consistent, fully kinetic simulations with completely physical mass-ratio show fundamental and harmonic emission of transverse electromagnetic waves and allow for detailled statistical analysis of all contributing wavemodes and their couplings.

  8. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  9. Post Process Characterization of Friction Stir Welded Components, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Luna Innovations Incorporated proposes in this STTR Phase II project to continue development and validation of Luna's amplitude-dependent, nonlinear ultrasonic...

  10. Cartilage turnover reflected by metabolic processing of type II collagen

    DEFF Research Database (Denmark)

    Gudmann, Karoline Natasja Stæhr; Wang, Jianxia; Hoielt, Sabine

    2014-01-01

    The aim of this study was to enable measurement of cartilage formation by a novel biomarker of type II collagen formation. The competitive enzyme-linked immunosorbent assay (ELISA) Pro-C2 was developed and characterized for assessment of the beta splice variant of type II procollagen (PIIBNP). Th...

  11. Preliminary evaluation of alternative waste form solidification processes. Volume II. Evaluation of the processes

    International Nuclear Information System (INIS)

    1980-08-01

    This Volume II presents engineering feasibility evaluations of the eleven processes for solidification of nuclear high-level liquid wastes (HHLW) described in Volume I of this report. Each evaluation was based in a systematic assessment of the process in respect to six principal evaluation criteria: complexity of process; state of development; safety; process requirements; development work required; and facility requirements. The principal criteria were further subdivided into a total of 22 subcriteria, each of which was assigned a weight. Each process was then assigned a figure of merit, on a scale of 1 to 10, for each of the subcriteria. A total rating was obtained for each process by summing the products of the subcriteria ratings and the subcriteria weights. The evaluations were based on the process descriptions presented in Volume I of this report, supplemented by information obtained from the literature, including publications by the originators of the various processes. Waste form properties were, in general, not evaluated. This document describes the approach which was taken, the developent and application of the rating criteria and subcriteria, and the evaluation results. A series of appendices set forth summary descriptions of the processes and the ratings, together with the complete numerical ratings assigned; two appendices present further technical details on the rating process

  12. SPANR planning and scheduling

    Science.gov (United States)

    Freund, Richard F.; Braun, Tracy D.; Kussow, Matthew; Godfrey, Michael; Koyama, Terry

    2001-07-01

    SPANR (Schedule, Plan, Assess Networked Resources) is (i) a pre-run, off-line planning and (ii) a runtime, just-in-time scheduling mechanism. It is designed to support primarily commercial applications in that it optimizes throughput rather than individual jobs (unless they have highest priority). Thus it is a tool for a commercial production manager to maximize total work. First the SPANR Planner is presented showing the ability to do predictive 'what-if' planning. It can answer such questions as, (i) what is the overall effect of acquiring new hardware or (ii) what would be the effect of a different scheduler. The ability of the SPANR Planner to formulate in advance tree-trimming strategies is useful in several commercial applications, such as electronic design or pharmaceutical simulations. The SPANR Planner is demonstrated using a variety of benchmarks. The SPANR Runtime Scheduler (RS) is briefly presented. The SPANR RS can provide benefit for several commercial applications, such as airframe design and financial applications. Finally a design is shown whereby SPANR can provide scheduling advice to most resource management systems.

  13. NASA Schedule Management Handbook

    Science.gov (United States)

    2011-01-01

    The purpose of schedule management is to provide the framework for time-phasing, resource planning, coordination, and communicating the necessary tasks within a work effort. The intent is to improve schedule management by providing recommended concepts, processes, and techniques used within the Agency and private industry. The intended function of this handbook is two-fold: first, to provide guidance for meeting the scheduling requirements contained in NPR 7120.5, NASA Space Flight Program and Project Management Requirements, NPR 7120.7, NASA Information Technology and Institutional Infrastructure Program and Project Requirements, NPR 7120.8, NASA Research and Technology Program and Project Management Requirements, and NPD 1000.5, Policy for NASA Acquisition. The second function is to describe the schedule management approach and the recommended best practices for carrying out this project control function. With regards to the above project management requirements documents, it should be noted that those space flight projects previously established and approved under the guidance of prior versions of NPR 7120.5 will continue to comply with those requirements until project completion has been achieved. This handbook will be updated as needed, to enhance efficient and effective schedule management across the Agency. It is acknowledged that most, if not all, external organizations participating in NASA programs/projects will have their own internal schedule management documents. Issues that arise from conflicting schedule guidance will be resolved on a case by case basis as contracts and partnering relationships are established. It is also acknowledged and understood that all projects are not the same and may require different levels of schedule visibility, scrutiny and control. Project type, value, and complexity are factors that typically dictate which schedule management practices should be employed.

  14. Simultaneous decomplexation in blended Cu(II)/Ni(II)-EDTA systems by electro-Fenton process using iron sacrificing electrodes.

    Science.gov (United States)

    Zhao, Zilong; Dong, Wenyi; Wang, Hongjie; Chen, Guanhan; Tang, Junyi; Wu, Yang

    2018-05-15

    This research explored the application of electro-Fenton (E-Fenton) technique for the simultaneous decomplexation in blended Cu(II)/Ni(II)-EDTA systems by using iron sacrificing electrodes. Standard discharge (0.3 mg L -1 for Cu and 0.1 mg L -1 for Ni in China) could be achieved after 30 min reaction under the optimum conditions (i.e. initial solution pH of 2.0, H 2 O 2 dosage of 6 mL L -1  h -1 , current density of 20 mA/cm 2 , inter-electrode distance of 2 cm, and sulfate electrolyte concentration of 2000 mg L -1 ). The distinct differences in apparent kinetic rate constants (k app ) and intermediate removal efficiencies corresponding to mere and blended systems indicated the mutual promotion effect toward the decomplexation between Cu(II) and Ni(II). Massive accumulation of Fe(Ⅲ) favored the further removal of Cu(II) and Ni(II) by metal ion substitution. Species distribution results demonstrated that the decomplexation of metal-EDTA in E-Fenton process was mainly contributed to the combination of various reactions, including Fenton reaction together with the anodic oxidation, electro-coagulation (E-coagulation) and electrodeposition. Unlike hypophosphite and citrate, the presence of chlorine ion displayed favorable effects on the removal efficiencies of Cu(II) and Ni(II) at low dosage, but facilitated the ammonia nitrogen (NH 4 + -N) removal only at high dosage. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Scheduling with Learning Effects and/or Time-Dependent Processing Times to Minimize the Weighted Number of Tardy Jobs on a Single Machine

    Directory of Open Access Journals (Sweden)

    Jianbo Qian

    2013-01-01

    Full Text Available We consider single machine scheduling problems with learning/deterioration effects and time-dependent processing times, with due date assignment consideration, and our objective is to minimize the weighted number of tardy jobs. By reducing all versions of the problem to an assignment problem, we solve them in O(n4 time. For some important special cases, the time complexity can be improved to be O(n2 using dynamic programming techniques.

  16. Scheduling theory, algorithms, and systems

    CERN Document Server

    Pinedo, Michael L

    2016-01-01

    This new edition of the well-established text Scheduling: Theory, Algorithms, and Systems provides an up-to-date coverage of important theoretical models in the scheduling literature as well as important scheduling problems that appear in the real world. The accompanying website includes supplementary material in the form of slide-shows from industry as well as movies that show actual implementations of scheduling systems. The main structure of the book, as per previous editions, consists of three parts. The first part focuses on deterministic scheduling and the related combinatorial problems. The second part covers probabilistic scheduling models; in this part it is assumed that processing times and other problem data are random and not known in advance. The third part deals with scheduling in practice; it covers heuristics that are popular with practitioners and discusses system design and implementation issues. All three parts of this new edition have been revamped, streamlined, and extended. The reference...

  17. ATLAS construction schedule

    CERN Multimedia

    Kotamaki, M

    The goal during the last few months has been to freeze and baseline as much as possible the schedules of various ATLAS systems and activities. The main motivations for the re-baselining of the schedules have been the new LHC schedule aiming at first collisions in early 2006 and the encountered delays in civil engineering as well as in the production of some of the detectors. The process was started by first preparing a new installation schedule that takes into account all the new external constraints and the new ATLAS staging scenario. The installation schedule version 3 was approved in the March EB and it provides the Ready For Installation (RFI) milestones for each system, i.e. the date when the system should be available for the start of the installation. TCn is now interacting with the systems aiming at a more realistic and resource loaded version 4 before the end of the year. Using the new RFI milestones as driving dates a new summary schedule has been prepared, or is under preparation, for each system....

  18. Software Defined Common Processing System (SDCPS), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated (CLX) proposes the development of a Software Defined Common Processing System (SDCPS) that leverages the inherent advantages of an...

  19. Friction Stir Processing of Cast Superalloys, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR effort examines the feasibility of an innovative fabrication technology incorporating sand casting and friction stir processing (FSP) for producing...

  20. Accelerated Numerical Processing API Based on GPU Technology, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The recent performance increases in graphics processing units (GPUs) have made graphics cards an attractive platform for implementing computationally intense...

  1. Tin( ii ) ketoacidoximates: synthesis, X-ray structures and processing to tin( ii ) oxide

    KAUST Repository

    Khanderi, Jayaprakash

    2015-10-21

    Tin(ii) ketoacidoximates of the type [HONCRCOO]Sn (R = Me 1, CHPh 2) and (MeONCMeCOO)Sn] NH·2HO 3 were synthesized by reacting pyruvate- and hydroxyl- or methoxylamine RONH (R = H, Me) with tin(ii) chloride dihydrate SnCl·2HO. The single crystal X-ray structure reveals that the geometry at the Sn atom is trigonal bipyramidal in 1, 2 and trigonal pyramidal in 3. Inter- or intramolecular hydrogen bonding is observed in 1-3. Thermogravimetric (TG) analysis shows that the decomposition of 1-3 to SnO occurs at ca. 160 °C. The evolved gas analysis during TG indicates complete loss of the oximato ligand in one step for 1 whereas a small organic residue is additionally removed at temperatures >400 °C for 2. Above 140 °C, [HONC(Me)COO]Sn (1) decomposes in air to spherical SnO particles of size 10-500 nm. Spin coating of 1 on Si or a glass substrate followed by heating at 200 °C results in a uniform film of SnO. The band gap of the produced SnO film and nanomaterial was determined by diffuse reflectance spectroscopy to be in the range of 3.0-3.3 eV. X-ray photoelectron spectroscopy indicates surface oxidation of the SnO film to SnO in ambient atmosphere.

  2. Review process and quality assurance in the EBR-II probabilistic risk assessment

    International Nuclear Information System (INIS)

    Roglans, J.; Hill, D.J.; Ragland, W.A.

    1992-01-01

    A Probabilistic Risk Assessment (PRA) of the Experimental Breeder Reactor II (EBR-II), a Department of Energy (DOE) Category A reactor, has recently been completed at Argonne National Laboratory (ANL). Within the scope of the ANL QA Programs, a QA Plan specifically for the EBR-II PRA was developed. The QA Plan covered all aspects of the PRA development, with emphasis on the procedures for document and software control, and the internal and external review process. The effort spent in the quality assurance tasks for the EBR-II PRA has reciprocated by providing acceptance of the work and confidence in the quality of the results

  3. Integrated batch production and maintenance scheduling for multiple items processed on a deteriorating machine to minimize total production and maintenance costs with due date constraint

    Directory of Open Access Journals (Sweden)

    Zahedi Zahedi

    2016-04-01

    Full Text Available This paper discusses an integrated model of batch production and maintenance scheduling on a deteriorating machine producing multiple items to be delivered at a common due date. The model describes the trade-off between total inventory cost and maintenance cost as the increase of production run length. The production run length is a time bucket between two consecutive preventive maintenance activities. The objective function of the model is to minimize total cost consisting of in process and completed part inventory costs, setup cost, preventive and corrective maintenance costs and rework cost. The problem is to determine the optimal production run length and to schedule the batches obtained from determining the production run length in order to minimize total cost.

  4. Scheduling the scheduling task : a time management perspective on scheduling

    NARCIS (Netherlands)

    Larco Martinelli, J.A.; Wiers, V.C.S.; Fransoo, J.C.

    2013-01-01

    Time is the most critical resource at the disposal of schedulers. Hence, an adequate management of time from the schedulers may impact positively on the scheduler’s productivity and responsiveness to uncertain scheduling environments. This paper presents a field study of how schedulers make use of

  5. An Opponent-Process Theory of Motivation: II. Cigarette Addiction

    Science.gov (United States)

    Solomon, Richard L.; Corbit, John D.

    1973-01-01

    Methods suggested by opponent-process theory of acquired motivation in helping smokers to quit the habit include use of antagonistic drugs, total cessation from tobacco, and decrease in intensity and frequency of tobacco use. (DS)

  6. Phase I/II Study Evaluating Early Tolerance in Breast Cancer Patients Undergoing Accelerated Partial Breast Irradiation Treated With the MammoSite Balloon Breast Brachytherapy Catheter Using a 2-Day Dose Schedule

    International Nuclear Information System (INIS)

    Wallace, Michelle; Martinez, Alvaro; Mitchell, Christina; Chen, Peter Y.; Ghilezan, Mihai; Benitez, Pamela; Brown, Eric; Vicini, Frank

    2010-01-01

    Purpose: Initial Phase I/II results using balloon brachytherapy to deliver accelerated partial breast irradiation (APBI) in 2 days in patients with early-stage breast cancer are presented. Materials and Methods: Between March 2004 and August 2007, 45 patients received adjuvant radiation therapy after lumpectomy with balloon brachytherapy in a Phase I/II trial delivering 2800 cGy in four fractions of 700 cGy. Toxicities were evaluated using the National Cancer Institute Common Toxicity Criteria for Adverse Events v3.0 scale and cosmesis was documented at ≥6 months. Results: The median age was 66 years (range, 48-83) and median skin spacing was 12 mm (range, 8-24). The median follow-up was 11.4 months (5.4-48 months) with 21 patients (47%) followed ≥1 year, 11 (24%) ≥2 years, and 7 (16%) ≥3 years. At <6 months (n = 45), Grade II toxicity rates were 9% radiation dermatitis, 13% breast pain, 2% edema, and 2% hyperpigmentation. Grade III breast pain was reported in 13% (n = 6). At ≥6 months (n = 43), Grade II toxicity rates were: 2% radiation dermatitis, 2% induration, and 2% hypopigmentation. Grade III breast pain was reported in 2%. Infection was 13% (n = 6) at <6 months and 5% (n = 2) at ≥6 months. Persistent seroma ≥6 months was 30% (n = 13). Fat necrosis developed in 4 cases (2 symptomatic). Rib fractures were seen in 4% (n = 2). Cosmesis was good/excellent in 96% of cases. Conclusions: Treatment with balloon brachytherapy using a 2-day dose schedule resulted acceptable rates of Grade II/III chronic toxicity rates and similar cosmetic results observed with a standard 5-day accelerated partial breast irradiation schedule.

  7. Planck 2013 results. II. The Low Frequency Instrument data processing

    DEFF Research Database (Denmark)

    Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.

    2013-01-01

    We describe the data processing pipeline of the Planck Low Frequency Instrument (LFI) data processing centre (DPC) to create and characterize full-sky maps based on the first 15.5 months of operations at 30, 44, and 70 GHz. In particular, we discuss the various steps involved in reducing the data......) is employed to combine radiometric data and pointing information into sky maps, minimizing the variance of correlated noise. Noise covariance matrices, required to compute statistical uncertainties on LFI and Planck products, are also produced. Main beams are estimated down to the approximate to-20 dB level...

  8. Heavy lon Reactions The Elementary Processes, Parts I and II

    CERN Document Server

    Broglia, Ricardo A

    2004-01-01

    Combining elastic and inelastic processes with transfer reactions, this two-part volume explores how these events affect heavy ion collisions. Special attention is given to processes involving the transfer of two nucleons, which are specific for probing pairing correlations in nuclei. This novel treatment provides, together with the description of surface vibration and rotations, a unified picture of heavy ion reactions in terms of the elementary modes of nuclear excitation. Heavy Ion Reactions is essential reading for beginning graduate students as well as experienced researchers.

  9. Processes on Uncontrolled Aerodromes and Safety Indicators - Part II

    Directory of Open Access Journals (Sweden)

    Vladimír Plos

    2014-01-01

    Full Text Available This article follows on the Part I, where the basic processes on uncontrolled aerodromes were introduced. The uncontrolled aerodromes face with the growing traffic and from that result the higher workload on AFIS officer. This means a higher potential for dangerous situations.The article describes some models of sub-processes and creates several safety indicators related to the operation at uncontrolled aerodromes. Thanks to monitoring and evaluation of safety indicators can be adopted targeted safety measures and thus increase safety on small uncontrolled aerodromes.

  10. Planck 2015 results. II. Low Frequency Instrument data processing

    CERN Document Server

    Ade, P.A.R.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Ballardini, M.; Banday, A.J.; Barreiro, R.B.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.-F.; Castex, G.; Catalano, A.; Chamballu, A.; Christensen, P.R.; Colombi, S.; Colombo, L.P.L.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T.A.; Eriksen, H.K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Franceschet, C.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.L.; Henrot-Versillé, S.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, A.H.; Jaffe, T.R.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Knoche, J.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P.B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P.M.; Macías-Pérez, J.F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P.G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McGehee, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Morisset, N.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J.A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Nørgaard-Nielsen, H.U.; Novikov, D.; Novikov, I.; Oppermann, N.; Paci, F.; Pagano, L.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Peel, M.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Pierpaoli, E.; Pietrobon, D.; Pointecouteau, E.; Polenta, G.; Pratt, G.W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Rocha, G.; Romelli, E.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Stolyarov, V.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L.A.; Wandelt, B.D.; Watson, R.; Wehus, I.K.; Wilkinson, A.; Yvon, D.; Zacchei, A.

    2016-01-01

    We present an updated description of the Planck Low Frequency (LFI) data processing pipeline, associated with the 2015 data release. We point out the places in which our results and methods have remained unchanged since the 2013 paper and we highlight the changes made for the 2015 release, describing the products (especially timelines) and the ways in which they were obtained. We demonstrate that the pipeline is self-consistent (principally based on simulations) and report all null tests. We refer to other related papers where more detailed descriptions on the LFI data processing pipeline may be found if needed.

  11. Planck 2015 results: II. Low Frequency Instrument data processings

    DEFF Research Database (Denmark)

    Ade, P. A R; Aghanim, N.; Ashdown, M.

    2016-01-01

    We present an updated description of the Planck Low Frequency Instrument (LFI) data processing pipeline, associated with the 2015 data release. We point out the places where our results and methods have remained unchanged since the 2013 paper and we highlight the changes made for the 2015 release...

  12. Zinc electrode shape change II. Process and mechanism

    NARCIS (Netherlands)

    Einerhand, R.E.F.; Visscher, W.; de Goeij, J.J.M.; Barendrecht, E.

    1991-01-01

    The process and mechanism of zinc electrode shape change is investigated with the radiotracer technique. It is shownthat during repeated cycling of the nickel oxide/zinc battery zinc material is transported over the zinc electrode via the battery electrolyte. During charge as well as during

  13. Writing for publication Part II--The writing process.

    Science.gov (United States)

    Clarke, L K

    1999-01-01

    You have selected a topic, gathered resources, and identified your target audience. The next step is to begin to write and organize your ideas. Initiating the actual writing process can be intimidating, especially for a novice author. This portion of the writing for publication series focuses on helping the writer to organize ideas and get started.

  14. Identification of new fluorescence processes in the UV spectra of cool stars from new energy levels of Fe II and Cr II

    Science.gov (United States)

    Johansson, Sveneric; Carpenter, Kenneth G.

    1988-01-01

    Two fluorescence processes operating in atmospheres of cool stars, symbiotic stars, and the Sun are presented. Two emission lines, at 1347.03 and 1360.17 A, are identified as fluorescence lines of Cr II and Fe II. The lines are due to transitions from highly excited levels, which are populated radiatively by the hydrogen Lyman alpha line due to accidental wavelength coincidences. Three energy levels, one in Cr II and two in Fe II, are reported.

  15. Constraint-based scheduling

    Science.gov (United States)

    Zweben, Monte

    1993-01-01

    The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint-based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all the inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocation for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its application to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems.

  16. The consideration and practice of data processing of WBS-II portal β monitor

    International Nuclear Information System (INIS)

    Du Xiangyang; Dong Qiangmin; Zhang Yong; Han Shuping; Wang Xiaodong; Fan Liya; Rao Xianming

    2001-01-01

    The main aspects of background and human body measurement data processing of WBS-II Portal β Monitor were discussed. The theory analysis of setting high and low background-warning threshold in data processing was done. The relative reference values were partly provided to the local executives. The measurement 'blind zone' and the whole warning function of data processing were discussed. And the structure, the process of monitoring and the microcomputer's hard wares of WBS-II Portal β Monitor were simply introduced

  17. Planck 2013 results. II. Low Frequency Instrument data processing

    CERN Document Server

    Aghanim, N; Arnaud, M; Ashdown, M; Atrio-Barandela, F; Aumont, J; Baccigalupi, C; Banday, A J; Barreiro, R B; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bobin, J; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Bridges, M; Bucher, M; Burigana, C; Butler, R C; Cappellini, B; Cardoso, J -F; Catalano, A; Chamballu, A; Chen, X; Chiang, L -Y; Christensen, P R; Church, S; Colombi, S; Colombo, L P L; Crill, B P; Cruz, M; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dupac, X; Efstathiou, G; Enßlin, T A; Eriksen, H K; Falvella, M C; Finelli, F; Forni, O; Frailis, M; Franceschi, E; Gaier, T C; Galeotta, S; Ganga, K; Giard, M; Giardino, G; Giraud-Héraud, Y; Gjerløw, E; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Hansen, F K; Hanson, D; Harrison, D; Henrot-Versillé, S; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hivon, E; Hobson, M; Holmes, W A; Hornstrup, A; Hovest, W; Huffenberger, K M; Jaffe, T R; Jaffe, A H; Jewell, J; Jones, W C; Juvela, M; Kangaslahti, P; Keihänen, E; Keskitalo, R; Kiiveri, K; Kisner, T S; Knoche, J; Knox, L; Kunz, M; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J -M; Lasenby, A; Lattanzi, M; Laureijs, R J; Lawrence, C R; Leach, S; Leahy, J P; Leonardi, R; Lesgourgues, J; Liguori, M; Lilje, P B; Lindholm, V; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Macías-Pérez, J F; Maggio, G; Maino, D; Mandolesi, N; Maris, M; Marshall, D J; Martin, P G; Martínez-González, E; Masi, S; Matarrese, S; Matthai, F; Mazzotta, P; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Naselsky, P; Natoli, P; Netterfield, C B; Nørgaard-Nielsen, H U; Novikov, D; Novikov, I; O'Dwyer, I J; Osborne, S; Paci, F; Pagano, L; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Peel, M; Perdereau, O; Perotto, L; Perrotta, F; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Platania, P; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Poutanen, T; Pratt, G W; Prézeau, G; Prunet, S; Puget, J -L; Rachen, J P; Reach, W T; Rebolo, R; Reinecke, M; Remazeilles, M; Ricciardi, S; Riller, T; Rocha, G; Rosset, C; Rossetti, M; Roudier, G; Rubiño-Martín, J A; Rusholme, B; Salerno, E; Sandri, M; Santos, D; Scott, D; Seiffert, M D; Shellard, E P S; Spencer, L D; Starck, J -L; Stolyarov, V; Stompor, R; Sureau, F; Sutton, D; Suur-Uski, A -S; Sygnet, J -F; Tauber, J A; Tavagnacco, D; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Tuovinen, J; Türler, M; Umana, G; Valenziano, L; Valiviita, J; Van Tent, B; Varis, J; Vielva, P; Villa, F; Vittorio, N; Wade, L A; Wandelt, B D; Watson, R; Wehus, I K; White, S D M; Wilkinson, A; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    We describe the data processing pipeline of the Planck Low Frequency Instrument (LFI) data processing centre (DPC) to create and characterize full-sky maps based on the first 15.5 months of operations at 30, 44 and 70 GHz. In particular, we discuss the various steps involved in reducing the data, starting from telemetry packets through to the production of cleaned, calibrated timelines and calibrated frequency maps. Data are continuously calibrated using the modulation induced on the mean temperature of the cosmic microwave background radiation by the proper motion of the spacecraft. Sky signals other than the dipole are removed by an iterative procedure based on simultaneous fitting of calibration parameters and sky maps. Noise properties are estimated from time-ordered data after the sky signal has been removed, using a generalized least square map-making algorithm. A destriping code (Madam) is employed to combine radiometric data and pointing information into sky maps, minimizing the variance of correlated...

  18. Study on multi-objective flexible job-shop scheduling problem considering energy consumption

    Directory of Open Access Journals (Sweden)

    Zengqiang Jiang

    2014-06-01

    Full Text Available Purpose: Build a multi-objective Flexible Job-shop Scheduling Problem(FJSP optimization model, in which the makespan, processing cost, energy consumption and cost-weighted processing quality are considered, then Design a Modified Non-dominated Sorting Genetic Algorithm (NSGA-II based on blood variation for above scheduling model.Design/methodology/approach: A multi-objective optimization theory based on Pareto optimal method is used in carrying out the optimization model. NSGA-II is used to solve the model.Findings: By analyzing the research status and insufficiency of multi-objective FJSP, Find that the difference in scheduling will also have an effect on energy consumption in machining process and environmental emissions. Therefore, job-shop scheduling requires not only guaranteeing the processing quality, time and cost, but also optimizing operation plan of machines and minimizing energy consumption.Originality/value: A multi-objective FJSP optimization model is put forward, in which the makespan, processing cost, energy consumption and cost-weighted processing quality are considered. According to above model, Blood-Variation-based NSGA-II (BVNSGA-II is designed. In which, the chromosome mutation rate is determined after calculating the blood relationship between two cross chromosomes, crossover and mutation strategy of NSGA-II is optimized and the prematurity of population is overcome. Finally, the performance of the proposed model and algorithm is evaluated through a case study, and the results proved the efficiency and feasibility of the proposed model and algorithm.

  19. A multiprocessor computer simulation model employing a feedback scheduler/allocator for memory space and bandwidth matching and TMR processing

    Science.gov (United States)

    Bradley, D. B.; Irwin, J. D.

    1974-01-01

    A computer simulation model for a multiprocessor computer is developed that is useful for studying the problem of matching multiprocessor's memory space, memory bandwidth and numbers and speeds of processors with aggregate job set characteristics. The model assumes an input work load of a set of recurrent jobs. The model includes a feedback scheduler/allocator which attempts to improve system performance through higher memory bandwidth utilization by matching individual job requirements for space and bandwidth with space availability and estimates of bandwidth availability at the times of memory allocation. The simulation model includes provisions for specifying precedence relations among the jobs in a job set, and provisions for specifying precedence execution of TMR (Triple Modular Redundant and SIMPLEX (non redundant) jobs.

  20. Dust in Supernovae and Supernova Remnants II: Processing and Survival

    Science.gov (United States)

    Micelotta, E. R.; Matsuura, M.; Sarangi, A.

    2018-03-01

    Observations have recently shown that supernovae are efficient dust factories, as predicted for a long time by theoretical models. The rapid evolution of their stellar progenitors combined with their efficiency in precipitating refractory elements from the gas phase into dust grains make supernovae the major potential suppliers of dust in the early Universe, where more conventional sources like Asymptotic Giant Branch (AGB) stars did not have time to evolve. However, dust yields inferred from observations of young supernovae or derived from models do not reflect the net amount of supernova-condensed dust able to be expelled from the remnants and reach the interstellar medium. The cavity where the dust is formed and initially resides is crossed by the high velocity reverse shock which is generated by the pressure of the circumstellar material shocked by the expanding supernova blast wave. Depending on grain composition and initial size, processing by the reverse shock may lead to substantial dust erosion and even complete destruction. The goal of this review is to present the state of the art about processing and survival of dust inside supernova remnants, in terms of theoretical modelling and comparison to observations.

  1. 2007 Wholesale Power Rate Schedules : 2007 General Rate Schedule Provisions.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    2006-11-01

    This schedule is available for the contract purchase of Firm Power to be used within the Pacific Northwest (PNW). Priority Firm (PF) Power may be purchased by public bodies, cooperatives, and Federal agencies for resale to ultimate consumers, for direct consumption, and for Construction, Test and Start-Up, and Station Service. Rates in this schedule are in effect beginning October 1, 2006, and apply to purchases under requirements Firm Power sales contracts for a three-year period. The Slice Product is only available for public bodies and cooperatives who have signed Slice contracts for the FY 2002-2011 period. Utilities participating in the Residential Exchange Program (REP) under Section 5(c) of the Northwest Power Act may purchase Priority Firm Power pursuant to the Residential Exchange Program. Rates under contracts that contain charges that escalate based on BPA's Priority Firm Power rates shall be based on the three-year rates listed in this rate schedule in addition to applicable transmission charges. This rate schedule supersedes the PF-02 rate schedule, which went into effect October 1, 2001. Sales under the PF-07 rate schedule are subject to BPA's 2007 General Rate Schedule Provisions (2007 GRSPs). Products available under this rate schedule are defined in the 2007 GRSPs. For sales under this rate schedule, bills shall be rendered and payments due pursuant to BPA's 2007 GRSPs and billing process.

  2. Process maps for plasma spray. Part II: Deposition and properties

    International Nuclear Information System (INIS)

    XIANGYANG, JIANG; MATEJICEK, JIRI; KULKARNI, ANAND; HERMAN, HERBERT; SAMPATH, SANJAY; GILMORE, DELWYN L.; NEISER A, RICHARD Jr.

    2000-01-01

    This is the second paper of a two part series based on an integrated study carried out at the State University of New York at Stony Brook and Sandia National Laboratories. The goal of the study is the fundamental understanding of the plasma-particle interaction, droplet/substrate interaction, deposit formation dynamics and microstructure development as well as the deposit property. The outcome is science-based relationships, which can be used to link processing to performance. Molybdenum splats and coatings produced at 3 plasma conditions and three substrate temperatures were characterized. It was found that there is a strong mechanical/thermal interaction between droplet and substrate, which builds up the coatings/substrate adhesion. Hardness, thermal conductivity, and modulus increase, while oxygen content and porosity decrease with increasing particle velocity. Increasing deposition temperature resulted in dramatic improvement in coating thermal conductivity and hardness as well as increase in coating oxygen content. Indentation reveals improved fracture resistance for the coatings prepared at higher deposition temperature. Residual stress was significantly affected by deposition temperature, although not significant by particle energy within the investigated parameter range. Coatings prepared at high deposition temperature with high-energy particles suffered considerably less damage in wear tests. Possible mechanisms behind these changes are discussed within the context of relational maps which are under development

  3. A customizable system for real-time image processing using the Blackfin DSProcessor and the MicroC/OS-II real-time kernel

    Science.gov (United States)

    Coffey, Stephen; Connell, Joseph

    2005-06-01

    This paper presents a development platform for real-time image processing based on the ADSP-BF533 Blackfin processor and the MicroC/OS-II real-time operating system (RTOS). MicroC/OS-II is a completely portable, ROMable, pre-emptive, real-time kernel. The Blackfin Digital Signal Processors (DSPs), incorporating the Analog Devices/Intel Micro Signal Architecture (MSA), are a broad family of 16-bit fixed-point products with a dual Multiply Accumulate (MAC) core. In addition, they have a rich instruction set with variable instruction length and both DSP and MCU functionality thus making them ideal for media based applications. Using the MicroC/OS-II for task scheduling and management, the proposed system can capture and process raw RGB data from any standard 8-bit greyscale image sensor in soft real-time and then display the processed result using a simple PC graphical user interface (GUI). Additionally, the GUI allows configuration of the image capture rate and the system and core DSP clock rates thereby allowing connectivity to a selection of image sensors and memory devices. The GUI also allows selection from a set of image processing algorithms based in the embedded operating system.

  4. Complete Element Abundances of Nine Stars in the r-process Galaxy Reticulum II

    Science.gov (United States)

    Ji, Alexander P.; Frebel, Anna; Simon, Joshua D.; Chiti, Anirudh

    2016-10-01

    We present chemical abundances derived from high-resolution Magellan/Magellan Inamori Kyocera Echelle spectra of the nine brightest known red giant members of the ultra-faint dwarf galaxy Reticulum II (Ret II). These stars span the full metallicity range of Ret II (-3.5 contaminated known r-process pattern. The abundances of lighter elements up to the iron peak are otherwise similar to abundances of stars in the halo and in other ultra-faint dwarf galaxies. However, the scatter in abundance ratios is large enough to suggest that inhomogeneous metal mixing is required to explain the chemical evolution of this galaxy. The presence of low amounts of neutron-capture elements in other ultra-faint dwarf galaxies may imply the existence of additional r-process sites besides the source of r-process elements in Ret II. Galaxies like Ret II may be the original birth sites of r-process enhanced stars now found in the halo. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.

  5. Application of Zr/Ti-Pic in the adsorption process of Cu(II), Co(II) and Ni(II) using adsorption physico-chemical models and thermodynamics of the process; Aplicacao de Zr/Ti-PILC no processo de adsorcao de Cu(II), Co(II) e Ni(II) utilizando modelos fisico-quimicos de adsorcao e termodinamica do processo

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, Denis Lima; Airoldi, Claudio [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Dept. de Quimica Inorganica]. E-mail: dlguerra@iqm.unicamp.br; Lemos, Vanda Porpino; Angelica, Romulo Simoes [Universidade Federal do Para (UFPa), Belem (Brazil); Viana, Rubia Ribeiro [Universidade Federal do Mato Grosso (UFMT), Cuiaba (Brazil). Inst. de Ciencias Exatas e da Terra. Dept. de Recursos Minerais

    2008-07-01

    The aim of this investigation is to study how Zr/Ti-Pic adsorbs metals. The physico-chemical proprieties of Zr/Ti-Pic have been optimized with pillarization processes and Cu(II), Ni(II) and Co(II) adsorption from aqueous solution has been carried out, with maximum adsorption values of 8.85, 8.30 and 7.78 x-1 mmol g{sup -1}, respectively. The Langmuir, Freundlich and Temkin adsorption isotherm models have been applied to fit the experimental data with a linear regression process. The energetic effect caused by metal interaction was determined through calorimetric titration at the solid-liquid interface and gave a net thermal effect that enabled the calculation of the exothermic values and the equilibrium constant. (author)

  6. LEARNING SCHEDULER PARAMETERS FOR ADAPTIVE PREEMPTION

    OpenAIRE

    Prakhar Ojha; Siddhartha R Thota; Vani M; Mohit P Tahilianni

    2015-01-01

    An operating system scheduler is expected to not allow processor stay idle if there is any process ready or waiting for its execution. This problem gains more importance as the numbers of processes always outnumber the processors by large margins. It is in this regard that schedulers are provided with the ability to preempt a running process, by following any scheduling algorithm, and give us an illusion of simultaneous running of several processes. A process which is allowed t...

  7. CMS multicore scheduling strategy

    International Nuclear Information System (INIS)

    Yzquierdo, Antonio Pérez-Calero; Hernández, Jose; Holzman, Burt; Majewski, Krista; McCrea, Alison

    2014-01-01

    In the next years, processor architectures based on much larger numbers of cores will be most likely the model to continue 'Moore's Law' style throughput gains. This not only results in many more jobs in parallel running the LHC Run 1 era monolithic applications, but also the memory requirements of these processes push the workernode architectures to the limit. One solution is parallelizing the application itself, through forking and memory sharing or through threaded frameworks. CMS is following all of these approaches and has a comprehensive strategy to schedule multicore jobs on the GRID based on the glideinWMS submission infrastructure. The main component of the scheduling strategy, a pilot-based model with dynamic partitioning of resources that allows the transition to multicore or whole-node scheduling without disallowing the use of single-core jobs, is described. This contribution also presents the experiences made with the proposed multicore scheduling schema and gives an outlook of further developments working towards the restart of the LHC in 2015.

  8. Lot-Order Assignment Applying Priority Rules for the Single-Machine Total Tardiness Scheduling with Nonnegative Time-Dependent Processing Times

    Directory of Open Access Journals (Sweden)

    Jae-Gon Kim

    2015-01-01

    Full Text Available Lot-order assignment is to assign items in lots being processed to orders to fulfill the orders. It is usually performed periodically for meeting the due dates of orders especially in a manufacturing industry with a long production cycle time such as the semiconductor manufacturing industry. In this paper, we consider the lot-order assignment problem (LOAP with the objective of minimizing the total tardiness of the orders with distinct due dates. We show that we can solve the LOAP optimally by finding an optimal sequence for the single-machine total tardiness scheduling problem with nonnegative time-dependent processing times (SMTTSP-NNTDPT. Also, we address how the priority rules for the SMTTSP can be modified to those for the SMTTSP-NNTDPT to solve the LOAP. In computational experiments, we discuss the performances of the suggested priority rules and show the result of the proposed approach outperforms that of the commercial optimization software package.

  9. Analysis on the nitrogen drilling accident of Well Qionglai 1 (II: Restoration of the accident process and lessons learned

    Directory of Open Access Journals (Sweden)

    Yingfeng Meng

    2015-12-01

    Full Text Available All the important events of the accident of nitrogen drilling of Well Qionglai 1 have been speculated and analyzed in the paper I. In this paper II, based on the investigating information, the well log data and some calculating and simulating results, according to the analysis method of the fault tree of safe engineering, the every possible compositions, their possibilities and time schedule of the events of the accident of Well Qionglai 1 have been analyzed, the implications of the logging data have been revealed, the process of the accident of Well Qionglai 1 has been restored. Some important understandings have been obtained: the objective causes of the accident is the rock burst and the induced events form rock burst, the subjective cause of the accident is that the blooie pipe could not bear the flow burden of the clasts from rock burst and was blocked by the clasts. The blocking of blooie pipe caused high pressure in wellhead, the high pressure made the blooie pipe burst, natural gas came out and flared fire. This paper also thinks that the rock burst in gas drilling in fractured tight sandstone gas zone is objective and not avoidable, but the accidents induced from rock burst can be avoidable by improving the performance of the blooie pipe, wellhead assemblies and drilling tool accessories aiming at the downhole rock burst.

  10. A Fe(II)/citrate/UV/PMS process for carbamazepine degradation at a very low Fe(II)/PMS ratio and neutral pH: The mechanisms.

    Science.gov (United States)

    Ling, Li; Zhang, Dapeng; Fan, Chihhao; Shang, Chii

    2017-11-01

    A novel Fe(II)/citrate/UV/PMS process for degrading a model micropollutant, carbamazepine (CBZ), at a low Fe(II)/PMS ratio and neutral pH has been proposed in this study, and the mechanisms of radical generation in the system was explored. With a UV dose of 302.4 mJ/cm 2 , an initial pH of 7, and CBZ, PMS, Fe(II) and citrate at initial concentrations of 10, 100, 12 and 26 μM, respectively, the CBZ degradation efficiency reached 71% in 20 min in the Fe(II)/citrate/UV/PMS process, which was 4.7 times higher than that in either the citrate/UV/PMS or Fe(II)/citrate/PMS process. The enhanced CBZ degradation in the Fe(II)/citrate/UV/PMS process was mainly attributed to the continuous activation of PMS by the UV-catalyzed regenerated Fe(II) from a Fe(III)-citrate complex, [Fe 3 O(cit) 3 H 3 ] 2- , which not only maintained Fe(III) soluble at neutral pH, but also increased 6.6 and 2.6 times of its molar absorbance and quantum yield as compared to those of ionic Fe(III), respectively. In the Fe(II)/citrate/UV/PMS process, the SO 4 •- produced from the fast reaction between PMS and the initially-added Fe(II) contributed 11% of CBZ degradation. The PMS activation by the UV radiation and regenerated Fe(II) contributed additional 14% and 46% of CBZ removal, respectively. The low iron and citrate doses and the fast radical generation at neutral pH make the Fe(II)/citrate/UV/PMS process suitable for degrading recalcitrant organic compounds in potable water. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The New Alvin and the Scheduling/Planning Processes for the National Deep Submergence Facility Jon C. Alberts, Barrie B. Walden, Richard F. Pittinger

    Science.gov (United States)

    Alberts, J.; Walden, B.

    2003-12-01

    Research. Operation of the NDSF remotely operated vehicle (ROV) assets can be arranged in a fly-away mode on appropriate vessels within the UNOLS fleet or on commercial vessels or foreign research vessels provided they are suitably equipped. Scheduling of the R/V ATLANTIS is arranged through UNOLS, as is the use of the ROVs on UNOLS ships. Coordination between funding agencies and the UNOLS scheduling process strives to provide the users with the optimal scheduling of the assets in a given year. Requests for at-sea use of these assets remain strong for the foreseeable future.

  12. Downlink scheduling using non-orthogonal uplink beams

    KAUST Repository

    Eltayeb, Mohammed E.

    2014-04-01

    Opportunistic schedulers rely on the feedback of the channel state information of users in order to perform user selection and downlink scheduling. This feedback increases with the number of users, and can lead to inefficient use of network resources and scheduling delays. We tackle the problem of feedback design, and propose a novel class of nonorthogonal codes to feed back channel state information. Users with favorable channel conditions simultaneously transmit their channel state information via non-orthogonal beams to the base station. The proposed formulation allows the base station to identify the strong users via a simple correlation process. After deriving the minimum required code length and closed-form expressions for the feedback load and downlink capacity, we show that i) the proposed algorithm reduces the feedback load while matching the achievable rate of full feedback algorithms operating over a noiseless feedback channel, and ii) the proposed codes are superior to the Gaussian codes.

  13. Downlink scheduling using non-orthogonal uplink beams

    KAUST Repository

    Eltayeb, Mohammed E.; Al-Naffouri, Tareq Y.; Bahrami, Hamid Reza Talesh

    2014-01-01

    Opportunistic schedulers rely on the feedback of the channel state information of users in order to perform user selection and downlink scheduling. This feedback increases with the number of users, and can lead to inefficient use of network resources and scheduling delays. We tackle the problem of feedback design, and propose a novel class of nonorthogonal codes to feed back channel state information. Users with favorable channel conditions simultaneously transmit their channel state information via non-orthogonal beams to the base station. The proposed formulation allows the base station to identify the strong users via a simple correlation process. After deriving the minimum required code length and closed-form expressions for the feedback load and downlink capacity, we show that i) the proposed algorithm reduces the feedback load while matching the achievable rate of full feedback algorithms operating over a noiseless feedback channel, and ii) the proposed codes are superior to the Gaussian codes.

  14. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    Over the last ten years, there have been numerous applications of evolutionary algorithms to a variety of scheduling problems. Like most other research on heuristic scheduling, the primary aim of the research has been on deterministic formulations of the problems. This is in contrast to real world...... scheduling problems which are usually not deterministic. Usually at the time the schedule is made some information about the problem and processing environment is available, but this information is uncertain and likely to change during schedule execution. Changes frequently encountered in scheduling...... environments include machine breakdowns, uncertain processing times, workers getting sick, materials being delayed and the appearance of new jobs. These possible environmental changes mean that a schedule which was optimal for the information available at the time of scheduling can end up being highly...

  15. An improved scheduling algorithm for linear networks

    KAUST Repository

    Bader, Ahmed; Alouini, Mohamed-Slim; Ayadi, Yassin

    2017-01-01

    In accordance with the present disclosure, embodiments of an exemplary scheduling controller module or device implement an improved scheduling process such that the targeted reduction in schedule length can be achieve while incurring minimal energy penalty by allowing for a large rate (or duration) selection alphabet.

  16. An improved scheduling algorithm for linear networks

    KAUST Repository

    Bader, Ahmed

    2017-02-09

    In accordance with the present disclosure, embodiments of an exemplary scheduling controller module or device implement an improved scheduling process such that the targeted reduction in schedule length can be achieve while incurring minimal energy penalty by allowing for a large rate (or duration) selection alphabet.

  17. A Photo Storm Report Mobile Application, Processing/Distribution System, and AWIPS-II Display Concept

    Science.gov (United States)

    Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.

    2014-12-01

    The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the

  18. Planning and scheduling - A schedule's performance

    International Nuclear Information System (INIS)

    Whitman, N.M.

    1993-01-01

    Planning and scheduling is a process whose time has come to PSI Energy. With an awareness of the challenges ahead, individuals must look for ways to enhance the corporate competitiveness. Working toward this goal means that each individual has to dedicate themselves to this more competitive corporate environment. Being competitive may be defined as the ability of each employee to add value to the corporation's economic well being. The timely and successful implementation of projects greatly enhances competitiveness. Those projects that do not do well often suffer from lack of proper execution - not for lack of talent or strategic vision. Projects are consumers of resources such as cash and people. They produce a return when completed and will generate a better return when properly completed utilizing proven project management techniques. Completing projects on time, within budget and meeting customer expectations is the way a corporation builds it's future. This paper offers suggestions on implementing planning and scheduling and provides a review of results in the form of management reports

  19. High power CO II lasers and their material processing applications at Centre for Advanced Technology, India

    Science.gov (United States)

    Nath, A. K.; Paul, C. P.; Rao, B. T.; Kau, R.; Raghu, T.; Mazumdar, J. Dutta; Dayal, R. K.; Mudali, U. Kamachi; Sastikumar, D.; Gandhi, B. K.

    2006-01-01

    We have developed high power transverse flow (TF) CW CO II lasers up to 15kW, a high repetition rate TEA CO II laser of 500Hz, 500W average power and a RF excited fast axial flow CO II laser at the Centre for Advanced Technology and have carried out various material processing applications with these lasers. We observed very little variation of discharge voltage with electrode gap in TF CO II lasers. With optimally modulated laser beam we obtained better results in laser piercing and cutting of titanium and resolidification of 3 16L stainless steel weld-metal for improving intergranular corrosion resistance. We carried out microstructure and phase analysis of laser bent 304 stainless steel sheet and optimum process zones were obtained. We carried out laser cladding of 316L stainless steel and Al-alloy substrates with Mo, WC, and Cr IIC 3 powder to improve their wear characteristics. We developed a laser rapid manufacturing facility and fabricated components of various geometries with minimum surface roughness of 5-7 microns Ra and surface waviness of 45 microns between overlapped layers using Colmonoy-6, 3 16L stainless steel and Inconel powders. Cutting of thick concrete blocks by repeated laser glazing followed by mechanical scrubbing process and drilling holes on a vertical concrete with laser beam incident at an optimum angle allowing molten material to flow out under gravity were also done. Some of these studies are briefly presented here.

  20. Comparative assessment of TRU waste forms and processes. Volume II. Waste form data, process descriptions, and costs

    International Nuclear Information System (INIS)

    Ross, W.A.; Lokken, R.O.; May, R.P.; Roberts, F.P.; Thornhill, R.E.; Timmerman, C.L.; Treat, R.L.; Westsik, J.H. Jr.

    1982-09-01

    This volume contains supporting information for the comparative assessment of the transuranic waste forms and processes summarized in Volume I. Detailed data on the characterization of the waste forms selected for the assessment, process descriptions, and cost information are provided. The purpose of this volume is to provide additional information that may be useful when using the data in Volume I and to provide greater detail on particular waste forms and processes. Volume II is divided into two sections and two appendixes. The first section provides information on the preparation of the waste form specimens used in this study and additional characterization data in support of that in Volume I. The second section includes detailed process descriptions for the eight processes evaluated. Appendix A lists the results of MCC-1 leach test and Appendix B lists additional cost data. 56 figures, 12 tables

  1. Tank waste processing analysis: Database development, tank-by-tank processing requirements, and examples of pretreatment sequences and schedules as applied to Hanford Double-Shell Tank Supernatant Waste - FY 1993

    International Nuclear Information System (INIS)

    Colton, N.G.; Orth, R.J.; Aitken, E.A.

    1994-09-01

    This report gives the results of work conducted in FY 1993 by the Tank Waste Processing Analysis Task for the Underground Storage Tank Integrated Demonstration. The main purpose of this task, led by Pacific Northwest Laboratory, is to demonstrate a methodology to identify processing sequences, i.e., the order in which a tank should be processed. In turn, these sequences may be used to assist in the development of time-phased deployment schedules. Time-phased deployment is implementation of pretreatment technologies over a period of time as technologies are required and/or developed. The work discussed here illustrates how tank-by-tank databases and processing requirements have been used to generate processing sequences and time-phased deployment schedules. The processing sequences take into account requirements such as the amount and types of data available for the tanks, tank waste form and composition, required decontamination factors, and types of compact processing units (CPUS) required and technology availability. These sequences were developed from processing requirements for the tanks, which were determined from spreadsheet analyses. The spreadsheet analysis program was generated by this task in FY 1993. Efforts conducted for this task have focused on the processing requirements for Hanford double-shell tank (DST) supernatant wastes (pumpable liquid) because this waste type is easier to retrieve than the other types (saltcake and sludge), and more tank space would become available for future processing needs. The processing requirements were based on Class A criteria set by the U.S. Nuclear Regulatory Commission and Clean Option goals provided by Pacific Northwest Laboratory

  2. Artificial neural network (ANN) approach for modeling Zn(II) adsorption in batch process

    Energy Technology Data Exchange (ETDEWEB)

    Yildiz, Sayiter [Engineering Faculty, Cumhuriyet University, Sivas (Turkmenistan)

    2017-09-15

    Artificial neural networks (ANN) were applied to predict adsorption efficiency of peanut shells for the removal of Zn(II) ions from aqueous solutions. Effects of initial pH, Zn(II) concentrations, temperature, contact duration and adsorbent dosage were determined in batch experiments. The sorption capacities of the sorbents were predicted with the aid of equilibrium and kinetic models. The Zn(II) ions adsorption onto peanut shell was better defined by the pseudo-second-order kinetic model, for both initial pH, and temperature. The highest R{sup 2} value in isotherm studies was obtained from Freundlich isotherm for the inlet concentration and from Temkin isotherm for the sorbent amount. The high R{sup 2} values prove that modeling the adsorption process with ANN is a satisfactory approach. The experimental results and the predicted results by the model with the ANN were found to be highly compatible with each other.

  3. Laser Welding Process Parameters Optimization Using Variable-Fidelity Metamodel and NSGA-II

    Directory of Open Access Journals (Sweden)

    Wang Chaochao

    2017-01-01

    Full Text Available An optimization methodology based on variable-fidelity (VF metamodels and nondominated sorting genetic algorithm II (NSGA-II for laser bead-on-plate welding of stainless steel 316L is presented. The relationships between input process parameters (laser power, welding speed and laser focal position and output responses (weld width and weld depth are constructed by VF metamodels. In VF metamodels, the information from two levels fidelity models are integrated, in which the low-fidelity model (LF is finite element simulation model that is used to capture the general trend of the metamodels, and high-fidelity (HF model which from physical experiments is used to ensure the accuracy of metamodels. The accuracy of the VF metamodel is verified by actual experiments. To slove the optimization problem, NSGA-II is used to search for multi-objective Pareto optimal solutions. The results of verification experiments show that the obtained optimal parameters are effective and reliable.

  4. Artificial neural network (ANN) approach for modeling Zn(II) adsorption in batch process

    International Nuclear Information System (INIS)

    Yildiz, Sayiter

    2017-01-01

    Artificial neural networks (ANN) were applied to predict adsorption efficiency of peanut shells for the removal of Zn(II) ions from aqueous solutions. Effects of initial pH, Zn(II) concentrations, temperature, contact duration and adsorbent dosage were determined in batch experiments. The sorption capacities of the sorbents were predicted with the aid of equilibrium and kinetic models. The Zn(II) ions adsorption onto peanut shell was better defined by the pseudo-second-order kinetic model, for both initial pH, and temperature. The highest R"2 value in isotherm studies was obtained from Freundlich isotherm for the inlet concentration and from Temkin isotherm for the sorbent amount. The high R"2 values prove that modeling the adsorption process with ANN is a satisfactory approach. The experimental results and the predicted results by the model with the ANN were found to be highly compatible with each other.

  5. Degradation of a xanthene dye by Fe(II)-mediated activation of Oxone process.

    Science.gov (United States)

    Wang, Y R; Chu, W

    2011-02-28

    A powerful oxidation process using sulfate radicals activated by transition metal mediated Oxone process has been evaluated in depth by monitoring the degradation of a xanthene dye Rhodamine B (RhB) in aqueous solution. Ferrous ion was chosen as the transition metal due to its potential catalytic effect and wide availability in dyeing industrial effluent. The effects of parameters including reactant dosing sequence, Fe(II)/Oxone molar ratio and concentration, solution pH, and inorganic salts on the process performance have been investigated. Total RhB removal was obtained within 90 min under an optimal Fe(II)/Oxone molar ratio of 1:1. The RhB degradation was found to be a two-stage kinetics, consisting of a rapid initial decay and followed by a retarded stage. Additionally, experimental results indicated that the presence of certain anions had either a positive or negative effect on the process. The inhibitory effect in the presence of SO(4)(2-) was elucidated by a proposed formula using Nernst equation. Furthermore, dye mineralization in terms of TOC removal indicates that stepwise addition of Fe(II) and Oxone can significantly improve the process performance by about 20%, and the retention time required can be greatly reduced comparing with the conventional one-off dosing method. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Implementation of parallel processing in the basf2 framework for Belle II

    International Nuclear Information System (INIS)

    Itoh, Ryosuke; Lee, Soohyung; Katayama, N; Mineo, S; Moll, A; Kuhr, T; Heck, M

    2012-01-01

    Recent PC servers are equipped with multi-core CPUs and it is desired to utilize the full processing power of them for the data analysis in large scale HEP experiments. A software framework basf2 is being developed for the use in the Belle II experiment, a new generation B-factory experiment at KEK, and the parallel event processing to utilize the multi-core CPUs is in its design for the use in the massive data production. The details of the implementation of event parallel processing in the basf2 framework are discussed with the report of preliminary performance study in the realistic use on a 32 core PC server.

  7. Elementary sulfur in effluent from denitrifying sulfide removal process as adsorbent for zinc(II).

    Science.gov (United States)

    Chen, Chuan; Zhou, Xu; Wang, Aijie; Wu, Dong-hai; Liu, Li-hong; Ren, Nanqi; Lee, Duu-Jong

    2012-10-01

    The denitrifying sulfide removal (DSR) process can simultaneously convert sulfide, nitrate and organic compounds into elementary sulfur (S(0)), di-nitrogen gas and carbon dioxide, respectively. However, the S(0) formed in the DSR process are micro-sized colloids with negatively charged surface, making isolation of S(0) colloids from other biological cells and metabolites difficult. This study proposed the use of S(0) in DSR effluent as a novel adsorbent for zinc removal from wastewaters. Batch and continuous tests were conducted for efficient zinc removal with S(0)-containing DSR effluent. At pHremoval rates of zinc(II) were increased with increasing pH. The formed S(0) colloids carried negative charge onto which zinc(II) ions could be adsorbed via electrostatic interactions. The zinc(II) adsorbed S(0) colloids further enhanced coagulation-sedimentation efficiency of suspended solids in DSR effluents. The DSR effluent presents a promising coagulant for zinc(II) containing wastewaters. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Cavity Processing and Preparation of 650 MHz Elliptical Cell Cavities for PIP-II

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, Allan [Fermilab; Chandrasekaran, Saravan Kumar [Fermilab; Grassellino, Anna [Fermilab; Melnychuk, Oleksandr [Fermilab; Merio, Margherita [Fermilab; Reid, Thomas [Argonne (main); Sergatskov, Dmitri [Fermilab

    2017-05-01

    The PIP-II project at Fermilab requires fifteen 650 MHz SRF cryomodules as part of the 800 MeV LINAC that will provide a high intensity proton beam to the Fermilab neutrino program. A total of fifty-seven high-performance SRF cavities will populate the cryomodules and will operate in both pulsed and continuous wave modes. These cavities will be processed and prepared for performance testing utilizing adapted cavity processing infrastructure already in place at Fermilab and Argonne. The processing recipes implemented for these structures will incorporate state-of-the art processing and cleaning techniques developed for 1.3 GHz SRF cavities for the ILC, XFEL, and LCLS-II projects. This paper describes the details of the processing recipes and associated chemistry, heat treatment, and cleanroom processes at the Fermilab and Argonne cavity processing facilities. This paper also presents single and multi-cell cavity test results with quality factors above 5·10¹⁰ and accelerating gradients above 30 MV/m.

  9. NRC comprehensive records disposition schedule

    International Nuclear Information System (INIS)

    1983-05-01

    Effective January 1, 1982, NRC will institute records retention and disposal practives in accordance with the approved Comprehensive Records Disposition Schedule (CRDS). CRDS is comprised of NRC Schedules (NRCS) 1 to 4 which apply to the agency's program or substantive records and General Records Schedules (GRS) 1 to 24 which apply to housekeeping or facilitative records. NRCS-I applies to records common to all or most NRC offices; NRCS-II applies to program records as found in the various offices of the Commission, Atomic Safety and Licensing Board Panel, and the Atomic Safety and Licensing Appeal Panel; NRCS-III applies to records accumulated by the Advisory Committee on Reactor Safeguards; and NRCS-IV applies to records accumulated in the various NRC offices under the Executive Director for Operations. The schedules are assembled functionally/organizationally to facilitate their use. Preceding the records descriptions and disposition instructions for both NRCS and GRS, there are brief statements on the organizational units which accumulate the records in each functional area, and other information regarding the schedules' applicability

  10. 78 FR 21818 - Schedules of Controlled Substances: Placement of Methylone Into Schedule I

    Science.gov (United States)

    2013-04-12

    ..., methamphetamine, and MDMA, Schedule I and II substances. These effects included elevated body temperature... of reuptake of monoamines, and in vivo studies (microdialysis, locomotor activity, body temperature.... Yet another commenter claimed that Schedule I placement would ``cripple efforts at learning,'' make it...

  11. High-Temperature Structural Analysis Model of the Process Heat Exchanger for Helium Gas Loop (II)

    International Nuclear Information System (INIS)

    Song, Kee Nam; Lee, Heong Yeon; Kim, Chan Soo; Hong, Seong Duk; Park, Hong Yoon

    2010-01-01

    PHE (Process Heat Exchanger) is a key component required to transfer heat energy of 950 .deg. C generated in a VHTR (Very High Temperature Reactor) to the chemical reaction that yields a large quantity of hydrogen. Korea Atomic Energy Research Institute established the helium gas loop for the performance test of components, which are used in the VHTR, and they manufactured a PHE prototype to be tested in the loop. In this study, as part of the high temperature structural-integrity evaluation of the PHE prototype, which is scheduled to be tested in the helium gas loop, we carried out high-temperature structural-analysis modeling, thermal analysis, and thermal expansion analysis of the PHE prototype. The results obtained in this study will be used to design the performance test setup for the PHE prototype

  12. NMR investigation of dynamic processes in complexes of nickel(II) and zinc(II) with iminodiacetate, n-methyliminodiacetate and n-ethyliminodiacetate

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1985-11-01

    Analysis of oxygen-17 bulk water relaxation rates with an aqueous solution of 1:1 Ni(II):ida reveals that two rate-limiting processes are involved with solvent exchange. Analysis of carbon-13 longitudinal relaxation rates of the bis-ligand complexes with zinc(II) are used to determine molecular tumbling rates and methyl rotation rates. The carbon-13 transverse relaxation rates for the carbons in the bis-ligand complex with Ni(II) are adequately fitted to the Solomon-Bloembergen equation. Three carboxylate carbon peaks are seen with the 13 C spectrum of the 1:2 Ni(II):ida complex, which coalesce into a single peak above about 360 K. The mechanism and rate of ligand exchange are determined for the complexes Zn(II)L 2 -2 (L = mida, eida) in aqueous solution by total lineshape analysis of the proton spectrum at 500 MHz

  13. Solvent-refined-coal (SRC) process. Volume II. Sections V-XIV. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-05-01

    This report documents the completion of development work on the Solvent Refined Coal Process by The Pittsburgh and Midway Coal Mining Co. The work was initiated in 1966 under Office of Coal Research, US Department of Interior, Contract No. 14-01-0001-496 and completed under US Department of Energy Contract No. DE-AC05-79ET10104. This report discusses work leading to the development of the SRC-I and SRC-II processes, construction of the Fort Lewis Pilot Plant for the successful development of these processes, and results from the operation of this pilot plant. Process design data generated on a 1 ton-per-day Process Development Unit, bench-scale units and through numerous research projects in support of the design of major demonstration plants are also discussed in summary form and fully referenced in this report.

  14. How useful are preemptive schedules?

    NARCIS (Netherlands)

    Brucker, P.; Heitmann, S.; Hurink, J.L.

    2001-01-01

    Machine scheduling admits two options to process jobs. In a preemptive mode processing may be interrupted and resumed later even on a different machine. In a nonpreemptive mode interruptions are not allowed. Usually, the possibility to preempt jobs leads to better performance values. However, also

  15. QoS Differentiated and Fair Packet Scheduling in Broadband Wireless Access Networks

    Directory of Open Access Journals (Sweden)

    Zhang Yan

    2009-01-01

    Full Text Available This paper studies the packet scheduling problem in Broadband Wireless Access (BWA networks. The key difficulties of the BWA scheduling problem lie in the high variability of wireless channel capacity and the unknown model of packet arrival process. It is difficult for traditional heuristic scheduling algorithms to handle the situation and guarantee satisfying performance in BWA networks. In this paper, we introduce learning-based approach for a better solution. Specifically, we formulate the packet scheduling problem as an average cost Semi-Markov Decision Process (SMDP. Then, we solve the SMDP by using reinforcement learning. A feature-based linear approximation and the Temporal-Difference learning technique are employed to produce a near optimal solution of the corresponding SMDP problem. The proposed algorithm, called Reinforcement Learning Scheduling (RLS, has in-built capability of self-training. It is able to adaptively and timely regulate its scheduling policy according to the instantaneous network conditions. Simulation results indicate that RLS outperforms two classical scheduling algorithms and simultaneously considers: (i effective QoS differentiation, (ii high bandwidth utilization, and (iii both short-term and long-term fairness.

  16. Treatment of plutonium contaminated ashes by electrogenerated Ag(II): a new, simple and efficient process

    International Nuclear Information System (INIS)

    Madic, C.; Saulze, J.L.; Bourges, J.; Lecomte, M.; Koehly, G.

    1990-01-01

    Incineration is a very attractive technique for managing plutonium contaminated solid wastes, allowing for large volume and mass reduction factors. After waste incineration, the plutonium is concentrated in the ashes and an efficient method must be designed for its recovery. To achieve this goal, a process based on the dissolution of plutonium in nitric solution under the agressive action of electrogenerated Ag(II) was developed. This process is very simple, requiring very few steps. Plutonium recovery yields up to 98% can be obtained and, in addition, the plutonium bearing solutions generated by the treatment can be processed by the PUREX technique for plutonium recovery. This process constitutes the basis for the development of industrial facilities: 1) a pilot facility is being built in MARCOULE (COGEMA, UP1 plant), to treat active ash in 1990; 2) an industrial facility will be built in the MELOX plant under construction at MARCOULE (COGEMA plant)

  17. Effects of annealing schedule on orientation of Bi3.2Nd0.8Ti3O12 ferroelectric film prepared by chemical solution deposition process

    International Nuclear Information System (INIS)

    He, H.Y.; Huang, J.F.; Cao, L.Y.; Wang, L.S.

    2006-01-01

    Fatigue-free Bi 3.2 Nd 0.8 Ti 3 O 12 ferroelectric thin films were successfully prepared on p-Si(1 1 1) substrate using chemical solution deposition process. The orientation and formation of thin film under different annealing schedules were studied. XRD analysis indicated that (2 0 0)-oriented films with degree of orientation of I (200) /I (117) = 2.097 and 0.466 were obtained by preannealing for 10 min at 400 deg. C followed by rapid thermal annealing for 3, 10 and 20 min at 700 deg. C, respectively (0 0 8)-oriented films with degree of orientation of I (008) /I (117) = 1.706 were obtained by rapid thermal annealing for 3 min at 700 deg. C without preannealing, and (0 0 8)-oriented films with degree of orientation of I (008) /I (117) = 0.719 were obtained by preheating the film from room temperature at 20 deg. C/min followed by annealing for 10 min at 700 deg. C. The a-axis and c-axis orientation decreased as increase of annealing time due to effects of (1 1 1)-oriented substrate. AFM analysis further indicated that preannealing at 400 deg. C for 10 min followed by rapid thermal annealing for 3 min at 700 deg. C resulted in formation of platelike crystallite parallel to substrate surface, however rapid thermal annealing for 3 min at 700 deg. C without preannealing resulted in columnar crystallite perpendicular to substrate surface

  18. Distributed real time data processing architecture for the TJ-II data acquisition system

    International Nuclear Information System (INIS)

    Ruiz, M.; Barrera, E.; Lopez, S.; Machon, D.; Vega, J.; Sanchez, E.

    2004-01-01

    This article describes the performance of a new model of architecture that has been developed for the TJ-II data acquisition system in order to increase its real time data processing capabilities. The current model consists of several compact PCI extension for instrumentation (PXI) standard chassis, each one with various digitizers. In this architecture, the data processing capability is restricted to the PXI controller's own performance. The controller must share its CPU resources between the data processing and the data acquisition tasks. In the new model, distributed data processing architecture has been developed. The solution adds one or more processing cards to each PXI chassis. This way it is possible to plan how to distribute the data processing of all acquired signals among the processing cards and the available resources of the PXI controller. This model allows scalability of the system. More or less processing cards can be added based on the requirements of the system. The processing algorithms are implemented in LabVIEW (from National Instruments), providing efficiency and time-saving application development when compared with other efficient solutions

  19. Immunization Schedules for Adults

    Science.gov (United States)

    ... ACIP Vaccination Recommendations Why Immunize? Vaccines: The Basics Immunization Schedule for Adults (19 Years of Age and ... diseases that can be prevented by vaccines . 2018 Immunization Schedule Recommended Vaccinations for Adults by Age and ...

  20. Instant Childhood Immunization Schedule

    Science.gov (United States)

    ... Recommendations Why Immunize? Vaccines: The Basics Instant Childhood Immunization Schedule Recommend on Facebook Tweet Share Compartir Get ... date. See Disclaimer for additional details. Based on Immunization Schedule for Children 0 through 6 Years of ...

  1. Multi-Objective Scheduling Optimization Based on a Modified Non-Dominated Sorting Genetic Algorithm-II in Voltage Source Converter−Multi-Terminal High Voltage DC Grid-Connected Offshore Wind Farms with Battery Energy Storage Systems

    Directory of Open Access Journals (Sweden)

    Ho-Young Kim

    2017-07-01

    Full Text Available Improving the performance of power systems has become a challenging task for system operators in an open access environment. This paper presents an optimization approach for solving the multi-objective scheduling problem using a modified non-dominated sorting genetic algorithm in a hybrid network of meshed alternating current (AC/wind farm grids. This approach considers voltage and power control modes based on multi-terminal voltage source converter high-voltage direct current (MTDC and battery energy storage systems (BESS. To enhance the hybrid network station performance, we implement an optimal process based on the battery energy storage system operational strategy for multi-objective scheduling over a 24 h demand profile. Furthermore, the proposed approach is formulated as a master problem and a set of sub-problems associated with the hybrid network station to improve the overall computational efficiency using Benders’ decomposition. Based on the results of the simulations conducted on modified institute of electrical and electronics engineers (IEEE-14 bus and IEEE-118 bus test systems, we demonstrate and confirm the applicability, effectiveness and validity of the proposed approach.

  2. Direct demonstration of rapid insulin-like growth factor II receptor internalization and recycling in rat adipocytes. Insulin stimulates 125I-insulin-like growth factor II degradation by modulating the IGF-II receptor recycling process

    International Nuclear Information System (INIS)

    Oka, Y.; Rozek, L.M.; Czech, M.P.

    1985-01-01

    The photoactive insulin-like growth factor (IGF)-II analogue 4-azidobenzoyl- 125 I-IGF-II was synthesized and used to label specifically and covalently the Mr = 250,000 Type II IGF receptor. When rat adipocytes are irradiated after a 10-min incubation with 4-azidobenzoyl- 125 I-IGF-II at 10 degrees C and immediately homogenized, most of the labeled IGF-II receptors are associated with the plasma membrane fraction, indicating that receptors accessible to the labeling reagent at low temperature are on the cell surface. However, when the photolabeled cells are incubated at 37 degrees C for various times before homogenization, labeled IGF-II receptors are rapidly internalized with a half-time of 3.5 min as evidenced by a loss from the plasma membrane fraction and a concomitant appearance in the low density microsome fraction. The steady state level of cell surface IGF-II receptors in the presence or absence of IGF-II remains constant under these conditions, demonstrating that IGF-II receptors rapidly recycle back to the cell surface at the same rate as receptor internalization. Using the above methodology, it is shown that acute insulin action: 1) increases the steady state number of cell surface IGF-II receptors; 2) increases the number of ligand-bound IGF-II receptors that are internalized per unit of time; and 3) increases the rate of cellular 125 I-IGF-II degradation by a process that is blocked by anti-IGF-II receptor antibody

  3. Web Publishing Schedule

    Science.gov (United States)

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  4. Preemptive scheduling with rejection

    NARCIS (Netherlands)

    Hoogeveen, H.; Skutella, M.; Woeginger, Gerhard

    2003-01-01

    We consider the problem of preemptively scheduling a set of n jobs on m (identical, uniformly related, or unrelated) parallel machines. The scheduler may reject a subset of the jobs and thereby incur job-dependent penalties for each rejected job, and he must construct a schedule for the remaining

  5. Preemptive scheduling with rejection

    NARCIS (Netherlands)

    Hoogeveen, J.A.; Skutella, M.; Woeginger, G.J.; Paterson, M.

    2000-01-01

    We consider the problem of preemptively scheduling a set of n jobs on m (identical, uniformly related, or unrelated) parallel machines. The scheduler may reject a subset of the jobs and thereby incur job-dependent penalties for each rejected job, and he must construct a schedule for the remaining

  6. Outage scheduling and implementation

    International Nuclear Information System (INIS)

    Allison, J.E.; Segall, P.; Smith, R.R.

    1986-01-01

    Successful preparation and implementation of an outage schedule and completion of scheduled and emergent work within an identified critical path time frame is a result of careful coordination by Operations, Work Control, Maintenance, Engineering, Planning and Administration and others. At the Fast Flux Test Facility (FFTF) careful planning has been responsible for meeting all scheduled outage critical paths

  7. Scheduling with Time Lags

    NARCIS (Netherlands)

    X. Zhang (Xiandong)

    2010-01-01

    textabstractScheduling is essential when activities need to be allocated to scarce resources over time. Motivated by the problem of scheduling barges along container terminals in the Port of Rotterdam, this thesis designs and analyzes algorithms for various on-line and off-line scheduling problems

  8. Biosorption of aqueous lead (II) on rice straws (oryza sativa) by flash column process

    International Nuclear Information System (INIS)

    Khalid, H.N.; Hassan, M.U.; Jamil, N.; Ahmad, D.; Bushra, H.; Khatoon, S.

    2010-01-01

    Biosorption of Pb (II) on rice straws has been studied with the variation in the parameters and on modified rice straws by flash column process. Different parameters like particle size of adsorbent, initial concentration of metal ions, length and width of columns were studied. A comparative study of modification of adsorbent was also done for which rice straws were modified with EDTA, acids, bases, and volatile organic solvents. Base modified adsorbents have shown an increase in adsorption capacity while acid modified adsorbents proved to be the poor adsorbents for metal ions similarly ash of rice straws used as adsorbent given higher adsorption and EDTA modified adsorbents have shown least adsorption of metal ions. Polar volatile organic solvents modified adsorbent gave less adsorption efficiency and non polar adsorbent shown no influence on Pb (II) uptake capacity of rice straws. Rice straws proved to be the best biosorbent for Pb(II) in aqueous solution. The biosorption characteristics fit well with Langmuir and Freundlich isotherm. (author)

  9. A hybrid job-shop scheduling system

    Science.gov (United States)

    Hellingrath, Bernd; Robbach, Peter; Bayat-Sarmadi, Fahid; Marx, Andreas

    1992-01-01

    The intention of the scheduling system developed at the Fraunhofer-Institute for Material Flow and Logistics is the support of a scheduler working in a job-shop. Due to the existing requirements for a job-shop scheduling system the usage of flexible knowledge representation and processing techniques is necessary. Within this system the attempt was made to combine the advantages of symbolic AI-techniques with those of neural networks.

  10. NASA scheduling technologies

    Science.gov (United States)

    Adair, Jerry R.

    1994-01-01

    This paper is a consolidated report on ten major planning and scheduling systems that have been developed by the National Aeronautics and Space Administration (NASA). A description of each system, its components, and how it could be potentially used in private industry is provided in this paper. The planning and scheduling technology represented by the systems ranges from activity based scheduling employing artificial intelligence (AI) techniques to constraint based, iterative repair scheduling. The space related application domains in which the systems have been deployed vary from Space Shuttle monitoring during launch countdown to long term Hubble Space Telescope (HST) scheduling. This paper also describes any correlation that may exist between the work done on different planning and scheduling systems. Finally, this paper documents the lessons learned from the work and research performed in planning and scheduling technology and describes the areas where future work will be conducted.

  11. Caltrans WeatherShare Phase II System: An Application of Systems and Software Engineering Process to Project Development

    Science.gov (United States)

    2009-08-25

    In cooperation with the California Department of Transportation, Montana State University's Western Transportation Institute has developed the WeatherShare Phase II system by applying Systems Engineering and Software Engineering processes. The system...

  12. A Study of the Operating Room Scheduling System at Tripler Army Medical Center, Hawaii

    Science.gov (United States)

    1981-08-01

    PROCESSING CLASS V SYSTEM .... .......... . A BIBLIOGRAPHY ....... ........... . . . .. . ii ’I. INTRODUCTIO9 Development of the Problem Convinced that...of the most difficult administrativo tasks that a modern hospital must face, and proposed using a combination of a master posting sheet and a...deal with scheduling problems.9 This particular process also incorporates the two-room system doscribed earlier, and the author admits that this

  13. Stochastic scheduling on unrelated machines

    NARCIS (Netherlands)

    Skutella, Martin; Sviridenko, Maxim; Uetz, Marc Jochen

    2013-01-01

    Two important characteristics encountered in many real-world scheduling problems are heterogeneous machines/processors and a certain degree of uncertainty about the actual sizes of jobs. The first characteristic entails machine dependent processing times of jobs and is captured by the classical

  14. Multi-objective Optimization of Pulsed Gas Metal Arc Welding Process Using Neuro NSGA-II

    Science.gov (United States)

    Pal, Kamal; Pal, Surjya K.

    2018-05-01

    Weld quality is a critical issue in fabrication industries where products are custom-designed. Multi-objective optimization results number of solutions in the pareto-optimal front. Mathematical regression model based optimization methods are often found to be inadequate for highly non-linear arc welding processes. Thus, various global evolutionary approaches like artificial neural network, genetic algorithm (GA) have been developed. The present work attempts with elitist non-dominated sorting GA (NSGA-II) for optimization of pulsed gas metal arc welding process using back propagation neural network (BPNN) based weld quality feature models. The primary objective to maintain butt joint weld quality is the maximization of tensile strength with minimum plate distortion. BPNN has been used to compute the fitness of each solution after adequate training, whereas NSGA-II algorithm generates the optimum solutions for two conflicting objectives. Welding experiments have been conducted on low carbon steel using response surface methodology. The pareto-optimal front with three ranked solutions after 20th generations was considered as the best without further improvement. The joint strength as well as transverse shrinkage was found to be drastically improved over the design of experimental results as per validated pareto-optimal solutions obtained.

  15. Image processing methods for noise reduction in the TJ-II Thomson Scattering diagnostic

    Energy Technology Data Exchange (ETDEWEB)

    Dormido-Canto, S., E-mail: sebas@dia.uned.es [Departamento de Informatica y Automatica, UNED, Madrid 28040 (Spain); Farias, G. [Pontificia Universidad Catolica de Valparaiso, Valparaiso (Chile); Vega, J.; Pastor, I. [Asociacion EURATOM/CIEMAT para Fusion, Madrid 28040 (Spain)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer We describe an approach in order to reduce or mitigate the stray-light on the images and show the exceptional results. Black-Right-Pointing-Pointer We analyze the parameters to take account in the proposed process. Black-Right-Pointing-Pointer We report a simplified exampled in order to explain the proposed process. - Abstract: The Thomsom Scattering diagnostic of the TJ-II stellarator provides temperature and density profiles. The CCD camera acquires images corrupted with noise that, in some cases, can produce unreliable profiles. The main source of noise is the so-called stray-light. In this paper we describe an approach that allows mitigation of the effects that stray-light has on the images: extraction regions with connected-components. In addition, the robustness and effectiveness of the noise reduction technique is validated in two ways: (1) supervised classification and (2) comparison of electron temperature profiles.

  16. Sport Tournament Automated Scheduling System

    Directory of Open Access Journals (Sweden)

    Raof R. A. A

    2018-01-01

    Full Text Available The organizer of sport events often facing problems such as wrong calculations of marks and scores, as well as difficult to create a good and reliable schedule. Most of the time, the issues about the level of integrity of committee members and also issues about errors made by human came into the picture. Therefore, the development of sport tournament automated scheduling system is proposed. The system will be able to automatically generate the tournament schedule as well as automatically calculating the scores of each tournament. The problem of scheduling the matches of a round robin and knock-out phase in a sport league are given focus. The problem is defined formally and the computational complexity is being noted. A solution algorithm is presented using a two-step approach. The first step is the creation of a tournament pattern and is based on known graph-theoretic method. The second one is an assignment problem and it is solved using a constraint based depth-first branch and bound procedure that assigns actual teams to numbers in the pattern. As a result, the scheduling process and knock down phase become easy for the tournament organizer and at the same time increasing the level of reliability.

  17. Rostering and Task Scheduling

    DEFF Research Database (Denmark)

    Dohn, Anders Høeg

    . The rostering process is non-trivial and especially when service is required around the clock, rostering may involve considerable effort from a designated planner. Therefore, in order to minimize costs and overstaffing, to maximize the utilization of available staff, and to ensure a high level of satisfaction...... as possible to the available staff, while respecting various requirements and rules and while including possible transportation time between tasks. This thesis presents a number of industrial applications in rostering and task scheduling. The applications exist within various contexts in health care....... Mathematical and logic-based models are presented for the problems considered. Novel components are added to existing models and the modeling decisions are justified. In one case, the model is solved by a simple, but efficient greedy construction heuristic. In the remaining cases, column generation is applied...

  18. Cloud Service Scheduling Algorithm Research and Optimization

    Directory of Open Access Journals (Sweden)

    Hongyan Cui

    2017-01-01

    Full Text Available We propose a cloud service scheduling model that is referred to as the Task Scheduling System (TSS. In the user module, the process time of each task is in accordance with a general distribution. In the task scheduling module, we take a weighted sum of makespan and flowtime as the objective function and use an Ant Colony Optimization (ACO and a Genetic Algorithm (GA to solve the problem of cloud task scheduling. Simulation results show that the convergence speed and output performance of our Genetic Algorithm-Chaos Ant Colony Optimization (GA-CACO are optimal.

  19. BIM-BASED SCHEDULING OF CONSTRUCTION

    DEFF Research Database (Denmark)

    Andersson, Niclas; Büchmann-Slorup, Rolf

    2010-01-01

    The potential of BIM is generally recognized in the construction industry, but the practical application of BIM for management purposes is, however, still limited among contractors. The objective of this study is to review the current scheduling process of construction in light of BIM...... and communicate. Scheduling on the detailed level, on the other hand, follows a stipulated approach to scheduling, i.e. the Last Planner System (LPS), which is characterized by involvement of all actors in the construction phase. Thus, the major challenge when implementing BIM-based scheduling is to improve...

  20. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    Science.gov (United States)

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity

  1. The facile synthesis of a chitosan Cu(II) complex by solution plasma process and evaluation of their antioxidant activities.

    Science.gov (United States)

    Ma, Fengming; Li, Pu; Zhang, Baiqing; Wang, Zhenyu

    2017-10-01

    Synthesis of chitosan-Cu(II) complex by solution plasma process (SPP) irradiation was investigated. The effects of the distance between the electrodes, initial Cu(II) concentration, and initial pH on the Cu(II) adsorption capacity were evaluated. The results showed that narrower distance between the electrodes, higher initial Cu(II) concentration and higher initial pH (at pHchitosan-Cu(II) complex by ultraviolet-visible (UV-vis), fourier transform infrared (FT-IR) and electron spin resonance (ESR) spectroscopy revealed that the main structure of chitosan was not changed after irradiation. Thermogravimetry (TG) analysis indicated that Cu(II) ions were well incorporated into the chitosan. The antioxidant activity of the chitosan-Cu(II) complex was evaluated by DPPH, ABTS, and reducing power assays. The chitosan-Cu(II) complex exhibited greater antioxidant activity than the original chitosan. Thus, SPP could be used for preparation of chitosan-Cu(II) complexes. Copyright © 2017. Published by Elsevier B.V.

  2. Scheduling for decommissioning projects

    International Nuclear Information System (INIS)

    Podmajersky, O.E.

    1987-01-01

    This paper describes the Project Scheduling system being employed by the Decommissioning Operations Contractor at the Shippingport Station Decommissioning Project (SSDP). Results from the planning system show that the project continues to achieve its cost and schedule goals. An integrated cost and schedule control system (C/SCS) which uses the concept of earned value for measurement of performance was instituted in accordance with DOE orders. The schedule and cost variances generated by the C/SCS system are used to confirm management's assessment of project status. This paper describes the types of schedules and tools used on the SSDP project to plan and monitor the work, and identifies factors that are unique to a decommissioning project that make scheduling critical to the achievement of the project's goals. 1 fig

  3. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  4. Approximating Preemptive Stochastic Scheduling

    OpenAIRE

    Megow Nicole; Vredeveld Tjark

    2009-01-01

    We present constant approximative policies for preemptive stochastic scheduling. We derive policies with a guaranteed performance ratio of 2 for scheduling jobs with release dates on identical parallel machines subject to minimizing the sum of weighted completion times. Our policies as well as their analysis apply also to the recently introduced more general model of stochastic online scheduling. The performance guarantee we give matches the best result known for the corresponding determinist...

  5. Revisiting Symbiotic Job Scheduling

    OpenAIRE

    Eyerman , Stijn; Michaud , Pierre; Rogiest , Wouter

    2015-01-01

    International audience; —Symbiotic job scheduling exploits the fact that in a system with shared resources, the performance of jobs is impacted by the behavior of other co-running jobs. By coscheduling combinations of jobs that have low interference, the performance of a system can be increased. In this paper, we investigate the impact of using symbiotic job scheduling for increasing throughput. We find that even for a theoretically optimal scheduler, this impact is very low, despite the subs...

  6. Start of operation of the barrel measuring facility II-01. Implementation into operational processes

    International Nuclear Information System (INIS)

    Buesing, B.; Escher, M.

    2013-01-01

    For the operation of the barrel measuring facility (FAME) II-01 a variety requirements to the measuring techniques were defined and tested in the frame of start-up. The used mechanical engineering and measuring technique complies with the state-of-the-art. Using the barrel measuring facility quality assured determinations of the dose rate and the nuclide-specific activity inventory were performed. For the evaluation of the gamma spectrometric measurements of FAME II-01 appropriately qualified personnel is available. The implementation of the facility in combination with the connection to the data base system PIK-AS and AVK it guaranteed that important data are available in real-time for the measuring process and the subsequent work steps. Besides this it is guaranteed that using the import/export functions relevant data are reviewed, supplemented and exchanged between the systems without transfer errors. The determined data of the dose rate and gamma spectrometric measurements allow an activity determination of the waste package with quality assurance and close to reality. Conservative assumptions in the frame of activity calculations for the later final disposal can be reduced. The automated operation of FAME allows also the reduction of radiation exposure of the personnel.

  7. NONLINEAR WAVE INTERACTIONS AS EMISSION PROCESS OF TYPE II RADIO BURSTS

    Energy Technology Data Exchange (ETDEWEB)

    Ganse, Urs; Kilian, Patrick; Spanier, Felix [Lehrstuhl fuer Astronomie, Universitaet Wuerzburg, Wuerzburg (Germany); Vainio, Rami, E-mail: uganse@astro.uni-wuerzburg.de [Department of Physics, University of Helsinki, Helsinki (Finland)

    2012-06-01

    The emission of fundamental and harmonic frequency radio waves of type II radio bursts are assumed to be products of three-wave interaction processes of beam-excited Langmuir waves. Using a particle-in-cell code, we have performed simulations of the assumed emission region, a coronal mass ejection foreshock with two counterstreaming electron beams. Analysis of wavemodes within the simulation shows self-consistent excitation of beam-driven modes, which yield interaction products at both fundamental and harmonic emission frequencies. Through variation of the beam strength, we have investigated the dependence of energy transfer into electrostatic and electromagnetic modes, confirming the quadratic dependence of electromagnetic emission on electron beam strength.

  8. NONLINEAR WAVE INTERACTIONS AS EMISSION PROCESS OF TYPE II RADIO BURSTS

    International Nuclear Information System (INIS)

    Ganse, Urs; Kilian, Patrick; Spanier, Felix; Vainio, Rami

    2012-01-01

    The emission of fundamental and harmonic frequency radio waves of type II radio bursts are assumed to be products of three-wave interaction processes of beam-excited Langmuir waves. Using a particle-in-cell code, we have performed simulations of the assumed emission region, a coronal mass ejection foreshock with two counterstreaming electron beams. Analysis of wavemodes within the simulation shows self-consistent excitation of beam-driven modes, which yield interaction products at both fundamental and harmonic emission frequencies. Through variation of the beam strength, we have investigated the dependence of energy transfer into electrostatic and electromagnetic modes, confirming the quadratic dependence of electromagnetic emission on electron beam strength.

  9. A Full Mesh ATCA-based General Purpose Data Processing Board: Pulsar II

    CERN Document Server

    Olsen, J; Okumura, Y

    2014-01-01

    High luminosity conditions at the LHC pose many unique challenges for potential silicon based track trigger systems. Among those challenges is data formatting, where hits from thousands of silicon modules must first be shared and organized into overlapping trigger towers. Other challenges exist for Level-1 track triggers, where many parallel data paths may be used for 5 high speed time multiplexed data transfers. Communication between processing nodes requires high bandwidth, low latency, and flexible real time data sharing, for which a full mesh backplane is a natural fit. A custom full mesh enabled ATCA board called the Pulsar II has been designed with the goal of creating a scalable architecture abundant in flexible, non-blocking, high bandwidth board- to-board communication channels while keeping the design as simple as possible.

  10. Forecasting and prevention of water inrush during the excavation process of a diversion tunnel at the Jinping II Hydropower Station, China.

    Science.gov (United States)

    Hou, Tian-Xing; Yang, Xing-Guo; Xing, Hui-Ge; Huang, Kang-Xin; Zhou, Jia-Wen

    2016-01-01

    Estimating groundwater inflow into a tunnel before and during the excavation process is an important task to ensure the safety and schedule during the underground construction process. Here we report a case of the forecasting and prevention of water inrush at the Jinping II Hydropower Station diversion tunnel groups during the excavation process. The diversion tunnel groups are located in mountains and valleys, and with high water pressure head. Three forecasting methods are used to predict the total water inflow of the #2 diversion tunnel. Furthermore, based on the accurate estimation of the water inrush around the tunnel working area, a theoretical method is presented to forecast the water inflow at the working area during the excavation process. The simulated results show that the total water flow is 1586.9, 1309.4 and 2070.2 m(3)/h using the Qshima method, Kostyakov method and Ochiai method, respectively. The Qshima method is the best one because it most closely matches the monitoring result. According to the huge water inflow into the #2 diversion tunnel, reasonable drainage measures are arranged to prevent the potential disaster of water inrush. The groundwater pressure head can be determined using the water flow velocity from the advancing holes; then, the groundwater pressure head can be used to predict the possible water inflow. The simulated results show that the groundwater pressure head and water inflow re stable and relatively small around the region of the intact rock mass, but there is a sudden change around the fault region with a large water inflow and groundwater pressure head. Different countermeasures are adopted to prevent water inrush disasters during the tunnel excavation process. Reasonable forecasting the characteristic parameters of water inrush is very useful for the formation of prevention and mitigation schemes during the tunnel excavation process.

  11. Feasibility of closed Fe(II)/Fe(III) system for product-reflux in Nitrox process

    International Nuclear Information System (INIS)

    Adachi, M.; Ishida, T.

    1981-01-01

    A concept of closed reflux system for stable isotope fractionation by chemical exchange method has been introduced. In a closed system a chemical agent used to convert one chemical species of an isotopic exchange reaction into the other at the product end is regenerated on site by means of an electrochemical or thermal process. It offers a convenience of eliminating the needs for transporting chemicals to and from the site and an advantage of allowing leniency in the degree of completeness of the reflux reaction. Feasibility of use of Fe(II) salt solutions in a closed reflux system for the Nitrox process for 15 N fractionation has been studied. Two of such systems, FeSO 4 in H 2 SO 4 and Fe(ClO 4 ) 2 in HClO 4 , are adopted for packed column operation. For both systems, the rate of reduction of nitric acid increases with increasing acid concentration, the solubility of the salts decreases with the increasing acid concentration, and the reflux reaction can be made to go to completion. Evaluation of such a closed reflux system will have to include that of performance of regenerative process

  12. Scheduling job shop - A case study

    Science.gov (United States)

    Abas, M.; Abbas, A.; Khan, W. A.

    2016-08-01

    The scheduling in job shop is important for efficient utilization of machines in the manufacturing industry. There are number of algorithms available for scheduling of jobs which depend on machines tools, indirect consumables and jobs which are to be processed. In this paper a case study is presented for scheduling of jobs when parts are treated on available machines. Through time and motion study setup time and operation time are measured as total processing time for variety of products having different manufacturing processes. Based on due dates different level of priority are assigned to the jobs and the jobs are scheduled on the basis of priority. In view of the measured processing time, the times for processing of some new jobs are estimated and for efficient utilization of the machines available an algorithm is proposed and validated.

  13. Fuel Quality/Processing Study. Volume II. Appendix, Task I, literature survey

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, J B; Bela, A; Jentz, N E; Klumpe, H W; Kessler, R E; Kotzot, H T; Loran, B I

    1981-04-01

    This activity was begun with the assembly of information from Parsons' files and from contacts in the development and commercial fields. A further more extensive literature search was carried out using the Energy Data Base and the American Petroleum Institute Data Base. These are part of the DOE/RECON system. Approximately 6000 references and abstracts were obtained from the EDB search. These were reviewed and the especially pertinent documents, approximately 300, were acquired in the form of paper copy or microfiche. A Fuel Properties form was developed for listing information pertinent to gas turbine liquid fuel properties specifications. Fuel properties data for liquid fuels from selected synfuel processes, deemed to be successful candidates for near future commercial plants were tabulated on the forms. The processes selected consisted of H-Coal, SRC-II and Exxon Donor Solvent (EDS) coal liquefaction processes plus Paraho and Tosco shale oil processes. Fuel properties analyses for crude and distillate syncrude process products are contained in Section 2. Analyses representing synthetic fuels given refinery treatments, mostly bench scale hydrotreating, are contained in Section 3. Section 4 discusses gas turbine fuel specifications based on petroleum source fuels as developed by the major gas turbine manufacturers. Section 5 presents the on-site gas turbine fuel treatments applicable to petroleum base fuels impurities content in order to prevent adverse contaminant effects. Section 7 relates the environmental aspects of gas turbine fuel usage and combustion performance. It appears that the near future stationary industrial gas turbine fuel market will require that some of the synthetic fuels be refined to the point that they resemble petroleum based fuels.

  14. Alternative Work Schedules: Definitions

    Science.gov (United States)

    Journal of the College and University Personnel Association, 1977

    1977-01-01

    The term "alternative work schedules" encompasses any variation of the requirement that all permanent employees in an organization or one shift of employees adhere to the same five-day, seven-to-eight-hour schedule. This article defines staggered hours, flexible working hours (flexitour and gliding time), compressed work week, the task system, and…

  15. Range Scheduling Aid (RSA)

    Science.gov (United States)

    Logan, J. R.; Pulvermacher, M. K.

    1991-01-01

    Range Scheduling Aid (RSA) is presented in the form of the viewgraphs. The following subject areas are covered: satellite control network; current and new approaches to range scheduling; MITRE tasking; RSA features; RSA display; constraint based analytic capability; RSA architecture; and RSA benefits.

  16. The triangle scheduling problem

    NARCIS (Netherlands)

    Dürr, Christoph; Hanzálek, Zdeněk; Konrad, Christian; Seddik, Yasmina; Sitters, R.A.; Vásquez, Óscar C.; Woeginger, Gerhard

    2017-01-01

    This paper introduces a novel scheduling problem, where jobs occupy a triangular shape on the time line. This problem is motivated by scheduling jobs with different criticality levels. A measure is introduced, namely the binary tree ratio. It is shown that the Greedy algorithm solves the problem to

  17. Genetic algorithm to solve the problems of lectures and practicums scheduling

    Science.gov (United States)

    Syahputra, M. F.; Apriani, R.; Sawaluddin; Abdullah, D.; Albra, W.; Heikal, M.; Abdurrahman, A.; Khaddafi, M.

    2018-02-01

    Generally, the scheduling process is done manually. However, this method has a low accuracy level, along with possibilities that a scheduled process collides with another scheduled process. When doing theory class and practicum timetable scheduling process, there are numerous problems, such as lecturer teaching schedule collision, schedule collision with another schedule, practicum lesson schedules that collides with theory class, and the number of classrooms available. In this research, genetic algorithm is implemented to perform theory class and practicum timetable scheduling process. The algorithm will be used to process the data containing lists of lecturers, courses, and class rooms, obtained from information technology department at University of Sumatera Utara. The result of scheduling process using genetic algorithm is the most optimal timetable that conforms to available time slots, class rooms, courses, and lecturer schedules.

  18. Short term economic emission power scheduling of hydrothermal energy systems using improved water cycle algorithm

    International Nuclear Information System (INIS)

    Haroon, S.S.; Malik, T.N.

    2017-01-01

    Due to the increasing environmental concerns, the demand of clean and green energy and concern of atmospheric pollution is increasing. Hence, the power utilities are forced to limit their emissions within the prescribed limits. Therefore, the minimization of fuel cost as well as exhaust gas emissions is becoming an important and challenging task in the short-term scheduling of hydro-thermal energy systems. This paper proposes a novel algorithm known as WCA-ER (Water Cycle Algorithm with Evaporation Rate) to inspect the short term EEPSHES (Economic Emission Power Scheduling of Hydrothermal Energy Systems). WCA has its ancestries from the natural hydrologic cycle i.e. the raining process forms streams and these streams start flowing towards the rivers which finally flow towards the sea. The worth of WCA-ER has been tested on the standard economic emission power scheduling of hydrothermal energy test system consisting of four hydropower and three thermal plants. The problem has been investigated for the three case studies (i) ECS (Economic Cost Scheduling), (ii) ES (Economic Emission Scheduling) and (iii) ECES (Economic Cost and Emission Scheduling). The results obtained show that WCA-ER is superior to many other methods in the literature in bringing lower fuel cost and emissions. (author)

  19. Advanced oxidation removal of hypophosphite by O3/H2O2 combined with sequential Fe(II) catalytic process.

    Science.gov (United States)

    Zhao, Zilong; Dong, Wenyi; Wang, Hongjie; Chen, Guanhan; Wang, Wei; Liu, Zekun; Gao, Yaguang; Zhou, Beili

    2017-08-01

    Elimination of hypophosphite (HP) was studied as an example of nickel plating effluents treatment by O 3 /H 2 O 2 and sequential Fe(II) catalytic oxidation process. Performance assessment performed with artificial HP solution by varying initial pH and employing various oxidation processes clearly showed that the O 3 /H 2 O 2 ─Fe(II) two-step oxidation process possessed the highest removal efficiency when operating under the same conditions. The effects of O 3 dosing, H 2 O 2 concentration, Fe(II) addition and Fe(II) feeding time on the removal efficiency of HP were further evaluated in terms of apparent kinetic rate constant. Under improved conditions (initial HP concentration of 50 mg L -1 , 75 mg L -1 O 3 , 1 mL L -1 H 2 O 2 , 150 mg L -1 Fe(II) and pH 7.0), standard discharge (<0.5 mg L -1 in China) could be achieved, and the Fe(II) feeding time was found to be the limiting factor for the evolution of apparent kinetic rate constant in the second stage. Characterization studies showed that neutralization process after oxidation treatment favored the improvement of phosphorus removal due to the formation of more metal hydroxides. Moreover, as a comparison with lab-scale Fenton approach, the O 3 /H 2 O 2 ─Fe(II) oxidation process had more competitive advantages with respect to applicable pH range, removal efficiency, sludge production as well as economic costs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Chemical and biological effects of heavy distillate recycle in the SRC-II process

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, B.W.; Pelroy, R.A.; Anderson, R.P.; Freel, J.

    1983-12-01

    Recent work from the Merriam Laboratory continuous coal liquefaction units shows that heavy distillate from the SRC-II process can be recycled to extinction, and hence a distillate product boiling entirely below 310/sup 0/C (590/sup 0/F) (or other selected boiling points) is feasible. In these runs distillate yield was not reduced; gas make was unaffected; and hydrogen consumption was increased only slightly, in keeping with the generally higher hydrogen content of lighter end products. Total distillate yield (C/sub 5/-590/sup 0/F) was 56 wt %, MAF coal in runs with subbituminous coal from the Amax Belle Ayr mine. Product endpoint is well below 371/sup 0/C (700/sup 0/F), the temperature above which coal distillates appear to become genotoxic; and the product was shown to be free of mutagenic activity in the Ames test. Chemical analyses showed both the < 270/sup 0/C (< 518/sup 0/F) and the < 310/sup 0/C (< 590/sup 0/F) distillates to be essentially devoid of several reference polycyclic compounds known to be carcinogenic in laboratory animals. Tests for tumorigenic or carcinogenic activity were not carried out on these materials. However, a comparison of chemical data from the Merriam heavy distillate samples with data on the other SRC-II distillates where carcinogenesis or tumorigenesis data is available leads to the expectation that < 371/sup 0/C (< 700/sup 0/F) materials from the Merriam Laboratory will have greatly reduced tumorigenic and carcinogenic activity in skin painting tests. Other studies suggest the product should be more readily upgraded than full-range (C/sub 5/-900/sup 0/F) distillate.

  1. Membrane/distillation hybrid process research and development. Final report, phase II

    Energy Technology Data Exchange (ETDEWEB)

    Mazanec, T.J.

    1997-07-01

    This report covers work conducted under the grant awarded to BP by DOE in late 1991 entitled {open_quotes}Membrane/Distillation Hybrid Process Research and Development.{close_quotes} The program was directed towards development and commercialization of the BP process for separation of vapor phase olefins from non-olefins via facilitated transport using an aqueous facilitator. The program has come to a very successful conclusion, with formation of a partnership between BP and Stone and Webster Engineering Corporation (SWEC) to market and commercialize the technology. The focus of this report is the final portion of the program, during which engineering re-design, facilitator optimization, economic analysis, and marketing have been the primary activities. At the end of Phase II BP was looking to partner with an engineering firm to advance the selective olefin recovery (SOR) technology from the lab/demo stage to full commercialization. In August 1995 BP and SWEC reached an agreement to advance the technology by completing additional Phase III work with DOE and beginning marketing activities.

  2. Online Scheduling in Manufacturing A Cumulative Delay Approach

    CERN Document Server

    Suwa, Haruhiko

    2013-01-01

    Online scheduling is recognized as the crucial decision-making process of production control at a phase of “being in production" according to the released shop floor schedule. Online scheduling can be also considered as one of key enablers to realize prompt capable-to-promise as well as available-to-promise to customers along with reducing production lead times under recent globalized competitive markets. Online Scheduling in Manufacturing introduces new approaches to online scheduling based on a concept of cumulative delay. The cumulative delay is regarded as consolidated information of uncertainties under a dynamic environment in manufacturing and can be collected constantly without much effort at any points in time during a schedule execution. In this approach, the cumulative delay of the schedule has the important role of a criterion for making a decision whether or not a schedule revision is carried out. The cumulative delay approach to trigger schedule revisions has the following capabilities for the ...

  3. Effects of nickel(II) addition on the activity of activated sludge microorganisms and activated sludge process

    International Nuclear Information System (INIS)

    Ong, Soon-An; Toorisaka, Eiichi; Hirata, Makoto; Hano, Tadashi

    2004-01-01

    The effects of Ni(II) in a synthetic wastewater on the activity of activated sludge microorganisms and sequencing batch reactor (SBR) treatment process were investigated. Two parallel lab-scale SBR systems were operated. One was used as a control unit, while the other received Ni(II) concentrations equal to 5 and 10 mg/l. The SBR systems were operated with FILL, REACT, SETTLE, DRAW and IDLE modes in the time ratio of 0.5:3.5:1.0:0.75:0.25 for a cycle time of 6 h. The addition of Ni(II) into SBR system caused drastically dropped in TOC removal rate (k) and specific oxygen uptake rate (SOUR) by activated sludge microorganisms due to the inhibitory effects of Ni(II) on the bioactivity of microorganisms. The addition of 5 mg/l Ni(II) caused a slight reduction in TOC removal efficiency, whereas 10 mg/l Ni(II) addition significantly affected the SBR performance in terms of suspended solids and TOC removal efficiency. Termination of Ni(II) addition led to almost full recovery of the bioactivity in microorganisms as shown in the increase of specific oxygen uptake rate (SOUR) and SBR treatment performance

  4. Algorithms for classical and modern scheduling problems

    OpenAIRE

    Ott, Sebastian

    2016-01-01

    Subject of this thesis is the design and the analysis of algorithms for scheduling problems. In the first part, we focus on energy-efficient scheduling, where one seeks to minimize the energy needed for processing certain jobs via dynamic adjustments of the processing speed (speed scaling). We consider variations and extensions of the standard model introduced by Yao, Demers, and Shenker in 1995 [79], including the addition of a sleep state, the avoidance of preemption, and variable speed lim...

  5. Development of industry processes simulators. Part II (continuous casting); Desarrollo de simuladores para procesos industriales. Parte II (Colada continua)

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez, A.; Mosqueda, A.; Sauce, V.; Morales, R.; Ramos, A.; Solario, G.

    2006-07-01

    The understanding of steel thermal behavior is very important in order to take care the quality of the products like billets and slabs due to these; this work shows the joint of a subroutine to simulate the heat transfer conditions during the continuous casting process to the model for simulating the process described by the present authors in a previous work; the result is the temperature profiles and surface temperature graphics of the steel, then they are compared with data carried out or real operating conditions. (Author). 15 refs.

  6. NASA Instrument Cost/Schedule Model

    Science.gov (United States)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  7. Integration of scheduling and discrete event simulation systems to improve production flow planning

    Science.gov (United States)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  8. Optimal Algorithms and a PTAS for Cost-Aware Scheduling

    NARCIS (Netherlands)

    L. Chen; N. Megow; R. Rischke; L. Stougie (Leen); J. Verschae

    2015-01-01

    htmlabstractWe consider a natural generalization of classical scheduling problems in which using a time unit for processing a job causes some time-dependent cost which must be paid in addition to the standard scheduling cost. We study the scheduling objectives of minimizing the makespan and the

  9. Some extensions of the discrete lotsizing and scheduling problem

    NARCIS (Netherlands)

    M. Salomon (Marc); L.G. Kroon (Leo); R. Kuik (Roelof); L.N. van Wassenhove (Luk)

    1991-01-01

    textabstractIn this paper the Discrete Lotsizing and Scheduling Problem (DLSP) is considered. DLSP relates to capacitated lotsizing as well as to job scheduling problems and is concerned with determining a feasible production schedule with minimal total costs in a single-stage manufacturing process.

  10. AP1000 construction schedule

    International Nuclear Information System (INIS)

    Winters, J.W.

    2001-01-01

    Westinghouse performed this study as part of EPRI interest in advancing the use of computer aided processes to reduce the cost of nuclear power plants. EPRI believed that if one could relate appropriate portions of an advanced light water reactor plant model to activities in its construction sequence, and this relationship could be portrayed visually, then optimization of the construction sequence could be developed as never before. By seeing a 3-D representation of the plant at any point in its construction sequence, more informed decisions can be made on the feasibility or attractiveness of follow on or parallel steps in the sequence. The 3-D representation of construction as a function of time (4-D) could also increase the confidence of potential investors concerning the viability of the schedule and the plant ultimate cost. This study performed by Westinghouse confirmed that it is useful to be able to visualize a plant construction in 3-D as a function of time in order to optimize the sequence of construction activities. (author)

  11. Efficient Load Scheduling Method For Power Management

    Directory of Open Access Journals (Sweden)

    Vijo M Joy

    2015-08-01

    Full Text Available An efficient load scheduling method to meet varying power supply needs is presented in this paper. At peak load times the power generation system fails due to its instability. Traditionally we use load shedding process. In load shedding process disconnect the unnecessary and extra loads. The proposed method overcomes this problem by scheduling the load based on the requirement. Artificial neural networks are used for this optimal load scheduling process. For generate economic scheduling artificial neural network has been used because generation of power from each source is economically different. In this the total load required is the inputs of this network and the power generation from each source and power losses at the time of transmission are the output of the neural network. Training and programming of the artificial neural networks are done using MATLAB.

  12. Integrated multi-resource planning and scheduling in engineering project

    Directory of Open Access Journals (Sweden)

    Samer Ben Issa

    2017-01-01

    Full Text Available Planning and scheduling processes in project management are carried out sequentially in prac-tice, i.e. planning project activities first without visibility of resource limitation, and then schedul-ing the project according to these pre-planned activities. This is a need to integrate these two pro-cesses. In this paper, we use Branch and Bound approach for generating all the feasible and non-feasible project schedules with/without activity splitting, and with a new criterion called “the Minimum Moments of Resources Required around X-Y axes (MMORR”, we select the best feasible project schedule to integrate plan processing and schedule processing for engineering projects. The results illustrate that this integrated approach can effectively select the best feasible project schedule among alternatives, improves the resource utilization, and shortens the project lead time.

  13. Physician Fee Schedule Search

    Data.gov (United States)

    U.S. Department of Health & Human Services — This website is designed to provide information on services covered by the Medicare Physician Fee Schedule (MPFS). It provides more than 10,000 physician services,...

  14. Clinical Laboratory Fee Schedule

    Data.gov (United States)

    U.S. Department of Health & Human Services — Outpatient clinical laboratory services are paid based on a fee schedule in accordance with Section 1833(h) of the Social Security Act. The clinical laboratory fee...

  15. CERN confirms LHC schedule

    CERN Document Server

    2003-01-01

    The CERN Council held its 125th session on 20 June. Highlights of the meeting included confirmation that the LHC is on schedule for a 2007 start-up, and the announcement of a new organizational structure in 2004.

  16. DMEPOS Fee Schedule

    Data.gov (United States)

    U.S. Department of Health & Human Services — The list contains the fee schedule amounts, floors, and ceilings for all procedure codes and payment category, jurisdication, and short description assigned to each...

  17. Project Schedule Simulation

    DEFF Research Database (Denmark)

    Mizouni, Rabeb; Lazarova-Molnar, Sanja

    2015-01-01

    overrun both their budget and time. To improve the quality of initial project plans, we show in this paper the importance of (1) reflecting features’ priorities/risk in task schedules and (2) considering uncertainties related to human factors in plan schedules. To make simulation tasks reflect features......’ priority as well as multimodal team allocation, enhanced project schedules (EPS), where remedial actions scenarios (RAS) are added, were introduced. They reflect potential schedule modifications in case of uncertainties and promote a dynamic sequencing of involved tasks rather than the static conventional...... this document as an instruction set. The electronic file of your paper will be formatted further at Journal of Software. Define all symbols used in the abstract. Do not cite references in the abstract. Do not delete the blank line immediately above the abstract; it sets the footnote at the bottom of this column....

  18. Decentralized Ground Staff Scheduling

    DEFF Research Database (Denmark)

    Sørensen, M. D.; Clausen, Jens

    2002-01-01

    scheduling is investigated. The airport terminal is divided into zones, where each zone consists of a set of stands geographically next to each other. Staff is assigned to work in only one zone and the staff scheduling is planned decentralized for each zone. The advantage of this approach is that the staff...... work in a smaller area of the terminal and thus spends less time walking between stands. When planning decentralized the allocation of stands to flights influences the staff scheduling since the workload in a zone depends on which flights are allocated to stands in the zone. Hence solving the problem...... depends on the actual stand allocation but also on the number of zones and the layout of these. A mathematical model of the problem is proposed, which integrates the stand allocation and the staff scheduling. A heuristic solution method is developed and applied on a real case from British Airways, London...

  19. Reactor outage schedule (tentative)

    Energy Technology Data Exchange (ETDEWEB)

    Walton, R.P.

    1969-11-01

    This single page document is the November 1, 1969 reactor refueling outage schedule for the Hanford Production Reactor. It also contains data on the amounts and types of fuels to be loaded and relocated in the production reactor.

  20. Reactor outage schedule (tentative)

    Energy Technology Data Exchange (ETDEWEB)

    Walton, R.P.

    1969-10-01

    This single page document is the October 1, 1969 reactor refueling outage schedule for the Hanford Production Reactor. It also contains data on the amounts and types of fuels to be loaded and relocated in the Production Reactor.

  1. Reactor outage schedule (tentative)

    Energy Technology Data Exchange (ETDEWEB)

    Walton, R.P.

    1969-10-15

    This single page document is the October 15, 1969 reactor refueling outage schedule for the Hanford Production Reactor. It also contains data on the amounts and types of fuels to be loaded and relocated in the Production Reactor.

  2. Reactor outage schedule (tentative)

    Energy Technology Data Exchange (ETDEWEB)

    Walton, R.P.

    1969-09-15

    This single page document is the September 15, 1969 reactor refueling outage schedule for the Hanford Production Reactor. It also contains data on the amounts and types of fuels to be loaded and relocated in the Production Reactor.

  3. Reactor outage schedule (tentative)

    Energy Technology Data Exchange (ETDEWEB)

    Walton, R.P.

    1969-12-15

    This single page document is the December 16, 1969 reactor refueling outage schedule for the Hanford Production Reactor. It also contains data on the amounts and types of fuels to be loaded and relocated in the Production reactor.

  4. Reactor outage schedule (tentative)

    Energy Technology Data Exchange (ETDEWEB)

    Walton, R.P.

    1969-12-01

    This single page document is the December 1, 1969 reactor refueling outage schedule for the Hanford Production Reactor. It also contains data on the amounts and types of fuels to be loaded and relocated in the Production reactor.

  5. Fee Schedules - General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — A fee schedule is a complete listing of fees used by Medicare to pay doctors or other providers-suppliers. This comprehensive listing of fee maximums is used to...

  6. CMS Records Schedule

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Records Schedule provides disposition authorizations approved by the National Archives and Records Administration (NARA) for CMS program-related records...

  7. On the Processing of Spalling Experiments. Part II: Identification of Concrete Fracture Energy in Dynamic Tension

    Science.gov (United States)

    Lukić, Bratislav B.; Saletti, Dominique; Forquin, Pascal

    2017-12-01

    This paper presents a second part of the study aimed at investigating the fracture behavior of concrete under high strain rate tensile loading. The experimental method together with the identified stress-strain response of three tests conducted on ordinary concrete have been presented in the paper entitled Part I (Forquin and Lukić in Journal of Dynamic Behavior of Materials, 2017. https://doi.org/10.1007/s40870-017-0135-1). In the present paper, Part II, the investigation is extended towards directly determining the specific fracture energy of each observed fracture zone by visualizing the dynamic cracking process with a temporal resolution of 1 µs. Having access to temporal displacement fields of the sample surface, it is possible to identify the fracture opening displacement (FOD) and the fracture opening velocity of any principle (open) and secondary (closed) fracture at each measurement instance, that may or may not lead to complete physical failure of the sample. Finally, the local Stress-FOD curves were obtained for each observed fracture zone, opposed to previous works where indirect measurements were used. The obtained results indicated a much lower specific fracture energy compared to the results often found in the literature. Furthermore, numerical simulations were performed with a damage law to evaluate the validity of the proposed experimental data processing and compare it to the most often used one in the previous works. The results showed that the present method can reliably predict the specific fracture energy needed to open one macro-fracture and suggested that indirect measurement techniques can lead to an overestimate of specific fracture energy due to the stringent assumption of linear elasticity up-to the peak and the inability of having access to the real post-peak change of axial stress.

  8. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    Science.gov (United States)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  9. I. Advances in NMR Signal Processing. II. Spin Dynamics in Quantum Dissipative Systems

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Yung-Ya [Univ. of California, Berkeley, CA (United States)

    1998-11-01

    Part I. Advances in IVMR Signal Processing. Improvements of sensitivity and resolution are two major objects in the development of NMR/MRI. A signal enhancement method is first presented which recovers signal from noise by a judicious combination of a priordmowledge to define the desired feasible solutions and a set theoretic estimation for restoring signal properties that have been lost due to noise contamination. The effect of noise can be significantly mitigated through the process of iteratively modifying the noisy data set to the smallest degree necessary so that it possesses a collection of prescribed properties and also lies closest to the original data set. A novel detection-estimation scheme is then introduced to analyze noisy and/or strongly damped or truncated FIDs. Based on exponential modeling, the number of signals is detected based on information estimated using the matrix pencil method. theory and the spectral parameters are Part II. Spin Dynamics in body dipole-coupled systems Quantum Dissipative Systems. Spin dynamics in manyconstitutes one of the most fundamental problems in magnetic resonance and condensed-matter physics. Its many-spin nature precludes any rigorous treatment. ‘Therefore, the spin-boson model is adopted to describe in the rotating frame the influence of the dipolar local fields on a tagged spin. Based on the polaronic transform and a perturbation treatment, an analytical solution is derived, suggesting the existence of self-trapped states in the. strong coupling limit, i.e., when transverse local field >> longitudinal local field. Such nonlinear phenomena originate from the joint action of the lattice fluctuations and the reaction field. Under semiclassical approximation, it is found that the main effect of the reaction field is the renormalization of the Hamiltonian of interest. Its direct consequence is the two-step relaxation process: the spin is initially localized in a quasiequilibrium state, which is later detrapped by

  10. CMS Planning and Scheduling System

    CERN Document Server

    Kotamaki, M

    1998-01-01

    The paper describes the procedures and the system to build and maintain the schedules needed to manage time, resources, and progress of the CMS project. The system is based on the decomposition of the project into work packages, which can be each considered as a complete project with its own structure. The system promotes the distribution of the decision making and responsibilities to lower levels in the organisation by providing a state-of-the-art system to formalise the external commitments of the work packages without limiting their ability to modify their internal schedules to best meet their commitments. The system lets the project management focus on the interfaces between the work packages and alerts the management immediately if a conflict arises. The proposed system simplifies the planning and management process and eliminates the need for a large, centralised project management system.

  11. 78 FR 9987 - Social Security Ruling, SSR 13-1p; Titles II and XVI: Agency Processes for Addressing Allegations...

    Science.gov (United States)

    2013-02-12

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2012-0071] Social Security Ruling, SSR 13-1p; Titles II and XVI: Agency Processes for Addressing Allegations of Unfairness, Prejudice, Partiality, Bias, Misconduct, or Discrimination by Administrative Law Judges (ALJs); Correction AGENCY: Social Security...

  12. 78 FR 22361 - Social Security Ruling, SSR 13-1p; Titles II and XVI: Agency Processes for Addressing Allegations...

    Science.gov (United States)

    2013-04-15

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2012-0071] Social Security Ruling, SSR 13-1p; Titles II and XVI: Agency Processes for Addressing Allegations of Unfairness, Prejudice, Partiality, Bias, Misconduct, or Discrimination by Administrative Law Judges (ALJs); Correction AGENCY: Social Security...

  13. 78 FR 8217 - Social Security Ruling, SSR 13-1p; Titles II and XVI: Agency Processes for Addressing Allegations...

    Science.gov (United States)

    2013-02-05

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2012-0071] Social Security Ruling, SSR 13-1p; Titles II and XVI: Agency Processes for Addressing Allegations of Unfairness, Prejudice, Partiality, Bias... the third column, the fourth line under the ``Summary'' heading, change ``SSR-13-Xp'' to ``SSR-13-1p...

  14. The Composition of the Master Schedule

    Science.gov (United States)

    Thomas, Cynthia C.; Behrend, Dirk; MacMillan, Daniel S.

    2010-01-01

    Over a period of about four months, the IVS Coordinating Center (IVSCC) each year composes the Master Schedule for the IVS observing program of the next calendar year. The process begins in early July when the IVSCC contacts the IVS Network Stations to request information about available station time as well as holiday and maintenance schedules for the upcoming year. Going through various planning stages and a review process with the IVS Observing Program Committee (OPC), the final version of the Master Schedule is posted by early November. We describe the general steps of the composition and illustrate them with the example of the planning for the Master Schedule of the 2010 observing year.

  15. Constraint-based job shop scheduling with ILOG SCHEDULER

    NARCIS (Netherlands)

    Nuijten, W.P.M.; Le Pape, C.

    1998-01-01

    We introduce constraint-based scheduling and discuss its main principles. An approximation algorithm based on tree search is developed for the job shop scheduling problem using ILOG SCHEDULER. A new way of calculating lower bounds on the makespan of the job shop scheduling problem is presented and

  16. A Mathematical Model for Scheduling a Batch Processing Machine with Multiple Incompatible Job Families, Non-identical Job dimensions, Non-identical Job sizes, Non-agreeable release times and due dates

    International Nuclear Information System (INIS)

    Ramasubramaniam, M; Mathirajan, M

    2013-01-01

    The paper addresses the problem scheduling a batch processing machine with multiple incompatible job families, non-identical job dimensions, non-identical job sizes and non-agreeable release dates to minimize makespan. The research problem is solved by proposing a mixed integer programming model that appropriately takes into account the parameters considered in the problem. The proposed is validated using a numerical example. The experiment conducted show that the model can pose significant difficulties in solving the large scale instances. The paper concludes by giving the scope for future work and some alternative approaches one can use for solving these class of problems.

  17. Finite Element Models for Electron Beam Freeform Fabrication Process, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This Small Business Innovation Research Phase II proposal offers to develop a comprehensive computer simulation methodology based on the finite element method for...

  18. Split Scheduling with Uniform Setup Times

    NARCIS (Netherlands)

    Schalekamp, F.; Sitters, R.A.; van der Ster, S.L.; Stougie, L.; Verdugo, V.; van Zuylen, A.

    2015-01-01

    We study a scheduling problem in which jobs may be split into parts, where the parts of a split job may be processed simultaneously on more than one machine. Each part of a job requires a setup time, however, on the machine where the job part is processed. During setup, a machine cannot process or

  19. An Improved Multiobjective PSO for the Scheduling Problem of Panel Block Construction

    Directory of Open Access Journals (Sweden)

    Zhi Yang

    2016-01-01

    Full Text Available Uncertainty is common in ship construction. However, few studies have focused on scheduling problems under uncertainty in shipbuilding. This paper formulates the scheduling problem of panel block construction as a multiobjective fuzzy flow shop scheduling problem (FSSP with a fuzzy processing time, a fuzzy due date, and the just-in-time (JIT concept. An improved multiobjective particle swarm optimization called MOPSO-M is developed to solve the scheduling problem. MOPSO-M utilizes a ranked-order-value rule to convert the continuous position of particles into the discrete permutations of jobs, and an available mapping is employed to obtain the precedence-based permutation of the jobs. In addition, to improve the performance of MOPSO-M, archive maintenance is combined with global best position selection, and mutation and a velocity constriction mechanism are introduced into the algorithm. The feasibility and effectiveness of MOPSO-M are assessed in comparison with general MOPSO and nondominated sorting genetic algorithm-II (NSGA-II.

  20. Guidelines of Decommissioning Schedule Establishment

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae Yong; Yun, Taesik; Kim, Younggook; Kim, Hee-Geun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    Decommissioning has recently become an issue highlighted in Korea due to the Permanent Shutdown (PS) of Kori-1 plant. Since Korea Hydro and Nuclear Power (KHNP) Company decided the PS of Kori-1 instead of further continued operation, Kori-1 will be the first decommissioning plant of the commercial reactors in Korea. Korean regulatory authority demands Initial Decommissioning Plan (IDP) for all the plants in operation and under construction. In addition, decommissioning should be considered for the completion of the life cycle of NPPs. To date, Korea has no experience regarding decommissioning of the commercial reactor and a lot of uncertainties will be expected due to its site-specific factors. However, optimized decommissioning process schedule must be indispensable in the safety and economic efficiency of the project. Differed from USA, Korea has no experience and know-hows of the operation and site management for decommissioning. Hence, in Korea, establishment of decommissioning schedule has to give more weight to safety than precedent cases. More economical and rational schedule will be composed by collecting and analyzing the experience data and site-specific data and information as the decommissioning progresses. In a long-range outlook, KHNP having capability of NPP decommissioning will try to decommissioning business in Korea and foreign countries.

  1. 1170-MW(t) HTGR-PS/C plant application study report: SRC-II process application

    International Nuclear Information System (INIS)

    Rao, R.; McMain, A.T. Jr.

    1981-05-01

    The solvent refined coal (SRC-II) process is an advanced process being developed by Gulf Mineral Resources Ltd. (a Gulf Oil Corporation subsidiary) to produce a clean, non-polluting liquid fuel from high-sulfur bituminous coals. The SRC-II commercial plant will process about 24,300 tonnes (26,800 tons) of feed coal per stream day, producing primarily fuel oil plus secondary fuel gases. This summary report describes the integration of a high-temperature gas-cooled reactor operating in a process steam/cogeneration mode (HTGR-PS/C) to provide the energy requirements for the SRC-II process. The HTGR-PS/C plant was developed by General Atomic Company (GA) specifically for industries which require energy in the form of both steam and electricity. General Atomic has developed an 1170-MW(t) HTGR-PS/C design which is particularly well suited to industrial applications and is expected to have excellent cost benefits over other sources of energy

  2. Luminescence and photothermally stimulated defects creation processes in PbWO4:La3+, Y3+ (PWO II) crystals

    International Nuclear Information System (INIS)

    Auffray, E.; Korjik, M.; Zazubovich, S.

    2015-01-01

    Photoluminescence and thermally stimulated luminescence (TSL) are studied for a PbWO 4 crystal grown by the Czochralski method at Bogoroditsk Technical Chemical Plant, Russia from the melt with a precise tuning of the stoichiometry and co-doped with La 3+ and Y 3+ ions (the PWO II crystal). Photothermally stimulated processes of electron and hole centers creation under selective UV irradiation of this crystal in the 3.5–5.0 eV energy range and the 85–205 K temperature range are clarified and the optically created electron and hole centers are identified. The electrons in PWO II are mainly trapped at the (WO 4 ) 2− groups located close to single La 3+ and Y 3+ ions, producing the electron {(WO 4 ) 3− –La 3+ } and {(WO 4 ) 3− –Y 3+ } centers. The holes are mainly trapped at the regular oxygen ions O 2− located close to La 3+ and Y 3+ ions associated with lead vacancies, producing the hole O − (I)-type centers. No evidence of single-vacancy-related centers has been observed in PWO II. The data obtained indicate that excellent scintillation characteristics of the PWO II crystal can be explained by a negligible concentration of single (non-compensated) oxygen and lead vacancies as the traps for electrons and holes, respectively. - Highlights: • Photoluminescence of the PbWO 4 :La 3+ , Y 3+ (PWO II) crystal is investigated. • Creation of defects under UV irradiation of PWO II is studied by TSL. • Origin of dominating electron and hole centers is ascertained. • Concentration of single-vacancy-related centers is found to be negligible. • Excellent scintillation characteristics of the PWO II crystal are explained.

  3. Estimating exponential scheduling preferences

    DEFF Research Database (Denmark)

    Hjorth, Katrine; Börjesson, Maria; Engelson, Leonid

    2015-01-01

    of car drivers' route and mode choice under uncertain travel times. Our analysis exposes some important methodological issues related to complex non-linear scheduling models: One issue is identifying the point in time where the marginal utility of being at the destination becomes larger than the marginal......Different assumptions about travelers' scheduling preferences yield different measures of the cost of travel time variability. Only few forms of scheduling preferences provide non-trivial measures which are additive over links in transport networks where link travel times are arbitrarily...... utility of being at the origin. Another issue is that models with the exponential marginal utility formulation suffer from empirical identification problems. Though our results are not decisive, they partly support the constant-affine specification, in which the value of travel time variability...

  4. Post LS1 schedule

    CERN Document Server

    Lamont, M

    2014-01-01

    The scheduling limits for a typical long year taking into account technical stops, machine development, spe- cial physics runs are presented. An attempt is then made to outline a ten year post LS1 schedule taking into account the disparate requirements outlined in the previous talks in this session. The demands on the planned long shutdowns and the impact of these demands on their proposed length will be discussed. The option of using ion running as a pre-shutdown cool-down period will be addressed.

  5. Automated Scheduling Via Artificial Intelligence

    Science.gov (United States)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  6. Minimizing tardiness for job shop scheduling under uncertainties

    OpenAIRE

    Yahouni , Zakaria; Mebarki , Nasser; Sari , Zaki

    2016-01-01

    International audience; —Many disturbances can occur during the execution of a manufacturing scheduling process. To cope with this drawback , flexible solutions are proposed based on the offline and the online phase of the schedule. Groups of permutable operations is one of the most studied flexible scheduling methods bringing flexibility as well as quality to a schedule. The online phase of this method is based on a human-machine system allowing to choose in real-time one schedule from a set...

  7. Multiple-Machine Scheduling with Learning Effects and Cooperative Games

    Directory of Open Access Journals (Sweden)

    Yiyuan Zhou

    2015-01-01

    Full Text Available Multiple-machine scheduling problems with position-based learning effects are studied in this paper. There is an initial schedule in this scheduling problem. The optimal schedule minimizes the sum of the weighted completion times; the difference between the initial total weighted completion time and the minimal total weighted completion time is the cost savings. A multiple-machine sequencing game is introduced to allocate the cost savings. The game is balanced if the normal processing times of jobs that are on the same machine are equal and an equal number of jobs are scheduled on each machine initially.

  8. Topology-based hierarchical scheduling using deficit round robin

    DEFF Research Database (Denmark)

    Yu, Hao; Yan, Ying; Berger, Michael Stubert

    2009-01-01

    according to the topology. The mapping process could be completed through the network management plane or by manual configuration. Based on the knowledge of the network, the scheduler can manage the traffic on behalf of other less advanced nodes, avoid potential traffic congestion, and provide flow...... protection and isolation. Comparisons between hierarchical scheduling, flow-based scheduling, and class-based scheduling schemes have been carried out under a symmetric tree topology. Results have shown that the hierarchical scheduling scheme provides better flow protection and isolation from attack...

  9. Mathematical Modeling and a Hybrid NSGA-II Algorithm for Process Planning Problem Considering Machining Cost and Carbon Emission

    Directory of Open Access Journals (Sweden)

    Jin Huang

    2017-09-01

    Full Text Available Process planning is an important function in a manufacturing system; it specifies the manufacturing requirements and details for the shop floor to convert a part from raw material to the finished form. However, considering only economical criterion with technological constraints is not enough in sustainable manufacturing practice; formerly, criteria about low carbon emission awareness have seldom been taken into account in process planning optimization. In this paper, a mathematical model that considers both machining costs reduction as well as carbon emission reduction is established for the process planning problem. However, due to various flexibilities together with complex precedence constraints between operations, the process planning problem is a non-deterministic polynomial-time (NP hard problem. Aiming at the distinctive feature of the multi-objectives process planning optimization, we then developed a hybrid non-dominated sorting genetic algorithm (NSGA-II to tackle this problem. A local search method that considers both the total cost criterion and the carbon emission criterion are introduced into the proposed algorithm to avoid being trapped into local optima. Moreover, the technique for order preference by similarity to an ideal solution (TOPSIS method is also adopted to determine the best solution from the Pareto front. Experiments have been conducted using Kim’s benchmark. Computational results show that process plan schemes with low carbon emission can be captured, and, more importantly, the proposed hybrid NSGA-II algorithm can obtain more promising optimal Pareto front than the plain NSGA-II algorithm. Meanwhile, according to the computational results of Kim’s benchmark, we find that both of the total machining cost and carbon emission are roughly proportional to the number of operations, and a process plan with less operation may be more satisfactory. This study will draw references for the further research on green

  10. Nonblocking Scheduling for Web Service Transactions

    DEFF Research Database (Denmark)

    Alrifai, Mohammad; Balke, Wolf-Tilo; Dolog, Peter

    2007-01-01

    . In this paper, we propose a novel nonblocking scheduling mechanism that is used prior to the actual service invocations. Its aim is to reach an agreement between the client and all participating providers on what transaction processing times have to be expected, accepted, and guaranteed. This enables service......For improved flexibility and concurrent usage existing transaction management models for Web services relax the isolation property of Web service-based transactions. Correctness of the concurrent execution then has to be ensured by commit order-preserving transaction schedulers. However, local...... schedulers of service providers typically do take into account neither time constraints for committing the whole transaction, nor the individual services' constraints when scheduling decisions are made. This often leads to an unnecessary blocking of transactions by (possibly long-running) others...

  11. EBR-II Cover Gas Cleanup System upgrade process control system structure

    International Nuclear Information System (INIS)

    Carlson, R.B.; Staffon, J.D.

    1992-01-01

    The Experimental Breeder Reactor II (EBR-II) Cover Gas Cleanup System (CGCS) control system was upgraded in 1991 to improve control and provide a graphical operator interface. The upgrade consisted of a main control computer, a distributed control computer, a front end input/output computer, a main graphics interface terminal, and a remote graphics interface terminal. This paper briefly describes the Cover Gas Cleanup System and the overall control system; describes the main control computer hardware and system software features in more detail; and, then, describes the real-time control tasks, and how they interact with each other, and how they interact with the operator interface task

  12. A Full Mesh ATCA-based General Purpose Data Processing Board (Pulsar II)

    Energy Technology Data Exchange (ETDEWEB)

    Ajuha, S. [Univ. of Sao Paulo (Brazil); et al.

    2017-06-29

    The Pulsar II is a custom ATCA full mesh enabled FPGA-based processor board which has been designed with the goal of creating a scalable architecture abundant in flexible, non-blocking, high bandwidth interconnections. The design has been motivated by silicon-based tracking trigger needs for LHC experiments. In this technical memo we describe the Pulsar II hardware and its performance, such as the performance test results with full mesh backplanes from different vendors, how the backplane is used for the development of low-latency time-multiplexed data transfer schemes and how the inter-shelf and intra-shelf synchronization works.

  13. A Full Mesh ATCA-based General Purpose Data Processing Board (Pulsar II)

    CERN Document Server

    Ajuha, S; Costa de Paiva, Thiago; Das, Souvik; Eusebi, Ricardo; Finotti Ferreira, Vitor; Hahn, Kristian; Hu, Zhen; Jindariani, Sergo; Konigsberg, Jacobo; Liu, Tiehui Ted; Low, Jia Fu; Okumura, Yasuyuki; Olsen, Jamieson; Arruda Ramalho, Lucas; Rossin, Roberto; Ristori, Luciano; Akira Shinoda, Ailton; Tran, Nhan; Trovato, Marco; Ulmer, Keith; Vaz, Mario; Wen, Xianshan; Wu, Jin-Yuan; Xu, Zijun; Yin, Han; Zorzetti, Silvia

    2017-01-01

    The Pulsar II is a custom ATCA full mesh enabled FPGA-based processor board which has been designed with the goal of creating a scalable architecture abundant in flexible, non-blocking, high bandwidth interconnections. The design has been motivated by silicon-based tracking trigger needs for LHC experiments. In this technical memo we describe the Pulsar II hardware and its performance, such as the performance test results with full mesh backplanes from di↵erent vendors, how the backplane is used for the development of low-latency time-multiplexed data transfer schemes and how the inter-shelf and intra-shelf synchronization works.

  14. Closed-Loop Control of the Thermal Stir Welding Process to Enable Rapid Process/Part Qualification, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Thermal Stir Welding (TSW) provides advancement over the more conventional Friction Stir Welding (C-FSW) process because it separates the primary processes variables...

  15. A Gas Scheduling Optimization Model for Steel Enterprises

    Directory of Open Access Journals (Sweden)

    Niu Honghai

    2017-01-01

    Full Text Available Regarding the scheduling problems of steel enterprises, this research designs the gas scheduling optimization model according to the rules and priorities. Considering different features and the process changes of the gas unit in the process of actual production, the calculation model of process state and gas consumption soft measurement together with the rules of scheduling optimization is proposed to provide the dispatchers with real-time gas using status of each process, then help them to timely schedule and reduce the gas volume fluctuations. In the meantime, operation forewarning and alarm functions are provided to avoid the abnormal situation in the scheduling, which has brought about very good application effect in the actual scheduling and ensures the safety of the gas pipe network system and the production stability.

  16. Technology for planning and scheduling under complex constraints

    Science.gov (United States)

    Alguire, Karen M.; Pedro Gomes, Carla O.

    1997-02-01

    Within the context of law enforcement, several problems fall into the category of planning and scheduling under constraints. Examples include resource and personnel scheduling, and court scheduling. In the case of court scheduling, a schedule must be generated considering available resources, e.g., court rooms and personnel. Additionally, there are constraints on individual court cases, e.g., temporal and spatial, and between different cases, e.g., precedence. Finally, there are overall objectives that the schedule should satisfy such as timely processing of cases and optimal use of court facilities. Manually generating a schedule that satisfies all of the constraints is a very time consuming task. As the number of court cases and constraints increases, this becomes increasingly harder to handle without the assistance of automatic scheduling techniques. This paper describes artificial intelligence (AI) technology that has been used to develop several high performance scheduling applications including a military transportation scheduler, a military in-theater airlift scheduler, and a nuclear power plant outage scheduler. We discuss possible law enforcement applications where we feel the same technology could provide long-term benefits to law enforcement agencies and their operations personnel.

  17. Harmonious personnel scheduling

    NARCIS (Netherlands)

    Fijn van Draat, Laurens; Post, Gerhard F.; Veltman, Bart; Winkelhuijzen, Wessel

    2006-01-01

    The area of personnel scheduling is very broad. Here we focus on the ‘shift assignment problem’. Our aim is to discuss how ORTEC HARMONY handles this planning problem. In particular we go into the structure of the optimization engine in ORTEC HARMONY, which uses techniques from genetic algorithms,

  18. Hybrid job shop scheduling

    NARCIS (Netherlands)

    Schutten, Johannes M.J.

    1995-01-01

    We consider the problem of scheduling jobs in a hybrid job shop. We use the term 'hybrid' to indicate that we consider a lot of extensions of the classic job shop, such as transportation times, multiple resources, and setup times. The Shifting Bottleneck procedure can be generalized to deal with

  19. Practical job shop scheduling

    NARCIS (Netherlands)

    Schutten, Johannes M.J.

    1998-01-01

    The Shifting Bottleneck procedure is an intuitive and reasonably good approximation algorithm for the notoriously difficult classical job shop scheduling problem. The principle of decomposing a classical job shop problem into a series of single-machine problems can also easily be applied to job shop

  20. "Creative" Work Schedules.

    Science.gov (United States)

    Blai, Boris

    Many creative or flexible work scheduling options are becoming available to the many working parents, students, handicapped persons, elderly individuals, and others who are either unable or unwilling to work a customary 40-hour work week. These options may be broadly categorized as either restructured or reduced work time options. The three main…

  1. Antigen processing of glycoconjugate vaccines; the polysaccharide portion of the pneumococcal CRM(197) conjugate vaccine co-localizes with MHC II on the antigen processing cell surface.

    Science.gov (United States)

    Lai, Zengzu; Schreiber, John R

    2009-05-21

    Pneumococcal (Pn) polysaccharides (PS) are T-independent (TI) antigens and do not induce immunological memory or antibodies in infants. Conjugation of PnPS to the carrier protein CRM(197) induces PS-specific antibody in infants, and memory similar to T-dependent (Td) antigens. Conjugates have improved immunogenicity via antigen processing and presentation of carrier protein with MHC II and recruitment of T cell help, but the fate of the PS attached to the carrier is unknown. To determine the location of the PS component of PnPS-CRM(197) in the APC, we separately labeled PS and protein and tracked their location. The PS of types 14-CRM(197) and 19F-CRM(197) was specifically labeled by Alexa Fluor 594 hydrazide (red). The CRM(197) was separately labeled red in a reaction that did not label PS. Labeled antigens were incubated with APC which were fixed, permeabilized and incubated with anti-MHC II antibody labeled green by Alexa Fluor 488, followed by confocal microscopy. Labeled CRM(197) was presented on APC surface and co-localized with MHC II (yellow). Labeled unconjugated 14 or 19F PS did not go to the APC surface, but PS labeled 14-CRM(197) and 19F-CRM(197) was internalized and co-localized with MHC II. Monoclonal antibody to type 14 PS bound to intracellular type 14 PS and PS-CRM(197). Brefeldin A and chloroquine blocked both CRM(197) and PS labeled 14-CRM(197) and 19F-CRM(197) from co-localizing with MHC II. These data suggest that the PS component of the CRM(197) glycoconjugate enters the endosome, travels with CRM(197) peptides to the APC surface and co-localizes with MHC II.

  2. Gemini NIFS survey of feeding and feedback processes in nearby active galaxies - II. The sample and surface mass density profiles

    Science.gov (United States)

    Riffel, R. A.; Storchi-Bergmann, T.; Riffel, R.; Davies, R.; Bianchin, M.; Diniz, M. R.; Schönell, A. J.; Burtscher, L.; Crenshaw, M.; Fischer, T. C.; Dahmer-Hahn, L. G.; Dametto, N. Z.; Rosario, D.

    2018-02-01

    We present and characterize a sample of 20 nearby Seyfert galaxies selected for having BAT 14-195 keV luminosities LX ≥ 1041.5 erg s-1, redshift z ≤ 0.015, being accessible for observations with the Gemini Near-Infrared Field Spectrograph (NIFS) and showing extended [O III]λ5007 emission. Our goal is to study Active Galactic Nucleus (AGN) feeding and feedback processes from near-infrared integral-field spectra, which include both ionized (H II) and hot molecular (H2) emission. This sample is complemented by other nine Seyfert galaxies previously observed with NIFS. We show that the host galaxy properties (absolute magnitudes MB, MH, central stellar velocity dispersion and axial ratio) show a similar distribution to those of the 69 BAT AGN. For the 20 galaxies already observed, we present surface mass density (Σ) profiles for H II and H2 in their inner ˜500 pc, showing that H II emission presents a steeper radial gradient than H2. This can be attributed to the different excitation mechanisms: ionization by AGN radiation for H II and heating by X-rays for H2. The mean surface mass densities are in the range (0.2 ≤ ΣH II ≤ 35.9) M⊙ pc-2, and (0.2 ≤ ΣH2 ≤ 13.9)× 10-3 M⊙ pc-2, while the ratios between the H II and H2 masses range between ˜200 and 8000. The sample presented here will be used in future papers to map AGN gas excitation and kinematics, providing a census of the mass inflow and outflow rates and power as well as their relation with the AGN luminosity.

  3. Performance assessment, participative processes and value judgements. Report from the first RISCOM II workshop

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Kjell [Karinta-Konsult, Taeby (Sweden); Lilja, Christina [Swedish Nuclear Power Inspectorate, Stockholm (Sweden)] (eds.)

    2001-12-01

    This workshop was the first one in a series of three workshops within the RISCOM-II project. The aim was to gather the status of the project as a starting point to enhance discussions between project participants and with a number of invited participants. The seminar also included two presentations from the OECD/NEA on NEA work related to stake holder participation, as well as the EC Concerted Action COWAM. Discussions were held in direct connection to the talks and in special sessions. The first day of the workshop entitled Value judgements,risk communication and performance assessment was moderated by Magnus Westerlind (SKI), the RISCOM-II coordinator. The second day was entitled Case studies exploring implications for the practical development of risk communication and was moderated by Anna Littleboy, UK Nirex Ltd. The workshop was opened by Thierry Devries, EDF. He welcomed the participants to Paris and gave some remarks about the French nuclear waste management situation and highlighted the significant French and EDF participation in RISCOM-II. He meant that the project should have possibilities to enhance transparency in nuclear waste programmes and noted that the new concept of stretching, introduced by RISCOM, is already is use. In the following the talks given at the workshop and the discussion that took place are summarized. Appendix 3 gives a brief overview of the RISCOM-II project.

  4. Performance assessment, participative processes and value judgements. Report from the first RISCOM II workshop

    International Nuclear Information System (INIS)

    Andersson, Kjell; Lilja, Christina

    2001-12-01

    This workshop was the first one in a series of three workshops within the RISCOM-II project. The aim was to gather the status of the project as a starting point to enhance discussions between project participants and with a number of invited participants. The seminar also included two presentations from the OECD/NEA on NEA work related to stake holder participation, as well as the EC Concerted Action COWAM. Discussions were held in direct connection to the talks and in special sessions. The first day of the workshop entitled Value judgements,risk communication and performance assessment was moderated by Magnus Westerlind (SKI), the RISCOM-II coordinator. The second day was entitled Case studies exploring implications for the practical development of risk communication and was moderated by Anna Littleboy, UK Nirex Ltd. The workshop was opened by Thierry Devries, EDF. He welcomed the participants to Paris and gave some remarks about the French nuclear waste management situation and highlighted the significant French and EDF participation in RISCOM-II. He meant that the project should have possibilities to enhance transparency in nuclear waste programmes and noted that the new concept of stretching, introduced by RISCOM, is already is use. In the following the talks given at the workshop and the discussion that took place are summarized. Appendix 3 gives a brief overview of the RISCOM-II project

  5. The power of reordering for online minimum makespan scheduling

    OpenAIRE

    Englert, Matthias; Özmen, Deniz; Westermann, Matthias

    2014-01-01

    In the classic minimum makespan scheduling problem, we are given an input sequence of jobs with processing times. A scheduling algorithm has to assign the jobs to m parallel machines. The objective is to minimize the makespan, which is the time it takes until all jobs are processed. In this paper, we consider online scheduling algorithms without preemption. However, we do not require that each arriving job has to be assigned immediately to one of the machines. A reordering buffer with limited...

  6. Effects of parasitic beam-beam interaction during the injection process at the PEP-II B Factory

    International Nuclear Information System (INIS)

    Chin, Y.H.

    1992-06-01

    This paper is concerned with beam-beam effects during the injection process at the proposed asymmetric SLAC/LBL/LLNL B-Factory, PEP-II. It is shown that the parasitic beam-beam interaction can lead to a significant blowup in the vertical size of the injected beam. Simulation results for the horizontal and the vertical injection schemes are presented, and their performances are studied

  7. Integrated Sensing and Processing (ISP) Phase II: Demonstration and Evaluation for Distributed Sensor Netowrks and Missile Seeker Systems

    Science.gov (United States)

    2007-02-28

    National Industrial Security Program Operating Manual (NISPOM), Chapter 5, Section 7, or DOD 5200.1-R, Information Security Program Regulation...Sensing and Processing (ISP) Phase II: Demonstration and Evaluation for Distributed Sensor Netowrks and Missile Seeker Systems 5a. CONTRACT NUMBER 5b... SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 41 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT

  8. Mapping and analysis of the assignment concepts process at academic secretaries of Colegio Pedro II: reflections and proposals for improvement

    Directory of Open Access Journals (Sweden)

    Patrícia Bitencourt de Carvalho Athaydes

    2016-12-01

    Full Text Available The Colegio Pedro II was equated to Federal Institutes of Education, Science and Technology by Law 12677 Publication of June 25, 2012. If, on the one hand, this equalization resulted in an expressive organizational restructuring, with growth of the number of educational units, in addition to the incorporation of new educational levels, on the other, this institutional growth was dissociated from efforts of standardization of administrative processes, notably, under the academic departments of different units – where it shows a variation of the process of launching notes/concepts. In order to contribute with improvements to the operation of the institution, the present article aims to map and analyse comparatively the launch process of notes/concepts in three campus of the Colegio Pedro II. Methodologically, are held in-person interviews with professionals responsible for the academic departments of the following units, Engenho Novo I, Humaita I and Realengo I, in order to obtain the necessary subsidies to support the design of the processes performed by these academic departments units. As a result, it can be verified that the processes of the academic departments are not aligned to any system of performance indicators, which motivated the proposal of a standard process for the launching of notes/concepts, as well as a performance indicators panel (KPIs.

  9. Split scheduling with uniform setup times.

    NARCIS (Netherlands)

    F. Schalekamp; R.A. Sitters (René); S.L. van der Ster; L. Stougie (Leen); V. Verdugo; A. van Zuylen

    2015-01-01

    htmlabstractWe study a scheduling problem in which jobs may be split into parts, where the parts of a split job may be processed simultaneously on more than one machine. Each part of a job requires a setup time, however, on the machine where the job part is processed. During setup, a

  10. Developing optimal nurses work schedule using integer programming

    Science.gov (United States)

    Shahidin, Ainon Mardhiyah; Said, Mohd Syazwan Md; Said, Noor Hizwan Mohamad; Sazali, Noor Izatie Amaliena

    2017-08-01

    Time management is the art of arranging, organizing and scheduling one's time for the purpose of generating more effective work and productivity. Scheduling is the process of deciding how to commit resources between varieties of possible tasks. Thus, it is crucial for every organization to have a good work schedule for their staffs. The job of Ward nurses at hospitals runs for 24 hours every day. Therefore, nurses will be working using shift scheduling. This study is aimed to solve the nurse scheduling problem at an emergency ward of a private hospital. A 7-day work schedule for 7 consecutive weeks satisfying all the constraints set by the hospital will be developed using Integer Programming. The work schedule for the nurses obtained gives an optimal solution where all the constraints are being satisfied successfully.

  11. Prescribed Travel Schedules for Fatigue Management

    Science.gov (United States)

    Whitmire, Alexandra; Johnston, Smith; Lockley, Steven

    2011-01-01

    The NASA Fatigue Management Team is developing recommendations for managing fatigue during travel and for shift work operations, as Clinical Practice Guidelines for the Management of Circadian Desynchrony in ISS Operations. The Guidelines provide the International Space Station (ISS ) flight surgeons and other operational clinicians with evidence-based recommendations for mitigating fatigue and other factors related to sleep loss and circadian desynchronization. As much international travel is involved both before and after flight, the guidelines provide recommendations for: pre-flight training, in-flight operations, and post-flight rehabilitation. The objective of is to standardize the process by which care is provided to crewmembers, ground controllers, and other support personnel such as trainers, when overseas travel or schedule shifting is required. Proper scheduling of countermeasures - light, darkness, melatonin, diet, exercise, and medications - is the cornerstone for facilitating circadian adaptation, improving sleep, enhancing alertness, and optimizing performance. The Guidelines provide, among other things, prescribed travel schedules that outline the specific implementation of these mitigation strategies. Each travel schedule offers evidence based protocols for properly using the NASA identified countermeasures for fatigue. This presentation will describe the travel implementation schedules and how these can be used to alleviate the effects of jet lag and/or schedule shifts.

  12. Interactive Dynamic Mission Scheduling for ASCA

    Science.gov (United States)

    Antunes, A.; Nagase, F.; Isobe, T.

    The Japanese X-ray astronomy satellite ASCA (Advanced Satellite for Cosmology and Astrophysics) mission requires scheduling for each 6-month observation phase, further broken down into weekly schedules at a few minutes resolution. Two tools, SPIKE and NEEDLE, written in Lisp and C, use artificial intelligence (AI) techniques combined with a graphic user interface for fast creation and alteration of mission schedules. These programs consider viewing and satellite attitude constraints as well as observer-requested criteria and present an optimized set of solutions for review by the planner. Six-month schedules at 1 day resolution are created for an oversubscribed set of targets by the SPIKE software, originally written for HST and presently being adapted for EUVE, XTE and AXAF. The NEEDLE code creates weekly schedules at 1 min resolution using in-house orbital routines and creates output for processing by the command generation software. Schedule creation on both the long- and short-term scale is rapid, less than 1 day for long-term, and one hour for short-term.

  13. Self-scheduling with Microsoft Excel.

    Science.gov (United States)

    Irvin, S A; Brown, H N

    1999-01-01

    Excessive time was being spent by the emergency department (ED) staff, head nurse, and unit secretary on a complex 6-week manual self-scheduling system. This issue, plus inevitable errors and staff dissatisfaction, resulted in a manager-lead initiative to automate elements of the scheduling process using Microsoft Excel. The implementation of this initiative included: common coding of all 8-hour and 12-hour shifts, with each 4-hour period represented by a cell; the creation of a 6-week master schedule using the "count-if" function of Excel based on current staffing guidelines; staff time-off requests then entered by the department secretary; the head nurse, with staff input, then fine-tuned the schedule to provide even unit coverage. Outcomes of these changes included an increase in staff satisfaction, time saved by the head nurse, and staff work time saved because there was less arguing about the schedule. Ultimately, the automated self-scheduling method was expanded to the entire 700-bed hospital.

  14. Scheduling techniques in the Request Oriented Scheduling Engine (ROSE)

    Science.gov (United States)

    Zoch, David R.

    1991-01-01

    Scheduling techniques in the ROSE are presented in the form of the viewgraphs. The following subject areas are covered: agenda; ROSE summary and history; NCC-ROSE task goals; accomplishments; ROSE timeline manager; scheduling concerns; current and ROSE approaches; initial scheduling; BFSSE overview and example; and summary.

  15. Location-based Scheduling

    DEFF Research Database (Denmark)

    Andersson, Niclas; Christensen, Knud

    on the market. However, CPM is primarily an activity based method that takes the activity as the unit of focus and there is criticism raised, specifically in the case of construction projects, on the method for deficient management of construction work and continuous flow of resources. To seek solutions...... to the identified limitations of the CPM method, an alternative planning and scheduling methodology that includes locations is tested. Location-based Scheduling (LBS) implies a shift in focus, from primarily the activities to the flow of work through the various locations of the project, i.e. the building. LBS uses...... the graphical presentation technique of Line-of-balance, which is adapted for planning and management of work-flows that facilitates resources to perform their work without interruptions caused by other resources working with other activities in the same location. As such, LBS and Lean Construction share...

  16. Estrutura de prática e validade ecológica no processo adaptativo de aprendizagem motora Practice schedule and ecological validity in the adaptive process of motor learning

    Directory of Open Access Journals (Sweden)

    Marcela Massigli

    2011-03-01

    Full Text Available O objetivo desse estudo foi investigar o efeito da estrutura de prática no processo adaptativo de aprendizagem motora em função da validade ecológica da situação experimental. Participaram do estudo 104 crianças distribuídas em oito grupos experimentais (dois níveis de validade ecológica x quatro estruturas de prática. A tarefa consistiu em rebater uma bola de tênis de mesa lançada por um equipamento ou pelo experimentador, com o objetivo de acertar um alvo localizado no lado oposto da mesa. O estudo envolveu duas fases: estabilização e adaptação. O desempenho foi analisado por meio da soma dos pontos obtidos em blocos de 10 tentativas. Os resultados mostraram que os efeitos das práticas constante, aleatória, constante-aleatória e aleatória-constante no processo adaptativo de aprendizagem motora foram similares em ambos os níveis de validade ecológica; a prática constante foi a estrutura menos efetiva no processo adaptativo de aprendizagem motora em ambas as situações experimentais.The aim of this study was to investigate the effect of different practice schedules in the adaptive process of motor learning in function of the ecological validity of the experimental situation. Participants were 104 children distributed in eight experimental groups (2 levels of ecological validity x 4 practice schedules. The task was to hit a table tennis ball thrown by equipment or experimenter, aiming to hit a target located on the opposite side of the table. The study was carried out in two phases: stabilization and adaptation. Performance was analyzed through the sum of the points achieved in ten trial blocks. Results showed that the effects of constant, random, constant-random and random-constant practice on the adaptive process of motor learning were similar in both levels of ecological validity; constant practice was the less effective schedule on the adaptive process of motor learning in both experimental situations.

  17. WE-H-BRA-03: Development of a Model to Include the Evolution of Resistant Tumor Subpopulations Into the Treatment Optimization Process for Schedules Involving Targeted Agents in Chemoradiation Therapy

    International Nuclear Information System (INIS)

    Grassberger, C; Paganetti, H

    2016-01-01

    Purpose: To develop a model that includes the process of resistance development into the treatment optimization process for schedules that include targeted therapies. Further, to validate the approach using clinical data and to apply the model to assess the optimal induction period with targeted agents before curative treatment with chemo-radiation in stage III lung cancer. Methods: Growth of the tumor and its subpopulations is modeled by Gompertzian growth dynamics, resistance induction as a stochastic process. Chemotherapy induced cell kill is modeled by log-cell kill dynamics, targeted agents similarly but restricted to the sensitive population. Radiation induced cell kill is assumed to follow the linear-quadratic model. The validation patient data consist of a cohort of lung cancer patients treated with tyrosine kinase inhibitors that had longitudinal imaging data available. Results: The resistance induction model was successfully validated using clinical trial data from 49 patients treated with targeted agents. The observed recurrence kinetics, with tumors progressing from 1.4–63 months, result in tumor growth equaling a median volume doubling time of 92 days [34–248] and a median fraction of pre-existing resistance of 0.035 [0–0.22], in agreement with previous clinical studies. The model revealed widely varying optimal time points for the use of curative therapy, reaching from ∼1m to >6m depending on the patient’s growth rate and amount of pre-existing resistance. This demonstrates the importance of patient-specific treatment schedules when targeted agents are incorporated into the treatment. Conclusion: We developed a model including evolutionary dynamics of resistant sub-populations with traditional chemotherapy and radiation cell kill models. Fitting to clinical data yielded patient specific growth rates and resistant fraction in agreement with previous studies. Further application of the model demonstrated how proper timing of chemo

  18. WE-H-BRA-03: Development of a Model to Include the Evolution of Resistant Tumor Subpopulations Into the Treatment Optimization Process for Schedules Involving Targeted Agents in Chemoradiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Grassberger, C; Paganetti, H [Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: To develop a model that includes the process of resistance development into the treatment optimization process for schedules that include targeted therapies. Further, to validate the approach using clinical data and to apply the model to assess the optimal induction period with targeted agents before curative treatment with chemo-radiation in stage III lung cancer. Methods: Growth of the tumor and its subpopulations is modeled by Gompertzian growth dynamics, resistance induction as a stochastic process. Chemotherapy induced cell kill is modeled by log-cell kill dynamics, targeted agents similarly but restricted to the sensitive population. Radiation induced cell kill is assumed to follow the linear-quadratic model. The validation patient data consist of a cohort of lung cancer patients treated with tyrosine kinase inhibitors that had longitudinal imaging data available. Results: The resistance induction model was successfully validated using clinical trial data from 49 patients treated with targeted agents. The observed recurrence kinetics, with tumors progressing from 1.4–63 months, result in tumor growth equaling a median volume doubling time of 92 days [34–248] and a median fraction of pre-existing resistance of 0.035 [0–0.22], in agreement with previous clinical studies. The model revealed widely varying optimal time points for the use of curative therapy, reaching from ∼1m to >6m depending on the patient’s growth rate and amount of pre-existing resistance. This demonstrates the importance of patient-specific treatment schedules when targeted agents are incorporated into the treatment. Conclusion: We developed a model including evolutionary dynamics of resistant sub-populations with traditional chemotherapy and radiation cell kill models. Fitting to clinical data yielded patient specific growth rates and resistant fraction in agreement with previous studies. Further application of the model demonstrated how proper timing of chemo

  19. Online scheduling of 2-re-entrant flexible manufacturing systems

    NARCIS (Netherlands)

    Pinxten, J. van; Waqas, U.; Geilen, M.; Basten, T.; Somers, L.

    2017-01-01

    Online scheduling of operations is essential to optimize productivity of flexible manufacturing systems (FMSs) where manufacturing requests arrive on the fly. An FMS processes products according to a particular flow through processing stations. This work focusses on online scheduling of re-entrant

  20. Modelling altered fractionation schedules

    International Nuclear Information System (INIS)

    Fowler, J.F.

    1993-01-01

    The author discusses the conflicting requirements of hyperfractionation and accelerated fractionation used in radiotherapy, and the development of computer modelling to predict how to obtain an optimum of tumour cell kill without exceeding normal-tissue tolerance. The present trend is to shorten hyperfractionated schedules from 6 or 7 weeks to give overall times of 4 or 5 weeks as in new schedules by Herskovic et al (1992) and Harari (1992). Very high doses are given, much higher than can be given when ultrashort schedules such as CHART (12 days) are used. Computer modelling has suggested that optimum overall times, to yield maximum cell kill in tumours ((α/β = 10 Gy) for a constant level of late complications (α/β = 3 Gy) would be X or X-1 weeks, where X is the doubling time of the tumour cells in days (Fowler 1990). For median doubling times of about 5 days, overall times of 4 or 5 weeks should be ideal. (U.K.)

  1. Estimating exponential scheduling preferences

    DEFF Research Database (Denmark)

    Hjorth, Katrine; Börjesson, Maria; Engelson, Leonid

    Extended abstract Choice of departure time is a travel choice dimension that transportation planners often need to forecast in appraisal. A traveller may shift departure time in response to changes in expected travel time or travel time variability (TTV) or in response to time-differentiated cong......Extended abstract Choice of departure time is a travel choice dimension that transportation planners often need to forecast in appraisal. A traveller may shift departure time in response to changes in expected travel time or travel time variability (TTV) or in response to time...... from the underlying scheduling preferences (Noland and Small, 1995, Bates et al., 2001, Fosgerau and Karlström, 2010). The scheduling preferences can be formally represented as time-dependent rates of utility derived at different locations. Assuming that the travellers are rational and choose departure......’ departure time choice. The assumption underlying the scheduling approach is that the traveller rationally maximises her total utility obtained during a period of time. The total utility depends on time of departure from the origin and time of arrival to the destination. The total utility is usually assumed...

  2. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part II

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    In Part I, an efficient method for identifying faults in large processes was presented. The whole plant is divided into sectors by using structural, functional, or causal decomposition. A signed directed graph (SDG) is the model used for each sector. The SDG represents interactions among process variables. This qualitative model is used to carry out qualitative simulation for all possible faults. The output of this step is information about the process behaviour. This information is used to build rules. When a symptom is detected in one sector, its rules are evaluated using on-line data and fuzzy logic to yield the diagnosis. In this paper the proposed methodology is applied to a multiple stage flash (MSF) desalination process. This process is composed of sequential flash chambers. It was designed for a pilot plant that produces drinkable water for a community in Argentina; that is, it is a real case. Due to the large number of variables, recycles, phase changes, etc., this process is a good challenge for the proposed diagnosis method

  3. VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans

    Science.gov (United States)

    Wang, Song; Gupta, Chetan; Mehta, Abhay

    There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.

  4. Assessment of very high-temperature reactors in process applications. Appendix II. VHTR process heat application studies

    International Nuclear Information System (INIS)

    Jones, J.E.; Gambill, W.R.; Cooper, R.H.; Fox, E.C.; Fuller, L.C.; Littlefield, C.C.; Silverman, M.D.

    1977-06-01

    A critical review is presented of the technology and economics for coupling a very high-temperature gas-cooled reactor to a variety of process applications. It is concluded that nuclear steam reforming of light hydrocarbons for coal conversion could be a near-term alternative and that direct nuclear coal gasification could be a future consideration. Thermochemical water splitting appears to be more costly and its availability farther in the future than the coal-conversion systems. Nuclear steelmaking is competitive with the direct reduction of iron ore from conventional coal-conversion processes but not competitive with the reforming of natural gas at present gas prices. Nuclear process heat for petroleum refining, even with the necessary backup systems, is competitive with fossil energy sources. The processing with nuclear heat of oil shale and tar sands is of marginal economic importance. An analysis of peaking power applications using nuclear heat was also made. It is concluded that steam reforming methane for energy storage and production of peaking power is not a viable economic alternative, but that energy storage with a high-temperature heat transfer salt (HTS) is competitive with conventional peaking systems. An examination of the materials required in process heat exchangers is made

  5. High-speed vector-processing system of the MELCOM-COSMO 900II

    Energy Technology Data Exchange (ETDEWEB)

    Masuda, K; Mori, H; Fujikake, J; Sasaki, Y

    1983-01-01

    Progress in scientific and technical calculations has lead to a growing demand for high-speed vector calculations. Mitsubishi electric has developed an integrated array processor and automatic-vectorizing fortran compiler as an option for the MELCOM-COSMO 900II computer system. This facilitates the performance of vector calculations and matrix calculations, achieving significant gains in cost-effectiveness. The article outlines the high-speed vector system, includes discussion of compiler structuring, and cites examples of effective system application. 1 reference.

  6. The ASDEX Upgrade discharge schedule

    International Nuclear Information System (INIS)

    Neu, G.; Engelhardt, K.; Raupp, G.; Treutterer, W.; Zasche, D.; Zehetbauer, T.

    2007-01-01

    ASDEX Upgrade's recently commissioned discharge control system (DCS) marks the transition from a traditional programmed system to a highly flexible 'data driven' one. The allocation of application processes (APs) to controllers, the interconnection of APs through uniquely named signals, and AP control parameter values are all defined as data, and can easily be adapted to the requirements of a particular discharge. The data is laid down in a set of XML documents which APs request via HTTP from a configuration server before a discharge. The use of XML allows for easy parsing, and structural validation through (XSD) schemas. The central input to the configuration process is the discharge schedule (DS), which embodies the dynamic behaviour of a planned discharge as reference trajectories grouped in segments, concatenated through transition conditions. Editing, generation and validation tools, and version control through CVS allow for efficient management of DSs

  7. Flow-shop scheduling problem under uncertainties: Review and trends

    OpenAIRE

    Eliana María González-Neira; Jairo R. Montoya-Torres; David Barrera

    2017-01-01

    Among the different tasks in production logistics, job scheduling is one of the most important at the operational decision-making level to enable organizations to achieve competiveness. Scheduling consists in the allocation of limited resources to activities over time in order to achieve one or more optimization objectives. Flow-shop (FS) scheduling problems encompass the sequencing processes in environments in which the activities or operations are performed in a serial flow. This type of co...

  8. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    1999-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments....

  9. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2004-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays with four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments....

  10. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments...

  11. Schedule optimization study implementation plan

    International Nuclear Information System (INIS)

    1993-11-01

    This Implementation Plan is intended to provide a basis for improvements in the conduct of the Environmental Restoration (ER) Program at Hanford. The Plan is based on the findings of the Schedule Optimization Study (SOS) team which was convened for two weeks in September 1992 at the request of the U.S. Department of Energy (DOE) Richland Operations Office (RL). The need for the study arose out of a schedule dispute regarding the submission of the 1100-EM-1 Operable Unit (OU) Remedial Investigation/Feasibility Study (RI/FS) Work Plan. The SOS team was comprised of independent professionals from other federal agencies and the private sector experienced in environmental restoration within the federal system. The objective of the team was to examine reasons for the lengthy RI/FS process and recommend ways to expedite it. The SOS team issued their Final Report in December 1992. The report found the most serious impediments to cleanup relate to a series of management and policy issues which are within the control of the three parties managing and monitoring Hanford -- the DOE, U.S. Environmental Protection Agency (EPA), and the State of Washington Department of Ecology (Ecology). The SOS Report identified the following eight cross-cutting issues as the root of major impediments to the Hanford Site cleanup. Each of these eight issues is quoted from the SOS Report followed by a brief, general description of the proposed approach being developed

  12. Converting Eucalyptus biomass into ethanol: Financial and sensitivity analysis in a co-current dilute acid process. Part II

    International Nuclear Information System (INIS)

    Gonzalez, R.; Treasure, T.; Phillips, R.; Jameel, H.; Saloni, D.; Wright, J.; Abt, R.

    2011-01-01

    The technical and financial performance of high yield Eucalyptus biomass in a co-current dilute acid pretreatment followed by enzymatic hydrolysis process was simulated using WinGEMS registered and Excel registered . Average ethanol yield per dry Mg of Eucalyptus biomass was approximately 347.6 L of ethanol (with average carbohydrate content in the biomass around 66.1%) at a cost of 0.49 L -1 of ethanol, cash cost of ∝0.46 L -1 and CAPEX of 1.03 L -1 of ethanol. The main cost drivers are: biomass, enzyme, tax, fuel (gasoline), depreciation and labor. Profitability of the process is very sensitive to biomass cost, carbohydrate content (%) in biomass and enzyme cost. Biomass delivered cost was simulated and financially evaluated in Part I; here in Part II the conversion of this raw material into cellulosic ethanol using the dilute acid process is evaluated. (author)

  13. NRC comprehensive records disposition schedule

    International Nuclear Information System (INIS)

    1982-07-01

    Effective January 1, 1982, NRC will institute records retention and disposal practices in accordance with the approved Comprehensive Records Disposition Schedule (CRDS). CRDS is comprised of NRC Schedules (NRCS) 1 to 4 which apply to the agency's program or substantive records and General Records Schedules (GRS) 1 to 22 which apply to housekeeping or facilitative records. The schedules are assembled functionally/organizationally to facilitate their use. Preceding the records descriptions and disposition instructions for both NRCS and GRS, there are brief statements on the organizational units which accumulate the records in each functional area, and other information regarding the schedules' applicability

  14. HTGR high temperature process heat design and cost status report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    None

    1981-12-01

    Information is presented concerning the 850/sup 0/C IDC reactor vessel; primary cooling system; secondary helium system; steam generator; heat cycle evaluations for the 850/sup 0/C IDC plant; 950/sup 0/C DC reactor vessel; 950/sup 0/C DC steam generator; direct and indirect cycle reformers; methanation plant; thermochemical pipeline; methodology for screening candidate synfuel processes; ECCG process; project technical requirements; process gas explosion assessment; HTGR program economic guidelines; and vendor respones.

  15. HTGR high temperature process heat design and cost status report. Volume II. Appendices

    International Nuclear Information System (INIS)

    1981-12-01

    Information is presented concerning the 850 0 C IDC reactor vessel; primary cooling system; secondary helium system; steam generator; heat cycle evaluations for the 850 0 C IDC plant; 950 0 C DC reactor vessel; 950 0 C DC steam generator; direct and indirect cycle reformers; methanation plant; thermochemical pipeline; methodology for screening candidate synfuel processes; ECCG process; project technical requirements; process gas explosion assessment; HTGR program economic guidelines; and vendor respones

  16. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  17. A new manufacturing process to remove thrombogenic factors (II, VII, IX, X, and XI) from intravenous immunoglobulin gamma preparations.

    Science.gov (United States)

    Park, Dong Hwarn; Kang, Gil Bu; Kang, Dae Eun; Hong, Jeung Woon; Lee, Min Gyu; Kim, Ki Yong; Han, Jeung Whan

    2017-01-01

    Coagulation factors (II, VII, IX, X, and particularly XIa) remaining in high concentrations in intravenous immunoglobulin (IVIG) preparations can form thrombi, causing thromboembolic events, and in serious cases, result in death. Therefore, manufacturers of biological products must investigate the ability of their production processes to remove procoagulant activities. Previously, we were able to remove coagulation factors II, VII, IX, and X from our IVIG preparation through ethanol precipitation, but factor XIa, which plays an important role in thrombosis, remained in the intermediate products. Here, we used a chromatographic process using a new resin that binds with high capacity to IgG and removes procoagulant activities. The procoagulant activities were reduced to low levels as determined by the thrombin generation assay: 250 s, FXI/FXIa ELISA: <0.31 ng/mL. Even after spiking with FXIa at a concentration 32.5 times higher than the concentration in normal specimens, the procoagulant activities were below the detection limit (<0.31 ng/mL). These results demonstrate the ability of our manufacturing process to remove procoagulant activities to below the detection limit (except by NaPTT), suggesting a reduced risk of thromboembolic events that maybe potentially caused by our IVIG preparation. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  18. Practical quantum appointment scheduling

    Science.gov (United States)

    Touchette, Dave; Lovitz, Benjamin; Lütkenhaus, Norbert

    2018-04-01

    We propose a protocol based on coherent states and linear optics operations for solving the appointment-scheduling problem. Our main protocol leaks strictly less information about each party's input than the optimal classical protocol, even when considering experimental errors. Along with the ability to generate constant-amplitude coherent states over two modes, this protocol requires the ability to transfer these modes back-and-forth between the two parties multiple times with very low losses. The implementation requirements are thus still challenging. Along the way, we develop tools to study quantum information cost of interactive protocols in the finite regime.

  19. Real-time data acquisition and parallel data processing solution for TJ-II Bolometer arrays diagnostic

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, E. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid, Crta. Valencia Km. 7, 28031 Madrid (Spain)]. E-mail: eduardo.barrera@upm.es; Ruiz, M. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid, Crta. Valencia Km. 7, 28031 Madrid (Spain); Lopez, S. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid, Crta. Valencia Km. 7, 28031 Madrid (Spain); Machon, D. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid, Crta. Valencia Km. 7, 28031 Madrid (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, 28040 Madrid (Spain); Ochando, M. [Asociacion EURATOM/CIEMAT para Fusion, 28040 Madrid (Spain)

    2006-07-15

    Maps of local plasma emissivity of TJ-II plasmas are determined using three-array cameras of silicon photodiodes (AXUV type from IRD). They have assigned the top and side ports of the same sector of the vacuum vessel. Each array consists of 20 unfiltered detectors. The signals from each of these detectors are the inputs to an iterative algorithm of tomographic reconstruction. Currently, these signals are acquired by a PXI standard system at approximately 50 kS/s, with 12 bits of resolution and are stored for off-line processing. A 0.5 s discharge generates 3 Mbytes of raw data. The algorithm's load exceeds the CPU capacity of the PXI system's controller in a continuous mode, making unfeasible to process the samples in parallel with their acquisition in a PXI standard system. A new architecture model has been developed, making possible to add one or several processing cards to a standard PXI system. With this model, it is possible to define how to distribute, in real-time, the data from all acquired signals in the system among the processing cards and the PXI controller. This way, by distributing the data processing among the system controller and two processing cards, the data processing can be done in parallel with the acquisition. Hence, this system configuration would be able to measure even in long pulse devices.

  20. Spin-dependent recombination processes in wide band gap II-Mn-VI compounds

    International Nuclear Information System (INIS)

    Godlewski, M.; Yatsunenko, S.; Khachapuridze, A.; Ivanov, V.Yu.

    2004-01-01

    Mechanisms of optical detection of magnetic resonance in wide band gap II-Mn-VI diluted magnetic semiconductor (DMS) are discussed based on the results of photoluminescence (PL), PL kinetics, electron spin resonance (ESR) and optically detected magnetic resonance (ODMR) and optically detected cyclotron resonance (ODCR) investigations. Spin-dependent interactions between localized spins of Mn 2+ ions and spins/magnetic moments of free, localized or bound carriers are responsible for the observed ODMR signals. We conclude that these interactions are responsible for the observed rapid shortening of the PL decay time of 4 T 1 → 6 A 1 intra-shell emission of Mn 2+ ions and also for the observed delocalization of excitons in low dimensional structures

  1. Intracellular insulin processing is altered in monocytes from patients with type II diabetes mellitus

    International Nuclear Information System (INIS)

    Trischitta, V.; Benzi, L.; Brunetti, A.; Cecchetti, P.; Marchetti, P.; Vigneri, R.; Navalesi, R.

    1987-01-01

    We studied total cell-associated A14-[ 125 I]insulin radioactivity (including surface-bound and internalized radioactivity), insulin internalization, and its intracellular degradation at 37 C in monocytes from nonobese type II untreated diabetic patients (n = 9) and normal subjects (n = 7). Total cell-associated radioactivity was decreased in diabetic patients [2.65 +/- 1.21% (+/- SD) vs. 4.47 +/- 1.04% of total radioactivity. Insulin internalization was also reduced in diabetic patients (34.0 +/- 6.8% vs. 59.0 +/- 11.3% of cell-associated radioactivity. Using high performance liquid chromatography six intracellular forms of radioactivity derived from A14-[ 125 I] insulin were identified; 10-20% of intracellular radioactivity had approximately 300,000 mol wt and was identified as radioactivity bound to the insulin receptor, and the remaining intracellular radioactivity included intact A14-[ 125 I]insulin, [ 125 I]iodide, or [ 125 I]tyrosine, and three intermediate compounds. A progressive reduction of intact insulin and a corresponding increase in iodine were found when the incubation time was prolonged. Intracellular insulin degradation was reduced in monocytes from diabetic patients; intracellular intact insulin was 65.6 +/- 18.1% vs. 37.4 +/- 18.0% of intracellular radioactivity after 2 min and 23.6 +/- 22.3% vs. 3.9 +/- 2.3% after 60 min in diabetic patients vs. normal subjects, respectively. In conclusion, 1) human monocytes internalize and degrade insulin in the intracellular compartment in a stepwise time-dependent manner; and 2) in monocytes from type II diabetic patients total cell-associated radioactivity, insulin internalization, and insulin degradation are significantly reduced. These defects may be related to the cellular insulin resistance present in these patients

  2. Impact of the industrial freezing process on selected vegetables -Part II. Colour and bioactive compounds

    NARCIS (Netherlands)

    Mazzeo, Teresa; Paciulli, Maria; Chiavaro, Emma; Visconti, Attilio; Fogliano, Vincenzo; Ganino, Tommaso; Pellegrini, Nicoletta

    2015-01-01

    In the present study, the impact of the different steps (i.e. blanching, freezing, storage following the industrial freezing process and the final cooking prior to consumption) of the industrial freezing process was evaluated on colour, chlorophylls, lutein, polyphenols and ascorbic acid content

  3. Effects of straw processing and pen stocking density on holstein dairy heifers: ii) behavior and hygiene

    Science.gov (United States)

    The effects of pen-stocking density and straw processing on the daily behavior traits and hygiene of Holstein dairy heifers housed in a freestall system are not understood. Our objective was to evaluate these factors in a trial with a 2 × 3 factorial arrangement of straw-processing (GOOD or POOR) an...

  4. Metastability in reversible diffusion processes II. Precise asymptotics for small eigenvalues

    CERN Document Server

    Bovier, A; Klein, M

    2002-01-01

    We continue the analysis of the problem of metastability for reversible diffusion processes, initiated in \\cite{BEGK3}, with a precise analysis of the low-lying spectrum of the generator. Recall that we are considering processes with generators of the form $-\\e \\Delta +\

  5. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  6. Crane Scheduling on a Plate Storage

    DEFF Research Database (Denmark)

    Hansen, Jesper

    2002-01-01

    OSS produces the worlds largest container ships. The first process of producing the steel ships is handling arrival and storage of steel plates until they are needed in production. Two gantry cranes carry out this task. The planning task is now to create a schedule of movements for the 2 cranes...

  7. FLOWSHOP SCHEDULING USING A NETWORK APPROACH ...

    African Journals Online (AJOL)

    eobe

    time when the last job completes on the last machine. The objective ... more jobs in a permutation flow shop scheduling problem ... processing time of a job on a machine is zero, it ..... hybrid flow shops with sequence dependent setup times ...

  8. Crane Scheduling for a Plate Storage

    DEFF Research Database (Denmark)

    Hansen, Jesper; Clausen, Jens

    2002-01-01

    Odense Steel Shipyard produces the worlds largest container ships. The first process of producing the steel ships is handling arrival and storage of steel plates until they are needed in production. This paper considers the problem of scheduling two cranes that carry out the movements of plates...... into, around and out of the storage. The system is required to create a daily schedule for the cranes, but also handle possible disruptions during the execution of the plan. The problem is solved with a Simulated Annealing algorithm....

  9. Autonomous scheduling technology for Earth orbital missions

    Science.gov (United States)

    Srivastava, S.

    1982-01-01

    The development of a dynamic autonomous system (DYASS) of resources for the mission support of near-Earth NASA spacecraft is discussed and the current NASA space data system is described from a functional perspective. The future (late 80's and early 90's) NASA space data system is discussed. The DYASS concept, the autonomous process control, and the NASA space data system are introduced. Scheduling and related disciplines are surveyed. DYASS as a scheduling problem is also discussed. Artificial intelligence and knowledge representation is considered as well as the NUDGE system and the I-Space system.

  10. An improved sheep flock heredity algorithm for job shop scheduling and flow shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Chandramouli Anandaraman

    2011-10-01

    Full Text Available Job Shop Scheduling Problem (JSSP and Flow Shop Scheduling Problem (FSSP are strong NP-complete combinatorial optimization problems among class of typical production scheduling problems. An improved Sheep Flock Heredity Algorithm (ISFHA is proposed in this paper to find a schedule of operations that can minimize makespan. In ISFHA, the pairwise mutation operation is replaced by a single point mutation process with a probabilistic property which guarantees the feasibility of the solutions in the local search domain. A Robust-Replace (R-R heuristic is introduced in place of chromosomal crossover to enhance the global search and to improve the convergence. The R-R heuristic is found to enhance the exploring potential of the algorithm and enrich the diversity of neighborhoods. Experimental results reveal the effectiveness of the proposed algorithm, whose optimization performance is markedly superior to that of genetic algorithms and is comparable to the best results reported in the literature.

  11. Conception of Self-Construction Production Scheduling System

    Science.gov (United States)

    Xue, Hai; Zhang, Xuerui; Shimizu, Yasuhiro; Fujimura, Shigeru

    With the high speed innovation of information technology, many production scheduling systems have been developed. However, a lot of customization according to individual production environment is required, and then a large investment for development and maintenance is indispensable. Therefore now the direction to construct scheduling systems should be changed. The final objective of this research aims at developing a system which is built by it extracting the scheduling technique automatically through the daily production scheduling work, so that an investment will be reduced. This extraction mechanism should be applied for various production processes for the interoperability. Using the master information extracted by the system, production scheduling operators can be supported to accelerate the production scheduling work easily and accurately without any restriction of scheduling operations. By installing this extraction mechanism, it is easy to introduce scheduling system without a lot of expense for customization. In this paper, at first a model for expressing a scheduling problem is proposed. Then the guideline to extract the scheduling information and use the extracted information is shown and some applied functions are also proposed based on it.

  12. River water quality model no. 1 (RWQM1): II. Biochemical process equations

    DEFF Research Database (Denmark)

    Reichert, P.; Borchardt, D.; Henze, Mogens

    2001-01-01

    In this paper, biochemical process equations are presented as a basis for water quality modelling in rivers under aerobic and anoxic conditions. These equations are not new, but they summarise parts of the development over the past 75 years. The primary goals of the presentation are to stimulate...... transformation processes. This paper is part of a series of three papers. In the first paper, the general modelling approach is described; in the present paper, the biochemical process equations of a complex model are presented; and in the third paper, recommendations are given for the selection of a reasonable...

  13. A trajectory description of quantum processes. II. Applications. A Bohmian perspective

    Energy Technology Data Exchange (ETDEWEB)

    Sanz, Angel S.; Miret-Artes, Salvador [CSIC, Madrid (Spain). Inst. de Fisica Fundamental (IFF-CSIC)

    2014-07-01

    Presents a thorough introduction to, and treatment of, trajectory-based quantum-mechanical calculations. Useful for a wide range of scattering problems. Presents the applications of the trajectory description of basic quantum processes. Trajectory-based formalisms are an intuitively appealing way of describing quantum processes because they allow the use of ''classical'' concepts. Beginning as an introductory level suitable for students, this two-volume monograph presents (1) the fundamentals and (2) the applications of the trajectory description of basic quantum processes. This second volume is focussed on simple and basic applications of quantum processes such as interference and diffraction of wave packets, tunneling, diffusion and bound-state and scattering problems. The corresponding analysis is carried out within the Bohmian framework. By stressing its interpretational aspects, the book leads the reader to an alternative and complementary way to better understand the underlying quantum dynamics.

  14. State of the art review of degradation processes in LMFBR materials. Volume II. Corrosion behavior

    International Nuclear Information System (INIS)

    Dillon, R.D.

    1975-01-01

    Degradation of materials exposed to Na in LMFBR service is reviewed. The degradation processes are discussed in sections on corrosion and mass transfer, erosion, wear and self welding, sodium--water reactions, and external corrosion. (JRD)

  15. Proceedings of the Malaysian Science and Technology Congress `94: Vol. II - new products and processes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    New processes and products in the field of the Malaysian technology research were presented at the Science and Technology congress `94. Composite materials, semiconductors fabrication, optical fibers, zeolite properties etc. were discussed in 35 contributions.

  16. Proceedings of the Malaysian Science and Technology Congress '94: Vol. II - new products and processes

    International Nuclear Information System (INIS)

    1994-01-01

    New processes and products in the field of the Malaysian technology research were presented at the Science and Technology congress '94. Composite materials, semiconductors fabrication, optical fibers, zeolite properties etc. were discussed in 35 contributions

  17. Underwater Nuclear Fuel Disassembly and Rod Storage Process and Equipment Description. Volume II

    International Nuclear Information System (INIS)

    Viebrock, J.M.

    1981-09-01

    The process, equipment, and the demonstration of the Underwater Nuclear Fuel Disassembly and Rod Storage System are presented. The process was shown to be a viable means of increasing spent fuel pool storage density by taking apart fuel assemblies and storing the fuel rods in a denser fashion than in the original storage racks. The assembly's nonfuel-bearing waste is compacted and containerized. The report documents design criteria and analysis, fabrication, demonstration program results, and proposed enhancements to the system

  18. Processing of copper converter slag for metals reclamation: Part II: mineralogical study.

    Science.gov (United States)

    Deng, Tong; Ling, Yunhan

    2004-10-01

    Chemical and mineralogical characterizations of a copper converter slag, and its products obtained by curing with strong sulphuric acid and leaching with hot water, were carried out using ore microscopy, scanning electronic microscopy with energy dispersive spectrometry, wave-length dispersive X-ray fluorescence spectrometry, X-ray diffractometry and chemical phase analysis, which provided necessary information to develop a new process for treating such slag and further understanding of the chemical and mineralogical changes in the process.

  19. Bomb reduction of uranium tetrafluoride. Part II: Influence of the addition elements in the reduction process

    International Nuclear Information System (INIS)

    Anca Abati, R.; Lopez Rodriguez, M.

    1962-01-01

    This work shows the influence of uranium oxide and uranyl fluoride in the reduction of uranium with Ca and Mg. These additions are more harmful when using smaller bombs. The uranyl fluoride has influence in the reduction process; the curves yield-concentration shows two regions depending upon the salt concentration. The behaviour of this addition in these regions can be explained following the different decompositions that can take place during the reduction process. (Author) 9 refs

  20. Field theoretical approach to proton-nucleus reactions: II-Multiple-step excitation process

    International Nuclear Information System (INIS)

    Eiras, A.; Kodama, T.; Nemes, M.

    1989-01-01

    A field theoretical formulation to multiple step excitation process in proton-nucleus collision within the context of a relativistic eikonal approach is presented. A closed form expression for the double differential cross section can be obtained whose structure is very simple and makes the physics transparent. Glauber's formulation of the same process is obtained as a limit of ours and the necessary approximations are studied and discussed. (author) [pt

  1. APGEN Scheduling: 15 Years of Experience in Planning Automation

    Science.gov (United States)

    Maldague, Pierre F.; Wissler, Steve; Lenda, Matthew; Finnerty, Daniel

    2014-01-01

    In this paper, we discuss the scheduling capability of APGEN (Activity Plan Generator), a multi-mission planning application that is part of the NASA AMMOS (Advanced Multi- Mission Operations System), and how APGEN scheduling evolved over its applications to specific Space Missions. Our analysis identifies two major reasons for the successful application of APGEN scheduling to real problems: an expressive DSL (Domain-Specific Language) for formulating scheduling algorithms, and a well-defined process for enlisting the help of auxiliary modeling tools in providing high-fidelity, system-level simulations of the combined spacecraft and ground support system.

  2. Perceptions of randomized security schedules.

    Science.gov (United States)

    Scurich, Nicholas; John, Richard S

    2014-04-01

    Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.

  3. Information Management of a Structured Admissions Interview Process in a Medical College with an Apple II System

    Science.gov (United States)

    O'Reilly, Robert; Fedorko, Steve; Nicholson, Nigel

    1983-01-01

    This paper describes a structured interview process for medical school admissions supported by an Apple II computer system which provides feedback to interviewers and the College admissions committee. Presented are the rationale for the system, the preliminary results of analysis of some of the interview data, and a brief description of the computer program and output. The present data show that the structured interview yields very high interrater reliability coefficients, is acceptable to the medical school faculty, and results in quantitative data useful in the admission process. The system continues in development at this time, a second year of data will be shortly available, and further refinements are being made to the computer program to enhance its utilization and exportability.

  4. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications

    Science.gov (United States)

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  5. Supercritical Production of Nanoparticles - Part I: The SSEC Process - Part II: Characterization of Nanopartic

    DEFF Research Database (Denmark)

    Jensen, Henrik

    with the crystallite size. Therefore special interest is being devoted to investigating these changes by developing new synthesis and characterizing methods. Wet chemical and gas phase syntheses are among the number of synthesis techniques that have been developed for nanoparticle formation. The sol-gel technique...... is the most broadly applied wet chemical process and it can be used for the production of nanosized materials in the formof particles or coatings for a wide range of materials. However, conventional sol-gel techniques have a number of drawbacks. The process maintains long reaction times and requires post....... The work presented in this thesis addresses the problems related to the conventional sol-gel techniques by using supercritical CO2 as the reaction media. Supercritical fluids exhibit gas like mass transfer properties and liquid like densities which are both particularly attractive to the sol-gel process...

  6. 36 CFR 1258.12 - NARA reproduction fee schedule.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false NARA reproduction fee... ADMINISTRATION PUBLIC AVAILABILITY AND USE FEES § 1258.12 NARA reproduction fee schedule. (a) Certification: $15...) Unlisted processes: For reproductions not covered by this fee schedule, see also § 1258.4. Fees for other...

  7. Duality-based algorithms for scheduling on unrelated parallel machines

    NARCIS (Netherlands)

    van de Velde, S.L.; van de Velde, S.L.

    1993-01-01

    We consider the following parallel machine scheduling problem. Each of n independent jobs has to be scheduled on one of m unrelated parallel machines. The processing of job J[sub l] on machine Mi requires an uninterrupted period of positive length p[sub lj]. The objective is to find an assignment of

  8. Recovery scheduling for industrial pocesses using graph constraints

    NARCIS (Netherlands)

    Saltik, M.B.; van Gameren, S.; Özkan, L.; Weiland, S.

    2017-01-01

    This paper considers a class of scheduling problems cast for processes that consist of several interconnected subprocesses. We model the temporal constraints (On-Off status) on each subprocess using labeled directed graphs to form the admissible set of schedules. Furthermore, we consider physical

  9. Microcomputer-based image processing system for CT/MRI scans II

    International Nuclear Information System (INIS)

    Kwok, J.C.K.; Yu, P.K.N.; Cheng, A.Y.S.; Ho, W.C.

    1991-01-01

    This paper reports that a microcomputer-based image processing system is used to digitize and process serial sections of CT/MRI scan and reconstruct three-dimensional images of brain structures and brain lesions. The images grabbed also serve as templates and different vital regions with different risk values are also traced out for 3D reconstruction. A knowledge-based system employing rule-based programming has been built to help identifying brain lesions and to help planning trajectory for operations. The volumes of the lesions are also automatically determined. Such system is very useful for medical skills archival, tumor size monitoring, survival and outcome forecasting, and consistent neurosurgical planning

  10. Numerical evaluation of path-integral solutions to Fokker-Planck equations. II. Restricted stochastic processes

    International Nuclear Information System (INIS)

    Wehner, M.F.

    1983-01-01

    A path-integral solution is derived for processes described by nonlinear Fokker-Plank equations together with externally imposed boundary conditions. This path-integral solution is written in the form of a path sum for small time steps and contains, in addition to the conventional volume integral, a surface integral which incorporates the boundary conditions. A previously developed numerical method, based on a histogram representation of the probability distribution, is extended to a trapezoidal representation. This improved numerical approach is combined with the present path-integral formalism for restricted processes and is show t give accurate results. 35 refs., 5 figs

  11. Thermal-capillary analysis of Czochralski and liquid encapsulated Czochralski crystal growth. II - Processing strategies

    Science.gov (United States)

    Derby, J. J.; Brown, R. A.

    1986-01-01

    The pseudosteady-state heat transfer model developed in a previous paper is augmented with constraints for constant crystal radius and melt/solid interface deflection. Combinations of growth rate, and crucible and bottom-heater temperatures are tested as processing parameters for satisfying the constrained thermal-capillary problem over a range of melt volumes corresponding to the sequence occuring during the batchwise Czochralski growth of a small-diameter silicon crystal. The applicability of each processing strategy is judged by the range of existence of the solution, in terms of melt volume and the values of the axial and radial temperature gradients in the crystal.

  12. A radiometric method for the characterization of particulate processes in colloidal suspensions. II

    International Nuclear Information System (INIS)

    Subotic, B.

    1979-01-01

    A radiometric method for the characterization of particulate processes is verified using stable hydrosols of silver iodide. Silver iodide hydrosols satisfy the conditions required for the applications of the proposed method. Comparison shows that the values for the change of particle size measured in silver iodide hydrosols by the proposed method are in excellent agreement with the values obtained by other methods on the same systems (electron microscopy, sedimentation analysis, light scattering). This shows that the proposed method is suitable for the characterization of particulate processes in colloidal suspensions. (Auth.

  13. The Application of Virtex-II Pro FPGA in High-Speed Image Processing Technology of Robot Vision Sensor

    International Nuclear Information System (INIS)

    Ren, Y J; Zhu, J G; Yang, X Y; Ye, S H

    2006-01-01

    The Virtex-II Pro FPGA is applied to the vision sensor tracking system of IRB2400 robot. The hardware platform, which undertakes the task of improving SNR and compressing data, is constructed by using the high-speed image processing of FPGA. The lower level image-processing algorithm is realized by combining the FPGA frame and the embedded CPU. The velocity of image processing is accelerated due to the introduction of FPGA and CPU. The usage of the embedded CPU makes it easily to realize the logic design of interface. Some key techniques are presented in the text, such as read-write process, template matching, convolution, and some modules are simulated too. In the end, the compare among the modules using this design, using the PC computer and using the DSP, is carried out. Because the high-speed image processing system core is a chip of FPGA, the function of which can renew conveniently, therefore, to a degree, the measure system is intelligent

  14. The Application of Virtex-II Pro FPGA in High-Speed Image Processing Technology of Robot Vision Sensor

    Science.gov (United States)

    Ren, Y. J.; Zhu, J. G.; Yang, X. Y.; Ye, S. H.

    2006-10-01

    The Virtex-II Pro FPGA is applied to the vision sensor tracking system of IRB2400 robot. The hardware platform, which undertakes the task of improving SNR and compressing data, is constructed by using the high-speed image processing of FPGA. The lower level image-processing algorithm is realized by combining the FPGA frame and the embedded CPU. The velocity of image processing is accelerated due to the introduction of FPGA and CPU. The usage of the embedded CPU makes it easily to realize the logic design of interface. Some key techniques are presented in the text, such as read-write process, template matching, convolution, and some modules are simulated too. In the end, the compare among the modules using this design, using the PC computer and using the DSP, is carried out. Because the high-speed image processing system core is a chip of FPGA, the function of which can renew conveniently, therefore, to a degree, the measure system is intelligent.

  15. Scheduling the powering tests

    CERN Document Server

    Barbero-Soto, E; Casas-Lino, M P; Fernandez-Robles, C; Foraz, K; Pojer, M; Saban, R; Schmidt, R; Solfaroli-Camillocci, M; Vergara-Fernandez, A

    2008-01-01

    The Large Hadron Collider is now entering in its final phase before receiving beam, and the activities at CERN between 2007 and 2008 have shifted from installation work to the commissioning of the technical systems ("hardware commissioning"). Due to the unprecedented complexity of this machine, all the systems are or will be tested as far as possible before the cool-down starts. Systems are firstly tested individually before being globally tested together. The architecture of LHC, which is partitioned into eight cryogenically and electrically independent sectors, allows the commissioning on a sector by sector basis. When a sector reaches nominal cryogenic conditions, commissioning of the magnet powering system to nominal current for all magnets can be performed. This paper briefly describes the different activities to be performed during the powering tests of the superconducting magnet system and presents the scheduling issues raised by co-activities as well as the management of resources.

  16. High solid fed-batch butanol fermentation with simultaneous product recovery: part II - process integration.

    Science.gov (United States)

    In these studies liquid hot water (LHW) pretreated and enzymatically hydrolyzed Sweet Sorghum Bagasse (SSB) hydrolyzates were fermented in a fed-batch reactor. As reported in the preceding paper, the culture was not able to ferment the hydrolyzate I in a batch process due to presence of high level o...

  17. Silicon Carbide (SiC) Power Processing Unit (PPU) for Hall Effect Thrusters, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR project, APEI, Inc. is proposing to develop a high efficiency, rad-hard 3.8 kW silicon carbide (SiC) power supply for the Power Processing Unit (PPU) of...

  18. MATLAB-based Applications for Image Processing and Image Quality Assessment – Part II: Experimental Results

    Directory of Open Access Journals (Sweden)

    L. Krasula

    2012-04-01

    Full Text Available The paper provides an overview of some possible usage of the software described in the Part I. It contains the real examples of image quality improvement, distortion simulations, objective and subjective quality assessment and other ways of image processing that can be obtained by the individual applications.

  19. Chromic acid recovery by electro-electrodialysis. II. Pilot scal process, development, and optimization

    NARCIS (Netherlands)

    Frenzel, I.; Frenzel, I.; Holdik, H.; Stamatialis, Dimitrios; Pourcelly, G.; Wessling, Matthias

    2005-01-01

    Electro-electrodialysis is a promising technology for chromic acid recovery and static rinse water purification. It combines the recovery of the plating chemicals from rinse water, the elimination of metallic impurities from the process and rinse water treatment in one step. Previous industrial use

  20. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  1. Sport Tournament Automated Scheduling System

    OpenAIRE

    Raof R. A. A; Sudin S.; Mahrom N.; Rosli A. N. C

    2018-01-01

    The organizer of sport events often facing problems such as wrong calculations of marks and scores, as well as difficult to create a good and reliable schedule. Most of the time, the issues about the level of integrity of committee members and also issues about errors made by human came into the picture. Therefore, the development of sport tournament automated scheduling system is proposed. The system will be able to automatically generate the tournament schedule as well as automatically calc...

  2. Planning and Scheduling for Environmental Sensor Networks

    Science.gov (United States)

    Frank, J. D.

    2005-12-01

    Environmental Sensor Networks are a new way of monitoring the environment. They comprise autonomous sensor nodes in the environment that record real-time data, which is retrieved, analyzed, integrated with other data sets (e.g. satellite images, GIS, process models) and ultimately lead to scientific discoveries. Sensor networks must operate within time and resource constraints. Sensors have limited onboard memory, energy, computational power, communications windows and communications bandwidth. The value of data will depend on when, where and how it was collected, how detailed the data is, how long it takes to integrate the data, and how important the data was to the original scientific question. Planning and scheduling of sensor networks is necessary for effective, safe operations in the face of these constraints. For example, power bus limitations may preclude sensors from simultaneously collecting data and communicating without damaging the sensor; planners and schedulers can ensure these operations are ordered so that they do not happen simultaneously. Planning and scheduling can also ensure best use of the sensor network to maximize the value of collected science data. For example, if data is best recorded using a particular camera angle but it is costly in time and energy to achieve this, planners and schedulers can search for times when time and energy are available to achieve the optimal camera angle. Planning and scheduling can handle uncertainty in the problem specification; planners can be re-run when new information is made available, or can generate plans that include contingencies. For example, if bad weather may prevent the collection of data, a contingent plan can check lighting conditions and turn off data collection to save resources if lighting is not ideal. Both mobile and immobile sensors can benefit from planning and scheduling. For example, data collection on otherwise passive sensors can be halted to preserve limited power and memory

  3. Data co-processing for extreme scale analysis level II ASC milestone (4745).

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, David; Moreland, Kenneth D.; Oldfield, Ron A.; Fabian, Nathan D.

    2013-03-01

    Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. A particularly pressing issue is gaining insight into the science behind the exascale computations. Power and I/O speed con- straints will fundamentally change current visualization and analysis work ows. A traditional post-processing work ow involves storing simulation results to disk and later retrieving them for visualization and data analysis. However, at exascale, scien- tists and analysts will need a range of options for moving data to persistent storage, as the current o ine or post-processing pipelines will not be able to capture the data necessary for data analysis of these extreme scale simulations. This Milestone explores two alternate work ows, characterized as in situ and in transit, and compares them. We nd each to have its own merits and faults, and we provide information to help pick the best option for a particular use.

  4. The acid digestion process for radioactive waste: The radioactive waste management series. Volume II

    International Nuclear Information System (INIS)

    Cecille, L.; Simon, R.

    1983-01-01

    This volume focuses on the acid digestion process for the treatment of alpha combustible solid waste by presenting detailed performance figures for the principal sub-assemblies of the Alona pilot plant, Belgium. Experience gained from the operation of the US RADTU plant, the only other acid digestion pilot plant, is also summarized, and the performances of these two plants compared. In addition, the research and development programmes carried out or supported by the Commission of the European Communities are reviewed, and details of an alternative to acid digestion for waste contamination described. Topics considered include review of the treatment of actinides-bearing radioactive wastes; alpha waste arisings in fuel fabrication; Alona Demonstration Facility for the acid digestion process at Eurochemic Mol (Belgium); the treatment of alpha waste at Eurochemic by acid digestion-feed pretreatment and plutonium recovery; US experience with acid digestion of combustible transuranic waste; and The European Communities R and D actions on alpha waste

  5. Implementation of the DYMAC system at the new Los Alamos Plutonium Processing Facility. Phase II report

    Energy Technology Data Exchange (ETDEWEB)

    Malanify, J.J.; Amsden, D.C.

    1982-08-01

    The DYnamic Materials ACcountability System - called DYMAC - performs accountability functions at the new Los Alamos Plutonium Processing Facility where it began operation when the facility opened in January 1978. A demonstration program, DYMAC was designed to collect and assess inventory information for safeguards purposes. It accomplishes 75% of its design goals. DYMAC collects information about the physical inventory through deployment of nondestructive assay instrumentation and video terminals throughout the facility. The information resides in a minicomputer where it can be immediately sorted and displayed on the video terminals or produced in printed form. Although the capability now exists to assess the collected data, this portion of the program is not yet implemented. DYMAC in its present form is an excellent tool for process and quality control. The facility operator relies on it exclusively for keeping track of the inventory and for complying with accountability requirements of the US Department of Energy.

  6. Collision processes of hydrocarbon species in hydrogen plasmas. II The ethane and propane families

    CERN Document Server

    Janev, R K

    2002-01-01

    Cross sections and rate coefficients are provided for collision processes of electrons and protons with C sub x H sub y and C sub x H sub y sup + (x = 2, 3; 1 <= y <= 2x + 2) hydrocarbon species in a wide range of collision energies and plasma (gas) temperatures. The considered processes include: electron-impact ionization and dissociation of C sub x H sub y , dissociative excitation, ionization and recombination of C sub x H sub y sup + with electrons, and both charge transfer and atom exchange in proton channels are considered separately. Information is also provided for the energies of each individual reaction channel. The cross sections and rate coefficients are presented in compact analytic forms.

  7. Data acquisition and processing in the ATLAS tile calorimeter phase-II upgrade demonstrator

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00306349; The ATLAS collaboration

    2017-01-01

    The LHC has planned a series of upgrades culminating in the High Luminosity LHC which will have an average luminosity 5-7 times larger than the nominal Run 2 value. The ATLAS Tile Calorimeter will undergo an upgrade to accommodate the HL-LHC parameters. The TileCal readout electronics will be redesigned, introducing a new readout strategy. A Demonstrator program has been developed to evaluate the new proposed readout architecture and prototypes of all the components. In the Demonstrator, the detector data received in the Tile PreProcessors (PPr) are stored in pipeline buffers and upon the reception of an external trigger signal the data events are processed, packed and readout in parallel through the legacy ROD system, the new Front-End Link eXchange system and an ethernet connection for monitoring purposes. This contribution describes in detail the data processing and the hardware, firmware and software components of the TileCal Demonstrator readout system.

  8. Evaluation of a computer aided X-ray fluorographic system: Part II - image processing

    International Nuclear Information System (INIS)

    Burch, S.F.; Cocking, S.J.

    1981-12-01

    The TV imagery from a computer aided X-ray fluorographic system has been digitally processed with an I 2 S model 70E image processor, controlled by a PDP 11/60 minicomputer. The image processor allowed valuable processing for detection of defects in cast components to be carried out at television frame rates. Summation of TV frames was used to reduce noise, and hence improve the thickness sensitivity of the system. A displaced differencing technique and interactive contrast enhancement were then used to improve the reliability of inspection by removing spurious blemishes and interference lines, while simultaneously enhancing the visibility of real defects. The times required for these operations are given, and the benefits provided for X-ray fluorography are illustrated by the results from inspection of aero engine castings. (author)

  9. Implementation of the DYMAC system at the new Los Alamos Plutonium Processing Facility. Phase II report

    International Nuclear Information System (INIS)

    Malanify, J.J.; Amsden, D.C.

    1982-08-01

    The DYnamic Materials ACcountability System - called DYMAC - performs accountability functions at the new Los Alamos Plutonium Processing Facility where it began operation when the facility opened in January 1978. A demonstration program, DYMAC was designed to collect and assess inventory information for safeguards purposes. It accomplishes 75% of its design goals. DYMAC collects information about the physical inventory through deployment of nondestructive assay instrumentation and video terminals throughout the facility. The information resides in a minicomputer where it can be immediately sorted and displayed on the video terminals or produced in printed form. Although the capability now exists to assess the collected data, this portion of the program is not yet implemented. DYMAC in its present form is an excellent tool for process and quality control. The facility operator relies on it exclusively for keeping track of the inventory and for complying with accountability requirements of the US Department of Energy

  10. NRC comprehensive records disposition schedule

    International Nuclear Information System (INIS)

    1992-03-01

    Title 44 United States Code, ''Public Printing and Documents,'' regulations cited in the General Services Administration's (GSA) ''Federal Information Resources Management Regulations'' (FIRMR), Part 201-9, ''Creation, Maintenance, and Use of Records,'' and regulation issued by the National Archives and Records Administration (NARA) in 36 CFR Chapter XII, Subchapter B, ''Records Management,'' require each agency to prepare and issue a comprehensive records disposition schedule that contains the NARA approved records disposition schedules for records unique to the agency and contains the NARA's General Records Schedules for records common to several or all agencies. The approved records disposition schedules specify the appropriate duration of retention and the final disposition for records created or maintained by the NRC. NUREG-0910, Rev. 2, contains ''NRC's Comprehensive Records Disposition Schedule,'' and the original authorized approved citation numbers issued by NARA. Rev. 2 totally reorganizes the records schedules from a functional arrangement to an arrangement by the host office. A subject index and a conversion table have also been developed for the NRC schedules to allow staff to identify the new schedule numbers easily and to improve their ability to locate applicable schedules

  11. Effect of the mechanical processing on the mechanical properties of MA956 alloy. II. Mechanical characterization

    International Nuclear Information System (INIS)

    Chao, J.; Gonzalez-Doncel, G.

    1998-01-01

    The mechanical properties at room and low temperature of MA 956 alloy in some stages of their processing route are evaluated. In this study the influence of crystallographic orientation on plastic deformation and brittle fracture, strongly anisotropic phenomena, is also considered. It is concluded that even though MA 956 alloy was designated for high temperature applications it could be also used for cryogenic temperatures applications. (Author) 8 refs

  12. PWR steam generator chemical cleaning. Phase I: solvent and process development. Volume II

    International Nuclear Information System (INIS)

    Larrick, A.P.; Paasch, R.A.; Hall, T.M.; Schneidmiller, D.

    1979-01-01

    A program to demonstrate chemical cleaning methods for removing magnetite corrosion products from the annuli between steam generator tubes and the tube support plates in vertical U-tube steam generators is described. These corrosion products have caused steam generator tube ''denting'' and in some cases have caused tube failures and support plate cracking in several PWR generating plants. Laboratory studies were performed to develop a chemical cleaning solvent and application process for demonstration cleaning of the Indian Point Unit 2 steam generators. The chemical cleaning solvent and application process were successfully pilot-tested by cleaning the secondary side of one of the Indian Point Unit 1 steam generators. Although the Indian Point Unit 1 steam generators do not have a tube denting problem, the pilot test provided for testing of the solvent and process using much of the same equipment and facilities that would be used for the Indian Point Unit 2 demonstration cleaning. The chemical solvent selected for the pilot test was an inhibited 3% citric acid-3% ascorbic acid solution. The application process, injection into the steam generator through the boiler blowdown system and agitation by nitrogen sparging, was tested in a nuclear environment and with corrosion products formed during years of steam generator operation at power. The test demonstrated that the magnetite corrosion products in simulated tube-to-tube support plate annuli can be removed by chemical cleaning; that corrosion resulting from the cleaning is not excessive; and that steam generator cleaning can be accomplished with acceptable levels of radiation exposure to personnel

  13. Conversion of paper sludge to ethanol, II: process design and economic analysis.

    Science.gov (United States)

    Fan, Zhiliang; Lynd, Lee R

    2007-01-01

    Process design and economics are considered for conversion of paper sludge to ethanol. A particular site, a bleached kraft mill operated in Gorham, NH by Fraser Papers (15 tons dry sludge processed per day), is considered. In addition, profitability is examined for a larger plant (50 dry tons per day) and sensitivity analysis is carried out with respect to capacity, tipping fee, and ethanol price. Conversion based on simultaneous saccharification and fermentation with intermittent feeding is examined, with ethanol recovery provided by distillation and molecular sieve adsorption. It was found that the Fraser plant achieves positive cash flow with or without xylose conversion and mineral recovery. Sensitivity analysis indicates economics are very sensitive to ethanol selling price and scale; significant but less sensitive to the tipping fee, and rather insensitive to the prices of cellulase and power. Internal rates of return exceeding 15% are projected for larger plants at most combinations of scale, tipping fee, and ethanol price. Our analysis lends support to the proposition that paper sludge is a leading point-of-entry and proving ground for emergent industrial processes featuring enzymatic hydrolysis of cellulosic biomass.

  14. Modelling dewatering behaviour through an understanding of solids formation processes. Part II--solids separation considerations.

    Science.gov (United States)

    Dustan, A C; Cohen, B; Petrie, J G

    2005-05-30

    An understanding of the mechanisms which control solids formation can provide information on the characteristics of the solids which are formed. The nature of the solids formed in turn impacts on dewatering behaviour. The 'upstream' solids formation determines a set of suspension characteristics: solids concentration, particle size distribution, solution ionic strength and electrostatic surface potential. These characteristics together define the suspension's rheological properties. However, the complicated interdependence of these has precluded the prediction of suspension rheology from such a fundamental description of suspension characteristics. Recent shear yield stress models, applied in this study to compressive yield, significantly reduce the empiricism required for the description of compressive rheology. Suspension compressibility and permeability uniquely define the dewatering behaviour, described in terms of settling, filtration and mechanical expression. These modes of dewatering may be described in terms of the same fundamental suspension mechanics model. In this way, it is possible to link dynamically the processes of solids formation and dewatering of the resultant suspension. This, ultimately, opens the door to improved operability of these processes. In part I of this paper we introduced an integrated system model for solids formation and dewatering. This model was demonstrated for the upstream processes using experimental data. In this current paper models of colloidal interactions and dewatering are presented and compared to experimental results from batch filtration tests. A novel approach to predicting suspension compressibility and permeability using a single test configuration is presented and tested.

  15. On the Processing of Martensitic Steels in Continuous Galvanizing Lines: Part II

    Science.gov (United States)

    Song, Taejin; Kwak, Jaihyun; de Cooman, B. C.

    2012-01-01

    The conventional continuous hot-dip galvanizing (GI) and galvannealing (GA) processes can be applied to untransformed austenite to produce Zn and Zn-alloy coated low-carbon ultra-high-strength martensitic steel provided specific alloying additions are made. The most suitable austenite decomposition behavior results from the combined addition of boron, Cr, and Mo, which results in a pronounced transformation bay during isothermal transformation. The occurrence of this transformation bay implies a considerable retardation of the austenite decomposition in the temperature range below the bay, which is close to the stages in the continuous galvanizing line (CGL) thermal cycle related to the GI and GA processes. After the GI and GA processes, a small amount of granular bainite, which consists of bainitic ferrite and discrete islands of martensite/austenite (M/A) constituents embedded in martensite matrix, is present in the microstructure. The ultimate tensile strength (UTS) of the steel after the GI and GA cycle was over 1300 MPa, and the stress-strain curve was continuous without any yielding phenomena.

  16. Pretreatment of furfural industrial wastewater by Fenton, electro-Fenton and Fe(II)-activated peroxydisulfate processes: a comparative study.

    Science.gov (United States)

    Yang, C W; Wang, D; Tang, Q

    2014-01-01

    The Fenton, electro-Fenton and Fe(II)-activated peroxydisulfate (PDS) processes have been applied for the treatment of actual furfural industrial wastewater in this paper. Through the comparative study of the three processes, a suitable pretreatment technology for actual furfural wastewater treatment was obtained, and the mechanism and dynamics process of this technology is discussed. The experimental results show that Fenton technology has a good and stable effect without adjusting pH of furfural wastewater. At optimal conditions, which were 40 mmol/L H₂O₂ initial concentration and 10 mmol/L Fe²⁺ initial concentration, the chemical oxygen demand (COD) removal rate can reach 81.2% after 90 min reaction at 80 °C temperature. The PDS process also has a good performance. The COD removal rate could attain 80.3% when Na₂S₂O₈ initial concentration was 4.2 mmol/L, Fe²⁺ initial concentration was 0.1 mol/L, the temperature remained at 70 °C, and pH value remained at 2.0. The electro-Fenton process was not competent to deal with the high-temperature furfural industrial wastewater and only 10.2% COD was degraded at 80 °C temperature in the optimal conditions (2.25 mA/cm² current density, 4 mg/L Na₂SO₄, 0.3 m³/h aeration rate). For the Fenton, electro-Fenton and PDS processes in pretreatment of furfural wastewater, their kinetic processes follow the pseudo first order kinetics law. The pretreatment pathways of furfural wastewater degradation are also investigated in this study. The results show that furfural and furan formic acid in furfural wastewater were preferentially degraded by Fenton technology. Furfural can be degraded into low-toxicity or nontoxic compounds by Fenton pretreatment technology, which could make furfural wastewater harmless and even reusable.

  17. Multi-processor network implementations in Multibus II and VME

    International Nuclear Information System (INIS)

    Briegel, C.

    1992-01-01

    ACNET (Fermilab Accelerator Controls Network), a proprietary network protocol, is implemented in a multi-processor configuration for both Multibus II and VME. The implementations are contrasted by the bus protocol and software design goals. The Multibus II implementation provides for multiple processors running a duplicate set of tasks on each processor. For a network connected task, messages are distributed by a network round-robin scheduler. Further, messages can be stopped, continued, or re-routed for each task by user-callable commands. The VME implementation provides for multiple processors running one task across all processors. The process can either be fixed to a particular processor or dynamically allocated to an available processor depending on the scheduling algorithm of the multi-processing operating system. (author)

  18. Analysis of the permitting processes associated with exploration of Federal OCS leases. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    Under contract to the Office of Leasing Policy Development (LPDO), Jack Faucett Associates is currently undertaking the description and analysis of the Outer Continental Shelf (OCS) regulatory process to determine the nature of time delays that affect OCS production of oil and gas. This report represents the results of the first phase of research under this contract, the description and analysis of regulatory activity associated with exploration activities on the Federal OCS. Volume 1 contains the following three sections: (1) study results; (2) Federal regulatory activities during exploration of Federal OCS leases which involved the US Geological Survey, Environmental Protection Agency, US Coast Guard, Corps of Engineers, and National Ocean and Atmospheric Administration; and (3) state regulatory activities during exploration of Federal OCS leases of Alaska, California, Louisiana, Massachusetts, New Jersey, North Carolina and Texas. Volume II contains appendices of US Geological Survey, Environmental Protection Agency, Coast Guard, Corps of Engineers, the Coastal Zone Management Act, and Alaska. The major causes of delay in the regulatory process governing exploration was summarized in four broad categories: (1) the long and tedious process associated with the Environmental Protection Agency's implementation of the National Pollutant Discharge Elimination System Permit; (2) thelack of mandated time periods for the completion of individual activities in the permitting process; (3) the lack of overall coordination of OCS exploratory regulation; and (4) the inexperience of states, the Federal government and industry relating to the appropriate level of regulation for first-time lease sale areas.

  19. Self-similarity of hard cumulative processes in fixed target experiment for BES-II at STAR

    International Nuclear Information System (INIS)

    Tokarev, M.V.; Aparin, A.A.; Zborovsky, I.

    2014-01-01

    Search for signatures of phase transition in Au + Au collisions is in the heart of the heavy ion program at RHIC. Systematic study of particle production over a wide range of collision energy revealed new phenomena such as the nuclear suppression effect expressed by nuclear modification factor, the constituent quark number scaling for elliptic flow, the 'ridge effect' in - fluctuations, etc. To determine the phase boundaries and location of the critical point of nuclear matter, the Beam Energy Scan (BES-I) program at RHIC has been suggested and performed by STAR and PHENIX Collaborations. The obtained results have shown that the program (BES-II) should be continued. In this paper a proposal to use hard cumulative processes in BES Phase-II program is outlined. Selection of the cumulative events is assumed to enrich data sample by a new type of collisions characterized by higher energy density and more compressed matter. This would allow finding clearer signatures of phase transition, location of a critical point and studying extreme conditions in heavy ion collisions.

  20. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    Science.gov (United States)

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  1. Beam-beam dynamics during the injection process at the PEP-II B-Factory

    International Nuclear Information System (INIS)

    Chin, Yong Ho.

    1991-10-01

    This paper is concerned with beam-beam effects during the injection process at the proposed asymmetric SLAC/LBL/LLNL B-Factory based on PEP (PEP-2). For symmetric colliders, the primary source of the beam-beam effect is the head-on collision at the interaction point (IP), and this effect can be mitigated by separating the beams during the injection process. For an asymmetric collider, which intrinsically consists of two separate rings, the bunches not only collide at the IP but experience a long-range beam-beam force on the way into and out of the IP region. These collisions are called ''parasitic crossings (PC).'' The parasitic crossings emerge as a potential source of far stronger beam-beam impact during the injection process for the following reason. In the proposed injection scheme of the APIARY-6.3d design, the bunches are injected horizontally into the two rings with large horizontal offset of 8σ Ox sptm where σ Ox sptm is the nominal horizontal storage ring beam size at the end of the septum magnet. Then, the injected beam starts to travel around the ring oscillating horizontally. For the sake of discussion, let us assume that the beam in the other ring has already been fully stored. When the injected beam arrives at the 1st PC, where the two nominal orbits are separated horizontally by about 7.6 times the nominal horizontal beam size of the low energy ring, it may pass through the other beam far more closely than at the nominal separation distance, or it may even strike the other beam head-on

  2. Innovation through developing consumers communities. Part II: Digitalizing the innovation processes

    Science.gov (United States)

    Avasilcai, S.; Galateanu (Avram, E.

    2015-11-01

    The current research recognises the concept of innovation as the main driver for organisational growth and profitability. The companies seek to develop new ways to engage consumers and customers into co - creation value through the product design, development and distribution processes. However the main concern is manifested for new and creative ways of customization products based on consumers’ requirements and needs. Thus the need for innovative virtual instruments arose as the demand from social communities for personalised products or services increased. Basically companies should develop own innovative platforms, where consumers can participate, with ideas, concepts or other relevant contributions, and interact with designers or engineers for product development. This paper aims to present the most important features of platform development within BMW Group as a concept and as innovative instrument. From this point of view it is important to enhance past experiences of the company in the field of co - creation projects. There will be highlighted the dual consumers’ character as co - creator and co - evaluator based on their involvement in the proposed and developed projects and platform structure. The significant impact on platform functioning it has the diversity of company's concerns for Research & Development and innovation activities. From this point of view there will be assessed the platform structure, the main proposed themes and the evaluation process. The main outcome is to highlight the significance of platform development as innovative tool for consumers’ communities’ enhancement. Based on the analysis of “BMW Co-Creation Lab”, there will be revealed the main consumers concerns in terms of safety, comfort and appearance of the products. Thus it is important to understand the evaluation process of gathered ideas and intellectual property policy. The importance of platform development and implementation will be highlighted by company

  3. Schedules of Controlled Substances: Temporary Placement of 4-Fluoroisobutyryl Fentanyl into Schedule I. Temporary scheduling order.

    Science.gov (United States)

    2017-05-03

    The Administrator of the Drug Enforcement Administration is issuing this temporary scheduling order to schedule the synthetic opioid, N-(4-fluorophenyl)-N-(1-phenethylpiperidin-4-yl)isobutyramide (4-fluoroisobutyryl fentanyl or para-fluoroisobutyryl fentanyl), and its isomers, esters, ethers, salts and salts of isomers, esters, and ethers, into schedule I pursuant to the temporary scheduling provisions of the Controlled Substances Act. This action is based on a finding by the Administrator that the placement of 4-fluoroisobutyryl fentanyl into schedule I of the Controlled Substances Act is necessary to avoid an imminent hazard to the public safety. As a result of this order, the regulatory controls and administrative, civil, and criminal sanctions applicable to schedule I controlled substances will be imposed on persons who handle (manufacture, distribute, reverse distribute, import, export, engage in research, conduct instructional activities or chemical analysis, or possess), or propose to handle, 4-fluoroisobutyryl fentanyl.

  4. Experimental investigation and numerical modeling of carbonation process in reinforced concrete structures Part II. Practical applications

    International Nuclear Information System (INIS)

    Saetta, Anna V.; Vitaliani, Renato V.

    2005-01-01

    The mathematical-numerical method developed by the authors to predict the corrosion initiation time of reinforced concrete structures due to carbonation process, recalled in Part I of this work, is here applied to some real cases. The final aim is to develop and test a practical method for determining the durability characteristics of existing buildings liable to carbonation, as well as estimating the corrosion initiation time of a building at the design stage. Two industrial sheds with different ages and located in different areas have been analyzed performing both experimental tests and numerical analyses. Finally, a case of carbonation-induced failure in a prestressed r.c. beam is presented

  5. Journal of Environmental Radioactivity special issue: II International Conference on Radioecological Concentration Processes. (50 years later).

    Science.gov (United States)

    Garcia-Tenorio, Rafael; Holm, Elis

    2018-06-01

    An international conference on Radioecological Concentration Processes was held in Seville, Spain, 6-9 November 2016 at the Centro Nacional de Aceleradores. It was attended by 160 participants from 35 different countries. This was the 2nd conference on this item since 1966, 50 years ago. The conference covered aspects of radiological important radionuclides on terrestrial, marine and freshwater environments and has allowed obtaining a clear picture of the status of the Radioecology as a consolidated discipline in the 21st century. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Structure of multiphoton quantum optics. II. Bipartite systems, physical processes, and heterodyne squeezed states

    Science.gov (United States)

    dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    Extending the scheme developed for a single mode of the electromagnetic field in the preceding paper [F. Dell’Anno, S. De Siena, and F. Illuminati, Phys. Rev. A 69, 033812 (2004)], we introduce two-mode nonlinear canonical transformations depending on two heterodyne mixing angles. They are defined in terms of Hermitian nonlinear functions that realize heterodyne superpositions of conjugate quadratures of bipartite systems. The canonical transformations diagonalize a class of Hamiltonians describing nondegenerate and degenerate multiphoton processes. We determine the coherent states associated with the canonical transformations, which generalize the nondegenerate two-photon squeezed states. Such heterodyne multiphoton squeezed states are defined as the simultaneous eigenstates of the transformed, coupled annihilation operators. They are generated by nonlinear unitary evolutions acting on two-mode squeezed states. They are non-Gaussian, highly nonclassical, entangled states. For a quadratic nonlinearity the heterodyne multiphoton squeezed states define two-mode cubic phase states. The statistical properties of these states can be widely adjusted by tuning the heterodyne mixing angles, the phases of the nonlinear couplings, as well as the strength of the nonlinearity. For quadratic nonlinearity, we study the higher-order contributions to the susceptibility in nonlinear media and we suggest possible experimental realizations of multiphoton conversion processes generating the cubic-phase heterodyne squeezed states.

  7. Structure of multiphoton quantum optics. II. Bipartite systems, physical processes, and heterodyne squeezed states

    International Nuclear Information System (INIS)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2004-01-01

    Extending the scheme developed for a single mode of the electromagnetic field in the preceding paper [F. Dell'Anno, S. De Siena, and F. Illuminati, Phys. Rev. A 69, 033812 (2004)], we introduce two-mode nonlinear canonical transformations depending on two heterodyne mixing angles. They are defined in terms of Hermitian nonlinear functions that realize heterodyne superpositions of conjugate quadratures of bipartite systems. The canonical transformations diagonalize a class of Hamiltonians describing nondegenerate and degenerate multiphoton processes. We determine the coherent states associated with the canonical transformations, which generalize the nondegenerate two-photon squeezed states. Such heterodyne multiphoton squeezed states are defined as the simultaneous eigenstates of the transformed, coupled annihilation operators. They are generated by nonlinear unitary evolutions acting on two-mode squeezed states. They are non-Gaussian, highly nonclassical, entangled states. For a quadratic nonlinearity the heterodyne multiphoton squeezed states define two-mode cubic phase states. The statistical properties of these states can be widely adjusted by tuning the heterodyne mixing angles, the phases of the nonlinear couplings, as well as the strength of the nonlinearity. For quadratic nonlinearity, we study the higher-order contributions to the susceptibility in nonlinear media and we suggest possible experimental realizations of multiphoton conversion processes generating the cubic-phase heterodyne squeezed states

  8. Treatment of waste salt from the advanced spent fuel conditioning process (II) : optimum immobilization condition

    International Nuclear Information System (INIS)

    Kim, Jeong Guk; Lee, Jae Hee; Yoo, Jae Hyung; Kim, Joon Hyung

    2004-01-01

    Since zeolite is known to be stable at a high temperature, it has been reported as a promising immobilization matrix for waste salt. The crystal structure of dehydrated zeolite A breaks down above 1060 K, resulting in the formation of an amorphous solid and re-crystallization to beta-Cristobalite. This structural degradation depends on the existence of chlorides. When contacted to HCl, zeolite 4A is not stable even at 473 K. The optimum consolidation condition for LiCl salt waste from the oxide fuel reduction process based on the electrochemical method (Advanced spent fuel Conditioning Process; ACP) has been studied using zeolite A since 2001. Actually the constituents of waste salt are water-soluble. And, alkali halides are known to be readily radiolyzed to yield interstitial halogens and metal colloids. For disposal in a geological repository, the waste salt must meet the acceptance criteria. For a waste form containing chloride salt, two of the more important criteria are leach resistance and waste form durability. In this work, we prepared some samples with different mixing ratios of LiCl salt to zeolite A, and then compared some characteristics such as thermal stability, salt occlusion, free chloride content, leach resistance, mixing effect, etc

  9. Visual information processing II; Proceedings of the Meeting, Orlando, FL, Apr. 14-16, 1993

    Science.gov (United States)

    Huck, Friedrich O. (Editor); Juday, Richard D. (Editor)

    1993-01-01

    Various papers on visual information processing are presented. Individual topics addressed include: aliasing as noise, satellite image processing using a hammering neural network, edge-detetion method using visual perception, adaptive vector median filters, design of a reading test for low-vision image warping, spatial transformation architectures, automatic image-enhancement method, redundancy reduction in image coding, lossless gray-scale image compression by predictive GDF, information efficiency in visual communication, optimizing JPEG quantization matrices for different applications, use of forward error correction to maintain image fidelity, effect of peanoscanning on image compression. Also discussed are: computer vision for autonomous robotics in space, optical processor for zero-crossing edge detection, fractal-based image edge detection, simulation of the neon spreading effect by bandpass filtering, wavelet transform (WT) on parallel SIMD architectures, nonseparable 2D wavelet image representation, adaptive image halftoning based on WT, wavelet analysis of global warming, use of the WT for signal detection, perfect reconstruction two-channel rational filter banks, N-wavelet coding for pattern classification, simulation of image of natural objects, number-theoretic coding for iconic systems.

  10. Diverse task scheduling for individualized requirements in cloud manufacturing

    Science.gov (United States)

    Zhou, Longfei; Zhang, Lin; Zhao, Chun; Laili, Yuanjun; Xu, Lida

    2018-03-01

    Cloud manufacturing (CMfg) has emerged as a new manufacturing paradigm that provides ubiquitous, on-demand manufacturing services to customers through network and CMfg platforms. In CMfg system, task scheduling as an important means of finding suitable services for specific manufacturing tasks plays a key role in enhancing the system performance. Customers' requirements in CMfg are highly individualized, which leads to diverse manufacturing tasks in terms of execution flows and users' preferences. We focus on diverse manufacturing tasks and aim to address their scheduling issue in CMfg. First of all, a mathematical model of task scheduling is built based on analysis of the scheduling process in CMfg. To solve this scheduling problem, we propose a scheduling method aiming for diverse tasks, which enables each service demander to obtain desired manufacturing services. The candidate service sets are generated according to subtask directed graphs. An improved genetic algorithm is applied to searching for optimal task scheduling solutions. The effectiveness of the scheduling method proposed is verified by a case study with individualized customers' requirements. The results indicate that the proposed task scheduling method is able to achieve better performance than some usual algorithms such as simulated annealing and pattern search.

  11. PLACEMENT APPLICATIONS SCHEDULING LECTURE IN INTERNATIONAL PROGRAM UNIKOM BASED ANDROID

    Directory of Open Access Journals (Sweden)

    Andri Sahata Sitanggang

    2017-12-01

    Full Text Available One who determines life of a classroom namely mapping scheduling courses especially at college. The process scheduling has included time or schedule of a class of available, room available, lecture who is scheduled for, and schedule for lecturer going to teach. Hopefully with a scheduling it will facilitate the students and teachers in obtaining information lecture schedule. With the emergence of the android application ( is implanted in mobile phones , the public can now use the internet so fast that is based .So with that researchers give one a technology based solutions to build android application .This is because one of the technology has given the functions which may make it easier for students and university lecturers in terms of access to information. In building this application used method of the prototype consisting 2 access namely access user and admin , where module user consisting of modules register , login , scheduling module , while for admin given module login , register and arrangement information scheduling courses both the administration and lecturers .Application made will be integrated with internet so that this program is real-time application.

  12. Brucella abortus Inhibits Major Histocompatibility Complex Class II Expression and Antigen Processing through Interleukin-6 Secretion via Toll-Like Receptor 2▿

    Science.gov (United States)

    Barrionuevo, Paula; Cassataro, Juliana; Delpino, M. Victoria; Zwerdling, Astrid; Pasquevich, Karina A.; Samartino, Clara García; Wallach, Jorge C.; Fossati, Carlos A.; Giambartolomei, Guillermo H.

    2008-01-01

    The strategies that allow Brucella abortus to survive inside macrophages for prolonged periods and to avoid the immunological surveillance of major histocompatibility complex class II (MHC-II)-restricted gamma interferon (IFN-γ)-producing CD4+ T lymphocytes are poorly understood. We report here that infection of THP-1 cells with B. abortus inhibited expression of MHC-II molecules and antigen (Ag) processing. Heat-killed B. abortus (HKBA) also induced both these phenomena, indicating the independence of bacterial viability and involvement of a structural component of the bacterium. Accordingly, outer membrane protein 19 (Omp19), a prototypical B. abortus lipoprotein, inhibited both MHC-II expression and Ag processing to the same extent as HKBA. Moreover, a synthetic lipohexapeptide that mimics the structure of the protein lipid moiety also inhibited MHC-II expression, indicating that any Brucella lipoprotein could down-modulate MHC-II expression and Ag processing. Inhibition of MHC-II expression and Ag processing by either HKBA or lipidated Omp19 (L-Omp19) depended on Toll-like receptor 2 and was mediated by interleukin-6. HKBA or L-Omp19 also inhibited MHC-II expression and Ag processing of human monocytes. In addition, exposure to the synthetic lipohexapeptide inhibited Ag-specific T-cell proliferation and IFN-γ production of peripheral blood mononuclear cells from Brucella-infected patients. Together, these results indicate that there is a mechanism by which B. abortus may prevent recognition by T cells to evade host immunity and establish a chronic infection. PMID:17984211

  13. Counter-current extraction studies for the recovery of neptunium by the Purex process. Part II

    Energy Technology Data Exchange (ETDEWEB)

    Srinivasan, N.; Nadkarni, M. N.; Kumar, S. V.; Kartha, P. K.S.; Sonavane, R. R.; Ramaniah, M. V.; Patil, S. K.

    1974-07-01

    Counter-extraction experiments were carried out under the conditions relevant to the partitioning column (IBX) in the purex process to know the path of neptunium present as Np (VI) the organic phase during the partitioning step. The results obtained show that when ferrous sulphamates is used as the reducing agent, most of the neptunium continues to remain with uranium in the organic stream while with hydrazine stabilized uranous nitrate as the reducing agent, a major fraction of neptunium follows the aqueous stream. Mixer-settler experiments were also carried out under the conditions relevant to the uranium purification cycle (2D) to establish the conditions for forcing neptunium to the aqueous raffinate or for partitioning it from uranium if both neptunium and uranium are co-extracted in this cycle and the results obtained are reported here. (auth)

  14. D-Zero run II data management and access

    International Nuclear Information System (INIS)

    Lueking, L.

    1997-03-01

    During the Run II data taking period at Fermilab, scheduled to begin in 1999, D0 plans to accumulate at least 200 TB of raw and reconstructed data per year. Data access patterns observed in the Run I experience have been examined in an attempt to establish an efficient data access environment. The needs and models for storing and processing the upcoming data are discussed

  15. Energy-Efficient Scheduling Problem Using an Effective Hybrid Multi-Objective Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Lvjiang Yin

    2016-12-01

    Full Text Available Nowadays, manufacturing enterprises face the challenge of just-in-time (JIT production and energy saving. Therefore, study of JIT production and energy consumption is necessary and important in manufacturing sectors. Moreover, energy saving can be attained by the operational method and turn off/on idle machine method, which also increases the complexity of problem solving. Thus, most researchers still focus on small scale problems with one objective: a single machine environment. However, the scheduling problem is a multi-objective optimization problem in real applications. In this paper, a single machine scheduling model with controllable processing and sequence dependence setup times is developed for minimizing the total earliness/tardiness (E/T, cost, and energy consumption simultaneously. An effective multi-objective evolutionary algorithm called local multi-objective evolutionary algorithm (LMOEA is presented to tackle this multi-objective scheduling problem. To accommodate the characteristic of the problem, a new solution representation is proposed, which can convert discrete combinational problems into continuous problems. Additionally, a multiple local search strategy with self-adaptive mechanism is introduced into the proposed algorithm to enhance the exploitation ability. The performance of the proposed algorithm is evaluated by instances with comparison to other multi-objective meta-heuristics such as Nondominated Sorting Genetic Algorithm II (NSGA-II, Strength Pareto Evolutionary Algorithm 2 (SPEA2, Multiobjective Particle Swarm Optimization (OMOPSO, and Multiobjective Evolutionary Algorithm Based on Decomposition (MOEA/D. Experimental results demonstrate that the proposed LMOEA algorithm outperforms its counterparts for this kind of scheduling problems.

  16. Practical principles in appointment scheduling

    NARCIS (Netherlands)

    Kuiper, A.; Mandjes, M.

    2015-01-01

    Appointment schedules aim at achieving a proper balance between the conflicting interests of the service provider and her clients: a primary objective of the service provider is to fully utilize her available time, whereas clients want to avoid excessive waiting times. Setting up schedules that

  17. Nontraditional work schedules for pharmacists.

    Science.gov (United States)

    Mahaney, Lynnae; Sanborn, Michael; Alexander, Emily

    2008-11-15

    Nontraditional work schedules for pharmacists at three institutions are described. The demand for pharmacists and health care in general continues to increase, yet significant material changes are occurring in the pharmacy work force. These changing demographics, coupled with historical vacancy rates and turnover trends for pharmacy staff, require an increased emphasis on workplace changes that can improve staff recruitment and retention. At William S. Middleton Memorial Veterans Affairs Hospital in Madison, Wisconsin, creative pharmacist work schedules and roles are now mainstays to the recruitment and retention of staff. The major challenge that such scheduling presents is the 8 hours needed to prepare a six-week schedule. Baylor Medical Center at Grapevine in Dallas, Texas, has a total of 45 pharmacy employees, and slightly less than half of the 24.5 full-time-equivalent staff work full-time, with most preferring to work one, two, or three days per week. As long as the coverage needs of the facility are met, Envision Telepharmacy in Alpine, Texas, allows almost any scheduling arrangement preferred by individual pharmacists or the pharmacist group covering the facility. Staffing involves a great variety of shift lengths and intervals, with shifts ranging from 2 to 10 hours. Pharmacy leaders must be increasingly aware of opportunities to provide staff with unique scheduling and operational enhancements that can provide for a better work-life balance. Compressed workweeks, job-sharing, and team scheduling were the most common types of alternative work schedules implemented at three different institutions.

  18. Modeling the Cray memory scheduler

    Energy Technology Data Exchange (ETDEWEB)

    Wickham, K.L.; Litteer, G.L.

    1992-04-01

    This report documents the results of a project to evaluate low cost modeling and simulation tools when applied to modeling the Cray memory scheduler. The specific tool used is described and the basics of the memory scheduler are covered. Results of simulations using the model are discussed and a favorable recommendation is made to make more use of this inexpensive technology.

  19. Flexible Work Schedules. ERIC Digest.

    Science.gov (United States)

    Kerka, Sandra

    Flexible work schedules are one response to changes in the composition of the work force, new life-styles, and changes in work attitudes. Types of alternative work schedules are part-time and temporary employment, job sharing, and flextime. Part-time workers are a diverse group--women, the very young, and older near-retirees. Although part-time…

  20. TECHNICAL COORDINATION SCHEDULE & INTEGRATION

    CERN Multimedia

    W. Zeuner

    Introduction The endgame of CMS installation in the underground cavern is in full swing, with several major milestones having been passed since the last CMS week. The Tracker was installed inside the Vactank just before the CERN end-of-year shutdown. Shortly after the reopening in 2008, the two remaining endcap disks, YE-2 and YE-1, were lowered, marking the completion of eight years of assembly in the surface building SX5. The remaining tasks, before the detector can be closed for the Cosmic Run At Four Tesla (CRAFT), are the installation of the thermal shields, the cabling of the negative endcap, the cabling of the tracker and the beam pipe installation. In addition to these installation tasks, a test closure of the positive endcap is planned just before the installation of the central beam pipe. The schedule is tight and complicated but the goal to close CMS by the end of May for a cosmic test with magnetic field remains feasible. Safety With all large components now being underground, the shortage...

  1. Scanning ARM Cloud Radars. Part II: Data Quality Control and Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, Pavlos; Jo, Ieng; Borque, Paloma; Tatarevic, Aleksandra; Lamer, Katia; Bharadwaj, Nitin; Widener, Kevin B.; Johnson, Karen L.; Clothiaux, Eugene E.

    2014-03-01

    The Scanning ARM Cloud Radars (SACR’s) are the primary instruments for documenting the four-dimensional structure and evolution of clouds within a 20-30 km radius from the ARM fixed and mobile sites. Here, the post-processing of the calibrated SACR measurements is discussed. First, a feature mask algorithm that objectively determines the presence of significant radar returns is described. The feature mask algorithm is based on the statistical properties of radar receiver noise. It accounts for atmospheric emission and is applicable even for SACR profiles with few or no signal-free range gates. Using the nearest-in-time atmospheric sounding, the SACR radar reflectivities are corrected for gaseous attenuation (water vapor and oxygen) using a line-by-line absorption model. Despite having a high pulse repetition frequency, the SACR has a narrow Nyquist velocity limit and thus Doppler velocity folding is commonly observed. An unfolding algorithm that makes use of a first guess for the true Doppler velocity using horizontal wind measurements from the nearest sounding is described. The retrieval of the horizontal wind profile from the HS-RHI SACR scan observations and/or nearest sounding is described. The retrieved horizontal wind profile can be used to adaptively configure SACR scan strategies that depend on wind direction. Several remaining challenges are discussed, including the removal of insect and second-trip echoes. The described algorithms significantly enhance SACR data quality and constitute an important step towards the utilization of SACR measurements for cloud research.

  2. Data acquisition and processing in the ATLAS Tile Calorimeter Phase-II Upgrade Demonstrator

    CERN Document Server

    Valero, Alberto; The ATLAS collaboration

    2016-01-01

    The LHC has planned a series of upgrades culminating in the High Luminosity LHC (HL-LHC) which will have an average luminosity 5-7 times larger than the nominal Run-2 value. The ATLAS Tile Calorimeter (TileCal) will undergo an upgrade to accommodate to the HL-LHC parameters. The TileCal read-out electronics will be redesigned introducing a new read-out strategy. The photomultiplier signals will be digitized and transferred to the TileCal PreProcessors (TilePPr) located off-detector for every bunch crossing, requiring a data bandwidth of 80 Tbps. The TilePPr will provide preprocessed information to the first level of trigger and in parallel will store the samples in pipeline memories. The data of the events selected by the trigger system will be transferred to the ATLAS global Data AcQuisition (DAQ) system for further processing. A demonstrator drawer has been built to evaluate the new proposed readout architecture and prototypes of all the components. In the demonstrator, the detector data received in the Til...

  3. Process analysis transit of municipal waste. Part II - Domestic provisions of law

    Directory of Open Access Journals (Sweden)

    Starkowski Dariusz

    2017-06-01

    Full Text Available In 2013, the Polish legal system referring to municipal waste management was restructured in a revolutionary way. The analysis of new provisions of law described in the article requires particular attention, taking into account their place in the entire system of dealing with waste and connections with the remaining elements of this system. At present, Polish regulations lay down the rules of conduct with all types of waste, diversifying a subjective area of responsibility. These assumptions are determined by the provisions of law that are in force in the Republic of Poland. At present, the system of legal provisions is quite complex; however, the provisions of law of the EU constitute its base (the first article. At the level of Polish law, the goals and tasks concerned with dealing with waste were set forth, which leads to tightening of the system. All actions in this respect - from propagating the selective accumulation and collection of municipal waste, keeping the established levels of recycling and recycling of packaging wastes, and limiting the mass of biodegradable waste directed at the storage - is only a beginning of the road to reduction of environmental risks. In this case, permanent monitoring of proper waste dealing in the commune, the province as well as the entire country is essential. Third part of the article will present characterization, division, classification and identification of waste, together with the aspects of logistic process of municipal waste collection and transport.

  4. Solar desalination using humidification-dehumidification processes. Part II. An experimental investigation

    International Nuclear Information System (INIS)

    Nafey, A.S.; Fath, H.E.S.; El-Helaby, S.O.; Soliman, A.

    2004-01-01

    An experimental investigation of a humidification-dehumidification desalination (HDD) process using solar energy at the weather conditions of Suez City, Egypt, is presented. A test rig is designed and constructed to conduct this investigation under different environmental and operating conditions. The test rig consists of a solar water heater (concentrator solar collector type), solar air heater (flat plate solar collector type), humidifier tower and dehumidifier exchanger. Different variables are examined including the feed water flow rate, the air flow rate, the cooling water flow rate in the dehumidifier and the weather conditions. Comparisons between the experimental results and other published results are presented. It is found that the results of the developed mathematical model by the same authors are in good agreement with the experimental results. The tested results show that the productivity of the system is strongly affected by the saline water temperature at the inlet to the humidifier, dehumidifier cooling water flow rate, air flow rate and solar intensity. The wind speed and ambient temperature variation were found to have a very small effect on the system productivity. A general correlation is developed to predict the unit productivity under different operating conditions. The results of this correlation have a reasonable confidence level (maximum error ±6%)

  5. Comparing Binaural Pre-processing Strategies II: Speech Intelligibility of Bilateral Cochlear Implant Users.

    Science.gov (United States)

    Baumgärtel, Regina M; Hu, Hongmei; Krawczyk-Becker, Martin; Marquardt, Daniel; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Bomke, Katrin; Plotz, Karsten; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-12-30

    Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs). 50% speech reception thresholds (SRT50) were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR) beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users. © The Author(s) 2015.

  6. Using Integer Programming for Airport Service Planning in Staff Scheduling

    Directory of Open Access Journals (Sweden)

    W.H. Ip

    2010-09-01

    Full Text Available Reliability and safety in flight is extremely necessary and that depend on the adoption of proper maintenance system. Therefore, it is essential for aircraft maintenance companies to perform the manpower scheduling efficiently. One of the objectives of this paper is to provide an Integer Programming approach to determine the optimal solutions to aircraft maintenance planning and scheduling and hence the planning and scheduling processes can become more efficient and effective. Another objective is to develop a set of computational schedules for maintenance manpower to cover all scheduled flights. In this paper, a sequential methodology consisting of 3 stages is proposed. They are initial maintenance demand schedule, the maintenance pairing and the maintenance group(s assignment. Since scheduling would split up into different stages, different mathematical techniques have been adopted to cater for their own problem characteristics. Microsoft Excel would be used. Results from the first stage and second stage would be inputted into integer programming model using Microsoft Excel Solver to find the optimal solution. Also, Microsoft Excel VBA is used for devising a scheduling system in order to reduce the manual process and provide a user friendly interface. For the results, all can be obtained optimal solution and the computation time is reasonable and acceptable. Besides, the comparison of the peak time and non-peak time is discussed.

  7. SR 2603. Support of the BMU in the process of decommissioning of the explorative mine Asse II. Final report for the 31.12.2008

    International Nuclear Information System (INIS)

    Kallenbach-Herbert, Beate; Ustohalova, Veronika

    2009-01-01

    The final report on the BMU support in the process of decommissioning of Asse II includes the following topics: Overview on the involved boards. Actual development in the process: inventory, contaminated brines, shut-down concept and structural safety, exchange of the operating company, appointment of an information center and financial equalization for the region. Work of the support group Asse II and the support group option's comparison. Questioning of the support group. Evaluation of the participation process with respect to the 31.12.2008.

  8. Multi-objective group scheduling with learning effect in the cellular manufacturing system

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Taghavi-fard

    2011-01-01

    Full Text Available Group scheduling problem in cellular manufacturing systems consists of two major steps. Sequence of parts in each part-family and the sequence of part-family to enter the cell to be processed. This paper presents a new method for group scheduling problems in flow shop systems where it minimizes makespan (Cmax and total tardiness. In this paper, a position-based learning model in cellular manufacturing system is utilized where processing time for each part-family depends on the entrance sequence of that part. The problem of group scheduling is modeled by minimizing two objectives of position-based learning effect as well as the assumption of setup time depending on the sequence of parts-family. Since the proposed problem is NP-hard, two meta heuristic algorithms are presented based on genetic algorithm, namely: Non-dominated sorting genetic algorithm (NSGA-II and non-dominated rank genetic algorithm (NRGA. The algorithms are tested using randomly generated problems. The results include a set of Pareto solutions and three different evaluation criteria are used to compare the results. The results indicate that the proposed algorithms are quite efficient to solve the problem in a short computational time.

  9. Randomized phase II study of a bendamustine monotherapy schedule for relapsed or refractory low-grade B-cell non-Hodgkin lymphoma or mantle cell lymphoma (RABBIT-14).

    Science.gov (United States)

    Itoh, Kuniaki; Igarashi, Tadahiko; Irisawa, Hiroyuki; Aotsuka, Nobuyuki; Masuda, Shinichi; Utsu, Yoshikazu; Tsujimura, Hideki; Tsukasaki, Kunihiro; Wakita, Hisashi

    2017-10-30

    The aim of this randomized phase II study was to improve the treatment delays and discontinuations associated with bendamustine use by comparing the effect of Benda-14 (intravenous bendamustine, 120 mg/m 2 on days 1 and 15, repeated every 4 weeks for a total of 6 cycles) with those of the standard treatment in relapsed indolent lymphoma and/or mantle cell lymphoma. Forty-six patients were randomly assigned to the treatments from September 2012 to February 2016. Treatment accomplishment rate and median relative dose intensity were similar in both arms: 38 and 63.4% in the Benda-14 arm and 41 and 66.3% in the standard treatment arm, respectively. The overall response rate and median progression-free survival, respectively, were 83% and 21.0 months for Benda-14, and 77% and 15.5 months for the standard treatment. Benda-14 induced favorable responses with less frequent hematological toxicities.

  10. Spectrophotometric Analysis of the Kinetic of Pd(II Chloride Complex Ions Sorption Process from Diluted Aqua Solutions Using Commercially Available Activated Carbon

    Directory of Open Access Journals (Sweden)

    Wojnicki M.

    2017-12-01

    Full Text Available In this paper, results of adsorption kinetic studies of Pd(II chloride complex ions on activated carbon Organosrob 10 CO are presented. Spectorphotometrical method was applied to investigate the process. Kinetic model was proposed, and fundamental thermodynamic parameters were determined. Proposed kinetic model describes well observed phenomenon in the studied range of concentration of Pd(II chloride complex ions as well, as concentration of activated carbon.

  11. Scanning ARM Cloud Radars Part II. Data Quality Control and Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, Pavlos [McGill Univ., Montreal, QC (Canada); Jo, Ieng [McGill Univ., Montreal, QC (Canada); Borque, Paloma [McGill Univ., Montreal, QC (Canada); Tatarevic, Aleksandra [McGill Univ., Montreal, QC (Canada); Lamer, Katia [McGill Univ., Montreal, QC (Canada); Bharadwaj, Nitin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widener, Kevin B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Karen [Brookhaven National Lab. (BNL), Upton, NY (United States); Clothiaux, Eugene E. [Pennsylvania State Univ., State College, PA (United States)

    2013-10-04

    The Scanning ARM Cloud Radars (SACR’s) are the primary instruments for documenting the four-dimensional structure and evolution of clouds within a 20-30 km radius from the ARM fixed and mobile sites. Here, the post-processing of the calibrated SACR measurements is discussed. First, a feature mask algorithm that objectively determines the presence of significant radar returns is described. The feature mask algorithm is based on the statistical properties of radar receiver noise. It accounts for atmospheric emission and is applicable even for SACR profiles with few or no signal-free range gates. Using the nearest-in-time atmospheric sounding, the SACR radar reflectivities are corrected for gaseous attenuation (water vapor and oxygen) using a line-by-line absorption model. Despite having a high pulse repetition frequency, the SACR has a narrow Nyquist velocity limit and thus Doppler velocity folding is commonly observed. An unfolding algorithm that makes use of a first guess for the true Doppler velocity using horizontal wind measurements from the nearest sounding is described. The retrieval of the horizontal wind profile from the Hemispherical Sky – Range Height Indicator SACR scan observations and/or nearest sounding is described. The retrieved horizontal wind profile can be used to adaptively configure SACR scan strategies that depend on wind direction. Several remaining challenges are discussed, including the removal of insect and second-trip echoes. The described algorithms significantly enhance SACR data quality and constitute an important step towards the utilization of SACR measurements for cloud research.

  12. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    Science.gov (United States)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  13. ATD-2 Surface Scheduling and Metering Concept

    Science.gov (United States)

    Coppenbarger, Richard A.; Jung, Yoon Chul; Capps, Richard Alan; Engelland, Shawn A.

    2017-01-01

    This presentation describes the concept of ATD-2 tactical surface scheduling and metering. The concept is composed of several elements, including data exchange and integration; surface modeling; surface scheduling; and surface metering. The presentation explains each of the elements. Surface metering is implemented to balance demand and capacity• When surface metering is on, target times from surface scheduler areconverted to advisories for throttling demand• Through the scheduling process, flights with CTOTs will not get addedmetering delay (avoids potential for ‘double delay’)• Carriers can designate certain flights as exempt from metering holds• Demand throttle in Phase 1 at CLT is through advisories sent to rampcontrollers for pushback instructions to the flight deck– Push now– Hold for an advised period of time (in minutes)• Principles of surface metering can be more generally applied to otherairports in the NAS to throttle demand via spot-release times (TMATs Strong focus on optimal use of airport resources• Flexibility enables stakeholders to vary the amount of delay theywould like transferred to gate• Addresses practical aspects of executing surface metering in aturbulent real world environment• Algorithms designed for both short term demand/capacityimbalances (banks) or sustained metering situations• Leverage automation to enable surface metering capability withoutrequiring additional positions• Represents first step in Tactical/Strategic fusion• Provides longer look-ahead calculations to enable analysis ofstrategic surface metering potential usage

  14. Advance Resource Provisioning in Bulk Data Scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet

    2012-10-01

    Today?s scientific and business applications generate mas- sive data sets that need to be transferred to remote sites for sharing, processing, and long term storage. Because of increasing data volumes and enhancement in current net- work technology that provide on-demand high-speed data access between collaborating institutions, data handling and scheduling problems have reached a new scale. In this paper, we present a new data scheduling model with ad- vance resource provisioning, in which data movement operations are defined with earliest start and latest comple- tion times. We analyze time-dependent resource assign- ment problem, and propose a new methodology to improve the current systems by allowing researchers and higher-level meta-schedulers to use data-placement as-a-service, so they can plan ahead and submit transfer requests in advance. In general, scheduling with time and resource conflicts is NP-hard. We introduce an efficient algorithm to organize multiple requests on the fly, while satisfying users? time and resource constraints. We successfully tested our algorithm in a simple benchmark simulator that we have developed, and demonstrated its performance with initial test results.

  15. 75 FR 34219 - Revision of Fee Schedules; Fee Recovery for FY 2010

    Science.gov (United States)

    2010-06-16

    ... Part II Nuclear Regulatory Commission 10 CFR Parts 170 and 171 Revision of Fee Schedules; Fee...-2009-0333 RIN 3150-AI70 Revision of Fee Schedules; Fee Recovery for FY 2010 AGENCY: Nuclear Regulatory..., inspection, and annual fees charged to its applicants and licensees. The amendments are necessary to...

  16. Estrutura de prática e idade no processo adaptativo da aprendizagem de uma tarefa de "timing" coincidente Practice schedule and age on the adaptive process of the coincident timing task learning

    Directory of Open Access Journals (Sweden)

    Lucia Afonso Gonçalves

    2010-12-01

    Full Text Available O objetivo desse estudo foi investigar os efeitos de diferentes estruturas de prática no processo adaptativo da aprendizagem de uma tarefa de "timing" coincidente em função da idade. Crianças (n = 40, adultos (n = 47 e idosos (n = 57 foram distribuídos em grupos de prática constante, aleatória, constante-aleatória e aleatória-constante. A tarefa consistiu consistia em tocar certos alvos de forma sequencial em integração a um estímulo visual. O delineamento envolveu duas fases de aprendizagem: estabilização e adaptação. Os dados foram analisados em relação aos erros absoluto, variável, constante e de execução. Os resultados mostraram que o processo adaptativo na aprendizagem de crianças, adultos e idosos de uma tarefa de "timing" coincidente foi mais beneficiado pela prática constante-aleatória.The objective of this study was to investigate the effects of different practice schedules on the adaptive process of a coincident timing task learning in function of age. Children (n = 40, adults (n = 47 and elderly (n = 57 were distributed into constant, random, constant-random and random-constant practice groups. The task consisted of touching response keys sequentially in conjunction with a visual stimulus. The experimental design was consisted of two learning phases: stabilization and adaptation. The data were analyzed in terms of absolute, variable, constant and execution errors. The results showed that the constant-random practice was the most efficient for the adaptive process in learning a coincident timing task regardless of the age group.

  17. Pre-treatment processes of Azolla filiculoides to remove Pb(II), Cd(II), Ni(II) and Zn(II) from aqueous solution in the batch and fixed-bed reactors.

    Science.gov (United States)

    Khosravi, Morteza; Rakhshaee, Roohan; Ganji, Masuod Taghi

    2005-12-09

    Intact and treated biomass can remove heavy metals from water and wastewater. This study examined the ability of the activated, semi-intact and inactivated Azolla filiculoides (a small water fern) to remove Pb(2+), Cd(2+), Ni(2+) and Zn(2+) from the aqueous solution. The maximum uptake capacities of these metal ions using the activated Azolla filiculoides by NaOH at pH 10.5 +/- 0.2 and then CaCl(2)/MgCl(2)/NaCl with total concentration of 2 M (2:1:1 mole ratio) in the separate batch reactors were obtained about 271, 111, 71 and 60 mg/g (dry Azolla), respectively. The obtained capacities of maximum adsorption for these kinds of the pre-treated Azolla in the fixed-bed reactors (N(o)) were also very close to the values obtained for the batch reactors (Q(max)). On the other hand, it was shown that HCl, CH(3)OH, C(2)H(5)OH, FeCl(2), SrCl(2), BaCl(2) and AlCl(3) in the pre-treatment processes decreased the ability of Azolla to remove the heavy metals in comparison to the semi-intact Azolla, considerably. The kinetic studies showed that the heavy metals uptake by the activated Azolla was done more rapid than those for the semi-intact Azolla.

  18. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Science.gov (United States)

    2010-04-01

    ... schedules of rates for emergency energy, spinning reserve or economy energy or in cases of coordination and...? (ii) A summary statement of all cost (whether fully distributed, incremental or other) computations...

  19. Solvent refined coal (SRC) process. Flashing of SRC-II slurry in the vacuum column on Process Development Unit P-99. Interim report, February-June 1980

    Energy Technology Data Exchange (ETDEWEB)

    Gray, J. A.; Mathias, S. T.

    1980-10-01

    This report presents the results of 73 tests on the vacuum flash system of Process Development Unit P-99 performed during processing of three different coals; the second batch, fourth shipment (low ash batch) of Powhatan No. 5 Mine (LR-27383), Powhatan No. 6 Mine (LR-27596) and Ireland Mine (LR-27987). The objective of this work was to obtain experimental data for use in confirming and improving the design of the vacuum distillation column for the 6000 ton/day SRC-II Demonstration Plant. The 900/sup 0/F distillate content of the bottoms and the percent of feed flashed overhead were correlated with flash zone operating conditions for each coal, and the observed differences in performance were attributed to differences in the feed compositions. Retrogressive reactions appeared to be occurring in the 900/sup 0/F+ pyridine soluble material leading to an increase in the quantity of pyridine insoluble organic matter. Stream physical properties determined include specific gravity, viscosity and melting point. Elemental, distillation and solvent analyses were used to calculate component material balances. The Technology and Materials Department has used these results in a separate study comparing experimental K-values and vapor/liquid split with CHAMP computer program design predictions.

  20. Plan and schedule for disposition and regulatory compliance for miscellaneous streams. Revision 1

    International Nuclear Information System (INIS)

    1994-12-01

    On December 23, 1991, the U.S. Department of Energy, Richland Operations Office (RL) and the Washington State Department of Ecology (Ecology) agreed to adhere to the provisions of Department of Ecology Consent Order No. DE 91NM-177 (Consent Order). The Consent Order lists regulatory milestones for liquid effluent streams at the Hanford Site to comply with the permitting requirements of Washington Administrative Code (WAC) 173-216 (State Waste Discharge Permit Program) or WAC 173-218 (Washington Underground Injection Control Program) where applicable. Hanford Site liquid effluent streams discharging to the soil column have been categorized in the Consent Order as follows: Phase I Streams Phase II Streams Miscellaneous Streams. Phase I and Phase II Streams are addressed in two RL reports: open-quotes Plan and Schedule to Discontinue Disposal of Contaminated Liquids into the Soil Column at the Hanford Siteclose quotes (DOE-RL 1987), and open-quotes Annual Status of the Report of the Plan and Schedule to Discontinue Disposal of Contaminated Liquids into the Soil Column at the Hanford Siteclose quotes. Miscellaneous Streams are those liquid effluent streams discharged to the ground that are not categorized as Phase I or Phase II Streams. Miscellaneous Streams discharging to the soil column at the Hanford Site are subject to the requirements of several milestones identified in the Consent Order. This document provides a plan and schedule for the disposition of Miscellaneous Streams. The disposition process for the Miscellaneous Streams is facilitated using a decision tree format. The decision tree and corresponding analysis for determining appropriate disposition of these streams is presented in this document

  1. Gain scheduling using the Youla parameterization

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    1999-01-01

    Gain scheduling controllers are considered in this paper. The gain scheduling problem where the scheduling parameter vector cannot be measured directly, but needs to be estimated is considered. An estimation of the scheduling vector has been derived by using the Youla parameterization. The use...... in connection with H_inf gain scheduling controllers....

  2. 78 FR 26701 - Schedules of Controlled Substances: Placement of Lorcaserin Into Schedule IV

    Science.gov (United States)

    2013-05-08

    .... Phentermine Being Combined With Lorcaserin Eight commenters expressed concern about the probability that... follows: Several commenters were critical of DEA's handling of the scheduling process. The commenters did..., 1301.74, 1301.75(b) and (c), 1301.76, and 1301.77 on or after June 7, 2013. Labeling and Packaging. All...

  3. A microencapsulation process of liquid mercury by sulfur polymer stabilization/solidification technology. Part II: Durability of materials

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Delgado, A.; Guerrero, A.; Lopez, F. A.; Perez, C.; Alguacil, F. J.

    2012-11-01

    Under the European LIFE Program a microencapsulation process was developed for liquid mercury using Sulfur Polymer Stabilization/Solidification (SPSS) technology, obtaining a stable concrete-like sulfur matrix that allows the immobilization of mercury for long-term storage. The process description and characterization of the materials obtained were detailed in Part I. The present document, Part II, reports the results of different tests carried out to determine the durability of Hg-S concrete samples with very high mercury content (up to 30 % w/w). Different UNE and RILEM standard test methods were applied, such as capillary water absorption, low pressure water permeability, alkali/acid resistance, salt mist aging, freeze-thaw resistance and fire performance. The samples exhibited no capillarity and their resistance in both alkaline and acid media was very high. They also showed good resistance to very aggressive environments such as spray salt mist, freeze-thaw and dry-wet. The fire hazard of samples at low heat output was negligible. (Author)

  4. Chitosan microparticles: influence of the gelation process on the release profile and oral bioavailability of albendazole, a class II compound.

    Science.gov (United States)

    Piccirilli, Gisela N; García, Agustina; Leonardi, Darío; Mamprin, María E; Bolmaro, Raúl E; Salomón, Claudio J; Lamas, María C

    2014-11-01

    Encapsulation of albendazole, a class II compound, into polymeric microparticles based on chitosan-sodium lauryl sulfate was investigated as a strategy to improve drug dissolution and oral bioavailability. The microparticles were prepared by spray drying technique and further characterized by means of X-ray powder diffractometry, infrared spectroscopy and scanning electron microscopy. The formation of a novel polymeric structure between chitosan and sodium lauryl sulfate, after the internal or external gelation process, was observed by infrared spectroscopy. The efficiency of encapsulation was found to be between 60 and 85% depending on the internal or external gelation process. Almost spherically spray dried microparticles were observed using scanning electron microscopy. In vitro dissolution results indicated that the microparticles prepared by internal gelation released 8% of the drug within 30 min, while the microparticles prepared by external gelation released 67% within 30 min. It was observed that the AUC and Cmax values of ABZ from microparticles were greatly improved, in comparison with the non-encapsulated drug. In conclusion, the release properties and oral bioavailability of albendazole were greatly improved by using spraydried chitosan-sodium lauryl sulphate microparticles.

  5. Constraint-based scheduling applying constraint programming to scheduling problems

    CERN Document Server

    Baptiste, Philippe; Nuijten, Wim

    2001-01-01

    Constraint Programming is a problem-solving paradigm that establishes a clear distinction between two pivotal aspects of a problem: (1) a precise definition of the constraints that define the problem to be solved and (2) the algorithms and heuristics enabling the selection of decisions to solve the problem. It is because of these capabilities that Constraint Programming is increasingly being employed as a problem-solving tool to solve scheduling problems. Hence the development of Constraint-Based Scheduling as a field of study. The aim of this book is to provide an overview of the most widely used Constraint-Based Scheduling techniques. Following the principles of Constraint Programming, the book consists of three distinct parts: The first chapter introduces the basic principles of Constraint Programming and provides a model of the constraints that are the most often encountered in scheduling problems. Chapters 2, 3, 4, and 5 are focused on the propagation of resource constraints, which usually are responsibl...

  6. Comparing Mixed & Integer Programming vs. Constraint Programming by solving Job-Shop Scheduling Problems

    Directory of Open Access Journals (Sweden)

    Renata Melo e Silva de Oliveira

    2015-03-01

    Full Text Available Scheduling is a key factor for operations management as well as for business success. From industrial Job-shop Scheduling problems (JSSP, many optimization challenges have emerged since de 1960s when improvements have been continuously required such as bottlenecks allocation, lead-time reductions and reducing response time to requests.  With this in perspective, this work aims to discuss 3 different optimization models for minimizing Makespan. Those 3 models were applied on 17 classical problems of examples JSSP and produced different outputs.  The first model resorts on Mixed and Integer Programming (MIP and it resulted on optimizing 60% of the studied problems. The other models were based on Constraint Programming (CP and approached the problem in two different ways: a model CP1 is a standard IBM algorithm whereof restrictions have an interval structure that fail to solve 53% of the proposed instances, b Model CP-2 approaches the problem with disjunctive constraints and optimized 88% of the instances. In this work, each model is individually analyzed and then compared considering: i Optimization success performance, ii Computational processing time, iii Greatest Resource Utilization and, iv Minimum Work-in-process Inventory. Results demonstrated that CP-2 presented best results on criteria i and ii, but MIP was superior on criteria iii and iv and those findings are discussed at the final section of this work.

  7. Scheduling Broadcasts in a Network of Timelines

    KAUST Repository

    Manzoor, Emaad A.

    2015-05-12

    Broadcasts and timelines are the primary mechanism of information exchange in online social platforms today. Services like Facebook, Twitter and Instagram have enabled ordinary people to reach large audiences spanning cultures and countries, while their massive popularity has created increasingly competitive marketplaces of attention. Timing broadcasts to capture the attention of such geographically diverse audiences has sparked interest from many startups and social marketing gurus. However, formal study is lacking on both the timing and frequency problems. In this thesis, we introduce, motivate and solve the broadcast scheduling problem of specifying the timing and frequency of publishing content to maximise the attention received. We validate and quantify three interacting behavioural phenomena to parametrise social platform users: information overload, bursty circadian rhythms and monotony aversion, which is defined here for the first time. Our analysis of the influence of monotony refutes the common assumption that posts on social network timelines are consumed piecemeal independently. Instead, we reveal that posts are consumed in chunks, which has important consequences for any future work considering human behaviour over social network timelines. Our quantification of monotony aversion is also novel, and has applications to problems in various domains such as recommender list diversification, user satiation and variety-seeking consumer behaviour. Having studied the underlying behavioural phenomena, we link schedules, timelines, attention and behaviour by formalising a timeline information exchange process. Our formulation gives rise to a natural objective function that quantifies the expected collective attention an arrangement of posts on a timeline will receive. We apply this formulation as a case-study on real-data from Twitter, where we estimate behavioural parameters, calculate the attention potential for different scheduling strategies and, using the

  8. Artificial intelligence for the CTA Observatory scheduler

    Science.gov (United States)

    Colomé, Josep; Colomer, Pau; Campreciós, Jordi; Coiffard, Thierry; de Oña, Emma; Pedaletti, Giovanna; Torres, Diego F.; Garcia-Piquer, Alvaro

    2014-08-01

    The Cherenkov Telescope Array (CTA) project will be the next generation ground-based very high energy gamma-ray instrument. The success of the precursor projects (i.e., HESS, MAGIC, VERITAS) motivated the construction of this large infrastructure that is included in the roadmap of the ESFRI projects since 2008. CTA is planned to start the construction phase in 2015 and will consist of two arrays of Cherenkov telescopes operated as a proposal-driven open observatory. Two sites are foreseen at the southern and northern hemispheres. The CTA observatory will handle several observation modes and will have to operate tens of telescopes with a highly efficient and reliable control. Thus, the CTA planning tool is a key element in the control layer for the optimization of the observatory time. The main purpose of the scheduler for CTA is the allocation of multiple tasks to one single array or to multiple sub-arrays of telescopes, while maximizing the scientific return of the facility and minimizing the operational costs. The scheduler considers long- and short-term varying conditions to optimize the prioritization of tasks. A short-term scheduler provides the system with the capability to adapt, in almost real-time, the selected task to the varying execution constraints (i.e., Targets of Opportunity, health or status of the system components, environment conditions). The scheduling procedure ensures that long-term planning decisions are correctly transferred to the short-term prioritization process for a suitable selection of the next task to execute on the array. In this contribution we present the constraints to CTA task scheduling that helped classifying it as a Flexible Job-Shop Problem case and finding its optimal solution based on Artificial Intelligence techniques. We describe the scheduler prototype that uses a Guarded Discrete Stochastic Neural Network (GDSN), for an easy representation of the possible long- and short-term planning solutions, and Constraint

  9. The local–global conjecture for scheduling with non-linear cost

    NARCIS (Netherlands)

    Bansal, N.; Dürr, C.; Thang, N.K.K.; Vásquez, Ó.C.

    2017-01-01

    We consider the classical scheduling problem on a single machine, on which we need to schedule sequentially n given jobs. Every job j has a processing time pj and a priority weight wj, and for a given schedule a completion time Cj. In this paper, we consider the problem of minimizing the objective

  10. "What Do I Teach for 90 Minutes?" Creating a Successful Block-Scheduled English Classroom.

    Science.gov (United States)

    Porter, Carol

    The story of the process that Mundelein High School (located in a northwest suburb of Chicago, Illinois) as it moved from a traditional schedule to a block schedule is described throughout this book as a way to blend theory with practice. The book addresses types of block schedules; key issues for effective preparation; professional development…

  11. A microencapsulation process of liquid mercury by sulfur polymer stabilization/solidification technology. Part II: Durability of materials

    Directory of Open Access Journals (Sweden)

    López-Delgado, A.

    2012-02-01

    Full Text Available Under the European LIFE Program a microencapsulation process was developed for liquid mercury using Sulfur Polymer Stabilization/Solidification (SPSS technology, obtaining a stable concrete-like sulfur matrix that allows the immobilization of mercury for long-term storage. The process description and characterization of the materials obtained were detailed in Part I. The present document, Part II, reports the results of different tests carried out to determine the durability of Hg-S concrete samples with very high mercury content (up to 30 % w/w. Different UNE and RILEM standard test methods were applied, such as capillary water absorption, low pressure water permeability, alkali/acid resistance, salt mist aging, freeze-thaw resistance and fire performance. The samples exhibited no capillarity and their resistance in both alkaline and acid media was very high. They also showed good resistance to very aggressive environments such as spray salt mist, freeze-thaw and dry-wet. The fire hazard of samples at low heat output was negligible.

    Dentro del Programa Europeo LIFE, se ha desarrollado un proceso de microencapsulación de mercurio liquido, utilizando la tecnología de estabilización/solidificación con azufre polimérico (SPSS. Como resultado se ha obtenido un material estable tipo concreto que permite la inmovilización de mercurio y su almacenamiento a largo plazo. La descripción del proceso y la caracterización de los materiales obtenidos, denominados concretos Hg-S, se detallan en la Parte I. El presente trabajo, Parte II, incluye los resultados de los diferentes ensayos realizados para determinar la durabilidad de las muestras de concreto Hg-S con un contenido de mercurio de hasta el 30 %. Se han utilizado diferentes métodos de ensayo estándar, UNE y RILEM, para determinar propiedades como la absorción de agua por capilaridad, la permeabilidad de agua a baja presión, la resistencia a álcali y ácido, el comportamiento en

  12. Immunization Schedules for Infants and Children

    Science.gov (United States)

    ... ACIP Vaccination Recommendations Why Immunize? Vaccines: The Basics Immunization Schedule for Infants and Children (Birth through 6 ... any questions please talk to your doctor. 2018 Immunization Schedule Recommended Vaccinations for Infants and Children Schedule ...

  13. Operating Theatre Planning and Scheduling.

    NARCIS (Netherlands)

    Hans, Elias W.; Vanberkel, P.T.; Hall, R.

    2012-01-01

    In this chapter we present a number of approaches to operating theatre planning and scheduling. We organize these approaches hierarchically which serves to illustrate the breadth of problems confronted by researchers. At each hierarchicalplanning level we describe common problems, solution

  14. Schedule Sales Query Raw Data

    Data.gov (United States)

    General Services Administration — Schedule Sales Query presents sales volume figures as reported to GSA by contractors. The reports are generated as quarterly reports for the current year and the...

  15. Multiagent scheduling models and algorithms

    CERN Document Server

    Agnetis, Alessandro; Gawiejnowicz, Stanisław; Pacciarelli, Dario; Soukhal, Ameur

    2014-01-01

    This book presents multi-agent scheduling models in which subsets of jobs sharing the same resources are evaluated by different criteria. It discusses complexity results, approximation schemes, heuristics and exact algorithms.

  16. Executive Schedule C System (ESCS)

    Data.gov (United States)

    Office of Personnel Management — Used to store information on Federal employees in the Senior Executive Service (SES) and appointed employees in the Schedule C System. Every four years, just after...

  17. Future aircraft networks and schedules

    Science.gov (United States)

    Shu, Yan

    2011-07-01

    Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents

  18. Construction schedules slack time minimizing

    Science.gov (United States)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  19. Biosorption of Cd(II), Ni(II) and Pb(II) from aqueous solution by dried biomass of aspergillus niger: application of response surface methodology to the optimization of process parameters

    Energy Technology Data Exchange (ETDEWEB)

    Amini, Malihe; Younesi, Habibollah [Department of Environmental Science, Faculty of Natural Resources and Marine Sciences, Tarbiat Modares University, Noor (Iran)

    2009-10-15

    In this study, the biosorption of Cd(II), Ni(II) and Pb(II) on Aspergillus niger in a batch system was investigated, and optimal condition determined by means of central composite design (CCD) under response surface methodology (RSM). Biomass inactivated by heat and pretreated by alkali solution was used in the determination of optimal conditions. The effect of initial solution pH, biomass dose and initial ion concentration on the removal efficiency of metal ions by A. niger was optimized using a design of experiment (DOE) method. Experimental results indicated that the optimal conditions for biosorption were 5.22 g/L, 89.93 mg/L and 6.01 for biomass dose, initial ion concentration and solution pH, respectively. Enhancement of metal biosorption capacity of the dried biomass by pretreatment with sodium hydroxide was observed. Maximal removal efficiencies for Cd(II), Ni(III) and Pb(II) ions of 98, 80 and 99% were achieved, respectively. The biosorption capacity of A. niger biomass obtained for Cd(II), Ni(II) and Pb(II) ions was 2.2, 1.6 and 4.7 mg/g, respectively. According to these observations the fungal biomass of A. niger is a suitable biosorbent for the removal of heavy metals from aqueous solutions. Multiple response optimization was applied to the experimental data to discover the optimal conditions for a set of responses, simultaneously, by using a desirability function. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  20. Hybrid Pareto artificial bee colony algorithm for multi-objective single machine group scheduling problem with sequence-dependent setup times and learning effects.

    Science.gov (United States)

    Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao

    2016-01-01

    Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.

  1. Multireversible redox processes in pentanuclear bis(triple-helical) manganese complexes featuring an oxo-centered triangular {Mn(II)2Mn(III)(μ3-O)}5+ or {Mn(II)Mn(III)2(μ3-O)}6+ core wrapped by two {Mn(II)2(bpp)3}-.

    Science.gov (United States)

    Romain, Sophie; Rich, Jordi; Sens, Cristina; Stoll, Thibaut; Benet-Buchholz, Jordi; Llobet, Antoni; Rodriguez, Montserrat; Romero, Isabel; Clérac, Rodolphe; Mathonière, Corine; Duboc, Carole; Deronzier, Alain; Collomb, Marie-Noëlle

    2011-09-05

    A new pentanuclear bis(triple-helical) manganese complex has been isolated and characterized by X-ray diffraction in two oxidation states: [{Mn(II)(μ-bpp)(3)}(2)Mn(II)(2)Mn(III)(μ-O)](3+) (1(3+)) and [{Mn(II)(μ-bpp)(3)}(2)Mn(II)Mn(III)(2)(μ-O)](4+) (1(4+)). The structure consists of a central {Mn(3)(μ(3)-O)} core of Mn(II)(2)Mn(III) (1(3+)) or Mn(II)Mn(III)(2) ions (1(4+)) which is connected to two apical Mn(II) ions through six bpp(-) ligands. Both cations have a triple-stranded helicate configuration, and a pair of enantiomers is present in each crystal. The redox properties of 1(3+) have been investigated in CH(3)CN. A series of five distinct and reversible one-electron waves is observed in the -1.0 and +1.50 V potential range, assigned to the Mn(II)(4)Mn(III)/Mn(II)(5), Mn(II)(3)Mn(III)(2)/Mn(II)(4)Mn(III), Mn(II)(2)Mn(III)(3)/Mn(II)(3)Mn(III)(2), Mn(II)Mn(III)(4)/Mn(II)(2)Mn(III)(3), and Mn(III)(5)/Mn(II)Mn(III)(4) redox couples. The two first oxidation processes leading to Mn(II)(3)Mn(III)(2) (1(4+)) and Mn(II)(2)Mn(III)(3) (1(5+)) are related to the oxidation of the Mn(II) ions of the central core and the two higher oxidation waves, close in potential, are thus assigned to the oxidation of the two apical Mn(II) ions. The 1(4+) and 1(5+) oxidized species and the reduced Mn(4)(II) (1(2+)) species are quantitatively generated by bulk electrolyses demonstrating the high stability of the pentanuclear structure in four oxidation states (1(2+) to 1(5+)). The spectroscopic characteristics (X-band electron paramagnetic resonance, EPR, and UV-visible) of these species are also described as well as the magnetic properties of 1(3+) and 1(4+) in solid state. The powder X- and Q-band EPR signature of 1(3+) corresponds to an S = 5/2 spin state characterized by a small zero-field splitting parameter (|D| = 0.071 cm(-1)) attributed to the two apical Mn(II) ions. At 40 K, the magnetic behavior is consistent for 1(3+) with two apical S = 5/2 {Mn(II)(bpp)(3)}(-) and one S

  2. Flow-shop scheduling problem under uncertainties: Review and trends

    Directory of Open Access Journals (Sweden)

    Eliana María González-Neira

    2017-03-01

    Full Text Available Among the different tasks in production logistics, job scheduling is one of the most important at the operational decision-making level to enable organizations to achieve competiveness. Scheduling consists in the allocation of limited resources to activities over time in order to achieve one or more optimization objectives. Flow-shop (FS scheduling problems encompass the sequencing processes in environments in which the activities or operations are performed in a serial flow. This type of configuration includes assembly lines and the chemical, electronic, food, and metallurgical industries, among others. Scheduling has been mostly investigated for the deterministic cases, in which all parameters are known in advance and do not vary over time. Nevertheless, in real-world situations, events are frequently subject to uncertainties that can affect the decision-making process. Thus, it is important to study scheduling and sequencing activities under uncertainties since they can cause infeasibilities and disturbances. The purpose of this paper is to provide a general overview of the FS scheduling problem under uncertainties and its role in production logistics and to draw up opportunities for further research. To this end, 100 papers about FS and flexible flow-shop scheduling problems published from 2001 to October 2016 were analyzed and classified. Trends in the reviewed literature are presented and finally some research opportunities in the field are proposed.

  3. Understanding the costs and schedule of hydroelectric projects

    International Nuclear Information System (INIS)

    Merrow, E.W.; Schroeder, B.R.

    1991-01-01

    This paper is based on a study conducted for the World Bank which evaluated the feasibility of developing an empirically based ex ante project analysis system for hydroelectric projects. The system would be used to assess: the reasonableness of engineering-based cost and schedule estimates used for project appraisal and preliminary estimates used to select projects for appraisal; and the potential for cost growth and schedule slip. The system would help identify projects early in the project appraisal process that harbor significantly higher than normal risks of overrunning cost and schedule estimates

  4. Flexiyear Schedules in Germany.

    Science.gov (United States)

    Teriet, Bernhard

    1982-01-01

    Describes a German experiment whereby fulltime employees can work fewer hours without losing status and parttime employees have more options on allocations of working hours. The process ensures that management can count on enough staff for peak periods and more easily plan ahead. (JOW)

  5. Roles of molecular layer interneurons in sensory information processing in mouse cerebellar cortex Crus II in vivo.

    Directory of Open Access Journals (Sweden)

    Chun-Ping Chu

    Full Text Available Cerebellar cortical molecular layer interneurons (MLIs play essential roles in sensory information processing by the cerebellar cortex. However, recent experimental and modeling results are questioning traditional roles for molecular layer inhibition in the cerebellum.Synaptic responses of MLIs and Purkinje cells (PCs, evoked by air-puff stimulation of the ipsilateral whisker pad were recorded from cerebellar cortex Crus II in urethane-anesthetized ICR mice by in vivo whole-cell patch-clamp recording techniques. Under current-clamp (I = 0, air-puff stimuli were found to primarily produce inhibition in PCs. In MLIs, this stimulus evoked spike firing regardless of whether they made basket-type synaptic connections or not. However, MLIs not making basket-type synaptic connections had higher rates of background activity and also generated spontaneous spike-lets. Under voltage-clamp conditions, excitatory postsynaptic currents (EPSCs were recorded in MLIs, although the predominant response of recorded PCs was an inhibitory postsynaptic potential (IPSP. The latencies of EPSCs were similar for all MLIs, but the time course and amplitude of EPSCs varied with depth in the molecular layer. The highest amplitude, shortest duration EPSCs were recorded from MLIs deep in the molecular layer, which also made basket-type synaptic connections. Comparing MLI to PC responses, time to peak of PC IPSP was significantly slower than MLI recorded EPSCs. Blocking GABA(A receptors uncovered larger EPSCs in PCs whose time to peak, half-width and 10-90% rising time were also significantly slower than in MLIs. Biocytin labeling indicated that the MLIs (but not PCs are dye-coupled.These findings indicate that tactile face stimulation evokes rapid excitation in MLIs and inhibition occurring at later latencies in PCs in mouse cerebellar cortex Crus II. These results support previous suggestions that the lack of parallel fiber driven PC activity is due to the effect

  6. IFR fuel cycle demonstration in the EBR-II Fuel Cycle Facility

    International Nuclear Information System (INIS)

    Lineberry, M.J.; Phipps, R.D.; Rigg, R.H.; Benedict, R.W.; Carnes, M.D.; Herceg, J.E.; Holtz, R.E.

    1991-01-01

    The next major milestone of the IFR (Integral Fast Reactor) program is engineering-scale demonstration of the pyroprocess fuel cycle. The EBR-II Fuel Cycle Facility has just entered a startup phase which includes completion of facility modifications, and installation and cold checkout of process equipment. This paper reviews the design and construction of the facility, the design and fabrication of the process equipment, and the schedule and initial plan for its operation. (author)

  7. IFR fuel cycle demonstration in the EBR-II Fuel Cycle Facility

    International Nuclear Information System (INIS)

    Lineberry, M.J.; Phipps, R.D.; Rigg, R.H.; Benedict, R.W.; Carnes, M.D.; Herceg, J.E.; Holtz, R.E.

    1991-01-01

    The next major milestone of the IFR program is engineering-scale demonstration of the pyroprocess fuel cycle. The EBR-II Fuel Cycle Facility has just entered a startup phase which includes completion of facility modifications, and installation and cold checkout of process equipment. This paper reviews the design and construction of the facility, the design and fabrication of the process equipment, and the schedule and initial plan for its operation. 5 refs., 4 figs

  8. Feasibility of processing the experimental breeder reactor-II driver fuel from the Idaho National Laboratory through Savannah River Site's H-Canyon facility

    Energy Technology Data Exchange (ETDEWEB)

    Magoulas, V. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-07-28

    Savannah River National Laboratory (SRNL) was requested to evaluate the potential to receive and process the Idaho National Laboratory (INL) uranium (U) recovered from the Experimental Breeder Reactor II (EBR-II) driver fuel through the Savannah River Site’s (SRS) H-Canyon as a way to disposition the material. INL recovers the uranium from the sodium bonded metallic fuel irradiated in the EBR-II reactor using an electrorefining process. There were two compositions of EBR-II driver fuel. The early generation fuel was U-5Fs, which consisted of 95% U metal alloyed with 5% noble metal elements “fissium” (2.5% molybdenum, 2.0% ruthenium, 0.3% rhodium, 0.1% palladium, and 0.1% zirconium), while the later generation was U-10Zr which was 90% U metal alloyed with 10% zirconium. A potential concern during the H-Canyon nitric acid dissolution process of the U metal containing zirconium (Zr) is the explosive behavior that has been reported for alloys of these materials. For this reason, this evaluation was focused on the ability to process the lower Zr content materials, the U-5Fs material.

  9. The applicability of knowledge-based scheduling to the utilities industry

    International Nuclear Information System (INIS)

    Yoshimoto, G.; Gargan, R. Jr.; Duggan, P.

    1992-01-01

    The Electric Power Research Institute (EPRI), Nuclear Power Division, has identified the three major goals of high technology applications for nuclear power plants. These goals are to enhance power production through increasing power generation efficiency, to increase productivity of the operations, and to reduce the threats to the safety of the plant. Our project responds to the second goal by demonstrating that significant productivity increases can be achieved for outage maintenance operations based on existing knowledge-based scheduling technology. Its use can also mitigate threats to potential safety problems by means of the integration of risk assessment features into the scheduler. The scheduling approach uses advanced techniques enabling the automation of the routine scheduling decision process that previously was handled by people. The process of removing conflicts in scheduling is automated. This is achieved by providing activity representations that allow schedulers to express a variety of different scheduling constraints and by implementing scheduling mechanisms that simulate kinds of processes that humans use to find better solutions from a large number of possible solutions. This approach allows schedulers to express detailed constraints between activities and other activities, resources (material and personnel), and requirements that certain states exist for their execution. Our scheduler has already demonstrated its benefit to improving the shuttle processing flow management at Kennedy Space Center. Knowledge-based scheduling techniques should be examined by utilities industry researchers, developers, operators and management for application to utilities planning problems because of its great cost benefit potential. 4 refs., 4 figs

  10. Multiuser switched diversity scheduling schemes

    KAUST Repository

    Shaqfeh, Mohammad; Alnuweiri, Hussein M.; Alouini, Mohamed-Slim

    2012-01-01

    Multiuser switched-diversity scheduling schemes were recently proposed in order to overcome the heavy feedback requirements of conventional opportunistic scheduling schemes by applying a threshold-based, distributed, and ordered scheduling mechanism. The main idea behind these schemes is that slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we characterize the achievable rate region of multiuser switched diversity systems and compare it with the rate region of full feedback multiuser diversity systems. We propose also a novel proportional fair multiuser switched-based scheduling scheme and we demonstrate that it can be optimized using a practical and distributed method to obtain the feedback thresholds. We finally demonstrate by numerical examples that switched-diversity scheduling schemes operate within 0.3 bits/sec/Hz from the ultimate network capacity of full feedback systems in Rayleigh fading conditions. © 2012 IEEE.

  11. Multiuser switched diversity scheduling schemes

    KAUST Repository

    Shaqfeh, Mohammad

    2012-09-01

    Multiuser switched-diversity scheduling schemes were recently proposed in order to overcome the heavy feedback requirements of conventional opportunistic scheduling schemes by applying a threshold-based, distributed, and ordered scheduling mechanism. The main idea behind these schemes is that slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we characterize the achievable rate region of multiuser switched diversity systems and compare it with the rate region of full feedback multiuser diversity systems. We propose also a novel proportional fair multiuser switched-based scheduling scheme and we demonstrate that it can be optimized using a practical and distributed method to obtain the feedback thresholds. We finally demonstrate by numerical examples that switched-diversity scheduling schemes operate within 0.3 bits/sec/Hz from the ultimate network capacity of full feedback systems in Rayleigh fading conditions. © 2012 IEEE.

  12. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons

  13. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons.

  14. Cbs (Contrastrain Based Schedulling Adalah Faktor Penentu Keberhasilan Perusahanan Printing

    Directory of Open Access Journals (Sweden)

    Hendra Achmadi

    2010-06-01

    Full Text Available In a highly competitive industry faces today ranging from small or home-based printing to using machine that can print offset a hundred thousand copies per hour. But, the increasing competition resulted in requiring a faster production time from order entry, print proff until the production process to delivery to customers. Often times in case of orders which will result in the concurrent PPIC will experience vertigo in the setting of production schedules which have concurrent delivery time. Often will end up with no receipt of orders due to difficulties in the production schedule, especially if the orders require the same offset machine and cylinder wear the same length, while the number of cylinders is limited. Therefore, the printing company should be able to do so in the conduct of a penetration timing of production can easily be simulated and implemented on the ground. CBS (Base Constraint scheduling is a technique to do the scheduling of production so that production can be carried out smoothly and quickly that fulfill the promise made to customers. In scheduling, there are several techniques that can be used are: FCFS (First Came First Serve, EDD (Earliest Date, and LCLS (Last Came Last Serve. So, it is required to be able to do way better scheduling to get results quickly in this fast changing schedules.

  15. The combined removal of methyl mercaptan and hydrogen sulfide via an electro-reactor process using a low concentration of continuously regenerable Ag(II) active catalyst

    International Nuclear Information System (INIS)

    Muthuraman, Govindan; Chung, Sang Joon; Moon, Il Shik

    2011-01-01

    Highlights: → Simultaneous removal of H 2 S and CH 3 SH was achieved at electro-reactor. → Active catalyst Ag(II) perpetually regenerated in HNO 3 medium by electrochemical cell. → CH 3 SH destruction follows two reaction pathways. → H 2 S induced destruction of CH 3 SH has identified. → Low concentration of active Ag(II) (12.5 x 10 -4 mol L -1 ) is enough for complete destruction. - Abstract: In this study, an electrocatalytic wet scrubbing process was developed for the simultaneous removal of synthetic odorous gases namely, methyl mercaptan (CH 3 SH) and hydrogen sulfide (H 2 S). The initial process consists of the absorption of CH 3 SH and H 2 S gases by an absorbing solution, followed by their mediated electrochemical oxidation using a low concentration of active Ag(II) in 6 M HNO 3 . Experiments were conducted under different reaction conditions, such as CH 3 SH and H 2 S loadings, active Ag(II) concentrations and molar flow rates. The cyclic voltammetry for the oxidation of CH 3 SH corroborated the electro-reactor results, in that the silver in the 6 M HNO 3 reaction solution significantly influences the oxidation of CH 3 SH. At a low active Ag(II) concentration of 0.0012 M, the CH 3 SH removal experiments demonstrated that the CH 3 SH degradation was steady, with 100% removal at a CH 3 SH loading of 5 g m -3 h -1 . The electro-reactor and cyclic voltammetry results indicated that the removal of H 2 S (100%) follows a mediated electrocatalytic oxidation reaction. The simultaneous removal of 100% of the CH 3 SH and H 2 S was achieved, even with a very low active Ag(II) concentration (0.0012 M), as a result of the high efficiency of the Ag(II). The parallel cyclic voltammetry results demonstrated that a process of simultaneous destruction of both CH 3 SH and H 2 S follows an H 2 S influenced mediated electrocatalytic oxidation. The use of a very low concentration of the Ag(II) mediator during the electro-reactor process is promising for the complete

  16. Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plum Constituents Under Test Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Phase II STTR project is to develop a prototype multi-analyte sensor system to detect gaseous analytes present in the test stands during...

  17. Trivalent Chromium Process (TCP) as a Sealer for MIL-A-8625F Type II, IIB, and IC Anodic Coatings

    National Research Council Canada - National Science Library

    Matzdorf, Craig; Beck, Erin; Hilgeman, Amy; Prado, Ruben

    2008-01-01

    This report documents evaluations of trivalent chromium compositions (TCP) as sealers for MIL-A-8625F Type II, IIB, and IC anodic coatings conducted from March 2001 through December 2007 by Materials Engineering...

  18. Schedulability Analysis for Java Finalizers

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Hansen, Rene Rydhof; Søndergaard, Hans

    2010-01-01

    Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact...... on the schedulability analysis. In this paper we show that a controlled scoped memory model results in a structured and predictable execution of finalizers, more reminiscent of C++ destructors than Java finalizers. Furthermore, we incorporate finalizers into a (conservative) schedulability analysis for Predictable Java...... programs. Finally, we extend the SARTS tool for automated schedulability analysis of Java bytecode programs to handle finalizers in a fully automated way....

  19. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1997-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL)(a) for the US Department of Energy (DOE). This document contains the planned 1997 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. In addition, Section 3.0, Biota, also reflects a rotating collection schedule identifying the year a specific sample is scheduled for collection. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1, General Environmental Protection Program, and DOE Order 5400.5, Radiation Protection of the Public and the Environment. The sampling methods will be the same as those described in the Environmental Monitoring Plan, US Department of Energy, Richland Operations Office, DOE/RL91-50, Rev. 1, US Department of Energy, Richland, Washington

  20. cobalt (ii), nickel (ii)

    African Journals Online (AJOL)

    DR. AMINU

    Department of Chemistry Bayero University, P. M. B. 3011, Kano, Nigeria. E-mail: hnuhu2000@yahoo.com. ABSTRACT. The manganese (II), cobalt (II), nickel (II) and .... water and common organic solvents, but are readily soluble in acetone. The molar conductance measurement [Table 3] of the complex compounds in.

  1. Endogenous scheduling preferences and congestion

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Small, Kenneth

    2010-01-01

    and leisure, but agglomeration economies at home and at work lead to scheduling preferences forming endogenously. Using bottleneck congestion technology, we obtain an equilibrium queuing pattern consistent with a general version of the Vickrey bottleneck model. However, the policy implications are different....... Compared to the predictions of an analyst observing untolled equilibrium and taking scheduling preferences as exogenous, we find that both the optimal capacity and the marginal external cost of congestion have changed. The benefits of tolling are greater, and the optimal time varying toll is different....

  2. Adapting planning and scheduling concepts to an engineering perspective: Key issues and successful techniques

    International Nuclear Information System (INIS)

    Finnegan, J.M.

    1986-01-01

    Traditional approaches to engineering planning are slanted toward the formats and interests of downstream implementation, and do not always consider the form and criticality of the front-end engineering development process. These processes and scopes are less defined and more subjective than most construction and operations tasks, and require flexible scheduling methods. This paper discusses the characteristics and requirement of engineering schedules, presents concepts for approaching planning in this field, and illustrates simple methods for developing and analyzing engineering plans, and evaluating schedule performance. Engineering plans are structured into a schedule hierarchy which delineates appropriate control and responsibilities, and is governed by key evaluation and decision milestones. Schedule risk analysis considers the uncertainty of engineering tasks, and critical resource constraints. Methods to evaluate schedule performance recognize that engineers and managers are responsible for adequate planning and forecasting, and quality decisions, even if they cannot control all factors influencing schedule results

  3. Chemical Processing Department monthly report for April 1958

    Energy Technology Data Exchange (ETDEWEB)

    Warren, J.H.

    1958-05-21

    The separations plants operated on schedule, and Pu production exceeded commitment. UO{sub 3} production and shipments were also ahead of schedule. Purex operation under pseudo two-cycle conditions (elimination of HS and 1A columns, co-decontamination cycle concentrator HCP) was successful. Final U stream was 3{times} lower in Pu than ever before; {gamma} activity in recovered HNO{sub 3} was also low. Four of 6 special E metal batches were processed through Redox and analyzed. Boric acid is removed from solvent extraction process via aq waste. The filter in Task II hydrofluorinator was changed from carbon to Poroloy. Various modifications to equipment were made.

  4. Artificial intelligence approaches to astronomical observation scheduling

    Science.gov (United States)

    Johnston, Mark D.; Miller, Glenn

    1988-01-01

    Automated scheduling will play an increasing role in future ground- and space-based observatory operations. Due to the complexity of the problem, artificial intelligence technology currently offers the greatest potential for the development of scheduling tools with sufficient power and flexibility to handle realistic scheduling situations. Summarized here are the main features of the observatory scheduling problem, how artificial intelligence (AI) techniques can be applied, and recent progress in AI scheduling for Hubble Space Telescope.

  5. Luminescence and photothermally stimulated defects creation processes in PbWO{sub 4}:La{sup 3+}, Y{sup 3+} (PWO II) crystals

    Energy Technology Data Exchange (ETDEWEB)

    Auffray, E. [CERN, Geneva 23, Geneva (Switzerland); Korjik, M. [Institute for Nuclear Problems, 11 Bobruiskaya, 220020 Minsk (Belarus); Zazubovich, S., E-mail: svetlana.zazubovits@ut.ee [Institute of Physics, University of Tartu, Ravila 14 c, 50411 Tartu (Estonia)

    2015-12-15

    Photoluminescence and thermally stimulated luminescence (TSL) are studied for a PbWO{sub 4} crystal grown by the Czochralski method at Bogoroditsk Technical Chemical Plant, Russia from the melt with a precise tuning of the stoichiometry and co-doped with La{sup 3+} and Y{sup 3+} ions (the PWO II crystal). Photothermally stimulated processes of electron and hole centers creation under selective UV irradiation of this crystal in the 3.5–5.0 eV energy range and the 85–205 K temperature range are clarified and the optically created electron and hole centers are identified. The electrons in PWO II are mainly trapped at the (WO{sub 4}){sup 2−} groups located close to single La{sup 3+} and Y{sup 3+} ions, producing the electron {(WO_4)"3"−–La"3"+} and {(WO_4)"3"−–Y"3"+} centers. The holes are mainly trapped at the regular oxygen ions O{sup 2−} located close to La{sup 3+} and Y{sup 3+} ions associated with lead vacancies, producing the hole O{sup −}(I)-type centers. No evidence of single-vacancy-related centers has been observed in PWO II. The data obtained indicate that excellent scintillation characteristics of the PWO II crystal can be explained by a negligible concentration of single (non-compensated) oxygen and lead vacancies as the traps for electrons and holes, respectively. - Highlights: • Photoluminescence of the PbWO{sub 4}:La{sup 3+}, Y{sup 3+} (PWO II) crystal is investigated. • Creation of defects under UV irradiation of PWO II is studied by TSL. • Origin of dominating electron and hole centers is ascertained. • Concentration of single-vacancy-related centers is found to be negligible. • Excellent scintillation characteristics of the PWO II crystal are explained.

  6. Residency Applicants Prefer Online System for Scheduling Interviews

    Directory of Open Access Journals (Sweden)

    Wills, Charlotte

    2015-03-01

    Full Text Available Introduction: Residency coordinators may be overwhelmed when scheduling residency interviews. Applicants often have to coordinate interviews with multiple programs at once, and relying on verbal or email confirmation may delay the process. Our objective was to determine applicant mean time to schedule and satisfaction using online scheduling. Methods: This pilot study is a retrospective analysis performed on a sample of applicants offered interviews at an urban county emergency medicine residency. Applicants were asked their estimated time to schedule with the online system compared to their average time using other methods. In addition, they were asked on a five-point anchored scale to rate their satisfaction. Results: Of 171 applicants, 121 completed the survey (70.8%. Applicants were scheduling an average of 13.3 interviews. Applicants reported scheduling interviews using the online system in mean of 46.2 minutes (median 10, range 1-1800 from the interview offer as compared with a mean of 320.2 minutes (median 60, range 3-2880 for other programs not using this system. This difference was statistically significant. In addition, applicants were more likely to rate their satisfaction using the online system as “satisfied” (83.5% vs 16.5%. Applicants were also more likely to state that they preferred scheduling their interviews using the online system rather than the way other programs scheduled interviews (74.2% vs 4.1% and that the online system aided them coordinating travel arrangements (52.1% vs 4.1%. Conclusion: An online interview scheduling system is associated with higher satisfaction among applicants both in coordinating travel arrangements and in overall satisfaction. [West J Emerg Med. 2015;16(2:352-354.

  7. Schedules for Regulatory Regimes

    International Nuclear Information System (INIS)

    Austvik, Ole Gunnar

    2003-01-01

    The idea of regulating transporters' terms of operations is that if the market itself does not produce optimal outcomes, then it can be mimicked to do so through regulatory and other public instruments. The first-best solution could be a subsidized (publicly owned) enterprise that sets tariffs according to marginal costs. This has been the tradition in many European countries in the aftermath of WW2. Due to lack of innovative pressure on and x-inefficiency in these companies, this solution is today viewed as inferior to the system of regulating independent (privately owned) firms. When the European gas market becomes liberalized, part of the process in many countries is to (partially) privatise the transport utilities. Privatised or not, in a liberalized market, the transport utilities should face an independent authority that overviews their operations not only in technical, but also in economic terms. Under regulation, a ''visible hand'' is introduced to correct the imperfect market's ''invisible hand''. By regulating the framework and conditions for how firms may operate, public authorities seek to achieve what is considered optimal for the society. The incentives and disincentives given for pricing and production should create mechanisms leading to an efficient allocation of resources and ''acceptable'' distribution of income. As part of intervening into firms' behavior, regulation may be introduced to direct the firm to behave in certain ways. The framework and regulatory mechanisms for the market must then be constructed in a way that companies voluntarily produce an amount at a price that gives maximal profits and simultaneously satisfies social goals. The regulations should lead to consistency between the company's desire to maximize profits and the society's desire for maximizing welfare, as in a perfectly competitive market. This is the core of regulatory economics

  8. Incorporation of Tropical Cyclone Avoidance Into Automated Ship Scheduling

    Science.gov (United States)

    2014-06-01

    illustrated during World War II ( Drury & Clavin, 2007), when Admiral Fredrick “Bull” Halsey was maneuvering his Third Fleet and trying to refuel, while...or damaged beyond repair ( Drury & Clavin, 2007). While this is an extreme historical case, it illustrates the dangers of not considering TC tracks...replenishment schedule that takes into account the supply levels of all the ships and maintains the supplies above required levels. With the proper inputs

  9. An Automatic Course Scheduling Approach Using Instructors' Preferences

    Directory of Open Access Journals (Sweden)

    Hossam Faris

    2012-03-01

    Full Text Available University Courses Timetabling problem has been extensively researched in the last decade. Therefore, numerous approaches were proposed to solve UCT problem. This paper proposes a new approach to process a sequence of meetings between instructors, rooms, and students in predefined periods of time with satisfying a set of constraints divided in variety of types. In addition, this paper proposes new representation for courses timetabling and conflict-free for each time slot by mining instructor preferences from previous schedules to avoid undesirable times for instructors. Experiments on different real data showed the approach achieved increased satisfaction degree for each instructor and gives feasible schedule with satisfying all hard constraints in construction operation. The generated schedules have high satisfaction degrees comparing with schedules created manually. The research conducts experiments on collected data gathered from the computer science department and other related departments in Jordan University of Science and Technology- Jordan.

  10. Microcomputer-based workforce scheduling for hospital porters.

    Science.gov (United States)

    Lin, C K

    1999-01-01

    This paper focuses on labour scheduling for hospital porters who are the major workforce providing routine cleansing of wards, transportation and messenger services. Generating an equitable monthly roster for porters while meeting the daily minimum demand is a tedious task scheduled manually by a supervisor. In considering a variety of constraints and goals, a manual schedule was usually produced in seven to ten days. To be in line with the strategic goal of scientific management of an acute care regional hospital in Hong Kong, a microcomputer-based algorithm was developed to schedule the monthly roster. The algorithm, coded in Digital Visual Fortran 5.0 Professional, could generate a monthly roster in seconds. Implementation has been carried out since September 1998 and the results proved to be useful to hospital administrators and porters. This paper discusses both the technical and human issues involved during the computerization process.

  11. SIMULTANEOUS SCHEDULING AND OPERATIONAL OPTIMIZATION OF MULTIPRODUCT, CYCLIC CONTINUOUS PLANTS

    Directory of Open Access Journals (Sweden)

    A. Alle

    2002-03-01

    Full Text Available The problems of scheduling and optimization of operational conditions in multistage, multiproduct continuous plants with intermediate storage are simultaneously addressed. An MINLP model, called TSPFLOW, which is based on the TSP formulation for product sequencing, is proposed to schedule the operation of such plants. TSPFLOW yields a one-order-of-magnitude CPU time reduction as well as the solution of instances larger than those formerly reported (Pinto and Grossmann, 1994. Secondly, processing rates and yields are introduced as additional optimization variables in order to state the simultaneous problem of scheduling with operational optimization. Results show that trade-offs are very complex and that the development of a straightforward (rule of thumb method to optimally schedule the operation is less effective than the proposed approach.

  12. SIMULTANEOUS SCHEDULING AND OPERATIONAL OPTIMIZATION OF MULTIPRODUCT, CYCLIC CONTINUOUS PLANTS

    Directory of Open Access Journals (Sweden)

    Alle A.

    2002-01-01

    Full Text Available The problems of scheduling and optimization of operational conditions in multistage, multiproduct continuous plants with intermediate storage are simultaneously addressed. An MINLP model, called TSPFLOW, which is based on the TSP formulation for product sequencing, is proposed to schedule the operation of such plants. TSPFLOW yields a one-order-of-magnitude CPU time reduction as well as the solution of instances larger than those formerly reported (Pinto and Grossmann, 1994. Secondly, processing rates and yields are introduced as additional optimization variables in order to state the simultaneous problem of scheduling with operational optimization. Results show that trade-offs are very complex and that the development of a straightforward (rule of thumb method to optimally schedule the operation is less effective than the proposed approach.

  13. Distributed continuous energy scheduling for dynamic virtual power plants

    International Nuclear Information System (INIS)

    Niesse, Astrid

    2015-01-01

    This thesis presents DynaSCOPE as distributed control method for continuous energy scheduling for dynamic virtual power plants (DVPP). DVPPs aggregate the flexibility of distributed energy units to address current energy markets. As an extension of the Virtual Power Plant concept they show high dynamics in aggregation and operation of energy units. Whereas operation schedules are set up for all energy units in a day-ahead planning procedure, incidents may render these schedules infeasible during execution, like deviation from prognoses or outages. Thus, a continuous scheduling process is needed to ensure product fulfillment. With DynaSCOPE, software agents representing single energy units solve this problem in a completely distributed heuristic approach. Using a stepped concept, several damping mechanisms are applied to allow minimum disturbance while continuously trying to fulfill the product as contracted at the market.

  14. Adaptive scheduling with postexamining user selection under nonidentical fading

    KAUST Repository

    Gaaloul, Fakhreddine

    2012-11-01

    This paper investigates an adaptive scheduling algorithm for multiuser environments with statistically independent but nonidentically distributed (i.n.d.) channel conditions. The algorithm aims to reduce feedback load by sequentially and arbitrarily examining the user channels. It also provides improved performance by realizing postexamining best user selection. The first part of the paper presents new formulations for the statistics of the signal-to-noise ratio (SNR) of the scheduled user under i.n.d. channel conditions. The second part capitalizes on the findings in the first part and presents various performance and processing complexity measures for adaptive discrete-time transmission. The results are then extended to investigate the effect of outdated channel estimates on the statistics of the scheduled user SNR, as well as some performance measures. Numerical results are provided to clarify the usefulness of the scheduling algorithm under perfect or outdated channel estimates. © 1967-2012 IEEE.

  15. A Review Of Fault Tolerant Scheduling In Multicore Systems

    Directory of Open Access Journals (Sweden)

    Shefali Malhotra

    2015-05-01

    Full Text Available Abstract In this paper we have discussed about various fault tolerant task scheduling algorithm for multi core system based on hardware and software. Hardware based algorithm which is blend of Triple Modulo Redundancy and Double Modulo Redundancy in which Agricultural Vulnerability Factor is considered while deciding the scheduling other than EDF and LLF scheduling algorithms. In most of the real time system the dominant part is shared memory.Low overhead software based fault tolerance approach can be implemented at user-space level so that it does not require any changes at application level. Here redundant multi-threaded processes are used. Using those processes we can detect soft errors and recover from them. This method gives low overhead fast error detection and recovery mechanism. The overhead incurred by this method ranges from 0 to 18 for selected benchmarks. Hybrid Scheduling Method is another scheduling approach for real time systems. Dynamic fault tolerant scheduling gives high feasibility rate whereas task criticality is used to select the type of fault recovery method in order to tolerate the maximum number of faults.

  16. Lifetime Improvement by Battery Scheduling

    NARCIS (Netherlands)

    Jongerden, M.R.; Schmitt, Jens B.; Haverkort, Boudewijn R.H.M.

    The use of mobile devices is often limited by the lifetime of their batteries. For devices that have multiple batteries or that have the option to connect an extra battery, battery scheduling, thereby exploiting the recovery properties of the batteries, can help to extend the system lifetime. Due to

  17. Lifetime improvement by battery scheduling

    NARCIS (Netherlands)

    Jongerden, M.R.; Haverkort, Boudewijn R.H.M.

    The use of mobile devices is often limited by the lifetime of its battery. For devices that have multiple batteries or that have the option to connect an extra battery, battery scheduling, thereby exploiting the recovery properties of the batteries, can help to extend the system lifetime. Due to the

  18. Interaction in activity location scheduling

    NARCIS (Netherlands)

    Tabak, V.; Vries, de B.; Dijkstra, J.; Jessurun, A.J.

    2006-01-01

    In this paper we discuss the interaction in activity location scheduling which is the main subject of an ongoing research project called "User Simulation of Space Utilization". The aim of this research project is to develop an overall model for the simulation of human movement and utilization of

  19. Scheduling with target start times

    NARCIS (Netherlands)

    Hoogeveen, J.A.; Velde, van de S.L.; Klein Haneveld, W.K.; Vrieze, O.J.; Kallenberg, L.C.M.

    1997-01-01

    We address the single-machine problem of scheduling n independent jobs subject to target start times. Target start times are essentially release times that may be violated at a certain cost. The goal is to minimize an objective function that is composed of total completion time and maximum

  20. Flexible Schedules and Shift Work.

    Science.gov (United States)

    Beers, Thomas M.

    2000-01-01

    Flexible work hours have gained prominence, as more than 25 million workers (27.6% of all full-time workers) can now vary their schedules. However, there has been little change since the mid-1980s in the proportion who work a shift other than a regular daytime shift. (JOW)