WorldWideScience

Sample records for management optimization code

  1. Code Optimization in FORM

    CERN Document Server

    Kuipers, J; Vermaseren, J A M

    2013-01-01

    We describe the implementation of output code optimization in the open source computer algebra system FORM. This implementation is based on recently discovered techniques of Monte Carlo tree search to find efficient multivariate Horner schemes, in combination with other optimization algorithms, such as common subexpression elimination. For systems for which no specific knowledge is provided it performs significantly better than other methods we could compare with. Because the method has a number of free parameters, we also show some methods by which to tune them to different types of problems.

  2. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe...... the codes succinctly using Gröbner bases....

  3. [Non elective cesarean section: use of a color code to optimize management of obstetric emergencies].

    Science.gov (United States)

    Rudigoz, René-Charles; Huissoud, Cyril; Delecour, Lisa; Thevenet, Simone; Dupont, Corinne

    2014-06-01

    The medical team of the Croix Rousse teaching hospital maternity unit has developed, over the last ten years, a set of procedures designed to respond to various emergency situations necessitating Caesarean section. Using the Lucas classification, we have defined as precisely as possible the degree of urgency of Caesarian sections. We have established specific protocols for the implementation of urgent and very urgent Caesarean section and have chosen a simple means to convey the degree of urgency to all team members, namely a color code system (red, orange and green). We have set time goals from decision to delivery: 15 minutes for the red code and 30 minutes for the orange code. The results seem very positive: The frequency of urgent and very urgent Caesareans has fallen over time, from 6.1 % to 1.6% in 2013. The average time from decision to delivery is 11 minutes for code red Caesareans and 21 minutes for code orange Caesareans. These time goals are now achieved in 95% of cases. Organizational and anesthetic difficulties are the main causes of delays. The indications for red and orange code Caesarians are appropriate more than two times out of three. Perinatal outcomes are generally favorable, code red Caesarians being life-saving in 15% of cases. No increase in maternal complications has been observed. In sum: Each obstetric department should have its own protocols for handling urgent and very urgent Caesarean sections. Continuous monitoring of their implementation, relevance and results should be conducted Management of extreme urgency must be integrated into the management of patients with identified risks (scarred uterus and twin pregnancies for example), and also in structures without medical facilities (birthing centers). Obstetric teams must keep in mind that implementation of these protocols in no way dispenses with close monitoring of labour.

  4. Modular optimization code package: MOZAIK

    Science.gov (United States)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  5. Optimal Organizational Hierarchies: Source Coding: Disaster Relief

    CERN Document Server

    Murthy, G Rama

    2011-01-01

    ulticasting is an important communication paradigm for enabling the dissemination of information selectively. This paper considers the problem of optimal secure multicasting in a communication network captured through a graph (optimal is in an interesting sense) and provides a doubly optimal solution using results from source coding. It is realized that the solution leads to optimal design (in a well defined optimality sense) of organizational hierarchies captured through a graph. In this effort two novel concepts : prefix free path, graph entropy are introduced. Some results of graph entropy are provided. Also some results on Kraft inequality are discussed. As an application Hierarchical Hybrid Communication Network is utilized as a model of structured Mobile Adhoc network for utility in Disaster Management. Several new research problems that naturally emanate from this research are summarized.

  6. Manual and Fast C Code Optimization

    Directory of Open Access Journals (Sweden)

    Mohammed Fadle Abdulla

    2010-01-01

    Full Text Available Developing an application with high performance through the code optimization places a greater responsibility on the programmers. While most of the existing compilers attempt to automatically optimize the program code, manual techniques remain the predominant method for performing optimization. Deciding where to try to optimize code is difficult, especially for large complex applications. For manual optimization, the programmers can use his experiences in writing the code, and then he can use a software profiler in order to collect and analyze the performance data from the code. In this work, we have gathered the most experiences which can be applied to improve the style of writing programs in C language as well as we present an implementation of the manual optimization of the codes using the Intel VTune profiler. The paper includes two case studies to illustrate our optimization on the Heap Sort and Factorial functions.

  7. Optimal probabilistic dense coding schemes

    Science.gov (United States)

    Kögler, Roger A.; Neves, Leonardo

    2017-04-01

    Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.

  8. Zip Code Manager

    Data.gov (United States)

    Office of Personnel Management — The system used to associate what Federal Employees Health Benefits Program (FEHBP) and Federal Employees Dental/Vision Program (FEDVIP) health, dental, and vision...

  9. Optimal Codes for the Burst Erasure Channel

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure

  10. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  11. Building Reusable Software Component For Optimization Check in ABAP Coding

    CERN Document Server

    Shireesha, P; 10.5121/ijsea.2010.1303

    2010-01-01

    Software component reuse is the software engineering practice of developing new software products from existing components. A reuse library or component reuse repository organizes stores and manages reusable components. This paper describes how a reusable component is created, how it reuses the function and checking if optimized code is being used in building programs and applications. Finally providing coding guidelines, standards and best practices used for creating reusable components and guidelines and best practices for making configurable and easy to use.

  12. Building Reusable Software Component For Optimization Check in ABAP Coding

    OpenAIRE

    Shireesha, P.; S.S.V.N.Sharma

    2010-01-01

    Software component reuse is the software engineering practice of developing new software products from existing components. A reuse library or component reuse repository organizes stores and manages reusable components. This paper describes how a reusable component is created, how it reuses the function and checking if optimized code is being used in building programs and applications. Finally providing coding guidelines, standards and best practices used for creating reusable components and ...

  13. Optimal patch code design via device characterization

    Science.gov (United States)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  14. MMSE Optimal Algebraic Space-Time Codes

    CERN Document Server

    Rajan, G Susinder

    2007-01-01

    Design of Space-Time Block Codes (STBCs) for Maximum Likelihood (ML) reception has been predominantly the main focus of researchers. However, the ML decoding complexity of STBCs becomes prohibitive large as the number of transmit and receive antennas increase. Hence it is natural to resort to a suboptimal reception technique like linear Minimum Mean Squared Error (MMSE) receiver. Barbarossa et al and Liu et al have independently derived necessary and sufficient conditions for a full rate linear STBC to be MMSE optimal, i.e achieve least Symbol Error Rate (SER). Motivated by this problem, certain existing high rate STBC constructions from crossed product algebras are identified to be MMSE optimal. Also, it is shown that a certain class of codes from cyclic division algebras which are special cases of crossed product algebras are MMSE optimal. Hence, these STBCs achieve least SER when MMSE reception is employed and are fully diverse when ML reception is employed.

  15. Repair Optimal Erasure Codes through Hadamard Designs

    CERN Document Server

    Papailiopoulos, Dimitris S; Cadambe, Viveck R

    2011-01-01

    In distributed storage systems that employ erasure coding, the issue of minimizing the total {\\it communication} required to exactly rebuild a storage node after a failure arises. This repair bandwidth depends on the structure of the storage code and the repair strategies used to restore the lost data. Designing high-rate maximum-distance separable (MDS) codes that achieve the optimum repair communication has been a well-known open problem. In this work, we use Hadamard matrices to construct the first explicit 2-parity MDS storage code with optimal repair properties for all single node failures, including the parities. Our construction relies on a novel method of achieving perfect interference alignment over finite fields with a finite file size, or number of extensions. We generalize this construction to design $m$-parity MDS codes that achieve the optimum repair communication for single systematic node failures and show that there is an interesting connection between our $m$-parity codes and the systematic-...

  16. On Codes for Optimal Rebuilding Access

    CERN Document Server

    Wang, Zhiying; Bruck, Jehoshua

    2011-01-01

    MDS (maximum distance separable) array codes are widely used in storage systems due to their computationally efficient encoding and decoding procedures. An MDS code with r redundancy nodes can correct any r erasures by accessing (reading) all the remaining information in both the systematic nodes and the parity (redundancy) nodes. However, in practice, a single erasure is the most likely failure event; hence, a natural question is how much information do we need to access in order to rebuild a single storage node? We define the rebuilding ratio as the fraction of remaining information accessed during the rebuilding of a single erasure. In our previous work we showed that the optimal rebuilding ratio of 1/r is achievable (using our newly constructed array codes) for the rebuilding of any systematic node, however, all the information needs to be accessed for the rebuilding of the parity nodes. Namely, constructing array codes with a rebuilding ratio of 1/r was left as an open problem. In this paper, we solve th...

  17. New code match strategy for wideband code division multiple access code tree management

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Orthogonal variable spreading factor channelization codes are widely used to provide variable data rates for supporting different bandwidth requirements in wideband code division multiple access (WCDMA) systems. A new code match scheme for WCDMA code tree management was proposed. The code match scheme is similar to the existing crowed-first scheme. When choosing a code for a user, the code match scheme only compares the one up layer of the allocated codes, unlike the crowed-first scheme which perhaps compares all up layers. So the operation of code match scheme is simple, and the average time delay is decreased by 5.1%. The simulation results also show that the code match strategy can decrease the average code blocking probability by 8.4%.

  18. Evaluation of Fugen core management code (POLESTAR)

    Energy Technology Data Exchange (ETDEWEB)

    Shiratori, Yoshitake (Power Reactor and Nuclear Fuel Development Corp., Tsuruga, Fukui (Japan). Fugen Nuclear Power Station); Matsumoto, Mitsuo; Deshimaru, Takehide; Saito, Kuniyoshi

    1991-06-01

    Core management code POLESTAR has been developed by PNC and it has enough functions for core management. The code has been successfully used to carry out core management of Fugen such as making a long term or next cycle fuel loading plan, predicting detailed characteristics of a next cycle core, planning of a control rod pattern and evaluating the core life time after reactor start-up and so on. This code has been contributed to the reliable and economical operation of Fugen, since its accuracy has been checked and the code has been tailored by comparing its calculation results with various measured data. (author).

  19. Optimality properties of a proposed precursor to the genetic code.

    Science.gov (United States)

    Butler, Thomas; Goldenfeld, Nigel

    2009-09-01

    We calculate the optimality score of a doublet precursor to the canonical genetic code with respect to mitigating the effects of point mutations and compare our results to corresponding ones for the canonical genetic code. We find that the proposed precursor is much less optimal than that of the canonical code. Our results render unlikely the notion that the doublet precursor was an intermediate state in the evolution of the canonical genetic code. These findings support the notion that code optimality reflects evolutionary dynamics, and that if such a doublet code originally had a biochemical significance, it arose before the emergence of translation.

  20. Statistical physics, optimization and source coding

    Indian Academy of Sciences (India)

    Riccardo Zecchina

    2005-06-01

    The combinatorial problem of satisfying a given set of constraints that depend on N discrete variables is a fundamental one in optimization and coding theory. Even for instances of randomly generated problems, the question ``does there exist an assignment to the variables that satisfies all constraints?" may become extraordinarily difficult to solve in some range of parameters where a glass phase sets in. We shall provide a brief review of the recent advances in the statistical mechanics approach to these satisfiability problems and show how the analytic results have helped to design a new class of message-passing algorithms – the survey propagation (SP) algorithms – that can efficiently solve some combinatorial problems considered intractable. As an application, we discuss how the packing properties of clusters of solutions in randomly generated satisfiability problems can be exploited in the design of simple lossy data compression algorithms.

  1. Non-binary Hybrid LDPC Codes: Structure, Decoding and Optimization

    CERN Document Server

    Sassatelli, Lucile

    2007-01-01

    In this paper, we propose to study and optimize a very general class of LDPC codes whose variable nodes belong to finite sets with different orders. We named this class of codes Hybrid LDPC codes. Although efficient optimization techniques exist for binary LDPC codes and more recently for non-binary LDPC codes, they both exhibit drawbacks due to different reasons. Our goal is to capitalize on the advantages of both families by building codes with binary (or small finite set order) and non-binary parts in their factor graph representation. The class of Hybrid LDPC codes is obviously larger than existing types of codes, which gives more degrees of freedom to find good codes where the existing codes show their limits. We give two examples where hybrid LDPC codes show their interest.

  2. EAF Management Optimization

    Science.gov (United States)

    Costoiu, M.; Ioana, A.; Semenescu, A.; Marcu, D.

    2016-11-01

    The article presents the main advantages of electric arc furnace (EAF): it has a great contribution to reintroduce significant quantities of reusable metallic materials in the economic circuit, it constitutes itself as an important part in the Primary Materials and Energy Recovery (PMER), good productivity, good quality / price ratio, the possibility of developing a wide variety of classes and types of steels, including special steels and high alloy. In this paper it is presented some important developments of electric arc furnace: vacuum electric arc furnace, artificial intelligence expert systems for pollution control Steelworks. Another important aspect presented in the article is an original block diagram for optimization the EAF management system. This scheme is based on the original objective function (criterion function) represented by the price / quality ratio. The article presents an original block diagram for optimization the control system of the EAF. For designing this concept of EAF management system, many principles were used.

  3. Optimal source codes for geometrically distributed integer alphabets

    Science.gov (United States)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  4. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    Science.gov (United States)

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  5. Optimal coding schemes for conflict-free channel access

    Science.gov (United States)

    Browning, Douglas W.; Thomas, John B.

    1989-10-01

    A method is proposed for conflict-free access of a broadcast channel. The method uses a variable-length coding scheme to determine which user gains access to the channel. For an idle channel, an equation for optimal expected overhead is derived and a coding scheme that produces optimal codes is presented. Algorithms for generating optimal codes for access on a busy channel are discussed. Suboptimal schemes are found that perform in a nearly optimal fashion. The method is shown to be superior in performance to previously developed conflict-free channel access schemes.

  6. Management of Statically Modifiable Prolog Code

    Institute of Scientific and Technical Information of China (English)

    张晨曦; 慈云桂

    1989-01-01

    The Warren Abstract Machine is an efficient execution model for Prolog,which has become the basis of many high performance Prolog systems.However.little support for the implementation of the non-logical components of Prolog is provided in the WAM.The original Warren code is not modifiable.In this paper,we show how static modifications of Warren code can be achieved by adding a few instructions and a little extra information to the code.The implementation of the code manager is discussed.Algorithms for some basic operations are given.

  7. On Construction of Optimal A2-Codes

    Institute of Scientific and Technical Information of China (English)

    HU Lei

    2001-01-01

    Two authentication codes with arbitration (A2-codes) are constrructed from finite affine spaces to illustrate for the first time that the information-theoretic lower bounds for A2-codes can be strictly tighter than the combinatorial ones. The codes also illustrate that the conditional combinatorial lower bounds on numbers of encoding\\ decoding rules are not genuine ones. As an analogue of 3-dimensional case, an A2-code from 4-dimensional finite projective spaces is constructed, which neets both the information-theoretic and combinatorial lower bounds.

  8. Zigzag Codes: MDS Array Codes with Optimal Rebuilding

    CERN Document Server

    Tamo, Itzhak; Bruck, Jehoshua

    2011-01-01

    MDS array codes are widely used in storage systems to protect data against erasures. We address the \\emph{rebuilding ratio} problem, namely, in the case of erasures, what is the fraction of the remaining information that needs to be accessed in order to rebuild \\emph{exactly} the lost information? It is clear that when the number of erasures equals the maximum number of erasures that an MDS code can correct then the rebuilding ratio is 1 (access all the remaining information). However, the interesting and more practical case is when the number of erasures is smaller than the erasure correcting capability of the code. For example, consider an MDS code that can correct two erasures: What is the smallest amount of information that one needs to access in order to correct a single erasure? Previous work showed that the rebuilding ratio is bounded between 1/2 and 3/4, however, the exact value was left as an open problem. In this paper, we solve this open problem and prove that for the case of a single erasure with ...

  9. Efficient topology optimization in MATLAB using 88 lines of code

    DEFF Research Database (Denmark)

    Andreassen, Erik; Clausen, Anders; Schevenels, Mattias

    2011-01-01

    The paper presents an efficient 88 line MATLAB code for topology optimization. It has been developed using the 99 line code presented by Sigmund (Struct Multidisc Optim 21(2):120–127, 2001) as a starting point. The original code has been extended by a density filter, and a considerable improvemen...... of the basic code to include recent PDE-based and black-and-white projection filtering methods. The complete 88 line code is included as an appendix and can be downloaded from the web site www.topopt.dtu.dk....

  10. Optimal Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, M. H.

    1994-01-01

    Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....

  11. Optimal superdense coding over memory channels

    Energy Technology Data Exchange (ETDEWEB)

    Shadman, Z.; Kampermann, H.; Bruss, D.; Macchiavello, C. [Institute fuer Theoretische Physik III, Heinrich-Heine-Universitaet Duesseldorf, DE-40225 Duesseldorf (Germany); Dipartimento di Fisica ' ' A. Volta' ' and INFM-Unita di Pavia, Via Bassi 6, IT-27100 Pavia (Italy)

    2011-10-15

    We study the superdense coding capacity in the presence of quantum channels with correlated noise. We investigate both the cases of unitary and nonunitary encoding. Pauli channels for arbitrary dimensions are treated explicitly. The superdense coding capacity for some special channels and resource states is derived for unitary encoding. We also provide an example of a memory channel where nonunitary encoding leads to an improvement in the superdense coding capacity.

  12. Optimality Of Variable-Length Codes

    Science.gov (United States)

    Yeh, Pen-Shu; Miller, Warner H.; Rice, Robert F.

    1994-01-01

    Report presents analysis of performances of conceptual Rice universal noiseless coders designed to provide efficient compression of data over wide range of source-data entropies. Includes predictive preprocessor that maps source data into sequence of nonnegative integers and variable-length-coding processor, which adapts to varying entropy of source data by selecting whichever one of number of optional codes yields shortest codeword.

  13. Code Organization and Configuration Management

    Institute of Scientific and Technical Information of China (English)

    J.P.Wellisch; I.Osborne; 等

    2001-01-01

    Industry experts are increasingly focusing on team productivity on team productivity as the key to success,the base of the team effort is the four-fold structure of software in terms of logical organisation,physical organisation,managerial organisation,and dynamical structure.We describe the ideas put into action within the CMS software for organising software into sub-systems and packages,and to establish configuration management in a multiproject environment.We use a structure that allows to maximise the independence of soft ware development in individual areas,and at the same time emphasises the overwhelming importance of the interdependencies between the packages and components in the system.We comment on release procedures,and describe the inter-relationship between release,development,integration,and testing.

  14. Analysis of the optimality of the standard genetic code.

    Science.gov (United States)

    Kumar, Balaji; Saini, Supreet

    2016-07-19

    Many theories have been proposed attempting to explain the origin of the genetic code. While strong reasons remain to believe that the genetic code evolved as a frozen accident, at least for the first few amino acids, other theories remain viable. In this work, we test the optimality of the standard genetic code against approximately 17 million genetic codes, and locate 29 which outperform the standard genetic code at the following three criteria: (a) robustness to point mutation; (b) robustness to frameshift mutation; and (c) ability to encode additional information in the coding region. We use a genetic algorithm to generate and score codes from different parts of the associated landscape, which are, as a result, presumably more representative of the entire landscape. Our results show that while the genetic code is sub-optimal for robustness to frameshift mutation and the ability to encode additional information in the coding region, it is very strongly selected for robustness to point mutation. This coupled with the observation that the different performance indicator scores for a particular genetic code are negatively correlated makes the standard genetic code nearly optimal for the three criteria tested in this work.

  15. An engineering code to analyze hypersonic thermal management systems

    Science.gov (United States)

    Vangriethuysen, Valerie J.; Wallace, Clark E.

    1993-01-01

    Thermal loads on current and future aircraft are increasing and as a result are stressing the energy collection, control, and dissipation capabilities of current thermal management systems and technology. The thermal loads for hypersonic vehicles will be no exception. In fact, with their projected high heat loads and fluxes, hypersonic vehicles are a prime example of systems that will require thermal management systems (TMS) that have been optimized and integrated with the entire vehicle to the maximum extent possible during the initial design stages. This will not only be to meet operational requirements, but also to fulfill weight and performance constraints in order for the vehicle to takeoff and complete its mission successfully. To meet this challenge, the TMS can no longer be two or more entirely independent systems, nor can thermal management be an after thought in the design process, the typical pervasive approach in the past. Instead, a TMS that was integrated throughout the entire vehicle and subsequently optimized will be required. To accomplish this, a method that iteratively optimizes the TMS throughout the vehicle will not only be highly desirable, but advantageous in order to reduce the manhours normally required to conduct the necessary tradeoff studies and comparisons. A thermal management engineering computer code that is under development and being managed at Wright Laboratory, Wright-Patterson AFB, is discussed. The primary goal of the code is to aid in the development of a hypersonic vehicle TMS that has been optimized and integrated on a total vehicle basis.

  16. MDS Array Codes with Optimal Rebuilding

    CERN Document Server

    Tamo, Itzhak; Bruck, Jehoshua

    2011-01-01

    MDS array codes are widely used in storage systems to protect data against erasures. We address the \\emph{rebuilding ratio} problem, namely, in the case of erasures, what is the the fraction of the remaining information that needs to be accessed in order to rebuild \\emph{exactly} the lost information? It is clear that when the number of erasures equals the maximum number of erasures that an MDS code can correct then the rebuilding ratio is 1 (access all the remaining information). However, the interesting (and more practical) case is when the number of erasures is smaller than the erasure correcting capability of the code. For example, consider an MDS code that can correct two erasures: What is the smallest amount of information that one needs to access in order to correct a single erasure? Previous work showed that the rebuilding ratio is bounded between 1/2 and 3/4, however, the exact value was left as an open problem. In this paper, we solve this open problem and prove that for the case of a single erasure...

  17. Optimization of KINETICS Chemical Computation Code

    Science.gov (United States)

    Donastorg, Cristina

    2012-01-01

    NASA JPL has been creating a code in FORTRAN called KINETICS to model the chemistry of planetary atmospheres. Recently there has been an effort to introduce Message Passing Interface (MPI) into the code so as to cut down the run time of the program. There has been some implementation of MPI into KINETICS; however, the code could still be more efficient than it currently is. One way to increase efficiency is to send only certain variables to all the processes when an MPI subroutine is called and to gather only certain variables when the subroutine is finished. Therefore, all the variables that are used in three of the main subroutines needed to be investigated. Because of the sheer amount of code that there is to comb through this task was given as a ten-week project. I have been able to create flowcharts outlining the subroutines, common blocks, and functions used within the three main subroutines. From these flowcharts I created tables outlining the variables used in each block and important information about each. All this information will be used to determine how to run MPI in KINETICS in the most efficient way possible.

  18. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-03-13

    In this work we have developed a restructuring software tool (RT-CUDA) following the proposed optimization specifications to bridge the gap between high-level languages and the machine dependent CUDA environment. RT-CUDA takes a C program and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design of linear algebra solvers. We expect RT-CUDA to be needed by many KSA industries dealing with science and engineering simulation on massively parallel computers like NVIDIA GPUs.

  19. ARC Code TI: Optimal Alarm System Design and Implementation

    Data.gov (United States)

    National Aeronautics and Space Administration — An optimal alarm system can robustly predict a level-crossing event that is specified over a fixed prediction horizon. The code contained in this packages provides...

  20. Optimizing management of glycaemia.

    Science.gov (United States)

    Chatterjee, Sudesna; Khunti, Kamlesh; Davies, Melanie J

    2016-06-01

    The global epidemic of type 2 diabetes (T2DM) continues largely unabated due to an increasingly sedentary lifestyle and obesogenic environment. A cost-effective patient-centred approach, incorporating glucose-lowering therapy and modification of cardiovascular risk factors, could help prevent the inevitable development and progression of macrovascular and microvascular complications. Glycaemic optimization requires patient structured education, self-management and empowerment, and psychological support along with early and proactive use of glucose lowering therapies, which should be delivered in a system of care as shown by the Chronic Care Model. From diagnosis, intensive glycaemic control and individualised care is aimed at reducing complications. In older people, the goal is maintaining quality of life and minimizing morbidity, especially as overtreatment increases hypoglycaemia risk. Maintaining durable glycaemic control is challenging and complex to achieve without hypoglycaemia, weight gain and other significant adverse effects. Overcoming patient and physician barriers can help ensure adequate treatment initiation and intensification. Cardiovascular safety studies with newer glucose-lowering agents are now mandatory, with a sodium glucose co-transporter-2 inhibitor (empagliflozin), and two glucagon like peptide-1 receptor agonists (liraglutide and semaglutide) being the first to demonstrate superior CV outcomes compared with placebo.

  1. The information capacity of the genetic code: Is the natural code optimal?

    Science.gov (United States)

    Kuruoglu, Ercan E; Arndt, Peter F

    2017-04-21

    We envision the molecular evolution process as an information transfer process and provide a quantitative measure for information preservation in terms of the channel capacity according to the channel coding theorem of Shannon. We calculate Information capacities of DNA on the nucleotide (for non-coding DNA) and the amino acid (for coding DNA) level using various substitution models. We extend our results on coding DNA to a discussion about the optimality of the natural codon-amino acid code. We provide the results of an adaptive search algorithm in the code domain and demonstrate the existence of a large number of genetic codes with higher information capacity. Our results support the hypothesis of an ancient extension from a 2-nucleotide codon to the current 3-nucleotide codon code to encode the various amino acids. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Optimal Subband Coding of Cyclostationary Signals

    Science.gov (United States)

    2007-11-02

    Computer Engineering The University of Iowa Iowa City, IA-52242, USA. Email: dasgupta, pashish @engineering.uiowa.edu ABSTRACT We consider...FILTERS FOR SUBBAND CODING OF WIDESENSE CYCLOSTATIONARY SIGNALS Ashish Pandharipande and Soura Dasgupta Electrical and Computer Engineering ... nonunique . This is consistent with the fact that LTI opti- mum compaction filters for WSS processes are also nonunique , [11]. We now state the main results

  3. Managed Code Rootkits Hooking into Runtime Environments

    CERN Document Server

    Metula, Erez

    2010-01-01

    Imagine being able to change the languages for the applications that a computer is running and taking control over it. That is exactly what managed code rootkits can do when they are placed within a computer. This new type of rootkit is hiding in a place that had previously been safe from this type of attack-the application level. Code reviews do not currently look for back doors in the virtual machine (VM) where this new rootkit would be injected. An invasion of this magnitude allows an attacker to steal information on the infected computer, provide false information, and disable securit

  4. Optimized reversible binary-coded decimal adders

    DEFF Research Database (Denmark)

    Thomsen, Michael Kirkedal; Glück, Robert

    2008-01-01

    their design. The optimized 1-decimal BCD full-adder, a 13 × 13 reversible logic circuit, is faster, and has lower circuit cost and less garbage bits. It can be used to build a fast reversible m-decimal BCD full-adder that has a delay of only m + 17 low-power reversible CMOS gates. For a 32-decimal (128-bit...... in reversible logic design by drastically reducing the number of garbage bits. Specialized designs benefit from support by reversible logic synthesis. All circuit components required for optimizing the original design could also be synthesized successfully by an implementation of an existing synthesis algorithm...

  5. A Rate-Distortion Optimized Coding Method for Region of Interest in Scalable Video Coding

    Directory of Open Access Journals (Sweden)

    Hongtao Wang

    2015-01-01

    original ones is also considered during rate-distortion optimization so that a reasonable trade-off between coding efficiency and decoding drift can be made. Besides, a new Lagrange multiplier derivation method is developed for further coding performance improvement. Experimental results demonstrate that the proposed method achieves significant bitrate saving compared to existing methods.

  6. MAPCLASS a code to optimize high order aberrations

    CERN Document Server

    Tomás, R

    2006-01-01

    MAPCLASS is a code written in PYTHON conceived to optimize the non-linear aberrations of the Final Focus System of CLIC. MAPCLASS calls MADX-PTC to obtain the map coefficients and uses optimization algorithms like the Simplex to compensate the high order aberrations.

  7. Optimized puncturing distributions for irregular non-binary LDPC codes

    CERN Document Server

    Gorgoglione, Matteo; Declercq, David

    2010-01-01

    In this paper we design non-uniform bit-wise puncturing distributions for irregular non-binary LDPC (NB-LDPC) codes. The puncturing distributions are optimized by minimizing the decoding threshold of the punctured LDPC code, the threshold being computed with a Monte-Carlo implementation of Density Evolution. First, we show that Density Evolution computed with Monte-Carlo simulations provides accurate (very close) and precise (small variance) estimates of NB-LDPC code ensemble thresholds. Based on the proposed method, we analyze several puncturing distributions for regular and semi-regular codes, obtained either by clustering punctured bits, or spreading them over the symbol-nodes of the Tanner graph. Finally, optimized puncturing distributions for non-binary LDPC codes with small maximum degree are presented, which exhibit a gap between 0.2 and 0.5 dB to the channel capacity, for punctured rates varying from 0.5 to 0.9.

  8. Optimal neural population coding of an auditory spatial cue.

    Science.gov (United States)

    Harper, Nicol S; McAlpine, David

    2004-08-05

    A sound, depending on the position of its source, can take more time to reach one ear than the other. This interaural (between the ears) time difference (ITD) provides a major cue for determining the source location. Many auditory neurons are sensitive to ITDs, but the means by which such neurons represent ITD is a contentious issue. Recent studies question whether the classical general model (the Jeffress model) applies across species. Here we show that ITD coding strategies of different species can be explained by a unifying principle: that the ITDs an animal naturally encounters should be coded with maximal accuracy. Using statistical techniques and a stochastic neural model, we demonstrate that the optimal coding strategy for ITD depends critically on head size and sound frequency. For small head sizes and/or low-frequency sounds, the optimal coding strategy tends towards two distinct sub-populations tuned to ITDs outside the range created by the head. This is consistent with recent observations in small mammals. For large head sizes and/or high frequencies, the optimal strategy is a homogeneous distribution of ITD tunings within the range created by the head. This is consistent with observations in the barn owl. For humans, the optimal strategy to code ITDs from an acoustically measured distribution depends on frequency; above 400 Hz a homogeneous distribution is optimal, and below 400 Hz distinct sub-populations are optimal.

  9. Optimal Grouping and Matching for Network-Coded Cooperative Communications

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, S; Shi, Y; Hou, Y T; Kompella, S; Midkiff, S F

    2011-11-01

    Network-coded cooperative communications (NC-CC) is a new advance in wireless networking that exploits network coding (NC) to improve the performance of cooperative communications (CC). However, there remains very limited understanding of this new hybrid technology, particularly at the link layer and above. This paper fills in this gap by studying a network optimization problem that requires joint optimization of session grouping, relay node grouping, and matching of session/relay groups. After showing that this problem is NP-hard, we present a polynomial time heuristic algorithm to this problem. Using simulation results, we show that our algorithm is highly competitive and can produce near-optimal results.

  10. Optimism in Enrollment Management

    Science.gov (United States)

    Buster-Williams, Kimberley

    2016-01-01

    Enrollment managers, like most managers, have goals that must be focused on with precision, excitement, and vigor. Enrollment managers must excel at enrollment planning. Typically, enrollment planning unites undergraduate and graduate recruitment plans, out-of-state recruitment plans, marketing plans, retention plans, international enrollment…

  11. Optimal Testing of Reed-Muller Codes

    CERN Document Server

    Bhattacharya, Arnab; Schoenebeck, Grant; Sudan, Madhu; Zuckerman, David

    2009-01-01

    We consider the problem of testing if a given function $f : \\F_2^n \\to \\F_2$ is close to any degree $d$ polynomial in $n$ variables, also known as the Reed-Muller testing problem. Alon et al. \\cite{AKKLR} proposed and analyzed a natural $2^{d+1}$-query test for this property and showed that it accepts every degree $d$ polynomial with probability 1, while rejecting functions that are $\\Omega(1)$-far with probability $\\Omega(1/(d 2^{d}))$. We give an asymptotically optimal analysis of their test showing that it rejects functions that are (even only) $\\Omega(2^{-d})$-far with $\\Omega(1)$-probability (so the rejection probability is a universal constant independent of $d$ and $n$). Our proof works by induction on $n$, and yields a new analysis of even the classical Blum-Luby-Rubinfeld \\cite{BLR} linearity test, for the setting of functions mapping $\\F_2^n$ to $\\F_2$. The optimality follows from a tighter analysis of counterexamples to the "inverse conjecture for the Gowers norm" constructed by \\cite{GT,LMS}. Our ...

  12. TRO-2D - A code for rational transonic aerodynamic optimization

    Science.gov (United States)

    Davis, W. H., Jr.

    1985-01-01

    Features and sample applications of the transonic rational optimization (TRO-2D) code are outlined. TRO-2D includes the airfoil analysis code FLO-36, the CONMIN optimization code and a rational approach to defining aero-function shapes for geometry modification. The program is part of an effort to develop an aerodynamically smart optimizer that will simplify and shorten the design process. The user has a selection of drag minimization and associated minimum lift, moment, and the pressure distribution, a choice among 14 resident aero-function shapes, and options on aerodynamic and geometric constraints. Design variables such as the angle of attack, leading edge radius and camber, shock strength and movement, supersonic pressure plateau control, etc., are discussed. The results of calculations of a reduced leading edge camber transonic airfoil and an airfoil with a natural laminar flow are provided, showing that only four design variables need be specified to obtain satisfactory results.

  13. Recent developments in KTF. Code optimization and improved numerics

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin [Karlsruhe Institute of Technology (KIT) (Germany). Inst. for Neutron Physics and Reactor Technology (INR)

    2012-11-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  14. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  15. Optimization of well field management

    DEFF Research Database (Denmark)

    Hansen, Annette Kirstine

    Groundwater is a limited but important resource for fresh water supply. Differ- ent conflicting objectives are important when operating a well field. This study investigates how the management of a well field can be improved with respect to different objectives simultaneously. A framework...... objectives. The sequential scheduling optimizes the management stepwise for daily time steps, and allows the final management to vary in time. The research shows that this method performs better than the constant scheduling when large variations in the hydrological conditions occur. This novel approach can...... multi-objective optimization framework has shown to be useful in optimizing the management of well fields, and it has successfully been applied to the two case studies, Hardhof and Søndersø waterworks. If the method is applied to all Danish waterworks it is estimated that 20-32 GWh/year could be saved...

  16. A realistic model under which the genetic code is optimal.

    Science.gov (United States)

    Buhrman, Harry; van der Gulik, Peter T S; Klau, Gunnar W; Schaffner, Christian; Speijer, Dave; Stougie, Leen

    2013-10-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By comparing this value with a distribution of values belonging to codes generated by random permutations of amino acid assignments, the level of error robustness of a genetic code can be quantified. We present a calculation in which the standard genetic code is shown to be optimal. We obtain this result by (1) using recently updated values of polar requirement as input; (2) fixing seven assignments (Ile, Trp, His, Phe, Tyr, Arg, and Leu) based on aptamer considerations; and (3) using known biosynthetic relations of the 20 amino acids. This last point is reflected in an approach of subdivision (restricting the random reallocation of assignments to amino acid subgroups, the set of 20 being divided in four such subgroups). The three approaches to explain robustness of the code (specific selection for robustness, amino acid-RNA interactions leading to assignments, or a slow growth process of assignment patterns) are reexamined in light of our findings. We offer a comprehensive hypothesis, stressing the importance of biosynthetic relations, with the code evolving from an early stage with just glycine and alanine, via intermediate stages, towards 64 codons carrying todays meaning.

  17. The optimal code searching method with an improved criterion of coded exposure for remote sensing image restoration

    Science.gov (United States)

    He, Lirong; Cui, Guangmang; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2015-03-01

    Coded exposure photography makes the motion de-blurring a well-posed problem. The integration pattern of light is modulated using the method of coded exposure by opening and closing the shutter within the exposure time, changing the traditional shutter frequency spectrum into a wider frequency band in order to preserve more image information in frequency domain. The searching method of optimal code is significant for coded exposure. In this paper, an improved criterion of the optimal code searching is proposed by analyzing relationship between code length and the number of ones in the code, considering the noise effect on code selection with the affine noise model. Then the optimal code is obtained utilizing the method of genetic searching algorithm based on the proposed selection criterion. Experimental results show that the time consuming of searching optimal code decreases with the presented method. The restoration image is obtained with better subjective experience and superior objective evaluation values.

  18. Optimal coding for qualitative sources on noiseless channels

    Directory of Open Access Journals (Sweden)

    Valeriu MUNTEANU

    2006-12-01

    Full Text Available In this paper we perform the encoding for sources which are only qualitatively characterized, that is, each message the source delivers possesses a certain quality, expressed as cost, importance or utility. The proposed encoding procedure is an optimal one, because it leads to maximum information per code word and it assures a minimum time for the transmission of the source information.

  19. Multiview coding mode decision with hybrid optimal stopping model.

    Science.gov (United States)

    Zhao, Tiesong; Kwong, Sam; Wang, Hanli; Wang, Zhou; Pan, Zhaoqing; Kuo, C-C Jay

    2013-04-01

    In a generic decision process, optimal stopping theory aims to achieve a good tradeoff between decision performance and time consumed, with the advantages of theoretical decision-making and predictable decision performance. In this paper, optimal stopping theory is employed to develop an effective hybrid model for the mode decision problem, which aims to theoretically achieve a good tradeoff between the two interrelated measurements in mode decision, as computational complexity reduction and rate-distortion degradation. The proposed hybrid model is implemented and examined with a multiview encoder. To support the model and further promote coding performance, the multiview coding mode characteristics, including predicted mode probability and estimated coding time, are jointly investigated with inter-view correlations. Exhaustive experimental results with a wide range of video resolutions reveal the efficiency and robustness of our method, with high decision accuracy, negligible computational overhead, and almost intact rate-distortion performance compared to the original encoder.

  20. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  1. Optimal Management of Power Systems

    OpenAIRE

    Andreassi, Luca; Ubertini, Stefano

    2010-01-01

    The present chapter discusses the importance of energy systems proper management to reduce energy costs and environmental impact. A numerical model for the optimal management of a power plant in buildings and industrial plants is presented. The model allows evaluating different operating strategies for the power plant components. The different strategies are defined on the basis of a pure economic optimisation (minimisation of total cost) and/or of an energetic optimisation (minimisation of f...

  2. Design of Optimal Quincunx Filter Banks for Image Coding

    Directory of Open Access Journals (Sweden)

    Wu-Sheng Lu

    2007-01-01

    Full Text Available Two new optimization-based methods are proposed for the design of high-performance quincunx filter banks for the application of image coding. These new techniques are used to build linear-phase finite-length-impulse-response (FIR perfect-reconstruction (PR systems with high coding gain, good frequency selectivity, and certain prescribed vanishing-moment properties. A parametrization of quincunx filter banks based on the lifting framework is employed to structurally impose the PR and linear-phase conditions. Then, the coding gain is maximized subject to a set of constraints on vanishing moments and frequency selectivity. Examples of filter banks designed using the newly proposed methods are presented and shown to be highly effective for image coding. In particular, our new optimal designs are shown to outperform three previously proposed quincunx filter banks in 72% to 95% of our experimental test cases. Moreover, in some limited cases, our optimal designs are even able to outperform the well-known (separable 9/7 filter bank (from the JPEG-2000 standard.

  3. Lossless coding using predictors and VLCs optimized for each image

    Science.gov (United States)

    Matsuda, Ichiro; Shirai, Noriyuki; Itoh, Susumu

    2003-06-01

    This paper proposes an efficient lossless coding scheme for still images. The scheme utilizes an adaptive prediction technique where a set of linear predictors are designed for a given image and an appropriate predictor is selected from the set block-by-block. The resulting prediction errors are encoded using context-adaptive variable-length codes (VLCs). Context modeling, or adaptive selection of VLCs, is carried out pel-by-pel and the VLC assigned to each context is designed on a probability distribution model of the prediction errors. In order to improve coding efficiency, a generalized Gaussian function is used as the model for each context. Moreover, not only the predictors but also parameters of the probability distribution models are iteratively optimized for each image so that a coding rate of the prediction errors can have a minimum. Experimental results show that the proposed coding scheme attains comparable coding performance to the state-of-the-art TMW scheme with much lower complexity in the decoding process.

  4. FREQUENCY-CODED OPTIMIZATION OF HOPPED-FREQUENCY PULSE SIGNAL BASED ON GENETIC ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Liu Zheng; Mu Xuehua

    2005-01-01

    The Frequency-Coded Pulse (FCP) signal has good performance of range and Doppler resolution. This paper first gives the mathematical expression of the ambiguity function for FCP signals, and then presents a coding rule for optimizing FCP signal. The genetic algorithm is presented to solve this kind of problem for optimizing codes. Finally, an example for optimizing calculation is illustrated and the optimized frequency coding results are given with the code length N=64 and N=128 respectively.

  5. OPTIMIZATION BASED ON LMPROVED REAL—CODED GENETIC ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    ShiYu; YuShenglin

    2002-01-01

    An improved real-coded genetic algorithm is pro-posed for global optimization of functionsl.The new algo-rithm is based om the judgement of the searching perfor-mance of basic real-coded genetic algorithm.The opera-tions of basic real-coded genetic algorithm are briefly dis-cussed and selected.A kind of chaos sequence is described in detail and added in the new algorithm ad a disturbance factor.The strategy of field partition is also used to im-prove the strcture of the new algorithm.Numerical ex-periment shows that the mew genetic algorithm can find the global optimum of complex funtions with satistaiting precision.

  6. Fundamentals of an Optimal Multirate Subband Coding of Cyclostationary Signals

    Directory of Open Access Journals (Sweden)

    D. Kula

    2000-06-01

    Full Text Available A consistent theory of optimal subband coding of zero mean wide-sense cyclostationary signals, with N-periodic statistics, is presented in this article. An M-channel orthonormal uniform filter bank, employing N-periodic analysis and synthesis filters, is used while an average variance condition is applied to evaluate the output distortion. In three lemmas and final theorem, the necessity of decorrelation of blocked subband signals and requirement of specific ordering of power spectral densities are proven.

  7. Optimal Merging Algorithms for Lossless Codes with Generalized Criteria

    CERN Document Server

    Charalambous, Themistoklis; Rezaei, Farzad

    2011-01-01

    This paper presents lossless prefix codes optimized with respect to a pay-off criterion consisting of a convex combination of maximum codeword length and average codeword length. The optimal codeword lengths obtained are based on a new coding algorithm which transforms the initial source probability vector into a new probability vector according to a merging rule. The coding algorithm is equivalent to a partition of the source alphabet into disjoint sets on which a new transformed probability vector is defined as a function of the initial source probability vector and a scalar parameter. The pay-off criterion considered encompasses a trade-off between maximum and average codeword length; it is related to a pay-off criterion consisting of a convex combination of average codeword length and average of an exponential function of the codeword length, and to an average codeword length pay-off criterion subject to a limited length constraint. A special case of the first related pay-off is connected to coding proble...

  8. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  9. Constellation labeling optimization for bit-interleaved coded APSK

    Science.gov (United States)

    Xiang, Xingyu; Mo, Zijian; Wang, Zhonghai; Pham, Khanh; Blasch, Erik; Chen, Genshe

    2016-05-01

    This paper investigates the constellation and mapping optimization for amplitude phase shift keying (APSK) modulation, which is deployed in Digital Video Broadcasting Satellite - Second Generation (DVB-S2) and Digital Video Broadcasting - Satellite services to Handhelds (DVB-SH) broadcasting standards due to its merits of power and spectral efficiency together with the robustness against nonlinear distortion. The mapping optimization is performed for 32-APSK according to combined cost functions related to Euclidean distance and mutual information. A Binary switching algorithm and its modified version are used to minimize the cost function and the estimated error between the original and received data. The optimized constellation mapping is tested by combining DVB-S2 standard Low-Density Parity-Check (LDPC) codes in both Bit-Interleaved Coded Modulation (BICM) and BICM with iterative decoding (BICM-ID) systems. The simulated results validate the proposed constellation labeling optimization scheme which yields better performance against conventional 32-APSK constellation defined in DVB-S2 standard.

  10. Building Reusable Software Component For Optimization Check in ABAP Coding

    Directory of Open Access Journals (Sweden)

    P.Shireesha

    2010-07-01

    Full Text Available Software component reuse is the software engineering practice of developing newsoftware products from existing components. A reuse library or component reuserepository organizes stores and manages reusable components. This paper describeshow a reusable component is created, how it reuses the function and checking ifoptimized code is being used in building programs and applications. Finally providingcoding guidelines, standards and best practices used for creating reusable componentsand guidelines and best practices for making configurable and easy to use.

  11. Codes for Computationally Simple Channels: Explicit Constructions with Optimal Rate

    CERN Document Server

    Guruswami, Venkatesan

    2010-01-01

    In this paper, we consider coding schemes for computationally bounded channels, which can introduce an arbitrary set of errors as long as (a) the fraction of errors is bounded with high probability by a parameter p and (b) the process which adds the errors can be described by a sufficiently "simple" circuit. For three classes of channels, we provide explicit, efficiently encodable/decodable codes of optimal rate where only inefficiently decodable codes were previously known. In each case, we provide one encoder/decoder that works for every channel in the class. (1) Unique decoding for additive errors: We give the first construction of poly-time encodable/decodable codes for additive (a.k.a. oblivious) channels that achieve the Shannon capacity 1-H(p). Such channels capture binary symmetric errors and burst errors as special cases. (2) List-decoding for log-space channels: A space-S(n) channel reads and modifies the transmitted codeword as a stream, using at most S(n) bits of workspace on transmissions of n bi...

  12. A simple model of optimal population coding for sensory systems.

    Science.gov (United States)

    Doi, Eizaburo; Lewicki, Michael S

    2014-08-01

    A fundamental task of a sensory system is to infer information about the environment. It has long been suggested that an important goal of the first stage of this process is to encode the raw sensory signal efficiently by reducing its redundancy in the neural representation. Some redundancy, however, would be expected because it can provide robustness to noise inherent in the system. Encoding the raw sensory signal itself is also problematic, because it contains distortion and noise. The optimal solution would be constrained further by limited biological resources. Here, we analyze a simple theoretical model that incorporates these key aspects of sensory coding, and apply it to conditions in the retina. The model specifies the optimal way to incorporate redundancy in a population of noisy neurons, while also optimally compensating for sensory distortion and noise. Importantly, it allows an arbitrary input-to-output cell ratio between sensory units (photoreceptors) and encoding units (retinal ganglion cells), providing predictions of retinal codes at different eccentricities. Compared to earlier models based on redundancy reduction, the proposed model conveys more information about the original signal. Interestingly, redundancy reduction can be near-optimal when the number of encoding units is limited, such as in the peripheral retina. We show that there exist multiple, equally-optimal solutions whose receptive field structure and organization vary significantly. Among these, the one which maximizes the spatial locality of the computation, but not the sparsity of either synaptic weights or neural responses, is consistent with known basic properties of retinal receptive fields. The model further predicts that receptive field structure changes less with light adaptation at higher input-to-output cell ratios, such as in the periphery.

  13. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  14. Highly Optimized Code Generation for Stencil Codes with Computation Reuse for GPUs

    Institute of Scientific and Technical Information of China (English)

    Wen-Jing Ma; Kan Gao; Guo-Ping Long

    2016-01-01

    Computation reuse is known as an effective optimization technique. However, due to the complexity of modern GPU architectures, there is yet not enough understanding regarding the intriguing implications of the interplay of compu-tation reuse and hardware specifics on application performance. In this paper, we propose an automatic code generator for a class of stencil codes with inherent computation reuse on GPUs. For such applications, the proper reuse of intermediate results, combined with careful register and on-chip local memory usage, has profound implications on performance. Current state of the art does not address this problem in depth, partially due to the lack of a good program representation that can expose all potential computation reuse. In this paper, we leverage the computation overlap graph (COG), a simple representation of data dependence and data reuse with “element view”, to expose potential reuse opportunities. Using COG, we propose a portable code generation and tuning framework for GPUs. Compared with current state-of-the-art code generators, our experimental results show up to 56.7%performance improvement on modern GPUs such as NVIDIA C2050.

  15. Investigation of Navier-Stokes Code Verification and Design Optimization

    Science.gov (United States)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  16. Optimal Partitioned Cyclic Difference Packings for Frequency Hopping and Code Synchronization

    CERN Document Server

    Chee, Yeow Meng; Yin, Jianxing

    2010-01-01

    Optimal partitioned cyclic difference packings (PCDPs) are shown to give rise to optimal frequency-hopping sequences and optimal comma-free codes. New constructions for PCDPs, based on almost difference sets and cyclic difference matrices, are given. These produce new infinite families of optimal PCDPs (and hence optimal frequency-hopping sequences and optimal comma-free codes). The existence problem for optimal PCDPs in ${\\mathbb Z}_{3m}$, with $m$ base blocks of size three, is also solved for all $m\

  17. EHR's effect on the revenue cycle management Coding function.

    Science.gov (United States)

    Giannangelo, Kathy; Fenton, Susan

    2008-01-01

    Without administrative terminologies there is no revenue to manage. The use of healthcare IT to capture the codes for administrative and financial support functions will impact the revenue cycle and the management of it. This is presumed to occur because clinical data coded at the point of care becomes the source for claims data. Thus, as electronic health record system applications utilizing terminologies are implemented, healthcare providers need to systematically consider the effect on the coding function and management of the revenue cycle. A key factor is the sequence of events changes, i.e., instead of a health information management professional selecting billing codes at the conclusion of an encounter based on the review of the record, clinical data generates the claims data via mapping. Efficiencies and management challenges result.

  18. Iterative Phase Optimization of Elementary Quantum Error Correcting Codes

    Science.gov (United States)

    Müller, M.; Rivas, A.; Martínez, E. A.; Nigg, D.; Schindler, P.; Monz, T.; Blatt, R.; Martin-Delgado, M. A.

    2016-07-01

    Performing experiments on small-scale quantum computers is certainly a challenging endeavor. Many parameters need to be optimized to achieve high-fidelity operations. This can be done efficiently for operations acting on single qubits, as errors can be fully characterized. For multiqubit operations, though, this is no longer the case, as in the most general case, analyzing the effect of the operation on the system requires a full state tomography for which resources scale exponentially with the system size. Furthermore, in recent experiments, additional electronic levels beyond the two-level system encoding the qubit have been used to enhance the capabilities of quantum-information processors, which additionally increases the number of parameters that need to be controlled. For the optimization of the experimental system for a given task (e.g., a quantum algorithm), one has to find a satisfactory error model and also efficient observables to estimate the parameters of the model. In this manuscript, we demonstrate a method to optimize the encoding procedure for a small quantum error correction code in the presence of unknown but constant phase shifts. The method, which we implement here on a small-scale linear ion-trap quantum computer, is readily applicable to other AMO platforms for quantum-information processing.

  19. Joint research project WASA-BOSS: Further development and application of severe accident codes. Assessment and optimization of accident management measures. Project B: Accident analyses for pressurized water reactors with the application of the ATHLET-CD code; Verbundprojekt WASA-BOSS: Weiterentwicklung und Anwendung von Severe Accident Codes. Bewertung und Optimierung von Stoerfallmassnahmen. Teilprojekt B: Druckwasserreaktor-Stoerfallanalysen unter Verwendung des Severe-Accident-Codes ATHLET-CD

    Energy Technology Data Exchange (ETDEWEB)

    Jobst, Matthias; Kliem, Soeren; Kozmenkov, Yaroslav; Wilhelm, Polina

    2017-02-15

    Within the framework of the project an ATHLET-CD input deck for a generic German PWR of type KONVOI has been created. This input deck was applied to the simulation of severe accidents from the accident categories station blackout (SBO) and small-break loss-of-coolant accidents (SBLOCA). The complete accident transient from initial event at full power until the damage of reactor pressure vessel (RPV) is covered and all relevant severe accident phenomena are modelled: start of core heat up, fission product release, melting of fuel and absorber material, oxidation and release of hydrogen, relocation of molten material inside the core, relocation to the lower plenum, damage and failure of the RPV. The model has been applied to the analysis of preventive and mitigative accident management measures for SBO and SBLOCA transients. Therefore, the measures primary side depressurization (PSD), injection to the primary circuit by mobile pumps and for SBLOCA the delayed injection by the cold leg hydro-accumulators have been investigated and the assumptions and start criteria of these measures have been varied. The time evolutions of the transients and time margins for the initiation of additional measures have been assessed. An uncertainty and sensitivity study has been performed for the early phase of one SBO scenario with PSD (until the start of core melt). In addition to that, a code -to-code comparison between ATHLET-CD and the severe accident code MELCOR has been carried out.

  20. Ethics and code of conduct in zoo management

    OpenAIRE

    Bahne, Rita

    2015-01-01

    There are around 1 million wild animals living in the 10,000-12,000 zoos worldwide. They include zoological parks, biological parks, safari parks, public aquariums, bird parks, reptile parks and insectariums. Zoo tourism is both domestic and international. The purpose of this research thesis is to clarify the ethics and codes of conduct in present day zoo management by presenting the current nature of zoos, ethics behind zoo business and the current codes of conduct. Zoo management i...

  1. Optimal management of perimenopausal depression

    Directory of Open Access Journals (Sweden)

    Barbara L Parry

    2010-06-01

    Full Text Available Barbara L ParryDepartment of Psychiatry, University of California, San Diego, USAAbstract: Only recently has the perimenopause become recognized as a time when women are at risk for new onset and recurrence of major depression. Untreated depression at this time not only exacerbates the course of a depressive illness, but also puts women at increased risk for sleep disorders, cardiovascular disease, diabetes, and osteoporosis. Although antidepressant medication is the mainstay of treatment, adjunctive therapy, especially with estrogen replacement, may be indicated in refractory cases, and may speed the onset of antidepressant action. Many, but not all, studies, report that progesterone antagonizes the beneficial effects of estrogen. Although some antidepressants improve vasomotor symptoms, in general they are not as effective as estrogen alone for relieving these symptoms. Estrogen alone, however, does not generally result in remission of major depression in most (but not all studies, but may provide benefit to some women with less severe symptoms if administered in therapeutic ranges. The selective serotonin reuptake inhibitors (SSRIs in addition to estrogen are usually more beneficial in improving mood than SSRIs or estrogen treatment alone for major depression, whereas the selective norepinephrine and serotonin reuptake inhibitors do not require the addition of estrogen to exert their antidepressant effects in menopausal depression. In addition to attention to general health, hormonal status, and antidepressant treatment, the optimal management of perimenopausal depression also requires attention to the individual woman’s psychosocial and spiritual well being.Keywords: menopause, depression, management

  2. Optimal Linear Joint Source-Channel Coding with Delay Constraint

    CERN Document Server

    Johannesson, Erik; Bernhardsson, Bo; Ghulchak, Andrey

    2012-01-01

    The problem of joint source-channel coding is considered for a stationary remote (noisy) Gaussian source and a Gaussian channel. The encoder and decoder are assumed to be causal and their combined operations are subject to a delay constraint. It is shown that, under the mean-square error distortion metric, an optimal encoder-decoder pair from the linear and time-invariant (LTI) class can be found by minimization of a convex functional and a spectral factorization. The functional to be minimized is the sum of the well-known cost in a corresponding Wiener filter problem and a new term, which is induced by the channel noise and whose coefficient is the inverse of the channel's signal-to-noise ratio. This result is shown to also hold in the case of vector-valued signals, assuming parallel additive white Gaussian noise channels. It is also shown that optimal LTI encoders and decoders generally require infinite memory, which implies that approximations are necessary. A numerical example is provided, which compares ...

  3. Optimization of Coding of AR Sources for Transmission Across Channels with Loss

    DEFF Research Database (Denmark)

    Arildsen, Thomas

    , and quantization. On this background we propose a new algorithm for optimization of predictive coding of AR sources for transmission across channels with loss. The optimization algorithm takes as its starting point a re-thinking of the source coding operation as an operation producing linear measurements....... Channel coding is usually applied in combination with source coding to ensure reliable transmission of the (source coded) information at the maximal rate across a channel given the properties of this channel. In this thesis, we consider the coding of auto-regressive (AR) sources which are sources that can...... be modeled as auto-regressive processes. The coding of AR sources lends itself to linear predictive coding. We address the problem of joint source/channel coding in the setting of linear predictive coding of AR sources. We consider channels in which individual source coded signal samples can be lost during...

  4. A novel neutron energy spectrum unfolding code using particle swarm optimization

    Science.gov (United States)

    Shahabinejad, H.; Sohrabpour, M.

    2017-07-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code.

  5. On the algebraic representation of certain optimal non-linear binary codes

    CERN Document Server

    Greferath, Marcus

    2011-01-01

    This paper investigates some optimal non-linear codes, in particular cyclic codes, by considering them as (non-linear) codes over Z_4. We use the Fourier transform as well as subgroups of the unit group of a group ring to analyse these codes. In particular we find a presentation of Best's (10, 40, 4) code as a coset of a subgroup in the unit group of a ring, and derive a simple decoding algorithm from this presentation. We also apply this technique to analyse Julin's (12, 144, 4) code and the (12, 24, 12) Hadamard code, as well as to construct a (14, 56, 6) binary code.

  6. Optimizing management of pancreaticopleural fistulas

    Institute of Scientific and Technical Information of China (English)

    Marek Wronski; Maciej Slodkowski; Wlodzimierz Cebulski; Daniel Moronczyk; Ireneusz W Krasnodebski

    2011-01-01

    fluid collections. Four out of 8 patients in our series required subsequent surgery due to a failed non-operative treat ment. Distal pancreatectomy with splenectomy was per formed in 3 cases. In one case, only external drainage of the pancreatic pseudocyst was done because of dif fuse peripancreatic inflammatory infiltration precluding safe dissection. There were no perioperative mortalities. There was no recurrence of a pancreaticopleural fistula in any of the patients.CONCLUSION: Optimal management of pancreatico pleural fistulas requires appropriate patient selection that should be based on the underlying pancreatic duct abnormalities.

  7. Asynchronous Code-Division Random Access Using Convex Optimization

    CERN Document Server

    Applebaum, Lorne; Duarte, Marco F; Calderbank, Robert

    2011-01-01

    Many applications in cellular systems and sensor networks involve a random subset of a large number of users asynchronously reporting activity to a base station. This paper examines the problem of multiuser detection (MUD) in random access channels for such applications. Traditional orthogonal signaling ignores the random nature of user activity in this problem and limits the total number of users to be on the order of the number of signal space dimensions. Contention-based schemes, on the other hand, suffer from delays caused by colliding transmissions and the hidden node problem. In contrast, this paper presents a novel asynchronous (non-orthogonal) code-division random access scheme along with a convex optimization-based MUD algorithm that overcomes the issues associated with orthogonal signaling and contention-based methods. Two key distinguishing features of the proposed algorithm are that it does not require knowledge of the delay or channel state information of every user and it has polynomial-time com...

  8. Code of ethics and ethical dilemmas' management in health professions

    Directory of Open Access Journals (Sweden)

    Sofia Triadafyllidou

    2011-10-01

    Full Text Available As the main interest of health professionals is for the well-being of patients/clients, ethical decision making is one of the prominent elements of his/her professionalism. Aim: The present study aims to illustrate the role of ethical judgment and the so-called "moral imagination" in health professions. Method and material: Review of theoretical and research literature, including both classic and recent sources about ethical dilemmas that health professionals may anticipate, as well as the suggested ways to manage these dilemmas. Results: Health professionals often have to act in complicated situations. Review of relevant literature indicates that the professionals' ethical decisions are structured not only through the codes of ethics, but also through other collective practices, such as organizational culture and cultural schemas about the role of health professional. Resorting to schematic thinking may temporarily release the professional from his/her concerns, but in the long run, it may devoid him/her of the sense of satisfaction from work and of the ability to offer clients the optimal care. The development of the so-called "moral imagination" permits the professional to advance from the typical application of the rules to actual ethical judgment. Conclusions: Ethical decision making presupposes not only a thorough knowledge of ethical guidelines, but also the development of the ability to openly reflect upon the ethical dimensions of an issue (moral imagination that allows health professionals to overcome schematic thinking and investigate comprehensive solutions to ethical dilemmas.

  9. Technical Review of peephole Technique in compiler to optimize intermediate code

    Directory of Open Access Journals (Sweden)

    Vaishali Sanghvi

    2013-01-01

    Full Text Available Peephole optimization is a efficient and easy optimization technique used by compilers sometime called window or peephole is set of code that replace one sequence of instructions by another equivalent set of instructions, but shorter, faster. Peephole optimization is traditionally done through String pattern matching that is using regular expression. There are some the techniques of peephole optimization like constant folding, Strength reduction, null sequences, combine operation, algebraic laws, special case instructions, address mode operations.The peephole optimization is applied to several parts or section of program or code so main question is where to apply it before compilation, on the intermediate code or after compilation of the code .The aim of this dissertation to show the current state of peephole optimization and how apply it to the IR (Intermediate Representation code that is generated by any programming language.

  10. Codes of practice and related issues in biomedical waste management

    Energy Technology Data Exchange (ETDEWEB)

    Moy, D.; Watt, C. [Griffith Univ. (Australia)

    1996-12-31

    This paper outlines the development of a National Code of Practice for biomedical waste management in Australia. The 10 key areas addressed by the code are industry mission statement; uniform terms and definitions; community relations - public perceptions and right to know; generation, source separation, and handling; storage requirements; transportation; treatment and disposal; disposal of solid and liquid residues and air emissions; occupational health and safety; staff awareness and education. A comparison with other industry codes in Australia is made. A list of outstanding issues is also provided; these include the development of standard containers, treatment effectiveness, and reusable sharps containers.

  11. ASPECTS OF OPTIMIZATION OF WATER MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    E. BEILICCI

    2013-03-01

    Full Text Available Water management system include all activities and works which providing the administration of public domain of water, with local / national interest, and qualitative, quantitative and sustainable management of water resources. Hydrotechnical arrangements, consisting of a set of hydraulic structures, produce both a favorable and unfavorable influences on environment. Their different constructive and exploitation solutions exercise a significantly impact on the environment. Therefore the advantages and disadvantages of each solution must be weighed and determined to materialize one or other of them seriously argued.The optimization of water management systems is needed to meet current and future requirements in the field of rational water management in the context of integrated water resources management. Optimization process of complex water management systems includes several components related to environmental protection, technical side and the business side. This paper summarizes the main aspects and possibilities of optimization of existing water management systems and those that are to be achieved.

  12. Optimal Prefix Free Code: word-RAM Linear and Algebraic Instance Optimal

    CERN Document Server

    Barbay, Jérémy

    2012-01-01

    We describe a new technique to compute an optimal prefix-free code over $\\alphabetSize$ symbols from their frequencies $\\{\\frequency_1,..,\\frequency_\\alphabetSize\\}$. This technique yields an algorithm running in linear time in the $\\Omega(\\lg \\alphabetSize)$-word RAM model when each frequency holds into $\\Oh(1)$ words, hence improving on the $\\Oh(\\alphabetSize\\lg\\lg\\alphabetSize)$ solution based on sorting in the word RAM model. In a more restricted model, this yields also an algorithm performing $\\Oh(\\alphabetSize(1{+}\\entropy(\\alphabetSize_1,...,\\alphabetSize_\

  13. Hydrodynamic Optimization Method and Design Code for Stall-Regulated Hydrokinetic Turbine Rotors

    Energy Technology Data Exchange (ETDEWEB)

    Sale, D.; Jonkman, J.; Musial, W.

    2009-08-01

    This report describes the adaptation of a wind turbine performance code for use in the development of a general use design code and optimization method for stall-regulated horizontal-axis hydrokinetic turbine rotors. This rotor optimization code couples a modern genetic algorithm and blade-element momentum performance code in a user-friendly graphical user interface (GUI) that allows for rapid and intuitive design of optimal stall-regulated rotors. This optimization method calculates the optimal chord, twist, and hydrofoil distributions which maximize the hydrodynamic efficiency and ensure that the rotor produces an ideal power curve and avoids cavitation. Optimizing a rotor for maximum efficiency does not necessarily create a turbine with the lowest cost of energy, but maximizing the efficiency is an excellent criterion to use as a first pass in the design process. To test the capabilities of this optimization method, two conceptual rotors were designed which successfully met the design objectives.

  14. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Fitzek, Frank

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...

  15. Greedy vs. L1 Convex Optimization in Sparse Coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... and action recognition, a comparative study of codes in abnormal event detection is less studied and hence no conclusion is gained on the effect of codes in detecting abnormalities. We constrict our comparison in two types of the above L0-norm solutions: greedy algorithms and convex L1-norm solutions....... Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate their performance...

  16. Novel Area Optimization in FPGA Implementation Using Efficient VHDL Code

    Directory of Open Access Journals (Sweden)

    . Zulfikar

    2012-10-01

    Full Text Available A new novel method for area efficiency in FPGA implementation is presented. The method is realized through flexibility and wide capability of VHDL coding. This method exposes the arithmetic operations such as addition, subtraction and others. The design technique aim to reduce occupies area for multi stages circuits by selecting suitable range of all value involved in every step of calculations. Conventional and efficient VHDL coding methods are presented and the synthesis result is compared. The VHDL code which limits range of integer values is occupies less area than the one which is not. This VHDL coding method is suitable for multi stage circuits.

  17. Novel Area Optimization in FPGA Implementation Using Efficient VHDL Code

    Directory of Open Access Journals (Sweden)

    Zulfikar Zulfikar

    2015-05-01

    Full Text Available A new novel method for area efficiency in FPGA implementation is presented. The method is realized through flexibility and wide capability of VHDL coding. This method exposes the arithmetic operations such as addition, subtraction and others. The design technique aim to reduce occupies area for multi stages circuits by selecting suitable range of all value involved in every step of calculations. Conventional and efficient VHDL coding methods are presented and the synthesis result is compared. The VHDL code which limits range of integer values is occupies less area than the one which is not. This VHDL coding method is suitable for multi stage circuits.

  18. A New Method Of Gene Coding For A Genetic Algorithm Designed For Parametric Optimization

    Directory of Open Access Journals (Sweden)

    Radu BELEA

    2003-12-01

    Full Text Available In a parametric optimization problem the genes code the real parameters of the fitness function. There are two coding techniques known under the names of: binary coded genes and real coded genes. The comparison between these two is a controversial subject since the first papers about parametric optimization have appeared. An objective analysis regarding the advantages and disadvantages of the two coding techniques is difficult to be done while different format information is compared. The present paper suggests a gene coding technique that uses the same format for both binary coded genes and for the real coded genes. After unifying the real parameters representation, the next criterion is going to be applied: the differences between the two techniques are statistically measured by the effect of the genetic operators over some random generated fellows.

  19. Managing purchasing and inventory with bar codes. Part II.

    Science.gov (United States)

    Gandy, J

    1986-04-01

    Automated identification systems (bar coding) have proven their worth in a number of diverse manufacturing and materials handling environments. Whether this technology can be broadly applied with equal effectiveness in the health care setting still remains largely unproven. One thing is certain, however. Before bar code technology can be effectively applied in any setting, the materials manager must understand several basic concepts: How materials flow through his physical plant; How, where and in what amounts they are used; and How and when they are expensed. With this information, it is possible to create a systematized approach to materials cost containment, of which bar coding is one element. In the following article, the author illustrates how purchasing and inventory control can be made more time and labor efficient through the use of bar code technology.

  20. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor;

    2015-01-01

    , such as face and action recognition, a comparative study of codes in abnormal event detection is less studied and hence no conclusion is gained on the effect of codes in detecting abnormalities. We constrict our comparison in two types of the above L0-norm solutions: greedy algorithms and convex L1-norm...

  1. Greenhouse climate management: an optimal control approach.

    NARCIS (Netherlands)

    Henten, van E.J.

    1994-01-01

    In this thesis a methodology is developed for the construction and analysis of an optimal greenhouse climate control system.In chapter 1, the results of a literature survey are presented and the research objectives are defined. In the literature, optimal greenhouse climate management systems have be

  2. Extreme genetic code optimality from a molecular dynamics calculation of amino acid polar requirement

    Science.gov (United States)

    Butler, Thomas; Goldenfeld, Nigel; Mathew, Damien; Luthey-Schulten, Zaida

    2009-06-01

    A molecular dynamics calculation of the amino acid polar requirement is used to score the canonical genetic code. Monte Carlo simulation shows that this computational polar requirement has been optimized by the canonical genetic code, an order of magnitude more than any previously known measure, effectively ruling out a vertical evolution dynamics. The sensitivity of the optimization to the precise metric used in code scoring is consistent with code evolution having proceeded through the communal dynamics of statistical proteins using horizontal gene transfer, as recently proposed. The extreme optimization of the genetic code therefore strongly supports the idea that the genetic code evolved from a communal state of life prior to the last universal common ancestor.

  3. Extreme genetic code optimality from a molecular dynamics calculation of amino acid polar requirement.

    Science.gov (United States)

    Butler, Thomas; Goldenfeld, Nigel; Mathew, Damien; Luthey-Schulten, Zaida

    2009-06-01

    A molecular dynamics calculation of the amino acid polar requirement is used to score the canonical genetic code. Monte Carlo simulation shows that this computational polar requirement has been optimized by the canonical genetic code, an order of magnitude more than any previously known measure, effectively ruling out a vertical evolution dynamics. The sensitivity of the optimization to the precise metric used in code scoring is consistent with code evolution having proceeded through the communal dynamics of statistical proteins using horizontal gene transfer, as recently proposed. The extreme optimization of the genetic code therefore strongly supports the idea that the genetic code evolved from a communal state of life prior to the last universal common ancestor.

  4. An optimized framework for degree distribution in LT codes based on power law

    Institute of Scientific and Technical Information of China (English)

    Asim; Muhammad; Choi; GoangSeog

    2013-01-01

    LT codes are practical realization of digital fountain codes, which provides the concept of rateless coding. In this scheme, encoded symbols are generated infinitely from k information symbols. Decoder uses only(1+α)k number of encoded symbols to recover the original information. The degree distribution function in the LT codes helps to generate a random graph also referred as tanner graph. The artifact of tanner graph is responsible for computational complexity and overhead in the LT codes. Intuitively, a well designed degree distribution can be used for an efficient implementation of LT codes. The degree distribution function is studied as a function of power law, and LT codes are classified into two different categories: SFLT and RLT codes. Also, two different degree distributions are proposed and analyzed for SFLT codes which guarantee optimal performance in terms of computational complexity and overhead.

  5. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  6. Performance of GTX Titan X GPUs and Code Optimization

    CERN Document Server

    Jeong, Hwancheol; Lee, Weonjong; Pak, Jeonghwan; Kim, Jangho; Chung, Juhyun

    2015-01-01

    Recently Nvidia has released a new GPU model: GTX Titan X (TX) in a linage of the Maxwell architecture. We use our conjugate gradient code and non-perturbative renormalization code to measure the performance of TX. The results are compared with those of GTX Titan Black (TB) in a lineage of the Kepler architecture. We observe a significant gain in the single and double precision calculations much greater than the theoretical expectation.

  7. LDPC code optimization techniques to improve the error correction threshold

    Directory of Open Access Journals (Sweden)

    Роман Сергійович Новиков

    2015-11-01

    Full Text Available Non-empty stopping sets, which are the main reason for achieving a threshold of errors in data transmission channels, are studied. New algorithm of transfer smallest stopping sets and stop distance of any LDPC code is proposed. More functional and flexible technique of splitting-and-filling is proposed. Time for which will be transferred the smallest stopping sets and founded stop distance of any LDPC code is calculated

  8. Optimizing Performance Management by Management by Objectives

    OpenAIRE

    Tong, Xi

    2011-01-01

    This study presents the critical problem of the development of Chinese small and medium-sized enterprises, where unstructured management hinders the healthy growth of the enterprise. Changchun Chengshi Ltd. Co China is a small enterprise troubled by an inefficient work force. In order to increase productivity the owner of Chengshi requested a solution to extricate the company from this difficult position. This thesis was conducted to fulfill the owner’s request. The initial objective of...

  9. Optimization and resilience in natural resources management

    Science.gov (United States)

    Williams, Byron K.; Johnson, Fred A.

    2015-01-01

    We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.

  10. Variation in coding influence across the USA. Risk and reward in reimbursement optimization.

    Science.gov (United States)

    Lorence, Daniel P; Richards, Michael

    2002-01-01

    Recent anti-fraud enforcement policies across the US health-care system have led to widespread speculation about the effectiveness of increased penalties for overcharging practices adopted by health-care service organizations. Severe penalties, including imprisonment, suggest that fraudulent billing, and related misclassification of services provided to patients, would be greatly reduced or eliminated as a result of increased government investigation and reprisal. This study sought to measure the extent to which health information managers reported being influenced by superiors to manipulate coding and classification of patient data. Findings from a nationwide survey of managers suggest that such practices are still pervasive, despite recent counter-fraud legislation and highly visible prosecution of fraudulent behaviors. Examining variation in influences exerted from both within and external to specific service delivery settings, results suggest that pressure to alter classification codes occurred both within and external to the provider setting. We also examine how optimization influences vary across demographic, practice setting, and market characteristics, and find significant variation in influence across practice settings and market types. Implications for reimbursement programs and evidence-based health care are discussed.

  11. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments

    Directory of Open Access Journals (Sweden)

    Monteagudo Ángel

    2011-02-01

    Full Text Available Abstract Background As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Results Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Conclusions Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the

  12. How to optimally tune sparse network coding over wireless links

    DEFF Research Database (Denmark)

    Garrido, Pablo; Roetter, Daniel Enrique Lucani; Aguero, Ramon

    2017-01-01

    . One of those are the so-called Tunable Sparse Network Coding (TSNC) techniques, which advocate limiting the number of packets that are combined to build a coded packet. They also propose dynamically adapting the corresponding sparsity level, as the transmission evolves, although an optimum tuning......Despite their high computational complexity, Random Linear Network Coding (RLNC) techniques have been shown to offer a good robustness against packet erasure wireless channels. Some approaches have been recently proposed to reduce such computational burden, for both encoder and decoder elements......, the proposed scheme offers a better trade-off between computational complexity and network performance. Furthermore, we broaden the analysis of TSNC techniques by thoroughly assessing their behavior over wireless networks using the ns-3 platform. The results yield a remarkable complexity reduction (approx. 3...

  13. Code Optimization on Kepler GPUs and Xeon Phi

    CERN Document Server

    Jang, Yong-Chull; Kim, Jangho; Lee, Weonjong; Pak, Jeonghwan; Chung, Yuree

    2014-01-01

    Kepler GTX Titan Black and Kepler Tesla K40 are still the best GPUs for high performance computing, although Maxwell GPUs such as GTX 980 are available in the market. Hence, we measure the performance of our lattice QCD codes using the Kepler GPUs. We also upgrade our code to use the latest CPS (Columbia Physics System) library along with the most recent QUDA (QCD CUDA) library for lattice QCD. These new libraries improve the performance of our conjugate gradient (CG) inverter so that it runs twice faster than before. We also investigate the performance of Xeon Phi 7120P coprocessor. It has similar computing power with the Kepler GPUs in principle. However, its performance for our CG code is significantly inferior to that of the GTX Titan Black GPUs at present.

  14. Source-channel optimized trellis codes for bitonal image transmission over AWGN channels.

    Science.gov (United States)

    Kroll, J M; Phamdo, N

    1999-01-01

    We consider the design of trellis codes for transmission of binary images over additive white Gaussian noise (AWGN) channels. We first model the image as a binary asymmetric Markov source (BAMS) and then design source-channel optimized (SCO) trellis codes for the BAMS and AWGN channel. The SCO codes are shown to be superior to Ungerboeck's codes by approximately 1.1 dB (64-state code, 10(-5) bit error probability), We also show that a simple "mapping conversion" method can be used to improve the performance of Ungerboeck's codes by approximately 0.4 dB (also 64-state code and 10 (-5) bit error probability). We compare the proposed SCO system with a traditional tandem system consisting of a Huffman code, a convolutional code, an interleaver, and an Ungerboeck trellis code. The SCO system significantly outperforms the tandem system. Finally, using a facsimile image, we compare the image quality of an SCO code, an Ungerboeck code, and the tandem code, The SCO code yields the best reconstructed image quality at 4-5 dB channel SNR.

  15. Optimization of Mapping Rule of Bit-Interleaved Turbo Coded Modulation with 16QAM

    Institute of Scientific and Technical Information of China (English)

    FEI Ze-song; YANG Yu; LIU Lin-nan; KUANG Jing-ming

    2005-01-01

    Optimization of mapping rule of bit-interleaved Turbo coded modulation with 16 quadrature amplitude modulation (QAM) is investigated based on different impacts of various encoded bits sequence on Turbo decoding performance. Furthermore, bit-interleaved in-phase and quadrature phase (I-Q) Turbo coded modulation scheme are designed similarly with I-Q trellis coded modulation (TCM). Through performance evaluation and analysis, it can be seen that the novel mapping rule outperforms traditional one and the I-Q Turbo coded modulation can not achieve good performance as expected. Therefore, there is not obvious advantage in using I-Q method in bit-interleaved Turbo coded modulation.

  16. Two-Layer Coding Rate Optimization in Relay-Aided Systems

    DEFF Research Database (Denmark)

    Sun, Fan

    2011-01-01

    We consider a three-node transmission system, where a source node conveys a data block to a destination node with the help of a half-duplex decode and-forward (DF) relay node. The whole data block is transmitted as a sequence of packets. For reliable transmission in the three-node system, a two......-layer coding scheme is proposed, where physical layer channel coding is utilized within each packet for error-correction and random network coding is applied on top of channel coding for network error-control. There is a natural tradeoff between the physical layer coding rate and the network coding rate given...... requirement. Numerical results are also provided to show the optimized physical layer coding and network coding rate pairs in different system scenarios....

  17. Perceptual Zero-Tree Coding with Efficient Optimization for Embedded Platforms

    Directory of Open Access Journals (Sweden)

    B. F. Wu

    2013-08-01

    Full Text Available This study proposes a block-edge-based perceptual zero-tree coding (PZTC method, which is implemented with efficientoptimization on the embedded platform. PZTC combines two novel compression concepts for coding efficiency and quality:block-edge detection (BED and the low-complexity and low-memory entropy coder (LLEC. The proposed PZTC wasimplemented as a fixed-point version and optimized on the DSP-based platform based on both the presented platformindependentand platform-dependent optimization technologies. For platform-dependent optimization, this study examinesthe fixed-point PZTC and analyzes the complexity to optimize PZTC toward achieving an optimal coding efficiency.Furthermore, hardware-based platform-dependent optimizations are presented to reduce the memory size. Theperformance, such as compression quality and efficiency, is validated by experimental results.

  18. Simulation and Optimization of VHDL code for FPGA-Based Design using Simulink

    Directory of Open Access Journals (Sweden)

    Naresh Grover

    2014-06-01

    Full Text Available Simulations and prototyping have been a very important part of the electronics industry since a very long time. In recent years, FPGA's have become increasingly important and have found their way into all kind of digital system design This paper presents a novel, easy and efficient approach of implementation and verification of VHDL code using Simulink and then to regenerate the optimized VHDL code again using Simulink. The VHDL code written for the complicated digital design of 32-bit floating point arithmetic unit has been synthesized on Xilinx, verified and simulated on Simulink. The same VHDL code in Modelsim was optimized using this approach and the optimized code so generated by Simulinkhas also been synthesized to compare the results. Power dissipations for both synthesized designs using Xilinx Power Estimator were also extracted for comparison.

  19. On Optimal Policies for Network-Coded Cooperation

    DEFF Research Database (Denmark)

    Khamfroush, Hana; Roetter, Daniel Enrique Lucani; Pahlevani, Peyman

    2015-01-01

    's Raspberry Pi testbed and compared with random linear network coding (RLNC) broadcast in terms of completion time, total number of required transmissions, and percentage of delivered generations. Our measurements show that enabling cooperation only among pairs of devices can decrease the completion time...

  20. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor;

    2015-01-01

    through finding the L0-norm solution of the problem: min ||Y -D_{alpfa}||–2^2 +||alpha||_0, is crucial. Note that D refers to the dictionary and refers to the sparse codes. This L0-norm solution, however, is known as a NP-hard problem. Despite of the research achievements in some classification fields...

  1. On the Existence of Optimal Exact-Repair MDS Codes for Distributed Storage

    CERN Document Server

    Suh, Changho

    2010-01-01

    The high repair cost of (n,k) Maximum Distance Separable (MDS) erasure codes has recently motivated a new class of codes, called Regenerating Codes, that optimally trade off storage cost for repair bandwidth. In this paper, we address bandwidth-optimal (n,k,d) Exact-Repair MDS codes, which allow for any failed node to be repaired exactly with access to arbitrary d survivor nodes, where k<=d<=n-1. We show the existence of Exact-Repair MDS codes that achieve minimum repair bandwidth (matching the cutset lower bound) for arbitrary admissible (n,k,d), i.e., kcodes which allow to split symbols into arbitrarily small subsymbols.

  2. Reed-Solomon Turbo Product Codes for Optical Communications: From Code Optimization to Decoder Design

    Directory of Open Access Journals (Sweden)

    Le Bidan Raphaël

    2008-01-01

    Full Text Available Abstract Turbo product codes (TPCs are an attractive solution to improve link budgets and reduce systems costs by relaxing the requirements on expensive optical devices in high capacity optical transport systems. In this paper, we investigate the use of Reed-Solomon (RS turbo product codes for 40 Gbps transmission over optical transport networks and 10 Gbps transmission over passive optical networks. An algorithmic study is first performed in order to design RS TPCs that are compatible with the performance requirements imposed by the two applications. Then, a novel ultrahigh-speed parallel architecture for turbo decoding of product codes is described. A comparison with binary Bose-Chaudhuri-Hocquenghem (BCH TPCs is performed. The results show that high-rate RS TPCs offer a better complexity/performance tradeoff than BCH TPCs for low-cost Gbps fiber optic communications.

  3. Reed-Solomon Turbo Product Codes for Optical Communications: From Code Optimization to Decoder Design

    Directory of Open Access Journals (Sweden)

    Ramesh Pyndiah

    2008-05-01

    Full Text Available Turbo product codes (TPCs are an attractive solution to improve link budgets and reduce systems costs by relaxing the requirements on expensive optical devices in high capacity optical transport systems. In this paper, we investigate the use of Reed-Solomon (RS turbo product codes for 40 Gbps transmission over optical transport networks and 10 Gbps transmission over passive optical networks. An algorithmic study is first performed in order to design RS TPCs that are compatible with the performance requirements imposed by the two applications. Then, a novel ultrahigh-speed parallel architecture for turbo decoding of product codes is described. A comparison with binary Bose-Chaudhuri-Hocquenghem (BCH TPCs is performed. The results show that high-rate RS TPCs offer a better complexity/performance tradeoff than BCH TPCs for low-cost Gbps fiber optic communications.

  4. The Optimal Fix-Free Code for Anti-Uniform Sources

    Directory of Open Access Journals (Sweden)

    Ali Zaghian

    2015-03-01

    Full Text Available An \\(n\\ symbol source which has a Huffman code with codelength vector \\(L_{n}=(1,2,3,\\cdots,n-2,n-1,n-1\\ is called an anti-uniform source. In this paper, it is shown that for this class of sources, the optimal fix-free code and symmetric fix-free code is \\( C_{n}^{*}=(0,11,101,1001,\\cdots,1\\overbrace{0\\cdots0}^{n-2}1.

  5. Supply chain management and optimization in manufacturing

    CERN Document Server

    Pirim, Harun; Yilbas, Bekir Sami

    2014-01-01

    This book introduces general supply chain terminology particularly for novice readers, state of the art supply chain management and optimization issues and problems in manufacturing. The book provides insights for making supply chain decisions, planning and scheduling through supply chain network. It introduces optimization problems, i.e. transportation of raw materials, products and location, inventory of plants, warehouses and retailers, faced throughout the supply chain network.

  6. Optimal Reserve Management Pattern for Turkish Banks

    OpenAIRE

    Anil Talasli; Suheyla Ozyildirim

    2013-01-01

    In this paper we model a representative bank’s optimal reserve management pattern by adopting institutional aspects of the Turkish required reserve regime. The cost minimization problem, in general, takes into account the expected opportunity cost of holding reserves and penalty costs for not fulfilling the liability of reserve requirement. We extend this problem for reserve carry-over provision and use of late liquidity window facility and compute the optimal reserve pattern by dynamic progr...

  7. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of individu

  8. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of

  9. Optimal management of equine keratomycosis

    Directory of Open Access Journals (Sweden)

    Brooks DE

    2012-03-01

    Full Text Available Paula D Galera1, Dennis E Brooks21College of Veterinary Medicine, University of Brasilia, DF, Brazil; 2Departments of Large and Small Animal Clinical Sciences, College of Veterinary Medicine, University of Florida, Gainesville, FL, USAAbstract: Keratomycosis in the horse exists in several unique clinical forms. This paper discusses the diagnosis and clinical management of keratomycosis in the horse associated with tear film instability, epithelial keratopathy, subepithelial infiltrates, superficial and deep ulcers, plaques, melting ulcers, descemetoceles, iris prolapse, and stromal abscesses. Prompt diagnosis and aggressive treatment of equine keratomycosis can make a major difference in the maintenance of a cosmetic and visual eye.Keywords: fungal keratitis, keratomycosis, horse, cornea, melting, keratoplasty

  10. Integration of QR codes into an anesthesia information management system for resident case log management.

    Science.gov (United States)

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Optimization and analysis of code-division multiplexed TES microcalorimeters

    CERN Document Server

    Fowler, J W; Hilton, G C; Irwin, K D; Schmidt, D R; Stiehl, G M; Swetz, D S; Ullom, J N; Vale., L R

    2011-01-01

    We are developing code-division multiplexing (CDM) systems for transition-edge sensor arrays with the goal of reaching multiplexing factors in the hundreds. We report on x-ray measurements made with a four-channel prototype CDM system that employs a flux-summing architecture, emphasizing data-analysis issues. We describe an empirical method to determine the demodulation matrix that minimizes cross-talk. This CDM system achieves energy resolutions of between 2.3 eV and 3.0 eV FWHM at 5.9 keV.

  12. Anytime coding on the infinite bandwidth AWGN channel: A sequential semi-orthogonal optimal code

    OpenAIRE

    Sahai, Anant

    2006-01-01

    It is well known that orthogonal coding can be used to approach the Shannon capacity of the power-constrained AWGN channel without a bandwidth constraint. This correspondence describes a semi-orthogonal variation of pulse position modulation that is sequential in nature -- bits can be ``streamed across'' without having to buffer up blocks of bits at the transmitter. ML decoding results in an exponentially small probability of error as a function of tolerated receiver delay and thus eventually...

  13. A new algorithm for optimizing the wavelength coverage for spectroscopic studies: Spectral Wavelength Optimization Code (SWOC)

    Science.gov (United States)

    Ruchti, G. R.; Feltzing, S.; Lind, K.; Caffau, E.; Korn, A. J.; Schnurr, O.; Hansen, C. J.; Koch, A.; Sbordone, L.; de Jong, R. S.

    2016-09-01

    The past decade and a half has seen the design and execution of several ground-based spectroscopic surveys, both Galactic and Extragalactic. Additionally, new surveys are being designed that extend the boundaries of current surveys. In this context, many important considerations must be done when designing a spectrograph for the future. Among these is the determination of the optimum wavelength coverage. In this work, we present a new code for determining the wavelength ranges that provide the optimal amount of information to achieve the required science goals for a given survey. In its first mode, it utilizes a user-defined list of spectral features to compute a figure-of-merit for different spectral configurations. The second mode utilizes a set of flux-calibrated spectra, determining the spectral regions that show the largest differences among the spectra. Our algorithm is easily adaptable for any set of science requirements and any spectrograph design. We apply the algorithm to several examples, including 4MOST, showing the method yields important design constraints to the wavelength regions.

  14. A new algorithm for optimizing the wavelength coverage for spectroscopic studies: Spectral Wavelength Optimization Code (SWOC)

    CERN Document Server

    Ruchti, G R; Lind, K; Caffau, E; Korn, A J; Schnurr, O; Hansen, C J; Koch, A; Sbordone, L; de Jong, R S

    2016-01-01

    The past decade and a half has seen the design and execution of several ground-based spectroscopic surveys, both Galactic and Extra-galactic. Additionally, new surveys are being designed that extend the boundaries of current surveys. In this context, many important considerations must be done when designing a spectrograph for the future. Among these is the determination of the optimum wavelength coverage. In this work, we present a new code for determining the wavelength ranges that provide the optimal amount of information to achieve the required science goals for a given survey. In its first mode, it utilizes a user-defined list of spectral features to compute a figure-of-merit for different spectral configurations. The second mode utilizes a set of flux-calibrated spectra, determining the spectral regions that show the largest differences among the spectra. Our algorithm is easily adaptable for any set of science requirements and any spectrograph design. We apply the algorithm to several examples, includin...

  15. Characterization and Optimization of LDPC Codes for the 2-User Gaussian Multiple Access Channel

    Directory of Open Access Journals (Sweden)

    Declercq David

    2007-01-01

    Full Text Available We address the problem of designing good LDPC codes for the Gaussian multiple access channel (MAC. The framework we choose is to design multiuser LDPC codes with joint belief propagation decoding on the joint graph of the 2-user case. Our main result compared to existing work is to express analytically EXIT functions of the multiuser decoder with two different approximations of the density evolution. This allows us to propose a very simple linear programming optimization for the complicated problem of LDPC code design with joint multiuser decoding. The stability condition for our case is derived and used in the optimization constraints. The codes that we obtain for the 2-user case are quite good for various rates, especially if we consider the very simple optimization procedure.

  16. Characterization and Optimization of LDPC Codes for the 2-User Gaussian Multiple Access Channel

    Directory of Open Access Journals (Sweden)

    Aline Roumy

    2007-06-01

    Full Text Available We address the problem of designing good LDPC codes for the Gaussian multiple access channel (MAC. The framework we choose is to design multiuser LDPC codes with joint belief propagation decoding on the joint graph of the 2-user case. Our main result compared to existing work is to express analytically EXIT functions of the multiuser decoder with two different approximations of the density evolution. This allows us to propose a very simple linear programming optimization for the complicated problem of LDPC code design with joint multiuser decoding. The stability condition for our case is derived and used in the optimization constraints. The codes that we obtain for the 2-user case are quite good for various rates, especially if we consider the very simple optimization procedure.

  17. Portfolio Optimization under Entropic Risk Management

    Institute of Scientific and Technical Information of China (English)

    Wei ZHONG

    2009-01-01

    In this paper,properties of the entropic risk measure are examined rigorously in a general framework.This risk measure is then applied in a dynamic portfolio optimization problem,appearing in the risk management constraint.By considering the dual problem,we prove the existence and uniqueness of the solution and obtain an analytic expression for the solution.

  18. Context-based lossless image compression with optimal codes for discretized Laplacian distributions

    Science.gov (United States)

    Giurcaneanu, Ciprian Doru; Tabus, Ioan; Stanciu, Cosmin

    2003-05-01

    Lossless image compression has become an important research topic, especially in relation with the JPEG-LS standard. Recently, the techniques known for designing optimal codes for sources with infinite alphabets have been applied for the quantized Laplacian sources which have probability mass functions with two geometrically decaying tails. Due to the simple parametric model of the source distribution the Huffman iterations are possible to be carried out analytically, using the concept of reduced source, and the final codes are obtained as a sequence of very simple arithmetic operations, avoiding the need to store coding tables. We propose the use of these (optimal) codes in conjunction with context-based prediction, for noiseless compression of images. To reduce further the average code length, we design Escape sequences to be employed when the estimation of the distribution parameter is unreliable. Results on standard test files show improvements in compression ratio when comparing with JPEG-LS.

  19. Optimal pain management for radical prostatectomy surgery

    DEFF Research Database (Denmark)

    Joshi, Grish P; Jaschinski, Thomas; Bonnet, Francis;

    2015-01-01

    of evidence to develop an optimal pain management protocol in patients undergoing radical prostatectomy. Most studies assessed unimodal analgesic approaches rather than a multimodal technique. There is a need for more procedure-specific studies comparing pain and analgesic requirements for open and minimally......BACKGROUND: Increase in the diagnosis of prostate cancer has increased the incidence of radical prostatectomy. However, the literature assessing pain therapy for this procedure has not been systematically evaluated. Thus, optimal pain therapy for patients undergoing radical prostatectomy remains...... invasive surgical procedures. Finally, while we wait for appropriate procedure specific evidence from publication of adequate studies assessing optimal pain management after radical prostatectomy, we propose a basic analgesic guideline....

  20. Aircraft Course Optimization Tool Using GPOPS MATLAB Code

    Science.gov (United States)

    2012-03-01

    experiences when the problem becomes too complex. v Acknowledgements This thesis would never have come to fruition without the help of those around me. I must...Florida, and Standford University’s Sparse Nonlinear OPTimizer(SNOPT) solver. The addition of several ACOT specific scripts frame the problem to the GPOPS... experiences with the two lobe radar cross section is the discontinuity where the RCS is 1m2, however it is thought this is ignored due to the discrete

  1. Optimal Decoding Algorithm for Asynchronous Physical-Layer Network Coding

    CERN Document Server

    Lu, Lu; Zhang, Shengli

    2011-01-01

    A key issue in physical-layer network coding (PNC) is how to deal with the asynchrony between signals transmitted by multiple transmitters. That is, symbols transmitted by different transmitters could arrive at the receiver with symbol misalignment as well as relative carrier-phase offset. In this paper, 1) we propose and investigate a general framework based on belief propagation (BP) that can effectively deal with symbol and phase asynchronies; 2) we show that for BPSK and QPSK modulations, our BP method can significantly reduce the SNR penalty due to asynchrony compared with prior methods; 3) we find that symbol misalignment makes the system performance less sensitive and more robust against carrier-phase offset. Observation 3) has the following practical implication. It is relatively easier to control symbol timing than carrier-phase offset. Our results indicate that if we could control the symbol offset in PNC, it would actually be advantageous to deliberately introduce symbol misalignment to desensitize...

  2. Signal-to-noise-optimal scaling of heterogenous population codes.

    Science.gov (United States)

    Leibold, Christian

    2013-01-01

    Similarity measures for neuronal population responses that are based on scalar products can be little informative if the neurons have different firing statistics. Based on signal-to-noise optimality, this paper derives positive weighting factors for the individual neurons' response rates in a heterogeneous neuronal population. The weights only depend on empirical statistics. If firing follows Poisson statistics, the weights can be interpreted as mutual information per spike. The scaling is shown to improve linear separability and clustering as compared to unscaled inputs.

  3. Optimal Demand Response with Energy Storage Management

    OpenAIRE

    Huang, Longbo; Walrand, Jean; Ramchandran, Kannan

    2012-01-01

    In this paper, we consider the problem of optimal demand response and energy storage management for a power consuming entity. The entity's objective is to find an optimal control policy for deciding how much load to consume, how much power to purchase from/sell to the power grid, and how to use the finite capacity energy storage device and renewable energy, to minimize his average cost, being the disutility due to load- shedding and cost for purchasing power. Due to the coupling effect of the...

  4. Microgrid management architecture considering optimal battery dispatch

    Science.gov (United States)

    Paul, Tim George

    Energy management and economic operation of microgrids with energy storage systems at the distribution level have attracted significant research interest in recent years. One of the challenges in this area has been the coordination of energy management functions with decentralized and centralized dispatch. In this thesis a distributed dispatch algorithm for a microgrid consisting of a photovoltaic source with energy storage which can work with a centralized dispatch algorithm that ensure stability of the microgrid is proposed. To this end, first a rule based dispatch algorithm is formulated which is based on maximum resource utilization and can work in both off grid and grid connected mode. Then a fixed horizon optimization algorithm which minimizes the cost of power taken from the grid is developed. In order to schedule the battery based on changes in the PV farm a predictive horizon methodology based optimization is designed. Further, the rule based and optimization based dispatch methodologies is linked to optimize the voltage deviations at the microgrid Point of Common Coupling (PCC). The main advantage of the proposed method is that, an optimal active power dispatch considering the nominal voltage bandwidth can be initiated for the microgrid in both grid connected or off grid mode of operation. Also, the method allows the grid operator to consider cost based optimal renewable generation scheduling and/or the maximum power extraction based modes of operation simultaneously or separately based on grid operating conditions and topologies. Further, the methods allows maintaining PCC voltage within the limits during these modes of operation and at the same time ensure that the battery dispatch is optimal.

  5. Hardware Abstraction and Protocol Optimization for Coded Sensor Networks

    DEFF Research Database (Denmark)

    Nistor, Maricica; Roetter, Daniel Enrique Lucani; Barros, João

    2015-01-01

    -efficient protocols that use such an abstraction, as well as mechanisms to optimize a communication protocol in terms of energy consumption. The problem is modeled for different feedback-based techniques, where sensors are connected to a base station, either directly or through relays. We show that for four example......The design of the communication protocols in wireless sensor networks (WSNs) often neglects several key characteristics of the sensor's hardware, while assuming that the number of transmitted bits is the dominating factor behind the system's energy consumption. A closer look at the hardware...... platforms, the use of relays may decrease up to 4.5 times the total energy consumption when the protocol and the hardware are carefully matched. We conclude that: 1) the energy budget for a communication protocol varies significantly on different sensor platforms; and 2) the protocols can be judiciously...

  6. The genetic code and its optimization for kinetic energy conservation in polypeptide chains.

    Science.gov (United States)

    Guilloux, Antonin; Jestin, Jean-Luc

    2012-08-01

    Why is the genetic code the way it is? Concepts from fields as diverse as molecular evolution, classical chemistry, biochemistry and metabolism have been used to define selection pressures most likely to be involved in the shaping of the genetic code. Here minimization of kinetic energy disturbances during protein evolution by mutation allows an optimization of the genetic code to be highlighted. The quadratic forms corresponding to the kinetic energy term are considered over the field of rational numbers. Arguments are given to support the introduction of notions from basic number theory within this context. The observations found to be consistent with this minimization are statistically significant. The genetic code may well have been optimized according to energetic criteria so as to improve folding and dynamic properties of polypeptide chains. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation

    CERN Document Server

    Blazewicz, Marek; Koppelman, David M; Brandt, Steven R; Ciznicki, Milosz; Kierzynka, Michal; Löffler, Frank; Tao, Jian

    2013-01-01

    Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization is based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of va...

  8. From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation

    Directory of Open Access Journals (Sweden)

    Marek Blazewicz

    2013-01-01

    Full Text Available Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization is based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of variables and thousands of terms.

  9. Buffer management optimization strategy for satellite ATM

    Institute of Scientific and Technical Information of China (English)

    Lu Rong; Cao Zhigang

    2006-01-01

    ECTD (erroneous cell tail drop), a buffer management optimization strategy is suggested which can improve the utilization of buffer resources in satellite ATM (asynchronous transfer mode) networks. The strategy, in which erroneous cells caused by satellite channel and the following cells that belong to the same PDU (protocol data Unit) are discarded, concerns non-real-time data services that use higher layer protocol for retransmission. Based on EPD (early packet drop) policy, mathematical models are established with and without ECTD. The numerical results show that ECTD would optimize buffer management and improve effective throughput (goodput), and the increment of goodput is relative to the CER (cell error ratio) and the PDU length. The higher their values are, the greater the increment. For example,when the average PDU length values are 30 and 90, the improvement of goodput are respectively about 4% and 10%.

  10. Online network coding for optimal throughput and delay -- the two-receiver case

    CERN Document Server

    Sundararajan, Jay Kumar; Médard, Muriel

    2008-01-01

    For a packet erasure broadcast channel with two receivers, a new coding algorithm is proposed that makes use of feedback to achieve asymptotically optimal queue size at the sender and decoding delay at the receivers, without compromising on throughput. Our coding module is compatible with the drop-when-seen queuing algorithm proposed in earlier work -- Sundararajan et al. (ISIT 2008). Hence, it guarantees that the physical queue size at the sender tracks the backlog in degrees of freedom. In addition, the coding module is throughput optimal and at the same time, also achieves an asymptotically optimal decoding delay of O(1/{1-\\rho}), where \\rho is the load factor. We consider the asymptotics when \\rho tends to 1 from below, with either the arrival rate (\\lambda) or the channel parameter (\\mu) being fixed at a number less than 1.

  11. New developments of the CARTE thermochemical code: I-parameter optimization

    Science.gov (United States)

    Desbiens, N.; Dubois, V.

    We present the calibration of the CARTE thermochemical code that allows to compute the properties of a wide variety of CHON explosives. We have developed an optimization procedure to obtain an accurate multicomponents EOS (fluid phase and condensed phase of carbon). We show here that the results of CARTE code are in good agreement with the specific data of molecular systems and we extensively compare our calculations with measured detonation properties for several explosives.

  12. New developments of the CARTE thermochemical code: I-parameter optimization

    Directory of Open Access Journals (Sweden)

    Dubois V.

    2011-01-01

    Full Text Available We present the calibration of the CARTE thermochemical code that allows to compute the properties of a wide variety of CHON explosives. We have developed an optimization procedure to obtain an accurate multicomponents EOS (fluid phase and condensed phase of carbon. We show here that the results of CARTE code are in good agreement with the specific data of molecular systems and we extensively compare our calculations with measured detonation properties for several explosives.

  13. Cat codes with optimal decoherence suppression for a lossy bosonic channel

    OpenAIRE

    Li, Linshu; Zou, Chang-Ling; Albert, Victor V.; Muralidharan, Sreraman; Girvin, S. M.; Jiang, Liang

    2016-01-01

    We investigate cat codes that can correct multiple excitation losses and identify two types of logical errors: bit-flip errors due to excessive excitation loss and dephasing errors due to quantum back-action from the environment. We show that selected choices of logical subspace and coherent amplitude can efficiently reduce dephasing errors. The trade-off between the two major errors enables optimized performance of cat codes in terms of minimized decoherence. With high coupling efficiency, w...

  14. Database Design and Management in Engineering Optimization.

    Science.gov (United States)

    1988-02-01

    for 4 Steekanta Murthy, T., Shyy, Y.-K. and Arora, J. S. MIDAS: educational and research purposes. It has considerably Management of Information for...an education in the particular field of ,-". expertise. ..-. *, The types of information to be retained and presented depend on the user of the system...191 . ,. 110 Though the design of MIDAS is directly influenced by Obl- SPOC qUery-bioek the current structural optimization applications, it possesses

  15. Fuel management optimization based on power profile by Cellular Automata

    Energy Technology Data Exchange (ETDEWEB)

    Fadaei, Amir Hosein, E-mail: Fadaei_amir@aut.ac.i [Faculty of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnique), Hafez Street, Tehran (Iran, Islamic Republic of); Moghaddam, Nader Maleki [Faculty of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnique), Hafez Street, Tehran (Iran, Islamic Republic of); Zahedinejad, Ehsan [Department of Energy Engineering, Sharif University of Technology, Azadi Str., Tehran (Iran, Islamic Republic of); Fadaei, Mohammad Mehdi [Department of Electrical Engineering, Faculty of Engineering, Central Tehran Branch, Islamic Azad University, Punak Square, Tehran (Iran, Islamic Republic of); Kia, Shabnam [Faculty of Engineering, Islamic Azad University, Science and Research Branch, Punak Square, Tehran (Iran, Islamic Republic of)

    2010-12-15

    Fuel management in PWR nuclear reactors is comprised of a collection of principles and practices required for the planning, scheduling, refueling, and safe operation of nuclear power plants to minimize the total plant and system energy costs to the extent possible. Despite remarkable advancements in optimization procedures, inherent complexities in nuclear reactor structure and strong inter-dependency among the fundamental parameters of the core make it necessary to evaluate the most efficient arrangement of the core. Several patterns have been presented so far to determine the best configuration of fuels in the reactor core by emphasis on minimizing the local power peaking factor (P{sub q}). In this research, a new strategy for optimizing the fuel arrangements in a VVER-1000 reactor core is developed while lowering the P{sub q} is considered as the main target. For this purpose, a Fuel Quality Factor, Z(r), served to depict the reactor core pattern. Mapping to ideal pattern is tracked over the optimization procedure in which the ideal pattern is prepared with considering the Z(r) constraints and their effects on flux and P{sub q} uniformity. For finding the best configuration corresponding to the desired pattern, Cellular Automata (CA) is applied as a powerful and reliable tool on optimization procedure. To obtain the Z(r) constraints, the MCNP code was used and core calculations were performed by WIMS and CITATION codes. The results are compared with the predictions of a Neural Network as a smart optimization method, and the Final Safety Analysis Report (FSAR) as a reference proposed by the designer.

  16. Game-Theoretic Rate-Distortion-Complexity Optimization of High Efficiency Video Coding

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Milani, Simone; Forchhammer, Søren

    2013-01-01

    This paper presents an algorithm for rate-distortioncomplexity optimization for the emerging High Efficiency Video Coding (HEVC) standard, whose high computational requirements urge the need for low-complexity optimization algorithms. Optimization approaches need to specify different complexity...... profiles in order to tailor the computational load to the different hardware and power-supply resources of devices. In this work, we focus on optimizing the quantization parameter and partition depth in HEVC via a game-theoretic approach. The proposed rate control strategy alone provides 0.2 dB improvement...

  17. Explanation of how to run the global local optimization code (GLO) to find surface heat flux

    Energy Technology Data Exchange (ETDEWEB)

    Aceves, S; Sahai, V; Stein, W

    1999-03-01

    From the evaluation[1] of the inverse techniques available, it was determined that the Global Local Optimization Code[2] can determine the surface heat flux using known experimental data at various points in the geometry. This code uses a whole domain approach in which an analysis code (such as TOPAZ2D or ABAQUS) can be run to get the appropriate data needed to minimize the heat flux function. This document is a compilation of our notes on how to run this code to find the surface heat flux. First, the code is described and the overall set-up procedure is reviewed. Then, creation of the configuration file is described. A specific configuration file is given with appropriate explanation. Using this information, the reader should be able to run GLO to find the surface heat flux.

  18. On the optimality of code options for a universal noiseless coder

    Science.gov (United States)

    Yeh, Pen-Shu; Rice, Robert F.; Miller, Warner

    1991-01-01

    A universal noiseless coding structure was developed that provides efficient performance over an extremely broad range of source entropy. This is accomplished by adaptively selecting the best of several easily implemented variable length coding algorithms. Custom VLSI coder and decoder modules capable of processing over 20 million samples per second are currently under development. The first of the code options used in this module development is shown to be equivalent to a class of Huffman code under the Humblet condition, other options are shown to be equivalent to the Huffman codes of a modified Laplacian symbol set, at specified symbol entropy values. Simulation results are obtained on actual aerial imagery, and they confirm the optimality of the scheme. On sources having Gaussian or Poisson distributions, coder performance is also projected through analysis and simulation.

  19. Optimal Algorithms for Near-Hitless Network Restoration via Diversity Coding

    CERN Document Server

    Avci, Serhat Nazim

    2012-01-01

    Diversity coding is a network restoration technique which offers near-hitless restoration, while other state-of-the art techniques are signi?cantly slower. Furthermore, the extra spare capacity requirement of diversity coding is competitive with the others. Previously, we developed heuristic algorithms to employ diversity coding structures in networks with arbitrary topology. This paper presents two algorithms to solve the network design problems using diversity coding in an optimal manner. The first technique pre-provisions static traffic whereas the second technique carries out the dynamic provisioning of the traffic on-demand. In both cases, diversity coding results in smaller restoration time, simpler synchronization, and much reduced signaling complexity than the existing techniques in the literature. A Mixed Integer Programming (MIP) formulation and an algorithm based on Integer Linear Programming (ILP) are developed for pre-provisioning and dynamic provisioning, respectively. Simulation results indicat...

  20. Blind Decorrelating Detection Based on Particle Swarm Optimization under Spreading Code Mismatch

    Institute of Scientific and Technical Information of China (English)

    Jhih-Chung Chang; Chih-Chang Shen

    2014-01-01

    A way of resolving spreading code mismatches in blind multiuser detection with a particle swarm optimization (PSO) approach is proposed. It has been shown that the PSO algorithm incorporating the linear system of the decorrelating detector, which is termed as decorrelating PSO (DPSO), can significantly improve the bit error rate (BER) and the system capacity. As the code mismatch occurs, the output BER performance is vulnerable to degradation for DPSO. With a blind decorrelating scheme, the proposed blind DPSO (BDPSO) offers more robust capabilities over existing DPSO under code mismatch scenarios.

  1. The combinatorial construction for a class of optimal optical orthogonal codes

    Institute of Scientific and Technical Information of China (English)

    唐煜; 殷剑兴

    2002-01-01

    Optical orthogonal code (OOC) has good correlation properties. It has many important appli-cations in a fiber-optic code-division multiple access channel. In this paper, a combinatorial construction foroptimal (15p, 5, 1) optical orthogonal codes with p congruent to 1 modulo 4 and greater than 5 is given byapplying Weil's Theorem. From this, when v is a product of primes congruent to 1 modulo 4 and greater than5, an optimal (15v, 5, 1)-OOC can be obtained by applying a known recursive construction.

  2. Optimization technique of wavefront coding system based on ZEMAX externally compiled programs

    Science.gov (United States)

    Han, Libo; Dong, Liquan; Liu, Ming; Zhao, Yuejin; Liu, Xiaohua

    2016-10-01

    Wavefront coding technique as a means of athermalization applied to infrared imaging system, the design of phase plate is the key to system performance. This paper apply the externally compiled programs of ZEMAX to the optimization of phase mask in the normal optical design process, namely defining the evaluation function of wavefront coding system based on the consistency of modulation transfer function (MTF) and improving the speed of optimization by means of the introduction of the mathematical software. User write an external program which computes the evaluation function on account of the powerful computing feature of the mathematical software in order to find the optimal parameters of phase mask, and accelerate convergence through generic algorithm (GA), then use dynamic data exchange (DDE) interface between ZEMAX and mathematical software to realize high-speed data exchanging. The optimization of the rotational symmetric phase mask and the cubic phase mask have been completed by this method, the depth of focus increases nearly 3 times by inserting the rotational symmetric phase mask, while the other system with cubic phase mask can be increased to 10 times, the consistency of MTF decrease obviously, the maximum operating temperature of optimized system range between -40°-60°. Results show that this optimization method can be more convenient to define some unconventional optimization goals and fleetly to optimize optical system with special properties due to its externally compiled function and DDE, there will be greater significance for the optimization of unconventional optical system.

  3. Improved Data Transmission Scheme of Network Coding Based on Access Point Optimization in VANET

    Directory of Open Access Journals (Sweden)

    Zhe Yang

    2014-01-01

    Full Text Available VANET is a hot spot of intelligent transportation researches. For vehicle users, the file sharing and content distribution through roadside access points (AP as well as the vehicular ad hoc networks (VANET have been an important complement to that cellular network. So the AP deployment is one of the key issues to improve the communication performance of VANET. In this paper, an access point optimization method is proposed based on particle swarm optimization algorithm. The transmission performances of the routing protocol with random linear network coding before and after the access point optimization are analyzed. The simulation results show the optimization model greatly affects the VANET transmission performances based on network coding, and it can enhance the delivery rate by 25% and 14% and reduce the average delay of transmission by 38% and 33%.

  4. Optimal Demand Response with Energy Storage Management

    CERN Document Server

    Huang, Longbo; Ramchandran, Kannan

    2012-01-01

    In this paper, we consider the problem of optimal demand response and energy storage management for a power consuming entity. The entity's objective is to find an optimal control policy for deciding how much load to consume, how much power to purchase from/sell to the power grid, and how to use the finite capacity energy storage device and renewable energy, to minimize his average cost, being the disutility due to load- shedding and cost for purchasing power. Due to the coupling effect of the finite size energy storage, such problems are challenging and are typically tackled using dynamic programming, which is often complex in computation and requires substantial statistical information of the system dynamics. We instead develop a low-complexity algorithm called Demand Response with Energy Storage Management (DR-ESM). DR-ESM does not require any statistical knowledge of the system dynamics, including the renewable energy and the power prices. It only requires the entity to solve a small convex optimization pr...

  5. Maximally Distant Codes Allocation Using Chemical Reaction Optimization with Enhanced Exploration

    Directory of Open Access Journals (Sweden)

    Taisir Eldos

    2016-01-01

    Full Text Available Error correcting codes, also known as error controlling codes, are sets of codes with redundancy that provides for error detection and correction, for fault tolerant operations like data transmission over noisy channels or data retention using storage media with possible physical defects. The challenge is to find a set of m codes out of 2n available n-bit combinations, such that the aggregate hamming distance among those codewords and/or the minimum distance is maximized. Due to the prohibitively large solution spaces of practically sized problems, greedy algorithms are used to generate quick and dirty solutions. However, modern evolutionary search techniques like genetic algorithms, swarm particles, gravitational search, and others, offer more feasible solutions, yielding near optimal solutions in exchange for some computational time. The Chemical Reaction Optimization (CRO, which is inspired by the molecular reactions towards a minimal energy state, emerged recently as an efficient optimization technique. However, like the other techniques, its internal dynamics are hard to control towards convergence, yielding poor performance in many situations. In this research, we proposed an enhanced exploration strategy to overcome this problem, and compared it with the standard threshold based exploration strategy in solving the maximally distant codes allocation problem. Test results showed that the enhancement provided better performance on most metrics.

  6. Optimal control theory for sustainable environmental management.

    Science.gov (United States)

    Shastri, Yogendra; Diwekar, Urmila; Cabezas, Heriberto

    2008-07-15

    Sustainable ecosystem management aims to promote the structure and operation of the human components of the system while simultaneously ensuring the persistence of the structures and operation of the natural component. Given the complexity of this task owing to the diverse temporal and spatial scales and multidisciplinary interactions, a systems theory approach based on sound mathematical techniques is essential. Two important aspects of this approach are formulation of sustainability-based objectives and development of the management strategies. Fisher information can be used as the basis of a sustainability hypothesis to formulate relevant mathematical objectives for disparate systems, and optimal control theory provides the means to derive time-dependent management strategies. Partial correlation coefficient analysis is an efficient technique to identify the appropriate control variables for policy development. This paper represents a proof of concept for this approach using a model system that includes an ecosystem, humans, a very rudimentary industrial process, and a very simple agricultural system. Formulation and solution of the control problems help in identifying the effective management options which offer guidelines for policies in real systems. The results also emphasize that management using multiple parameters of different nature can be distinctly effective.

  7. The Effect of Slot-Code Optimization in Warehouse Order Picking

    Directory of Open Access Journals (Sweden)

    Andrea Fumi

    2013-07-01

    most appropriate material handling resource configuration. Building on previous work on the effect of slot-code optimization on travel times in single/dual command cycles, the authors broaden the scope to include the most general picking case, thus widening the range of applicability and realising former suggestions for future research.

  8. SPATIALLY SCALABLE RESOLUTION IMAGE CODING METHOD WITH MEMORY OPTIMIZATION BASED ON WAVELET TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    Wang Na; Zhang Li; Zhou Xiao'an; Jia Chuanying; Li Xia

    2005-01-01

    This letter exploits fundamental characteristics of a wavelet transform image to form a progressive octave-based spatial resolution. Each wavelet subband is coded based on zeroblock and quardtree partitioning ordering scheme with memory optimization technique. The method proposed in this letter is of low complexity and efficient for Internet plug-in software.

  9. RAID-6 reed-solomon codes with asymptotically optimal arithmetic complexities

    KAUST Repository

    Lin, Sian-Jheng

    2016-12-24

    In computer storage, RAID 6 is a level of RAID that can tolerate two failed drives. When RAID-6 is implemented by Reed-Solomon (RS) codes, the penalty of the writing performance is on the field multiplications in the second parity. In this paper, we present a configuration of the factors of the second-parity formula, such that the arithmetic complexity can reach the optimal complexity bound when the code length approaches infinity. In the proposed approach, the intermediate data used for the first parity is also utilized to calculate the second parity. To the best of our knowledge, this is the first approach supporting the RAID-6 RS codes to approach the optimal arithmetic complexity.

  10. An approach to implement PSO to optimize outage probability of coded cooperative communication with multiple relays

    Directory of Open Access Journals (Sweden)

    Sindhu Hak Gupta

    2016-09-01

    Full Text Available Coded Cooperative Communication is a novel concept and it is the solution to utilize the benefits of MIMO (Multiple Input Multiple Output gains on distributed scale. In this paper the outage behavior of coded cooperative communication with multiple relays is examined. The numerical expression for outage probability is derived. Nakagami-m fading statics is considered. Outage probability is observed to be function of various free and constrained parameters. An approach is presented to implement PSO and optimize the free parameters on which outage probability of coded cooperative communication with multiple relay depends. Analytical and Matlab simulation results reveal that the proposed technique outperforms Non Optimized technique and exhibit a promising performance.

  11. Determination of optimal period of absolute encoders with single track cyclic gray code

    Institute of Scientific and Technical Information of China (English)

    张帆; 朱衡君

    2008-01-01

    Low cost and miniaturized rotary encoders are important in automatic and precise production. Presented here is a code called Single Track Cyclic Gray Code (STCGC) that is an image etched on a single circular track of a rotary encoder disk read by a group of even spread reading heads to provide a unique codeword for every angular position and features such that every two adjacent words differ in exactly one component, thus avoiding coarse error. The existing construction or combination methods are helpful but not sufficient in determining the period of the STCGC of large word length and the theoretical approach needs further development to extend the word length. Three principles, such as the seed combination, short code removal and ergodicity examination were put forward that suffice determination of the optimal period for such absolute rotary encoders using STCGC with even spread heads. The optimal periods of STCGC in 3 through 29 bit length were determined and listed.

  12. Profitability and optimization of data management

    Energy Technology Data Exchange (ETDEWEB)

    Boussa, M. [Sonatrach, Alger (Algeria). Petroleum Engineering and Development

    2008-07-01

    Information systems and technologies for the oil and gas industry were discussed with particular reference to the use of data analysis in dynamic planning processes. This paper outlined the risks and challenges associated with reorganizing data systems and the costs associated with equipment and software purchases. Issues related to Intranet encryption and electronic commerce systems were also reviewed along with the impact of the Internet on the oil and gas industry. New methods for using real time data systems for updating well data were outlined together with recent developments in Intranet and Extranet technologies and services. Other topics of discussion included new software applications for network optimization and nodal analyses; industry-specific software developed for well testing and reservoir engineering; and simulation and management production software. Data management solutions for storing, retrieving and analyzing data streams were presented. It was concluded that successful organizations must develop accurate data systems in order to ensure continuing success. 4 refs., 8 figs.

  13. Parameter optimization of pulse compression in ultrasound imaging systems with coded excitation.

    Science.gov (United States)

    Behar, Vera; Adam, Dan

    2004-08-01

    A linear array imaging system with coded excitation is considered, where the proposed excitation/compression scheme maximizes the signal-to-noise ratio (SNR) and minimizes sidelobes at the output of the compression filter. A pulse with linear frequency modulation (LFM) is used for coded excitation. The excitation/compression scheme is based on the fast digital mismatched filtering. The parameter optimization of the excitation/compression scheme includes (i) choice of an optimal filtering function for the mismatched filtering; (ii) choice of an optimal window function for tapering of the chirp amplitude; (iii) optimization of a chirp-to-transducer bandwidth ratio; (iv) choice of an appropriate n-bit quantizer. The simulation results show that the excitation/compression scheme can be implemented as a Dolph-Chebyshev filter including amplitude tapering of the chirp with a Lanczos window. An example of such an optimized system is given where the chirp bandwidth is chosen to be 2.5 times the transducer bandwidth and equals 6 MHz: The sidelobes are suppressed to -80 dB, for a central frequency of 4 MHz, and to -94 dB, for a central frequency of 8 MHz. The corresponding improvement of the SNR is 18 and 21 dB, respectively, when compared to a conventional short pulse imaging system. Simulation of B-mode images demonstrates the advantage of coded excitation systems of detecting regions with low contrast.

  14. On Optimal Causal Coding of Partially Observed Markov Sources in Single and Multi-Terminal Settings

    CERN Document Server

    Yüksel, Serdar

    2010-01-01

    The optimal causal coding of a partially observed Markov process is studied, where the cost to be minimized is a bounded, non-negative, additive, measurable single-letter function of the source and the receiver output. A structural result is obtained extending Witsenhausen's and Walrand-Varaiya's structural results on the optimal real-time coders to a partially observed setting. The decentralized (multi-terminal) setup is also considered. For the case where the source is an i.i.d. process, it is shown that the design of optimal decentralized causal coding of correlated observations admits a separation. For Markov sources, a counterexample to a natural separation conjecture is presented. Applications in estimation and networked control problems are discussed, in the context of a linear, Gaussian setup.

  15. Codes cross-correlation analysis and data/pilot code pairs optimization for Galileo E1 OS and GPS L1C

    Institute of Scientific and Technical Information of China (English)

    Yang Zaixiu; Huang Zhigang; Geng Shengqun

    2013-01-01

    The Galileo E1 open service (OS) and the global positioning system (GPS) LIC are intending to use the multiplexed binary offset carrier (MBOC) modulation in E1/L1 band,including both pilot and data components.The impact of data and pilot codes cross-correlation on the distortion of the discriminator function (i.e.,the S-curve) is investigated,when only the pilot (or data)components of MBOC signals are tracked.It is shown that the modulation schemes and the receiver configuration (e.g.,the correlator spacing) strongly affect the S-curve bias.In this paper,two methods are proposed to optimize the data/pilot code pairs of Galileo E1 OS and GPS L1C.The optimization goal is to obtain the minimum average S-curve bias when tracking only the pilot components a the specific correlator spacing.Figures of merit,such as S-curve bias,correlation loss and code tracking variance have been adopted for analyzing and comparing the un-optimized and optimized code pairs.Simulation results show that the optimized data/pilot code pairs could significantly mitigate the intra-channel codes cross-correlation,and then improve the code tracking performance of MBOC signals.

  16. Methods for Distributed Optimal Energy Management

    DEFF Research Database (Denmark)

    Brehm, Robert

    The presented research deals with the fundamental underlying methods and concepts of how the growing number of distributed generation units based on renewable energy resources and distributed storage devices can be most efficiently integrated into the existing utility grid. In contrast to convent......The presented research deals with the fundamental underlying methods and concepts of how the growing number of distributed generation units based on renewable energy resources and distributed storage devices can be most efficiently integrated into the existing utility grid. In contrast...... to conventional centralised optimal energy flow management systems, here-in, focus is set on how optimal energy management can be achieved in a decentralised distributed architecture such as a multi-agent system. Distributed optimisation methods are introduced, targeting optimisation of energy flow in virtual...... micro-grids by prevention of meteorologic power flows into high voltage grids. A method, based on mathematical optimisation and a consensus algorithm is introduced and evaluated to coordinate charge/discharge scheduling for batteries between a number of buildings in order to improve self...

  17. Optimal Management of Geothermal Heat Extraction

    Science.gov (United States)

    Patel, I. H.; Bielicki, J. M.; Buscheck, T. A.

    2015-12-01

    Geothermal energy technologies use the constant heat flux from the subsurface in order to produce heat or electricity for societal use. As such, a geothermal energy system is not inherently variable, like systems based on wind and solar resources, and an operator can conceivably control the rate at which heat is extracted and used directly, or converted into a commodity that is used. Although geothermal heat is a renewable resource, this heat can be depleted over time if the rate of heat extraction exceeds the natural rate of renewal (Rybach, 2003). For heat extraction used for commodities that are sold on the market, sustainability entails balancing the rate at which the reservoir renews with the rate at which heat is extracted and converted into profit, on a net present value basis. We present a model that couples natural resource economic approaches for managing renewable resources with simulations of geothermal reservoir performance in order to develop an optimal heat mining strategy that balances economic gain with the performance and renewability of the reservoir. Similar optimal control approaches have been extensively studied for renewable natural resource management of fisheries and forests (Bonfil, 2005; Gordon, 1954; Weitzman, 2003). Those models determine an optimal path of extraction of fish or timber, by balancing the regeneration of stocks of fish or timber that are not harvested with the profit from the sale of the fish or timber that is harvested. Our model balances the regeneration of reservoir temperature with the net proceeds from extracting heat and converting it to electricity that is sold to consumers. We used the Non-isothermal Unconfined-confined Flow and Transport (NUFT) model (Hao, Sun, & Nitao, 2011) to simulate the performance of a sedimentary geothermal reservoir under a variety of geologic and operational situations. The results of NUFT are incorporated into the natural resource economics model to determine production strategies that

  18. Group Search Optimizer for the Mobile Location Management Problem

    Directory of Open Access Journals (Sweden)

    Dan Wang

    2014-01-01

    Full Text Available We propose a diversity-guided group search optimizer-based approach for solving the location management problem in mobile computing. The location management problem, which is to find the optimal network configurations of management under the mobile computing environment, is considered here as an optimization problem. The proposed diversity-guided group search optimizer algorithm is realized with the aid of diversity operator, which helps alleviate the premature convergence problem of group search optimizer algorithm, a successful optimization algorithm inspired by the animal behavior. To address the location management problem, diversity-guided group search optimizer algorithm is exploited to optimize network configurations of management by minimizing the sum of location update cost and location paging cost. Experimental results illustrate the effectiveness of the proposed approach.

  19. Group Search Optimizer for the Mobile Location Management Problem

    Science.gov (United States)

    Wang, Dan; Xiong, Congcong; Huang, Wei

    2014-01-01

    We propose a diversity-guided group search optimizer-based approach for solving the location management problem in mobile computing. The location management problem, which is to find the optimal network configurations of management under the mobile computing environment, is considered here as an optimization problem. The proposed diversity-guided group search optimizer algorithm is realized with the aid of diversity operator, which helps alleviate the premature convergence problem of group search optimizer algorithm, a successful optimization algorithm inspired by the animal behavior. To address the location management problem, diversity-guided group search optimizer algorithm is exploited to optimize network configurations of management by minimizing the sum of location update cost and location paging cost. Experimental results illustrate the effectiveness of the proposed approach. PMID:25180199

  20. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  1. RO-75: a FORTRAN code for calculation and design optimization of reverse osmosis seawater desalination plants

    Energy Technology Data Exchange (ETDEWEB)

    Glueckstern, P.; Reed, S.A.; Wilson, J.V.

    1976-11-01

    The reverse osmosis process has been used extensively for the conversion of brackish waters to potable water. The process is now nearing commercialization as a means for the conversion of seawater. The computer program (RO-75) is a Fortran code for the optimizatin of the design and economics of seawater reverse osmosis plants. The examples described are based on currently available, commercial membrane modules and prevailing prices. However, the code is very flexible and can be used to optimize plants utilizing future technological improvements and different economic parameters.

  2. PlayNCool: Opportunistic Network Coding for Local Optimization of Routing in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk;

    2013-01-01

    This paper introduces PlayNCool, an opportunistic protocol with local optimization based on network coding to increase the throughput of a wireless mesh network (WMN). PlayNCool aims to enhance current routing protocols by (i) allowing random linear network coding transmissions end-to-end, (ii...... in large scale mesh networks. We show that PlayNCool can provide gains of more than 3x in individual links, which translates into a large end-to-end throughput improvement, and that it provides higher gains when more nodes in the network contend for the channel at the MAC layer, making it particularly...... relevant for dense mesh networks....

  3. Autonomous tools for Grid management, monitoring and optimization

    CERN Document Server

    Wislicki, Wojciech

    2007-01-01

    We outline design and lines of development of autonomous tools for the computing Grid management, monitoring and optimization. The management is proposed to be based on the notion of utility. Grid optimization is considered to be application-oriented. A generic Grid simulator is proposed as an optimization tool for Grid structure and functionality.

  4. Development of Advanced In core Management Codes for Ulchin Unit 1, 2

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Hee; Park, Moon Gyu [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1996-12-31

    As the first case of FRAMATOME`s plants, Ulchin Unit 1 Cycle 8 core was loaded with V5H fuel. Because of the heterogeneity in the axial enrichment of the V5H fuel, FRAMATOME`s 2-D in core management codes, CEDRIC-CARIN-ESTHER, are no longer valid to be used for in core management. In order to analyze Ulchin Unit 1 an 2 cores loaded with V5H fuel exactly, this study substituted them with WH`s In core-3D and Tote code. The previous IN CORE-Tote codes have been utilized on the HP workstation or the IBM mainframe which are not easily accessible by the site engineers and require the complicated manipulation in the computer network system. This study developed the PC-version of IN CORE-3D and Tote codes available in plants, including an interface code linking the data measured by the in core instrument system (RIC-KIT system) to IN CORE code. These codes reduce the time to manage in core and increase the economic benefits. We installed the developed codes in Ulchin Unit 1 and 2 and actually applied them to the core power distribution measurement performed during Cycle 8 power escalation tests. The results satisfied all limits of Technical Specification very well. The major contents of this study can be categorized as follows. 1. Analysis of the in core management codes. (a) Analysis of flux mapping system and measurement reduction algorithm. (b) Analysis of the methodology of in core management codes. 2. Development and verification of PC-version in core management codes. (a) Development of the measured-data processing code (C2I). (b) Development of PC-version IN CORE code. (c) Development of PC-version Tote code (d) Verification of the developed codes. 3. Application to core physics test of Ulchin until cycle 8. (a) Power distribution measurement at 75% and 100%. (author). 14 refs., figs., tabs.

  5. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Qing [Univ. of Colorado, Colorado Springs, CO (United States); Whaley, Richard Clint [Univ. of Texas, San Antonio, TX (United States); Qasem, Apan [Texas State Univ., San Marcos, TX (United States); Quinlan, Daniel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-11-23

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis, identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.

  6. Optimal performance of networked control systems with bandwidth and coding constraints.

    Science.gov (United States)

    Zhan, Xi-Sheng; Sun, Xin-xiang; Li, Tao; Wu, Jie; Jiang, Xiao-Wei

    2015-11-01

    The optimal tracking performance of multiple-input multiple-output (MIMO) discrete-time networked control systems with bandwidth and coding constraints is studied in this paper. The optimal tracking performance of networked control system is obtained by using spectral factorization technique and partial fraction. The obtained results demonstrate that the optimal performance is influenced by the directions and locations of the nonminimum phase zeros and unstable poles of the given plant. In addition to that, the characters of the reference signal, encoding, the bandwidth and additive white Gaussian noise (AWGN) of the communication channel are also closely influenced by the optimal tracking performance. Some typical examples are given to illustrate the theoretical results.

  7. Optimization of energy saving device combined with a propeller using real-coded genetic algorithm

    Directory of Open Access Journals (Sweden)

    Ryu Tomohiro

    2014-06-01

    Full Text Available This paper presents a numerical optimization method to improve the performance of the propeller with Turbo-Ring using real-coded genetic algorithm. In the presented method, Unimodal Normal Distribution Crossover (UNDX and Minimal Generation Gap (MGG model are used as crossover operator and generation-alternation model, respectively. Propeller characteristics are evaluated by a simple surface panel method “SQCM” in the optimization process. Blade sections of the original Turbo-Ring and propeller are replaced by the NACA66 a = 0.8 section. However, original chord, skew, rake and maximum blade thickness distributions in the radial direction are unchanged. Pitch and maximum camber distributions in the radial direction are selected as the design variables. Optimization is conducted to maximize the efficiency of the propeller with Turbo-Ring. The experimental result shows that the efficiency of the optimized propeller with Turbo-Ring is higher than that of the original propeller with Turbo-Ring.

  8. Mean-Adaptive Real-Coding Genetic Algorithm and its Applications to Electromagnetic Optimization (Part One

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2007-09-01

    Full Text Available In the paper, a novel instance of the real-coding steady-state genetic algorithm, called the Mean-adaptive real-coding genetic algorithm, is put forward. In this instance, three novel implementations of evolution operators are incorporated. Those are a recombination and two mutation operators. All of the evolution operators are designed with the aim of possessing a big explorative power. Moreover, one of the mutation operators exhibits self-adaptive behavior and the other exhibits adaptive behavior, thereby allowing the algorithm to self-control its own mutability as the search advances. This algorithm also takes advantage of population-elitist selection, acting as a replacement policy, being adopted from evolution strategies. The purpose of this paper (i.e., the first part is to provide theoretical foundations of a robust and advanced instance of the real-coding genetic algorithm having the big potential of being successfully applied to electromagnetic optimization.

  9. Two-Layer Coding Rate Optimization in Relay-Aided Systems

    DEFF Research Database (Denmark)

    Sun, Fan

    2011-01-01

    We consider a three-node transmission system, where a source node conveys a data block to a destination node with the help of a half-duplex decode and-forward (DF) relay node. The whole data block is transmitted as a sequence of packets. For reliable transmission in the three-node system, a two...... different system performance requirements. For different objectives, two optimization problems are formulated and solutions are presented. One is to minimize the outage probability given the efficiency requirement, while the other one is to maximize the transmission efficiency given the outage probability...... requirement. Numerical results are also provided to show the optimized physical layer coding and network coding rate pairs in different system scenarios....

  10. Tunable wavefront coded imaging system based on detachable phase mask: Mathematical analysis, optimization and underlying applications

    Science.gov (United States)

    Zhao, Hui; Wei, Jingxuan

    2014-09-01

    The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.

  11. Model-Based Speech Signal Coding Using Optimized Temporal Decomposition for Storage and Broadcasting Applications

    Directory of Open Access Journals (Sweden)

    Chandranath R. N. Athaudage

    2003-09-01

    Full Text Available A dynamic programming-based optimization strategy for a temporal decomposition (TD model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%–60% compression of speech spectral information with negligible degradation in the decoded speech quality.

  12. Contrast and Comparison Between the Old and New Bar Code for Commodity Management Measures

    Institute of Scientific and Technical Information of China (English)

    Huang Xiaolin; Ma Jing

    2005-01-01

    @@ With the development of socialism market economy, the former Bar Code For Commodity Management Measures (being called Old Measures for short hereafter) issued by the National Bureau of Quality and Technical Supervision can not adapt to the requirement of managing for bar code for commodity.

  13. Techniques and Tools for Optimizing Codes on Modern Architectures: : A Low-Level Approach

    OpenAIRE

    2009-01-01

    This thesis describes novel techniques and test implementations for optimizing numerically intensive codes. Our main focus is on how given algorithms can be adapted to run efficiently on modern microprocessor exploring several architectural features including, instruction selection, and access patterns related to having several levels of cache. Our approach is also shown to be relevant for multicore architectures. Our primary target applications are linear algebra routines in the form of ma...

  14. Analysis and Optimization of Sparse Random Linear Network Coding for Reliable Multicast Services

    DEFF Research Database (Denmark)

    Tassi, Andrea; Chatzigeorgiou, Ioannis; Roetter, Daniel Enrique Lucani

    2016-01-01

    Point-to-multipoint communications are expected to play a pivotal role in next-generation networks. This paper refers to a cellular system transmitting layered multicast services to a multicast group of users. Reliability of communications is ensured via different random linear network coding (RL...... guarantees to predetermined fractions of users. The performance of the proposed optimization framework is then investigated in a LTE-A eMBMS network multicasting H.264/SVC video services....

  15. Remarks on the Criteria of Constructing MIMO-MAC DMT Optimal Codes

    CERN Document Server

    Hsiao-feng,; Lahtonen, Jyrki; Vehkalahti, Roope; Hollanti, Camilla

    2009-01-01

    In this paper we investigate the criteria proposed by Coronel et al. for constructing MIMO MAC-DMT optimal codes over several classes of fading channels. We first give a counterexample showing their DMT result is not correct when the channel is frequency-selective. For the case of symmetric MIMO-MAC flat fading channels, their DMT result reduces to exactly the same as that derived by Tse et al., and we therefore focus on their criteria for constructing MAC-DMT optimal codes, especially when the number of receive antennas is sufficiently large. In such case, we show their criterion is equivalent to requiring the codes of any subset of users to satisfy a joint non-vanishing determinant criterion when the system operates in the antenna pooling regime. Finally an upper bound on the product of minimum eigenvalues of the difference matrices is provided, and is used to show any MIMO-MAC codes satisfying their criterion can possibly exist only when the target multiplexing gain is small.

  16. Optimal management strategies for placenta accreta.

    Science.gov (United States)

    Eller, A G; Porter, T F; Soisson, P; Silver, R M

    2009-04-01

    To determine which interventions for managing placenta accreta were associated with reduced maternal morbidity. Retrospective cohort study. Two tertiary care teaching hospitals in Utah. All identified cases of placenta accreta from 1996 to 2008. Cases of placenta accreta were identified using standard ICD-9 codes for placenta accreta, placenta praevia, and caesarean hysterectomy. Medical records were then abstracted for maternal medical history, hospital course, and maternal and neonatal outcomes. Maternal and neonatal complications were compared according to antenatal suspicion of accreta, indications for delivery, preoperative preparation, attempts at placental removal before hysterectomy, and hypogastric artery ligation. Early morbidity (prolonged maternal intensive care unit admission, large volume of blood transfusion, coagulopathy, ureteral injury, or early re-operation) and late morbidity (intra-abdominal infection, hospital re-admission, or need for delayed re-operation). Results Seventy-six cases of placenta accreta were identified. When accreta was suspected, scheduled caesarean hysterectomy without attempting placental removal was associated with a significantly reduced rate of early morbidity compared with cases in which placental removal was attempted (67 versus 36%, P=0.038). Women with preoperative bilateral ureteric stents had a lower incidence of early morbidity compared with women without stents (18 versus 55%, P=0.018). Hypogastric artery ligation did not reduce maternal morbidity. Scheduled caesarean hysterectomy with preoperative ureteric stent placement and avoiding attempted placental removal are associated with reduced maternal morbidity in women with suspected placenta accreta.

  17. Service Operations Optimization: Recent Development in Supply Chain Management

    OpenAIRE

    Bin Shen

    2015-01-01

    Services are the key of success in operation management. Designing the effective strategies by optimization techniques is the fundamental and important condition for performance increase in service operations (SOs) management. In this paper, we mainly focus on investigating SOs optimization in the areas of supply chain management, which create the greatest business values. Specifically, we study the recent development of SOs optimization associated with supply chain by categorizing them into ...

  18. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)

    2016-01-15

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation

  19. Optimization of Channel Coding for Transmitted Image Using Quincunx Wavelets Transforms Compression

    Directory of Open Access Journals (Sweden)

    Mustapha Khelifi

    2016-05-01

    Full Text Available Many images you see on the Internet today have undergone compression for various reasons. Image compression can benefit users by having pictures load faster and webpages use up less space on a Web host. Image compression does not reduce the physical size of an image but instead compresses the data that makes up the image into a smaller size. In case of image transmission the noise will decrease the quality of recivide image which obliges us to use channel coding techniques to protect our data against the channel noise. The Reed-Solomon code is one of the most popular channel coding techniques used to correct errors in many systems ((Wireless or mobile communications, Satellite communications, Digital television / DVB,High-speed modems such as ADSL, xDSL, etc.. Since there is lot of possibilities to select the input parameters of RS code this will make us concerned about the optimum input that can protect our data with minimum number of redundant bits. In this paper we are going to use the genetic algorithm to optimize in the selction of input parameters of RS code acording to the channel conditions wich reduce the number of bits needed to protect our data with hight quality of received image.

  20. Dynamic optimization the calculus of variations and optimal control in economics and management

    CERN Document Server

    Kamien, Morton I

    2012-01-01

    Since its initial publication, this text has defined courses in dynamic optimization taught to economics and management science students. The two-part treatment covers the calculus of variations and optimal control. 1998 edition.

  1. Optimization of Mutation Pressure in Relation to Properties of Protein-Coding Sequences in Bacterial Genomes.

    Science.gov (United States)

    Błażej, Paweł; Miasojedow, Błażej; Grabińska, Małgorzata; Mackiewicz, Paweł

    2015-01-01

    Most mutations are deleterious and require energetically costly repairs. Therefore, it seems that any minimization of mutation rate is beneficial. On the other hand, mutations generate genetic diversity indispensable for evolution and adaptation of organisms to changing environmental conditions. Thus, it is expected that a spontaneous mutational pressure should be an optimal compromise between these two extremes. In order to study the optimization of the pressure, we compared mutational transition probability matrices from bacterial genomes with artificial matrices fulfilling the same general features as the real ones, e.g., the stationary distribution and the speed of convergence to the stationarity. The artificial matrices were optimized on real protein-coding sequences based on Evolutionary Strategies approach to minimize or maximize the probability of non-synonymous substitutions and costs of amino acid replacements depending on their physicochemical properties. The results show that the empirical matrices have a tendency to minimize the effects of mutations rather than maximize their costs on the amino acid level. They were also similar to the optimized artificial matrices in the nucleotide substitution pattern, especially the high transitions/transversions ratio. We observed no substantial differences between the effects of mutational matrices on protein-coding sequences in genomes under study in respect of differently replicated DNA strands, mutational cost types and properties of the referenced artificial matrices. The findings indicate that the empirical mutational matrices are rather adapted to minimize mutational costs in the studied organisms in comparison to other matrices with similar mathematical constraints.

  2. Optimization of Mutation Pressure in Relation to Properties of Protein-Coding Sequences in Bacterial Genomes.

    Directory of Open Access Journals (Sweden)

    Paweł Błażej

    Full Text Available Most mutations are deleterious and require energetically costly repairs. Therefore, it seems that any minimization of mutation rate is beneficial. On the other hand, mutations generate genetic diversity indispensable for evolution and adaptation of organisms to changing environmental conditions. Thus, it is expected that a spontaneous mutational pressure should be an optimal compromise between these two extremes. In order to study the optimization of the pressure, we compared mutational transition probability matrices from bacterial genomes with artificial matrices fulfilling the same general features as the real ones, e.g., the stationary distribution and the speed of convergence to the stationarity. The artificial matrices were optimized on real protein-coding sequences based on Evolutionary Strategies approach to minimize or maximize the probability of non-synonymous substitutions and costs of amino acid replacements depending on their physicochemical properties. The results show that the empirical matrices have a tendency to minimize the effects of mutations rather than maximize their costs on the amino acid level. They were also similar to the optimized artificial matrices in the nucleotide substitution pattern, especially the high transitions/transversions ratio. We observed no substantial differences between the effects of mutational matrices on protein-coding sequences in genomes under study in respect of differently replicated DNA strands, mutational cost types and properties of the referenced artificial matrices. The findings indicate that the empirical mutational matrices are rather adapted to minimize mutational costs in the studied organisms in comparison to other matrices with similar mathematical constraints.

  3. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    Science.gov (United States)

    VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  4. MPEG-2/4 Low-Complexity Advanced Audio Coding Optimization and Implementation on DSP

    Science.gov (United States)

    Wu, Bing-Fei; Huang, Hao-Yu; Chen, Yen-Lin; Peng, Hsin-Yuan; Huang, Jia-Hsiung

    This study presents several optimization approaches for the MPEG-2/4 Audio Advanced Coding (AAC) Low Complexity (LC) encoding and decoding processes. Considering the power consumption and the peripherals required for consumer electronics, this study adopts the TI OMAP5912 platform for portable devices. An important optimization issue for implementing AAC codec on embedded and mobile devices is to reduce computational complexity and memory consumption. Due to power saving issues, most embedded and mobile systems can only provide very limited computational power and memory resources for the coding process. As a result, modifying and simplifying only one or two blocks is insufficient for optimizing the AAC encoder and enabling it to work well on embedded systems. It is therefore necessary to enhance the computational efficiency of other important modules in the encoding algorithm. This study focuses on optimizing the Temporal Noise Shaping (TNS), Mid/Side (M/S) Stereo, Modified Discrete Cosine Transform (MDCT) and Inverse Quantization (IQ) modules in the encoder and decoder. Furthermore, we also propose an efficient memory reduction approach that provides a satisfactory balance between the reduction of memory usage and the expansion of the encoded files. In the proposed design, both the AAC encoder and decoder are built with fixed-point arithmetic operations and implemented on a DSP processor combined with an ARM-core for peripheral controlling. Experimental results demonstrate that the proposed AAC codec is computationally effective, has low memory consumption, and is suitable for low-cost embedded and mobile applications.

  5. Finite population analysis of the effect of horizontal gene transfer on the origin of an universal and optimal genetic code

    Science.gov (United States)

    Aggarwal, Neha; Vishwa Bandhu, Ashutosh; Sengupta, Supratim

    2016-06-01

    The origin of a universal and optimal genetic code remains a compelling mystery in molecular biology and marks an essential step in the origin of DNA and protein based life. We examine a collective evolution model of genetic code origin that allows for unconstrained horizontal transfer of genetic elements within a finite population of sequences each of which is associated with a genetic code selected from a pool of primordial codes. We find that when horizontal transfer of genetic elements is incorporated in this more realistic model of code-sequence coevolution in a finite population, it can increase the likelihood of emergence of a more optimal code eventually leading to its universality through fixation in the population. The establishment of such an optimal code depends on the probability of HGT events. Only when the probability of HGT events is above a critical threshold, we find that the ten amino acid code having a structure that is most consistent with the standard genetic code (SGC) often gets fixed in the population with the highest probability. We examine how the threshold is determined by factors like the population size, length of the sequences and selection coefficient. Our simulation results reveal the conditions under which sharing of coding innovations through horizontal transfer of genetic elements may have facilitated the emergence of a universal code having a structure similar to that of the SGC.

  6. Finite population analysis of the effect of horizontal gene transfer on the origin of an universal and optimal genetic code.

    Science.gov (United States)

    Aggarwal, Neha; Bandhu, Ashutosh Vishwa; Sengupta, Supratim

    2016-05-27

    The origin of a universal and optimal genetic code remains a compelling mystery in molecular biology and marks an essential step in the origin of DNA and protein based life. We examine a collective evolution model of genetic code origin that allows for unconstrained horizontal transfer of genetic elements within a finite population of sequences each of which is associated with a genetic code selected from a pool of primordial codes. We find that when horizontal transfer of genetic elements is incorporated in this more realistic model of code-sequence coevolution in a finite population, it can increase the likelihood of emergence of a more optimal code eventually leading to its universality through fixation in the population. The establishment of such an optimal code depends on the probability of HGT events. Only when the probability of HGT events is above a critical threshold, we find that the ten amino acid code having a structure that is most consistent with the standard genetic code (SGC) often gets fixed in the population with the highest probability. We examine how the threshold is determined by factors like the population size, length of the sequences and selection coefficient. Our simulation results reveal the conditions under which sharing of coding innovations through horizontal transfer of genetic elements may have facilitated the emergence of a universal code having a structure similar to that of the SGC.

  7. Optimized Irregular Low-Density Parity-Check Codes for Multicarrier Modulations over Frequency-Selective Channels

    Directory of Open Access Journals (Sweden)

    Gelle Guillaume

    2004-01-01

    Full Text Available This paper deals with optimized channel coding for OFDM transmissions (COFDM over frequency-selective channels using irregular low-density parity-check (LDPC codes. Firstly, we introduce a new characterization of the LDPC code irregularity called “irregularity profile.” Then, using this parameterization, we derive a new criterion based on the minimization of the transmission bit error probability to design an irregular LDPC code suited to the frequency selectivity of the channel. The optimization of this criterion is done using the Gaussian approximation technique. Simulations illustrate the good performance of our approach for different transmission channels.

  8. Simple PSF based method for pupil phase mask's optimization in wavefront coding system

    Institute of Scientific and Technical Information of China (English)

    ZHANG Wen-zi; CHEN Yan-ping; ZHAO Ting-yu; YE Zi; YU Fei-hong

    2007-01-01

    By applying the wavefront coding technique to an optical system, the depth of focus can be greatly increased. Several complicated methods, such as Fisher Information based method, have already been taken to optimize for the best pupil phase mask in ideal condition. Here one simple point spread function (PSF) based method with only the standard deviation method used to evaluate the PSF stability over the depth of focus is taken to optimize for the best coefficients of pupil phase mask in practical optical systems. Results of imaging simulations for optical systems with and without pupil phase mask are presented, and the sharpness of image is calculated for comparison. The optimized results showed better and much more stable imaging quality over the original system without changing the position of the image plane.

  9. Simple Strehl ratio based method for pupil phase mask's optimization in wavefront coding system

    Institute of Scientific and Technical Information of China (English)

    Wenzi Zhang; Yanping Chen; Tingyu Zhao; Zi Ye; Feihong Yu

    2006-01-01

    @@ By applying the wavefront coding technique to an optical system,the depth of focus can be greatly increased.Several complicated methods have already been taken to optimize for the best pupil phase mask in ideal condition.Here a simple Strehl ratio based method with only the standard deviation method used to evaluate the Strehl ratio stability over the depth of focus is applied to optimize for the best coefficients of pupil phase mask in practical optical systems.Results of imaging simulations for optical systems with and without pupil phase mask are presented,and the sharpness of image is calculated for comparison.The optimized pupil phase mask shows good results in extending the depth of focus.

  10. OPTIMAL ANTENNA SUBSET SELECTION AND BLIND DETECTION APPROACH APPLIED TO ORTHOGONAL SPACE-TIME BLOCK CODING

    Institute of Scientific and Technical Information of China (English)

    Xu Hongji; Liu Ju; Gu Bo

    2007-01-01

    An approach combining optimal antenna subset selection with blind detection scheme for Orthogonal Space-Time Block Coding (OSTBC) is proposed in this paper. The optimal antenna subset selection is taken into account at transmitter and/or receiver sides, which chooses the optimal antennas to increase the diversity order of OSTBC and improve further its performance. In order to enhance the robustness of the detection used in the conventional OSTBC scheme, a blind detection scheme based on Independent Component Analysis (ICA) is exploited which can directly extract transmitted signals without channel estimation. Performance analysis shows that the proposed approach can achieve the full diversity and the flexibility of system design by using the antenna selection and the ICA based blind detection schemes.

  11. A optimized context-based adaptive binary arithmetic coding algorithm in progressive H.264 encoder

    Science.gov (United States)

    Xiao, Guang; Shi, Xu-li; An, Ping; Zhang, Zhao-yang; Gao, Ge; Teng, Guo-wei

    2006-05-01

    Context-based Adaptive Binary Arithmetic Coding (CABAC) is a new entropy coding method presented in H.264/AVC that is highly efficient in video coding. In the method, the probability of current symbol is estimated by using the wisely designed context model, which is adaptive and can approach to the statistic characteristic. Then an arithmetic coding mechanism largely reduces the redundancy in inter-symbol. Compared with UVLC method in the prior standard, CABAC is complicated but efficiently reduce the bit rate. Based on thorough analysis of coding and decoding methods of CABAC, This paper proposed two methods, sub-table method and stream-reuse methods, to improve the encoding efficiency implemented in H.264 JM code. In JM, the CABAC function produces bits one by one of every syntactic element. Multiplication operating times after times in the CABAC function lead to it inefficient.The proposed algorithm creates tables beforehand and then produce every bits of syntactic element. In JM, intra-prediction and inter-prediction mode selection algorithm with different criterion is based on RDO(rate distortion optimization) model. One of the parameter of the RDO model is bit rate that is produced by CABAC operator. After intra-prediction or inter-prediction mode selection, the CABAC stream is discard and is recalculated to output stream. The proposed Stream-reuse algorithm puts the stream in memory that is created in mode selection algorithm and reuses it in encoding function. Experiment results show that our proposed algorithm can averagely speed up 17 to 78 MSEL higher speed for QCIF and CIF sequences individually compared with the original algorithm of JM at the cost of only a little memory space. The CABAC was realized in our progressive h.264 encoder.

  12. Optimizing the Search for High-z GRBs: The JANUS X-ray Coded Aperture Telescope

    CERN Document Server

    Burrows, D N; Palmer, D; Romano, P; Mangano, V; La Parola, V; Falcone, A D; Roming, P W A

    2011-01-01

    We discuss the optimization of gamma-ray burst (GRB) detectors with a goal of maximizing the detected number of bright high-redshift GRBs, in the context of design studies conducted for the X-ray transient detector on the JANUS mission. We conclude that the optimal energy band for detection of high-z GRBs is below about 30 keV. We considered both lobster-eye and coded aperture designs operating in this energy band. Within the available mass and power constraints, we found that the coded aperture mask was preferred for the detection of high-z bursts with bright enough afterglows to probe galaxies in the era of the Cosmic Dawn. This initial conclusion was confirmed through detailed mission simulations that found that the selected design (an X-ray Coded Aperture Telescope) would detect four times as many bright, high-z GRBs as the lobster-eye design we considered. The JANUS XCAT instrument will detect 48 GRBs with z > 5 and fluence Sx > 3 {\\times} 10-7 erg cm-2 in a two year mission.

  13. Multiple Description Coding Based on Optimized Redundancy Removal for 3D Depth Map

    Directory of Open Access Journals (Sweden)

    Sen Han

    2016-06-01

    Full Text Available Multiple description (MD coding is a promising alternative for the robust transmission of information over error-prone channels. In 3D image technology, the depth map represents the distance between the camera and objects in the scene. Using the depth map combined with the existing multiview image, it can be efficient to synthesize images of any virtual viewpoint position, which can display more realistic 3D scenes. Differently from the conventional 2D texture image, the depth map contains a lot of spatial redundancy information, which is not necessary for view synthesis, but may result in the waste of compressed bits, especially when using MD coding for robust transmission. In this paper, we focus on the redundancy removal of MD coding based on the DCT (discrete cosine transform domain. In view of the characteristics of DCT coefficients, at the encoder, a Lagrange optimization approach is designed to determine the amounts of high frequency coefficients in the DCT domain to be removed. It is noted considering the low computing complexity that the entropy is adopted to estimate the bit rate in the optimization. Furthermore, at the decoder, adaptive zero-padding is applied to reconstruct the depth map when some information is lost. The experimental results have shown that compared to the corresponding scheme, the proposed method demonstrates better rate central and side distortion performance.

  14. A Hierarchical Joint Optimized Bit—allocation Strategy for HDTV Encoder with Parallel Coding Architecture

    Institute of Scientific and Technical Information of China (English)

    XIONGHongkai; YUSongyu; YEWei

    2003-01-01

    Because real-time compression and high-speed digital processing circuitry are crucial for digital high definition television (HDTV) coding, parallel processing has become a feasible scheme in most applications as yet. This paper presents a novel bit-allocation strategy for an HDTV encoder system with parallel architecture, in which the original HDTV-picture is divided into six hor-izontal sub-pictures. It is shown that the MPEG-2 Test Model 5 (TMS) rate control scheme would not only give rise to non-consistent sub-pictures visual quality in a com-posite HDTV frame, but also make the coding quality de-grade abruptly and the buffer underfiow at scene changes.How to allocate bit-rates among sub-pictures becomes a great challenge in literatures. The proposed strategy is dedicated to a hierarchical joint optimized bit-allocation with sub-pictures' average complexity and average bits measure, and moreover, capable of alleviating serious pic-ture quality inconsistence at scene changes. The optimized bit-allocation and its complementary rate adaptive proce-dures are formulated and described. In the paper, the pro-posed strategy is compared with the independent coding,in which each sub-picture sequence is assigned the same proportion of the channel bandwidth. Experimental re-suits demonstrate the effectiveness of the proposed scheme not only alleviates the boundary effect but also promises the sub-pictures quality consistency.

  15. Optimal management of genital herpes: current perspectives

    Directory of Open Access Journals (Sweden)

    Sauerbrei A

    2016-06-01

    Full Text Available Andreas Sauerbrei Institute of Virology and Antiviral Therapy, German Consulting Laboratory for Herpes Simplex Virus and Varicella-Zoster Virus, Jena University Hospital, Friedrich-Schiller University of Jena, Jena, Germany Abstract: As one of the most common sexually transmitted diseases, genital herpes is a global medical problem with significant physical and psychological morbidity. Genital herpes is caused by herpes simplex virus type 1 or type 2 and can manifest as primary and/or recurrent infection. This manuscript provides an overview about the fundamental knowledge on the virus, its epidemiology, and infection. Furthermore, the current possibilities of antiviral therapeutic interventions and laboratory diagnosis of genital herpes as well as the present situation and perspectives for the treatment by novel antivirals and prevention of disease by vaccination are presented. Since the medical management of patients with genital herpes simplex virus infection is often unsatisfactory, this review aims at all physicians and health professionals who are involved in the care of patients with genital herpes. The information provided would help to improve the counseling of affected patients and to optimize the diagnosis, treatment, and prevention of this particular disease. Keywords: herpes simplex virus, epidemiology, infection, antiviral therapy, laboratory diagnosis, prevention

  16. Optimal management of idiopathic scoliosis in adolescence

    Directory of Open Access Journals (Sweden)

    Kotwicki T

    2013-07-01

    the need for surgical treatment. Surgery is the treatment of choice for severe idiopathic scoliosis which is rapidly progressive, with early onset, late diagnosis, and neglected or failed conservative treatment. The psychologic impact of idiopathic scoliosis, a chronic disease occurring in the psychologically fragile period of adolescence, is important because of its body distorting character and the onerous treatment required, either conservative or surgical. Optimal management of idiopathic scoliosis requires cooperation within a professional team which includes the entire therapeutic spectrum, extending from simple watchful observation of nonprogressive mild deformities through to early surgery for rapidly deteriorating curvature. Probably most demanding is adequate management with regard to the individual course of the disease in a given patient, while avoiding overtreatment or undertreatment. Keywords: management, idiopathic scoliosis, adolescence

  17. Dynamic systems of regional economy management optimization

    Science.gov (United States)

    Trofimov, S.; Kudzh, S.

    directions of an industrial policy of region. The situational-analytical centers (SAC) of regional administration The major component of SAC is dynamic modeling, analysis, forecasting and optimization systems, based on modern intellectual information technologies. Spheres of SAC are not only financial streams management and investments optimization, but also strategic forecasting functions, which provide an optimum choice, "aiming", search of optimum ways of regional development and corresponding investments. It is expedient to consider an opportunity of formation of the uniform organizational-methodical center of an industrial policy of region. This organization can be directly connected to the scheduled-analytical services of the largest economic structures, local authorities, the ministries and departments. Such "direct communication" is capable to provide an effective regional development strategic management. Anyway, the output on foreign markets demands concentration of resources and support of authorities. Offered measures are capable to provide a necessary coordination of efforts of a various level economic structures. For maintenance of a regional industrial policy an attraction of all newest methods of strategic planning and management is necessary. Their activity should be constructed on the basis of modern approaches of economic systems management, cause the essence of an industrial policy is finally reduced to an effective regional and corporate economic activities control centers formation. Opportunities of optimum regional economy planning and management as uniform system Approaches to planning regional economic systems can be different. We will consider some most effective methods of planning and control over a regional facilities condition. All of them are compact and evident, that allows to put them into the group of average complexity technologies. At the decision of problems of a regional resource management is rather perspective the so

  18. A wavelet packet based block-partitioning image coding algorithm with rate-distortion optimization

    Institute of Scientific and Technical Information of China (English)

    YANG YongMing; XU Chao

    2008-01-01

    As an elegant generalization of wavelet transform, wavelet packet (WP) provides an effective representation tool for adaptive waveform analysis. Recent work shows that image-coding methods based on WP decomposition can achieve significant gain over those based on a usual wavelet transform. However, most of the work adopts a tree-structured quantization scheme, which is a successful technique for wavelet image coding, but not appropriate for WP subbands. This paper presents an image-coding algorithm based on a rate-distortion optimized wavelet packet decomposition and on an intraband block-partitioning scheme. By encoding each WP subband separately with the block-partitioning algorithm and the JPEG2000 context modeling, the proposed algorithm naturally avoids the difficulty in defining parent-offspring relationships for the WP coefficients, which has to be faced when adopting the tree-structured quantization scheme. The experimental results show that the proposed algorithm significantly outperforms SPIHT and JPEG2000 schemes and also surpasses state-of-the-art WP image coding algorithms, in terms of both PSNR and visual quality.

  19. Jar Decoding: Non-Asymptotic Converse Coding Theorems, Taylor-Type Expansion, and Optimality

    CERN Document Server

    Yang, En-Hui

    2012-01-01

    Recently, a new decoding rule called jar decoding was proposed; under jar decoding, a non-asymptotic achievable tradeoff between the coding rate and word error probability was also established for any discrete input memoryless channel with discrete or continuous output (DIMC). Along the path of non-asymptotic analysis, in this paper, it is further shown that jar decoding is actually optimal up to the second order coding performance by establishing new non-asymptotic converse coding theorems, and determining the Taylor expansion of the (best) coding rate $R_n (\\epsilon)$ of finite block length for any block length $n$ and word error probability $\\epsilon$ up to the second order. Finally, based on the Taylor-type expansion and the new converses, two approximation formulas for $R_n (\\epsilon)$ (dubbed "SO" and "NEP") are provided; they are further evaluated and compared against some of the best bounds known so far, as well as the normal approximation of $R_n (\\epsilon)$ revisited recently in the literature. It t...

  20. The Optimal Management of Informational Servicing Logistic Systems

    Directory of Open Access Journals (Sweden)

    Safwan Al SALAIMEH

    2007-01-01

    Full Text Available This paper reviews optimization problems of informational servicing logistic systems (ISLS management in problems class, which can be solved by the queuing system (QS theory. Examples of mathematics models building and effective algorithm development for quasi-optimal management of informational servicing logistic systems are presented.

  1. Optimizing performance of superscalar codes for a single Cray X1MSP processor

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Oliker, Leonid

    2004-06-08

    The growing gap between sustained and peak performance for full-scale complex scientific applications on conventional supercomputers is a major concern in high performance computing. The recently-released vector-based Cray X1 offers to bridge this gap for many demanding scientific applications. However, this unique architecture contains both data caches and multi-streaming processing units, and the optimal programming methodology is still under investigation. In this paper we investigate Cray X1 code optimization for a suite of computational kernels originally designed for superscalar processors. For our study, we select four applications from the SPLASH2 application suite (1-D FFT,Radix, Ocean, and Nbody), two kernels from the NAS benchmark suite (3-DFFT and CG), and a matrix-matrix multiplication kernel. Results show that for many cases, the addition of vectorization compiler directives results faster runtimes. However, to achieve a significant performance improvement via increased vector length, it is often necessary to restructure the program at the source level sometimes leading to algorithmic level transformations. Additionally, memory bank conflicts may result in substantial performance losses. These conflicts can often be exacerbated when optimizing code for increased vector lengths, and must be explicitly minimized. Finally, we investigate the relationship of the X1 data caches on overall performance.

  2. A treatment planning code for inverse planning and 3D optimization in hadrontherapy.

    Science.gov (United States)

    Bourhaleb, F; Marchetto, F; Attili, A; Pittà, G; Cirio, R; Donetti, M; Giordanengo, S; Givehchi, N; Iliescu, S; Krengli, M; La Rosa, A; Massai, D; Pecka, A; Pardo, J; Peroni, C

    2008-09-01

    The therapeutic use of protons and ions, especially carbon ions, is a new technique and a challenge to conform the dose to the target due to the energy deposition characteristics of hadron beams. An appropriate treatment planning system (TPS) is strictly necessary to take full advantage. We developed a TPS software, ANCOD++, for the evaluation of the optimal conformal dose. ANCOD++ is an analytical code using the voxel-scan technique as an active method to deliver the dose to the patient, and provides treatment plans with both proton and carbon ion beams. The iterative algorithm, coded in C++ and running on Unix/Linux platform, allows the determination of the best fluences of the individual beams to obtain an optimal physical dose distribution, delivering a maximum dose to the target volume and a minimum dose to critical structures. The TPS is supported by Monte Carlo simulations with the package GEANT3 to provide the necessary physical lookup tables and verify the optimized treatment plans. Dose verifications done by means of full Monte Carlo simulations show an overall good agreement with the treatment planning calculations. We stress the fact that the purpose of this work is the verification of the physical dose and a next work will be dedicated to the radiobiological evaluation of the equivalent biological dose.

  3. Bandwidth optimization of a Planar Inverted-F Antenna using binary and real coded genetic algorithms

    Institute of Scientific and Technical Information of China (English)

    AMEERUDDEN Mohammad Riyad; RUGHOOPUTH Harry C S

    2009-01-01

    With the exponential development of mobile communications and the miniaturization of radio frequency transceivers, the need for small and low profile antennas at mobile frequencies is constantly growing. Therefore, new antennas should be developed to provide larger bandwidth and at the same time small dimensions. Although the gain in bandwidth performances of an antenna are directly related to its dimensions in relation to the wavelength, the aim is to keep the overall size of the antenna constant and from there, find the geometry and structure that give the best performance. The design and bandwidth optimization of a Planar Inverted-F Antenna (PIFA) were introduced in order to achieve a larger bandwidth in the 2 GHz band, using two optimization techniques based upon genetic algorithms (GA), namely the Binary Coded GA (BCGA) and Real-Coded GA (RCGA). During the optimization process, the different PIFA models were evaluated using the finite-difference time domain (FDTD) method-a technique belonging to the general class of differential time domain numerical modeling methods.

  4. An Optimization Model for Design of Asphalt Pavements Based on IHAP Code Number 234

    Directory of Open Access Journals (Sweden)

    Ali Reza Ghanizadeh

    2016-01-01

    Full Text Available Pavement construction is one of the most costly parts of transportation infrastructures. Incommensurate design and construction of pavements, in addition to the loss of the initial investment, would impose indirect costs to the road users and reduce road safety. This paper aims to propose an optimization model to determine the optimal configuration as well as the optimum thickness of different pavement layers based on the Iran Highway Asphalt Paving Code Number 234 (IHAP Code 234. After developing the optimization model, the optimum thickness of pavement layers for secondary rural roads, major rural roads, and freeways was determined based on the recommended prices in “Basic Price List for Road, Runway and Railway” of Iran in 2015 and several charts were developed to determine the optimum thickness of pavement layers including asphalt concrete, granular base, and granular subbase with respect to road classification, design traffic, and resilient modulus of subgrade. Design charts confirm that in the current situation (material prices in 2015, application of asphalt treated layer in pavement structure is not cost effective. Also it was shown that, with increasing the strength of subgrade soil, the subbase layer may be removed from the optimum structure of pavement.

  5. Optimum Power and Rate Allocation for Coded V-BLAST: Average Optimization

    CERN Document Server

    Kostina, Victoria

    2010-01-01

    An analytical framework for performance analysis and optimization of coded V-BLAST is developed. Average power and/or rate allocations to minimize the outage probability as well as their robustness and dual problems are investigated. Compact, closed-form expressions for the optimum allocations and corresponding system performance are given. The uniform power allocation is shown to be near optimum in the low outage regime in combination with the optimum rate allocation. The average rate allocation provides the largest performance improvement (extra diversity gain), and the average power allocation offers a modest SNR gain limited by the number of transmit antennas but does not increase the diversity gain. The dual problems are shown to have the same solutions as the primal ones. All these allocation strategies are shown to be robust. The reported results also apply to coded multiuser detection and channel equalization systems relying on successive interference cancelation.

  6. Three-dimensional polarization marked multiple-QR code encryption by optimizing a single vectorial beam

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Hua, Binbin; Wang, Zhisong

    2015-10-01

    We demonstrate the feasibility of three dimensional (3D) polarization multiplexing by optimizing a single vectorial beam using a multiple-signal window multiple-plane (MSW-MP) phase retrieval algorithm. Original messages represented with multiple quick response (QR) codes are first partitioned into a series of subblocks. Then, each subblock is marked with a specific polarization state and randomly distributed in 3D space with both longitudinal and transversal adjustable freedoms. A generalized 3D polarization mapping protocol is established to generate a 3D polarization key. Finally, multiple-QR code is encrypted into one phase only mask and one polarization only mask based on the modified Gerchberg-Saxton (GS) algorithm. We take the polarization mask as the cyphertext and the phase only mask as additional dimension of key. Only when both the phase key and 3D polarization key are correct, original messages can be recovered. We verify our proposal with both simulation and experiment evidences.

  7. Lossless image compression based on optimal prediction, adaptive lifting, and conditional arithmetic coding.

    Science.gov (United States)

    Boulgouris, N V; Tzovaras, D; Strintzis, M G

    2001-01-01

    The optimal predictors of a lifting scheme in the general n-dimensional case are obtained and applied for the lossless compression of still images using first quincunx sampling and then simple row-column sampling. In each case, the efficiency of the linear predictors is enhanced nonlinearly. Directional postprocessing is used in the quincunx case, and adaptive-length postprocessing in the row-column case. Both methods are seen to perform well. The resulting nonlinear interpolation schemes achieve extremely efficient image decorrelation. We further investigate context modeling and adaptive arithmetic coding of wavelet coefficients in a lossless compression framework. Special attention is given to the modeling contexts and the adaptation of the arithmetic coder to the actual data. Experimental evaluation shows that the best of the resulting coders produces better results than other known algorithms for multiresolution-based lossless image coding.

  8. EELQMS - the European quality management system for engine lubricants - the ATC Code of Practice; EELQMS - Das Europaeische Qualitaets-Management System fuer Motorenoele - der ATC Code of Practice

    Energy Technology Data Exchange (ETDEWEB)

    Raddatz, J.H.; Eberan-Eberhorst, C.G.A. von

    1998-01-01

    In 1995 the ATC developed a Code of Practice which, in conjunction with the ATIEL Code of Practice, represents the basis for the European Engine Lubricant Quality Management System (EELQMS). Compliance with the requirements of this system is a prerequisite for performance claims made by engine oil marketers regarding the European ACEA Engine Oil Sequences. TAD, the German section of the Technical Committee of Petroleum Additive Manufacturers in Europe (ATC), has prepared this presentation in order to promote the dialogue between the industries concerned and to provide information on EELQMS and the ATC Code of Practice to a broader audience. Key elements of the paper are: - What is EELQMS? - How does EELQMS work? - What is the role of the ATC Code of Practice in EELQMS? - What are the most important rules of the ATC Code of Practice? - What benefits do EELQMS and the ATC Code of Practice offer to the end-user? - What is the current status of EELQMS? We hope that this presentation will help to promote a better understanding and acceptance of EELQMS on a broad basis. (orig.) [Deutsch] Im Jahre 1995 hat der ATC eine Code of Practice entwickelt, der in Verbindung mit dem ATIEL Code of Practice die Grundlage des Europaeischen Qualitaets-Management-Systems fuer Motoroele (European Engine Lubricant Quality Management System=EELQMS) ist. Die Einhaltung der in diesem System spezifizierten Regeln ist Voraussetzung fuer die Erfuellung der ACEA-Richtlinien und der entsprechenden Performance-Aussagen nach den jeweiligen europaeischen ACEA-Motorenoelsequenzen. Zur Vertiefung des Dialogs zwischen den beteiligten Industrien und zur Verbreitung der Kenntnisse ueber EELQMS und den ATC Code of Practice hat die TAD, die deutsche nationale Organisation innerhalb des europaeischen Dachverbandes der Additivindustrie (ATC), folgende Praesentation ausgearbeitet. Wesentliche Elemente der Praesentation sind: - Was ist EELQMS? - Wie funktioniert EELQMS? - Welche Rolle spielt der ATC Code of

  9. An effective coded excitation scheme based on a predistorted FM signal and an optimized digital filter

    DEFF Research Database (Denmark)

    Misaridis, Thanasis; Jensen, Jørgen Arendt

    1999-01-01

    performed with the program Field II. A commercial scanner (B-K Medical 3535) was modified and interfaced to an arbitrary function generator along with an RF power amplifier (Ritec). Hydrophone measurements in water were done to establish excitation voltage and corresponding intensity levels (I-sptp and I......This paper presents a coded excitation imaging system based on a predistorted FM excitation and a digital compression filter designed for medical ultrasonic applications, in order to preserve both axial resolution and contrast. In radars, optimal Chebyshev windows efficiently weight a nearly...

  10. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.; Faletti, D.W.; Wiles, L.E.

    1978-05-01

    This volume provides a listing of the BNW-II dry/wet ammonia heat rejection optimization code and is an appendix to Volume I which gives a narrative description of the code's algorithms as well as logic, input and output information.

  11. An Efficient Group Key Management Using Code for Key Calculation for Simultaneous Join/Leave: CKCS

    OpenAIRE

    Melisa Hajyvahabzadeh; Elina Eidkhani; S. Anahita Mortazavi; Alireza Nemaney Pour

    2012-01-01

    This paper presents an efficient group key management protocol, CKCS (Code for Key Calculation in Simultaneous join/leave) for simultaneous join/leave in secure multicast. This protocol is based on logical key hierarchy. In this protocol, when new members join the group simultaneously, server sends only thegroup key for those new members. Then, current members and new members calculate the necessary keys by node codes and one-way hash function. A node code is a random number which is assigned...

  12. Optimal management of idiopathic scoliosis in adolescence.

    Science.gov (United States)

    Kotwicki, Tomasz; Chowanska, Joanna; Kinel, Edyta; Czaprowski, Dariusz; Tomaszewski, Marek; Janusz, Piotr

    2013-01-01

    . Optimal management of idiopathic scoliosis requires cooperation within a professional team which includes the entire therapeutic spectrum, extending from simple watchful observation of nonprogressive mild deformities through to early surgery for rapidly deteriorating curvature. Probably most demanding is adequate management with regard to the individual course of the disease in a given patient, while avoiding overtreatment or undertreatment.

  13. Optimal management of idiopathic scoliosis in adolescence

    Science.gov (United States)

    Kotwicki, Tomasz; Chowanska, Joanna; Kinel, Edyta; Czaprowski, Dariusz; Tomaszewski, Marek; Janusz, Piotr

    2013-01-01

    . Optimal management of idiopathic scoliosis requires cooperation within a professional team which includes the entire therapeutic spectrum, extending from simple watchful observation of nonprogressive mild deformities through to early surgery for rapidly deteriorating curvature. Probably most demanding is adequate management with regard to the individual course of the disease in a given patient, while avoiding overtreatment or undertreatment. PMID:24600296

  14. Terminated and Tailbiting Spatially Coupled Codes with Optimized Bit Mappings for Spectrally Efficient Fiber-Optical Systems

    CERN Document Server

    Häger, Christian; Brännström, Fredrik; Alvarado, Alex; Agrell, Erik

    2014-01-01

    We study the design of spectrally efficient fiber-optical communication systems based on different spatially coupled (SC) forward error correction (FEC) schemes. In particular, we optimize the allocation of the coded bits from the FEC encoder to the modulation bits of the signal constellation. Two SC code classes are considered. The codes in the first class are protograph-based low-density parity-check (LDPC) codes which are decoded using iterative soft-decision decoding. The codes in the second class are generalized LDPC codes which are decoded using iterative hard-decision decoding. For both code classes, the bit allocation is optimized for the terminated and tailbiting SC cases based on a density evolution analysis. An optimized bit allocation can significantly improve the performance of tailbiting SC codes codes over the baseline sequential allocation, up to the point where they have a comparable gap to capacity as their terminated counterparts, at a lower FEC overhead. For the considered terminated SC co...

  15. A Review of Deterministic Optimization Methods in Engineering and Management

    Directory of Open Access Journals (Sweden)

    Ming-Hua Lin

    2012-01-01

    Full Text Available With the increasing reliance on modeling optimization problems in practical applications, a number of theoretical and algorithmic contributions of optimization have been proposed. The approaches developed for treating optimization problems can be classified into deterministic and heuristic. This paper aims to introduce recent advances in deterministic methods for solving signomial programming problems and mixed-integer nonlinear programming problems. A number of important applications in engineering and management are also reviewed to reveal the usefulness of the optimization methods.

  16. Minimum-Energy Wireless Real-Time Multicast by Joint Network Coding and Scheduling Optimization

    Directory of Open Access Journals (Sweden)

    Guoping Tan

    2015-01-01

    Full Text Available For real-time multicast services over wireless multihop networks, to minimize the energy of transmissions with satisfying the requirements of a fixed data rate and high reliabilities, we construct a conflict graph based framework by joint optimizing network coding and scheduling. Then, we propose a primal-dual subgradient optimization algorithm by random sampling K maximal stable sets in a given conflict graph. This method transforms the NP-hard scheduling subproblem into a normal linear programming problem to obtain an approximate solution. The proposed algorithm only needs to adopt centralized technique for solving the linear programming problem while all of the other computations can be distributed. The simulation results show that, comparing with the existing algorithm, this algorithm can not only achieve about 20% performance gain, but also have better performance in terms of convergence and robustness.

  17. Optimization of prostate cancer treatment plans using the adjoint transport method and discrete ordinates codes

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, S.; Henderson, D.L. [Dept. of Medical Physics, Madison, WI (United States); Thomadsen, B.R. [Dept. of Medical Physics and Dept. of Human Oncology, Madison (United States)

    2001-07-01

    Interstitial brachytherapy is a type of radiation in which radioactive sources are implanted directly into cancerous tissue. Determination of dose delivered to tissue by photons emitted from implanted seeds is an important step in the treatment plan process. In this paper we will investigate the use of the discrete ordinates method and the adjoint method to calculate absorbed dose in the regions of interest. MIP (mixed-integer programming) is used to determine the optimal seed distribution that conforms the prescribed dose to the tumor and delivers minimal dose to the sensitive structures. The patient treatment procedure consists of three steps: (1) image acquisition with the transrectal ultrasound (TRUS) and assessing the region of interest, (2) adjoint flux computation with discrete ordinate code for inverse dose calculation, and (3) optimization with the MIP branch-and-bound method.

  18. Color Coded Cards for Student Behavior Management in Higher Education Environments

    Science.gov (United States)

    Alhalabi, Wadee; Alhalabi, Mobeen

    2017-01-01

    The Color Coded Cards system as a possibly effective class management tool is the focus of this research. The Color Coded Cards system involves each student being given a card with a specific color based on his or her behavior. The main objective of the research is to find out whether this system effectively improves students' behavior, thus…

  19. Optimizing antibiotic usage in hospitals: a qualitative study of the perspectives of hospital managers.

    Science.gov (United States)

    Broom, A; Gibson, A F; Broom, J; Kirby, E; Yarwood, T; Post, J J

    2016-11-01

    Antibiotic optimization in hospitals is an increasingly critical priority in the context of proliferating resistance. Despite the emphasis on doctors, optimizing antibiotic use within hospitals requires an understanding of how different stakeholders, including non-prescribers, influence practice and practice change. This study was designed to understand Australian hospital managers' perspectives on antimicrobial resistance, managing antibiotic governance, and negotiating clinical vis-à-vis managerial priorities. Twenty-three managers in three hospitals participated in qualitative semi-structured interviews in Australia in 2014 and 2015. Data were systematically coded and thematically analysed. The findings demonstrate, from a managerial perspective: (1) competing demands that can hinder the prioritization of antibiotic governance; (2) ineffectiveness of audit and monitoring methods that limit rationalization for change; (3) limited clinical education and feedback to doctors; and (4) management-directed change processes are constrained by the perceived absence of a 'culture of accountability' for antimicrobial use amongst doctors. Hospital managers report considerable structural and interprofessional challenges to actualizing antibiotic optimization and governance. These challenges place optimization as a lower priority vis-à-vis other issues that management are confronted with in hospital settings, and emphasize the importance of antimicrobial stewardship (AMS) programmes that engage management in understanding and addressing the barriers to change. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  20. MOSEG code for safety oriented maintenance management Safety of management of maintenance oriented by MOSEG code; Codigo MOSEG para la gestion de mantenimiento orientada a la seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Torres Valle, Antonio [Instituto Superior de Tecnologias y Ciencias Aplicadas, La Habana (Cuba). Dept. Ingenieria Nuclear]. E-mail: atorres@fctn.isctn.edu.cu; Rivero Oliva, Jose de Jesus [Centro de Gestion de la Informacion y Desarrollo de la Energia (CUBAENERGIA) (Cuba)]. E-mail: jose@cubaenergia.cu

    2005-07-01

    Full text: One of the main reasons that makes maintenance contribute highly when facing safety problems and facilities availability is the lack of maintenance management systems to solve these fields in a balanced way. Their main setbacks are shown in this paper. It briefly describes the development of an integrating algorithm for a safety and availability-oriented maintenance management by virtue of the MOSEG Win 1.0 code. (author)

  1. Service Operations Optimization: Recent Development in Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Bin Shen

    2015-01-01

    Full Text Available Services are the key of success in operation management. Designing the effective strategies by optimization techniques is the fundamental and important condition for performance increase in service operations (SOs management. In this paper, we mainly focus on investigating SOs optimization in the areas of supply chain management, which create the greatest business values. Specifically, we study the recent development of SOs optimization associated with supply chain by categorizing them into four different industries (i.e., e-commerce industry, consumer service industry, public sector, and fashion industry and four various SOs features (i.e., advertising, channel coordination, pricing, and inventory. Moreover, we conduct the technical review on the stylish industries/topics and typical optimization models. The classical optimization approaches for SOs management in supply chain are presented. The managerial implications of SOs in supply chain are discussed.

  2. A Generalization Belief Propagation Decoding Algorithm for Polar Codes Based on Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Yingxian Zhang

    2014-01-01

    Full Text Available We propose a generalization belief propagation (BP decoding algorithm based on particle swarm optimization (PSO to improve the performance of the polar codes. Through the analysis of the existing BP decoding algorithm, we first introduce a probability modifying factor to each node of the BP decoder, so as to enhance the error correcting capacity of the decoding. Then, we generalize the BP decoding algorithm based on these modifying factors and drive the probability update equations for the proposed decoding. Based on the new probability update equations, we show the intrinsic relationship of the existing decoding algorithms. Finally, in order to achieve the best performance, we formulate an optimization problem to find the optimal probability modifying factors for the proposed decoding algorithm. Furthermore, a method based on the modified PSO algorithm is also introduced to solve that optimization problem. Numerical results show that the proposed generalization BP decoding algorithm achieves better performance than that of the existing BP decoding, which suggests the effectiveness of the proposed decoding algorithm.

  3. Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  4. Acceleration of the Geostatistical Software Library (GSLIB) by code optimization and hybrid parallel programming

    Science.gov (United States)

    Peredo, Oscar; Ortiz, Julián M.; Herrero, José R.

    2015-12-01

    The Geostatistical Software Library (GSLIB) has been used in the geostatistical community for more than thirty years. It was designed as a bundle of sequential Fortran codes, and today it is still in use by many practitioners and researchers. Despite its widespread use, few attempts have been reported in order to bring this package to the multi-core era. Using all CPU resources, GSLIB algorithms can handle large datasets and grids, where tasks are compute- and memory-intensive applications. In this work, a methodology is presented to accelerate GSLIB applications using code optimization and hybrid parallel processing, specifically for compute-intensive applications. Minimal code modifications are added decreasing as much as possible the elapsed time of execution of the studied routines. If multi-core processing is available, the user can activate OpenMP directives to speed up the execution using all resources of the CPU. If multi-node processing is available, the execution is enhanced using MPI messages between the compute nodes.Four case studies are presented: experimental variogram calculation, kriging estimation, sequential gaussian and indicator simulation. For each application, three scenarios (small, large and extra large) are tested using a desktop environment with 4 CPU-cores and a multi-node server with 128 CPU-nodes. Elapsed times, speedup and efficiency results are shown.

  5. Optimal-Rate Coding Theorem For Adversarial Networks in the Public-Key Setting

    CERN Document Server

    Amir, Yair; Ostrovksy, Rafail

    2008-01-01

    In this paper, we establish an optimal-rate (interactive) coding theorem in the public-key setting for synchronous networks in the presence of a malicious poly-time adversary for dynamically changing networks. Namely, even if the majority of the nodes are controlled by a malicious adversary and the topology of the network is changing at each round, then as long as there is some path of non-corrupted nodes connecting the sender and receiver at each round (though this path may change at every round) we construct a protocol with bounded memory per processor that achieves optimal transfer rate and negligible decoding error. This protocol will transmit polynomially many messages of polynomial size with constant overhead per bit. We stress that our protocol assumes no knowledge of which nodes are corrupted nor which path is reliable at any round. Our interactive coding theorem states that our protocol cannot be affected in a meaningful way by any polynomial-time malicious adversary whose goal is to disrupt and dest...

  6. Greenhouse climate management : an optimal control approach

    NARCIS (Netherlands)

    Henten, van E.J.

    1994-01-01

    In this thesis a methodology is developed for the construction and analysis of an optimal greenhouse climate control system.

    In chapter 1, the results of a literature survey are presented and the research objectives are defined. In the literature, optimal greenhouse climate

  7. A Low-Jitter Wireless Transmission Based on Buffer Management in Coding-Aware Routing

    Directory of Open Access Journals (Sweden)

    Cunbo Lu

    2015-08-01

    Full Text Available It is significant to reduce packet jitter for real-time applications in a wireless network. Existing coding-aware routing algorithms use the opportunistic network coding (ONC scheme in a packet coding algorithm. The ONC scheme never delays packets to wait for the arrival of a future coding opportunity. The loss of some potential coding opportunities may degrade the contribution of network coding to jitter performance. In addition, most of the existing coding-aware routing algorithms assume that all flows participating in the network have equal rate. This is unrealistic, since multi-rate environments often appear. To overcome the above problem and expand coding-aware routing to multi-rate scenarios, from the view of data transmission, we present a low-jitter wireless transmission algorithm based on buffer management (BLJCAR, which decides packets in coding node according to the queue-length based threshold policy instead of the regular ONC policy as used in existing coding-aware routing algorithms. BLJCAR is a unified framework to merge the single rate case and multiple rate case. Simulations results show that the BLJCAR algorithm embedded in coding-aware routing outperforms the traditional ONC policy in terms of jitter, packet delivery delay, packet loss ratio and network throughput in network congestion in any traffic rates.

  8. Residue management and Canada's environmental codes of practice for steam electric power generation

    Energy Technology Data Exchange (ETDEWEB)

    Finlay, P.G.; Stobbs, R.A.; Ross, G.C.; Pinault, P.H.; Doiron, C.C. (Environment Canada, Ottawa, ON (Canada))

    1993-01-01

    Canada's 'Environmental Codes of Practice for Steam Electric Power Generation' are comprehensive environmental protection standards for the various phases of the life cycle of power plants. Codes published include the Siting, Design, Construction, Operation and Decommissioning Phase Codes. Various practices are recommended for the management of air emissions, water, fuels, chemicals, wastewater, and liquid and solid residues. Topics discussed include the production, utilization and disposal of residues, including combustion ashes and desulphurization by-products. An example of application of the Codes at the 'zero discharge' Shand Power Station is discussed. 6 refs., 1 fig., 1 tab.

  9. Code orange: Towards transformational leadership of emergency management systems.

    Science.gov (United States)

    Caro, Denis H J

    2015-09-01

    The 21(st) century calls upon health leaders to recognize and respond to emerging threats and systemic emergency management challenges through transformative processes inherent in the LEADS in a caring environment framework. Using a grounded theory approach, this qualitative study explores key informant perspectives of leaders in emergency management across Canada on pressing needs for relevant systemic transformation. The emerging model points to eight specific attributes of transformational leadership central to emergency management and suggests that contextualization of health leadership is of particular import.

  10. Optimal Bipartitet Ramanujan Graphs from Balanced Incomplete Block Designs: Their Characterization and Applications to Expander/LDPC Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Janwa, Heeralal

    2009-01-01

    We characterize optimaal bipartitet expander graphs and give nessecary and sufficient conditions for optimality. We determine the expansion parameters of the BIBD graphs and show that they yield optimal expander graphs and also bipartitet Ramanujan graphs. in particular, we show that the bipartit...... graphs derived from finite projective and affine geometries yield optimal Ramanujan graphs. This in turn leads to a theoretical explanation of the good performance of a class of LDPC codes....

  11. ROLE OF MANAGEMENT ACCOUNTING IN TAX OPTIMIZATION OF ECONOMIC SUBJECTS

    Directory of Open Access Journals (Sweden)

    Bashkatov V. V.

    2016-03-01

    Full Text Available The relevance of the study is due to the fact that in modern conditions the question of the effectiveness of tax accounting, the optimal model of its interaction with management accounting, as well as enhancing the role of the tax administration, tax optimization at the level of the entity and the construction of the administrative account for tax purposes are particularly acute. As a consequence there is the need to investigate the bookkeeping and fiscal accounting data harmonization with the aim of taxation control, analysis, management and optimization. This problem is solved through organization and fulfillment of management accounting playing the key role in data consolidation in the unified information system to solve the arising problems. The paper presents the theoretical and methodological aspects of management accounting aimed at taxation optimization. The scientific and methodical approaches and recommendations presented in the article allow extension of the theoretical understanding of the tax and management accounting systems, increase the range of missions in management accounting tasks related to tax optimization and effective management of tax liabilities of the organization. All this will enhance the analyticity of information, the effectiveness of management decisions in the field of taxation. The provisions of the article can be used in the practice of accounting and economic services organizations, audit, consulting organizations

  12. Optimal dynamic management of groundwater pollutant sources.

    Science.gov (United States)

    Gorelick, S.M.; Remson, I.

    1982-01-01

    The linear programing-superposition method is presented for managing multiple sources of groundwater pollution over time. The method uses any linear solute transport simulation model to generate a unit source-concentration response matrix that is incorporated into a management model. -from Authors

  13. Synergy optimization and operation management on syndicate complementary knowledge cooperation

    Science.gov (United States)

    Tu, Kai-Jan

    2014-10-01

    The number of multi enterprises knowledge cooperation has grown steadily, as a result of global innovation competitions. I have conducted research based on optimization and operation studies in this article, and gained the conclusion that synergy management is effective means to break through various management barriers and solve cooperation's chaotic systems. Enterprises must communicate system vision and access complementary knowledge. These are crucial considerations for enterprises to exert their optimization and operation knowledge cooperation synergy to meet global marketing challenges.

  14. Solid Warehouse Material Management System Based on ERP and Bar Code Technology

    Institute of Scientific and Technical Information of China (English)

    ZHANG Cheng; WANG Jie; YUAN Bing; WU Chao; HU Qiao-dan

    2004-01-01

    This paper presents a manufacturing material management system based on ERP, which is combined with industrial bar code information collection and material management, and carries out extensive research on the system structure and function model, as well as a detailed application scheme.

  15. Review of dynamic optimization methods in renewable natural resource management

    Science.gov (United States)

    Williams, B.K.

    1989-01-01

    In recent years, the applications of dynamic optimization procedures in natural resource management have proliferated. A systematic review of these applications is given in terms of a number of optimization methodologies and natural resource systems. The applicability of the methods to renewable natural resource systems are compared in terms of system complexity, system size, and precision of the optimal solutions. Recommendations are made concerning the appropriate methods for certain kinds of biological resource problems.

  16. Topology Optimization for Energy Management in Underwater Sensor Networks

    Science.gov (United States)

    2015-02-01

    1 To appear in International Journal of Control as a regular paper Topology Optimization for Energy Management in Underwater Sensor Networks⋆ Devesh... topology that maximizes the probability of successful search (of a target) over a surveillance region. In a two-stage optimization, a genetic algorithm (GA...Adaptation to energy variations across the network is shown to be manifested as a change in the optimal network topology by using sensing and

  17. Spectral decomposition of optimal asset-liability management

    NARCIS (Netherlands)

    Decamps, M.; de Schepper, A.; Goovaerts, M.

    2009-01-01

    This paper concerns optimal asset-liability management when the assets and the liabilities are modeled by means of correlated geometric Brownian motions as suggested in Gerber and Shiu [2003. Geometric Brownian motion models for assets and liabilities: from pension funding to optimal dividends.

  18. Reliability-Based Optimization for Maintenance Management in Bridge Networks

    OpenAIRE

    Hu, Xiaofei

    2014-01-01

    This dissertation addresses the problem of optimizing maintenance, repair and reconstruction decisions for bridge networks. Incorporating network topologies into bridge management problems is computationally difficult. Because of the interdependencies among networked bridges, they have to be analyzed together. Simulation-based numerical optimization techniques adopted in past research are limited to networks of moderate sizes. In this dissertation, novel approaches are developed to dete...

  19. Spectral decomposition of optimal asset-liability management

    NARCIS (Netherlands)

    Decamps, M.; de Schepper, A.; Goovaerts, M.

    2009-01-01

    This paper concerns optimal asset-liability management when the assets and the liabilities are modeled by means of correlated geometric Brownian motions as suggested in Gerber and Shiu [2003. Geometric Brownian motion models for assets and liabilities: from pension funding to optimal dividends. Nort

  20. Principles for a Code of Conduct for the Management and Sustainable Use of Mangrove Ecosystems

    DEFF Research Database (Denmark)

    Macintosh, Donald; Nielsen, Thomas; Zweig, Ronald

    mangrove forest ecosystems worldwide, the World Bank commissioned a study with the title "Mainstreaming conservation of coastal biodiversity through formulation of a generic Code of Conduct for Sustainable Management of Mangrove Forest Ecosystems". Formulation of these Principles for a Code of Conduct......, Africa, and Central and South America. These workshops provided an opportunity to seek expert advice regarding practical examples of sound mangrove management, or problems for management, from each region, and to illustrate them in the working document. A peer review workshop was held in Washington...

  1. Well Field Management Using Multi-Objective Optimization

    DEFF Research Database (Denmark)

    Hansen, Annette Kirstine; Hendricks Franssen, H. J.; Bauer-Gottwein, Peter

    2013-01-01

    Efficient management of groundwater resources is important because groundwater availability is limited and, locally, groundwater quality has been impaired because of contamination. Here we present a multi-objective optimization framework for improving the management of a water works that operates...... with infiltration basins, injection wells and abstraction wells. The two management objectives are to minimize the amount of water needed for infiltration and to minimize the risk of getting contaminated water into the drinking water wells. The management is subject to a daily demand fulfilment constraint. Two...... optimization results are presented for the Hardhof water works in Zurich, Switzerland. It is found that both methods perform better than the historical management. The constant scheduling performs best in fairly stable conditions, whereas the sequential optimization performs best in extreme situations...

  2. The Optimization of Dispersion Properties of Photonic Crystal Fibers Using a Real-Coded Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    YIN Guo-Bing; LI Shu-Guang; LIU Shuo; WANG Xiao-Yan

    2011-01-01

    @@ A real-coded genetic algorithm (GA) combined with a fully vectorial effective index method (FVEIM) is employed to design structures of photonic crystal fibers (PCFs) with user defined dispersion properties theoretically.The structures of PCFs whose solid cores axe doped GeO with zero-dispersions at 0.7-3.9μm are optimized and the flat dispersion ranges through the R+L+C band and the negative dispersion is -1576.26 ps.km·nm at 1.55μm.Analyses show that the zero-dispersion wavelength (ZDW) could be one of many ZDWs for the same fiber structure; PCFs couM alter the dispersion to be flattened through the R+L+C band with a single air-hole diameter; and negative dispersion requires high air filling rate at 1.55μm.The method is proved to be elegant for solving this inverse problem.

  3. Scalable coding of depth maps with R-D optimized embedding.

    Science.gov (United States)

    Mathew, Reji; Taubman, David; Zanuttigh, Pietro

    2013-05-01

    Recent work on depth map compression has revealed the importance of incorporating a description of discontinuity boundary geometry into the compression scheme. We propose a novel compression strategy for depth maps that incorporates geometry information while achieving the goals of scalability and embedded representation. Our scheme involves two separate image pyramid structures, one for breakpoints and the other for sub-band samples produced by a breakpoint-adaptive transform. Breakpoints capture geometric attributes, and are amenable to scalable coding. We develop a rate-distortion optimization framework for determining the presence and precision of breakpoints in the pyramid representation. We employ a variation of the EBCOT scheme to produce embedded bit-streams for both the breakpoint and sub-band data. Compared to JPEG 2000, our proposed scheme enables the same the scalability features while achieving substantially improved rate-distortion performance at the higher bit-rate range and comparable performance at the lower rates.

  4. Optimization and implementation of the integer wavelet transform for image coding.

    Science.gov (United States)

    Grangetto, Marco; Magli, Enrico; Martina, Maurizio; Olmo, Gabriella

    2002-01-01

    This paper deals with the design and implementation of an image transform coding algorithm based on the integer wavelet transform (IWT). First of all, criteria are proposed for the selection of optimal factorizations of the wavelet filter polyphase matrix to be employed within the lifting scheme. The obtained results lead to the IWT implementations with very satisfactory lossless and lossy compression performance. Then, the effects of finite precision representation of the lifting coefficients on the compression performance are analyzed, showing that, in most cases, a very small number of bits can be employed for the mantissa keeping the performance degradation very limited. Stemming from these results, a VLSI architecture is proposed for the IWT implementation, capable of achieving very high frame rates with moderate gate complexity.

  5. Optimal choice of Reed-Solomon codes to protect against queuing losses in wireless networks

    Institute of Scientific and Technical Information of China (English)

    Claus Bauer; JIANG Wen-yu

    2009-01-01

    This article proposes algorithms to determine an optimal choice of the Reed-Solomon forward error correction (FEC) code parameters (n,k) to mitigate the effects of packet loss on multimedia traffic caused by buffer overflow at a wireless base station. A network model is developed that takes into account traffic arrival rates, channel loss characteristics, the capacity of the buffer at the base station, and FEC parameters. For Poisson distributed traffic, the theory of recurrent linear equations is applied to develop a new closed form solution of low complexity of the Markov model for the buffer occupancy. For constant bit rate (CBR) traffic,an iterative procedure is developed to compute the packet loss probabilities after FEC recovery.

  6. The role of stochasticity in an information-optimal neural population code

    Science.gov (United States)

    Stocks, N. G.; Nikitin, A. P.; McDonnell, M. D.; Morse, R. P.

    2009-12-01

    In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems. The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise; in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and, hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations. In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.

  7. Optimization of Parallel Legendre Transform using Graphics Processing Unit (GPU) for a Geodynamo Code

    Science.gov (United States)

    Lokavarapu, H. V.; Matsui, H.

    2015-12-01

    Convection and magnetic field of the Earth's outer core are expected to have vast length scales. To resolve these flows, high performance computing is required for geodynamo simulations using spherical harmonics transform (SHT), a significant portion of the execution time is spent on the Legendre transform. Calypso is a geodynamo code designed to model magnetohydrodynamics of a Boussinesq fluid in a rotating spherical shell, such as the outer core of the Earth. The code has been shown to scale well on computer clusters capable of computing at the order of 10⁵ cores using Message Passing Interface (MPI) and Open Multi-Processing (OpenMP) parallelization for CPUs. To further optimize, we investigate three different algorithms of the SHT using GPUs. One is to preemptively compute the Legendre polynomials on the CPU before executing SHT on the GPU within the time integration loop. In the second approach, both the Legendre polynomials and the SHT are computed on the GPU simultaneously. In the third approach , we initially partition the radial grid for the forward transform and the harmonic order for the backward transform between the CPU and GPU. There after, the partitioned works are simultaneously computed in the time integration loop. We examine the trade-offs between space and time, memory bandwidth and GPU computations on Maverick, a Texas Advanced Computing Center (TACC) supercomputer. We have observed improved performance using a GPU enabled Legendre transform. Furthermore, we will compare and contrast the different algorithms in the context of GPUs.

  8. Optimization Problems in Supply Chain Management

    NARCIS (Netherlands)

    D. Romero Morales (Dolores)

    2000-01-01

    textabstractMaria Dolores Romero Morales was born on Augustus 5th, 1971, in Sevilla (Spain). She studied Mathematics at University of Sevilla from 1989 to 1994 and specialized in Statistics and Operations Research. She wrote her Master's thesis on Global Optimization in Location Theory under the sup

  9. Cover crop-based ecological weed management: exploration and optimization

    NARCIS (Netherlands)

    Kruidhof, H.M.

    2008-01-01

    Keywords: organic farming, ecologically-based weed management, cover crops, green manure, allelopathy, Secale cereale, Brassica napus, Medicago sativa Cover crop-based ecological weed management: exploration and optimization. In organic farming systems, weed control is recognized as one of the mai

  10. Cover crop-based ecological weed management: exploration and optimization

    NARCIS (Netherlands)

    Kruidhof, H.M.

    2008-01-01

    Keywords: organic farming, ecologically-based weed management, cover crops, green manure, allelopathy, Secale cereale, Brassica napus, Medicago sativa Cover crop-based ecological weed management: exploration and optimization. In organic farming systems, weed control is recognized as one of the

  11. Iterative optimization of performance libraries by hierarchical division of codes; Optimisation iterative de bibliotheques de calculs par division hierarchique de codes

    Energy Technology Data Exchange (ETDEWEB)

    Donadio, S

    2007-09-15

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  12. Identifying Cost-Effective Water Resources Management Strategies: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a public-domain software application designed to aid decision makers with integrated water resources management. The tool allows water resource managers and planners to screen a wide-range of management practices for c...

  13. Uncertainty, learning, and the optimal management of wildlife

    Science.gov (United States)

    Williams, B.K.

    2001-01-01

    Wildlife management is limited by uncontrolled and often unrecognized environmental variation, by limited capabilities to observe and control animal populations, and by a lack of understanding about the biological processes driving population dynamics. In this paper I describe a comprehensive framework for management that includes multiple models and likelihood values to account for structural uncertainty, along with stochastic factors to account for environmental variation, random sampling, and partial controllability. Adaptive optimization is developed in terms of the optimal control of incompletely understood populations, with the expected value of perfect information measuring the potential for improving control through learning. The framework for optimal adaptive control is generalized by including partial observability and non-adaptive, sample-based updating of model likelihoods. Passive adaptive management is derived as a special case of constrained adaptive optimization, representing a potentially efficient suboptimal alternative that nonetheless accounts for structural uncertainty.

  14. A model for the optimal risk management of farm firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    2012-01-01

    Risk management is an integrated part of business or firm management and deals with the problem of how to avoid the risk of economic losses when the objective is to maximize expected profit. This paper will focus on the identification, assessment, and prioritization of risks in agriculture follow......, we derive a criterion for optimal risk management in the sense that we derive the optimal combination of expected income and variance on return to capital on the efficient frontier....... for risk management Risk management is typically based on numerical analysis and the concept of efficiency. None of the methods developed so far actually solve the basic question of how the individual manager should behave so as to optimise the balance between expected profit/income and risk. In the paper...

  15. Software Project Scheduling Management by Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Dinesh B. Hanchate

    2014-12-01

    Full Text Available PSO (Particle Swarm Optimization is, like GA, a heuristic global optimization method based on swarm intelligence. In this paper, we present a particle swarm optimization algorithm to solve software project scheduling problem. PSO itself inherits very efficient local search method to find the near optimal and best-known solutions for all instances given as inputs required for SPSM (Software Project Scheduling Management. At last, this paper imparts PSO and research situation with SPSM. The effect of PSO parameter on project cost and time is studied and some better results in terms of minimum SCE (Software Cost Estimation and time as compared to GA and ACO are obtained.

  16. Selecting a proper design period for heliostat field layout optimization using Campo code

    Science.gov (United States)

    Saghafifar, Mohammad; Gadalla, Mohamed

    2016-09-01

    In this paper, different approaches are considered to calculate the cosine factor which is utilized in Campo code to expand the heliostat field layout and maximize its annual thermal output. Furthermore, three heliostat fields containing different number of mirrors are taken into consideration. Cosine factor is determined by considering instantaneous and time-average approaches. For instantaneous method, different design days and design hours are selected. For the time average method, daily time average, monthly time average, seasonally time average, and yearly time averaged cosine factor determinations are considered. Results indicate that instantaneous methods are more appropriate for small scale heliostat field optimization. Consequently, it is proposed to consider the design period as the second design variable to ensure the best outcome. For medium and large scale heliostat fields, selecting an appropriate design period is more important. Therefore, it is more reliable to select one of the recommended time average methods to optimize the field layout. Optimum annual weighted efficiency for heliostat fields (small, medium, and large) containing 350, 1460, and 3450 mirrors are 66.14%, 60.87%, and 54.04%, respectively.

  17. Simulated data and code for analysis of herpetofauna response to forest management in the Missouri Ozarks.

    Science.gov (United States)

    Rota, Christopher T; Wolf, Alexander J; Renken, Rochelle B; Gitzen, Robert A; Fantz, Debby K; Montgomery, Robert A; Olson, Matthew G; Vangilder, Larry D; Millspaugh, Joshua J

    2016-12-01

    We present predictor variables and R and Stan code for simulating and analyzing counts of Missouri Ozark herpetofauna in response to three forest management strategies. Our code performs four primary purposes: import predictor variables from spreadsheets; simulate synthetic response variables based on imported predictor variables and user-supplied values for data-generating parameters; format synthetic data for export to Stan; and analyze synthetic data.

  18. Simulated data and code for analysis of herpetofauna response to forest management in the Missouri Ozarks

    Directory of Open Access Journals (Sweden)

    Christopher T. Rota

    2016-12-01

    Full Text Available We present predictor variables and R and Stan code for simulating and analyzing counts of Missouri Ozark herpetofauna in response to three forest management strategies. Our code performs four primary purposes: import predictor variables from spreadsheets; simulate synthetic response variables based on imported predictor variables and user-supplied values for data-generating parameters; format synthetic data for export to Stan; and analyze synthetic data.

  19. OPIUM : optimal package install/ uninstall manager

    OpenAIRE

    Tucker, Christopher James

    2008-01-01

    Linux distributions often include package management tools such as apt-get in Debian or yum in RedHat. Using information about package dependencies and conflicts, such tools can determine how to install a new package (and its dependencies on a system of already installed packages. Using off-the-shelf SAT solvers, pseudo-boolean solvers, and Integer Linear Programming solvers, we have developed a new package-management tool, called Opium, that improves on current tools in two ways: (1) Opium i...

  20. Optimization of high-definition video coding and hybrid fiber-wireless transmission in the 60 GHz band

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Pham, Tien Thang; Beltrán, Marta;

    2011-01-01

    We demonstrate that, by jointly optimizing video coding and radio-over-fibre transmission, we extend the reach of 60-GHz wireless distribution of high-quality high-definition video satisfying low complexity and low delay constraints, while preserving superb video quality.......We demonstrate that, by jointly optimizing video coding and radio-over-fibre transmission, we extend the reach of 60-GHz wireless distribution of high-quality high-definition video satisfying low complexity and low delay constraints, while preserving superb video quality....

  1. Complex energy system management using optimization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bridgeman, Stuart; Hurdowar-Castro, Diana; Allen, Rick; Olason, Tryggvi; Welt, Francois

    2010-09-15

    Modern energy systems are often very complex with respect to the mix of generation sources, energy storage, transmission, and avenues to market. Historically, power was provided by government organizations to load centers, and pricing was provided in a regulatory manner. In recent years, this process has been displaced by the independent system operator (ISO). This complexity makes the operation of these systems very difficult, since the components of the system are interdependent. Consequently, computer-based large-scale simulation and optimization methods like Decision Support Systems are now being used. This paper discusses the application of a DSS to operations and planning systems.

  2. Optimization of landscape services under uncoordinated management by multiple landowners.

    Directory of Open Access Journals (Sweden)

    Miguel Porto

    Full Text Available Landscapes are often patchworks of private properties, where composition and configuration patterns result from cumulative effects of the actions of multiple landowners. Securing the delivery of services in such multi-ownership landscapes is challenging, because it is difficult to assure tight compliance to spatially explicit management rules at the level of individual properties, which may hinder the conservation of critical landscape features. To deal with these constraints, a multi-objective simulation-optimization procedure was developed to select non-spatial management regimes that best meet landscape-level objectives, while accounting for uncoordinated and uncertain response of individual landowners to management rules. Optimization approximates the non-dominated Pareto frontier, combining a multi-objective genetic algorithm and a simulator that forecasts trends in landscape pattern as a function of management rules implemented annually by individual landowners. The procedure was demonstrated with a case study for the optimum scheduling of fuel treatments in cork oak forest landscapes, involving six objectives related to reducing management costs (1, reducing fire risk (3, and protecting biodiversity associated with mid- and late-successional understories (2. There was a trade-off between cost, fire risk and biodiversity objectives, that could be minimized by selecting management regimes involving ca. 60% of landowners clearing the understory at short intervals (around 5 years, and the remaining managing at long intervals (ca. 75 years or not managing. The optimal management regimes produces a mosaic landscape dominated by stands with herbaceous and low shrub understories, but also with a satisfactory representation of old understories, that was favorable in terms of both fire risk and biodiversity. The simulation-optimization procedure presented can be extended to incorporate a wide range of landscape dynamic processes, management rules

  3. Charge Management Optimization for Future TOU Rates

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiucai; Markel, Tony

    2016-06-22

    The effectiveness of future time of use (TOU) rates to enable managed charging for providing demand response depends on the vehicle's flexibility and the benefits to owners. This paper adopts opportunity, delayed, and smart charging methods to quantify these impacts, flexibilities, and benefits. Simulation results show that delayed and smart charging methods can shift most charging events to lower TOU rate periods without compromising the charged energy and individual driver mobility needs.

  4. Optimal Investment by Financially Xenophobic Managers

    OpenAIRE

    Cummins, Jason G; Ingmar Nyman

    2000-01-01

    Case studies show that corporate managers seek financial independence to avoid interference by outside financiers. We incorporate this financial xenophobia as a fixed cost in a simple dynamic model of financing and investment. To avoid refinancing in the future, the firm alters its behavior depending on the extent of its financial xenophobia and the realization of a revenue shock. With a sufficiently adverse shock, the firm holds no liquidity. Otherwise, the firm precautionarily saves and hol...

  5. Managing Retention Use of Simulation and Optimization

    Science.gov (United States)

    2010-01-01

    Early Retirement Separation Policy N P R S Advancement Planning Promotion Plan by Community N Compensation Policy Pay and Incentives Force...Shaping Tools Managing Losses and Excesses High Year TenureE4 E6 Temporary Early Retirement Authority Perform to Serve Selective Reenlistment Bonus...T Selective Early Retirement N P R S N Length of ServiceNominal Actual Nominal Steady State 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

  6. Stand management optimization – the role of simplifications

    Directory of Open Access Journals (Sweden)

    Timo Pukkala

    2014-02-01

    Full Text Available Background Studies on optimal stand management often make simplifications or restrict the choice of treatments. Examples of simplifications are neglecting natural regeneration that appears on a plantation site, omitting advance regeneration in simulations, or restricting thinning treatments to low thinning (thinning from below. Methods This study analyzed the impacts of simplifications on the optimization results for Fennoscandian boreal forests. Management of pine and spruce plantations was optimized by gradually reducing the number of simplifying assumptions. Results Forced low thinning, cleaning the plantation from the natural regeneration of mixed species and ignoring advance regeneration all had a major impact on optimization results. High thinning (thinning from above resulted in higher NPV and longer rotation length than thinning from below. It was profitable to leave a mixed stand in the tending treatment of young plantation. When advance regeneration was taken into account, it was profitable to increase the number of thinnings and postpone final felling. In the optimal management, both pine and spruce plantation was gradually converted into uneven-aged mixture of spruce and birch. Conclusions The results suggest that, with the current management costs and timber price level, it may be profitable to switch to continuous cover management on medium growing sites of Fennoscandian boreal forests.

  7. Program MAMO: Models for avian management optimization-user guide

    Science.gov (United States)

    Guillaumet, Alban; Paxton, Eben

    2017-01-01

    The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).

  8. The exploration of relationship between the surgery grading management and the ICD-9-CM-3 coding

    Directory of Open Access Journals (Sweden)

    Xin-yu FAN

    2014-09-01

    Full Text Available Surgery grading management is one of the key aspects of medical technology access and medical quality management. Surgical classification is graded by risk level, the degree of difficulty of the process and the technology; however, surgery coding is mainly classified by the surgical site. Therefore, the grading of the ICD coding surgical name encounters some difficulties. Our hospital uses the “Jiangsu Province set of medical information classification and coding standards” and “Jiangsu Province surgery hierarchical directory”, which is issued by Jiangsu Provincial Commission of Health and Family Planning, as the basis. We are practicing and exploring about the correspondence between the surgery which has graded and ICD-9-CM-3 encoding, including six categories problems, such as different types of the same operation, different surgeries of the same parts. We carry out the surgical grading and coding intelligent management into hospital management, which will regulate the management of hospital surgery more effectively, ensure the safety of medical quality, and reduce the incidence of adverse events in health care.

  9. An Optimal Pull-Push Scheduling Algorithm Based on Network Coding for Mesh Peer-to-Peer Live Streaming

    Science.gov (United States)

    Cui, Laizhong; Jiang, Yong; Wu, Jianping; Xia, Shutao

    Most large-scale Peer-to-Peer (P2P) live streaming systems are constructed as a mesh structure, which can provide robustness in the dynamic P2P environment. The pull scheduling algorithm is widely used in this mesh structure, which degrades the performance of the entire system. Recently, network coding was introduced in mesh P2P streaming systems to improve the performance, which makes the push strategy feasible. One of the most famous scheduling algorithms based on network coding is R2, with a random push strategy. Although R2 has achieved some success, the push scheduling strategy still lacks a theoretical model and optimal solution. In this paper, we propose a novel optimal pull-push scheduling algorithm based on network coding, which consists of two stages: the initial pull stage and the push stage. The main contributions of this paper are: 1) we put forward a theoretical analysis model that considers the scarcity and timeliness of segments; 2) we formulate the push scheduling problem to be a global optimization problem and decompose it into local optimization problems on individual peers; 3) we introduce some rules to transform the local optimization problem into a classical min-cost optimization problem for solving it; 4) We combine the pull strategy with the push strategy and systematically realize our scheduling algorithm. Simulation results demonstrate that decode delay, decode ratio and redundant fraction of the P2P streaming system with our algorithm can be significantly improved, without losing throughput and increasing overhead.

  10. An Enhanced System Architecture for Optimized Demand Side Management in Smart Grid

    Directory of Open Access Journals (Sweden)

    Anzar Mahmood

    2016-04-01

    Full Text Available Demand Side Management (DSM through optimization of home energy consumption in the smart grid environment is now one of the well-known research areas. Appliance scheduling has been done through many different algorithms to reduce peak load and, consequently, the Peak to Average Ratio (PAR. This paper presents a Comprehensive Home Energy Management Architecture (CHEMA with integration of multiple appliance scheduling options and enhanced load categorization in a smart grid environment. The CHEMA model consists of six layers and has been modeled in Simulink with an embedded MATLAB code. A single Knapsack optimization technique is used for scheduling and four different cases of cost reduction are modeled at the second layer of CHEMA. Fault identification and electricity theft control have also been added in CHEMA. Furthermore, carbon footprint calculations have been incorporated in order to make the users aware of environmental concerns. Simulation results prove the effectiveness of the proposed model.

  11. Optimal management of chronic osteomyelitis: current perspectives

    Directory of Open Access Journals (Sweden)

    Pande KC

    2015-08-01

    Full Text Available Ketan C Pande Raja Isteri Pengiran Anak Saleha Hospital, Bandar Seri Begawan, BruneiAbstract: Chronic osteomyelitis is a challenging condition to treat. It is seen mostly after open fractures or in implant-related infections following treatment of fractures and prosthetic joint replacements. Recurrence of infection is well known, and successful treatment requires a multidisciplinary team approach with surgical debridement and appropriate antimicrobial therapy as the cornerstone of treatment. Staging of the disease and identification of the causative microorganism is essential before initiation of treatment. Important surgical steps include radical debridement of necrotic and devitalized tissue, removal of implants, management of resultant dead space, soft-tissue coverage, and skeletal stabilization or management of skeletal defects. The route of administration and duration of antimicrobial therapy continues to be debated. The role of biofilm is now clearly established in the chronicity of bone infection, and newer modalities are being developed to address various issues related to biofilm formation. The present review addresses various aspects of chronic osteomyelitis of long bones seen in adults, with a review of recent developments. Keywords: osteomyelitis, infection, biofilm, bone, therapy, treatment

  12. Optimal management of hemophilic arthropathy and hematomas

    Directory of Open Access Journals (Sweden)

    Lobet S

    2014-10-01

    Full Text Available Sébastien Lobet,1,2 Cedric Hermans,1 Catherine Lambert1 1Hemostasis-Thrombosis Unit, Division of Hematology, 2Division of Physical Medicine and Rehabilitation, Cliniques Universitaires Saint-Luc, Brussels, Belgium Abstract: Hemophilia is a hematological disorder characterized by a partial or complete deficiency of clotting factor VIII or IX. Its bleeding complications primarily affect the musculoskeletal system. Hemarthrosis is a major hemophilia-related complication, responsible for a particularly debilitating chronic arthropathy, in the long term. In addition to clotting factor concentrates, usually prescribed by the hematologist, managing acute hemarthrosis and chronic arthropathy requires a close collaboration between the orthopedic surgeon and physiotherapist. This collaboration, comprising a coagulation and musculoskeletal specialist, is key to effectively preventing hemarthrosis, managing acute joint bleeding episodes, assessing joint function, and actively treating chronic arthropathy. This paper reviews, from a practical point of view, the pathophysiology, clinical manifestations, and treatment of hemarthrosis and chronic hemophilia-induced arthropathy for hematologists, orthopedic surgeons, and physiotherapists. Keywords: hemophilia, arthropathy, hemarthrosis, hematoma, physiotherapy, target joint

  13. Neural network river forecasting through baseflow separation and binary-coded swarm optimization

    Science.gov (United States)

    Taormina, Riccardo; Chau, Kwok-Wing; Sivakumar, Bellie

    2015-10-01

    The inclusion of expert knowledge in data-driven streamflow modeling is expected to yield more accurate estimates of river quantities. Modular models (MMs) designed to work on different parts of the hydrograph are preferred ways to implement such approach. Previous studies have suggested that better predictions of total streamflow could be obtained via modular Artificial Neural Networks (ANNs) trained to perform an implicit baseflow separation. These MMs fit separately the baseflow and excess flow components as produced by a digital filter, and reconstruct the total flow by adding these two signals at the output. The optimization of the filter parameters and ANN architectures is carried out through global search techniques. Despite the favorable premises, the real effectiveness of such MMs has been tested only on a few case studies, and the quality of the baseflow separation they perform has never been thoroughly assessed. In this work, we compare the performance of MM against global models (GMs) for nine different gaging stations in the northern United States. Binary-coded swarm optimization is employed for the identification of filter parameters and model structure, while Extreme Learning Machines, instead of ANN, are used to drastically reduce the large computational times required to perform the experiments. The results show that there is no evidence that MM outperform global GM for predicting the total flow. In addition, the baseflow produced by the MM largely underestimates the actual baseflow component expected for most of the considered gages. This occurs because the values of the filter parameters maximizing overall accuracy do not reflect the geological characteristics of the river basins. The results indeed show that setting the filter parameters according to expert knowledge results in accurate baseflow separation but lower accuracy of total flow predictions, suggesting that these two objectives are intrinsically conflicting rather than compatible.

  14. The role of crossover operator in evolutionary-based approach to the problem of genetic code optimization.

    Science.gov (United States)

    Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł

    2016-12-01

    One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure

  15. Enhanced Protein Production in Escherichia coli by Optimization of Cloning Scars at the Vector-Coding Sequence Junction

    DEFF Research Database (Denmark)

    Mirzadeh, Kiavash; Martinez, Virginia; Toddo, Stephen

    2015-01-01

    Protein production in Escherichia coli is a fundamental activity for a large fraction of academic, pharmaceutical, and industrial research laboratories. Maximum production is usually sought, as this reduces costs and facilitates downstream purification steps. Frustratingly, many coding sequences...... are poorly expressed even when they are codon-optimized and expressed from vectors with powerful genetic elements. In this study, we show that poor expression can be caused by certain nucleotide sequences (e.g., cloning scars) at the junction between the vector and the coding sequence. Since these sequences...... lie between the Shine-Dalgarno sequence and the start codon, they are an integral part of the translation initiation region. To identify the most optimal sequences, we devised a simple and inexpensive PCR-based step that generates sequence variants at the vector-coding sequence junction...

  16. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    Science.gov (United States)

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  17. An Optimization of the Risk Management using Derivatives

    Directory of Open Access Journals (Sweden)

    Ovidiu ŞONTEA

    2011-07-01

    Full Text Available This article aims to provide a process that can be used in financial risk management by resolving problems of minimizing the risk measure (VaR using derivatives products, bonds and options. This optimization problem was formulated in the hedging situation of a portfolio formed by an active and a put option on this active, respectively a bond and an option on this bond. In the first optimization problem we will obtain the coverage ratio of the optimal price for the excertion of the option which is in fact the relative cost of the option’s value. In the second optimization problem we obtained optimal exercise price for a put option which is to support a bond.

  18. Review: Optimization methods for groundwater modeling and management

    Science.gov (United States)

    Yeh, William W.-G.

    2015-09-01

    Optimization methods have been used in groundwater modeling as well as for the planning and management of groundwater systems. This paper reviews and evaluates the various optimization methods that have been used for solving the inverse problem of parameter identification (estimation), experimental design, and groundwater planning and management. Various model selection criteria are discussed, as well as criteria used for model discrimination. The inverse problem of parameter identification concerns the optimal determination of model parameters using water-level observations. In general, the optimal experimental design seeks to find sampling strategies for the purpose of estimating the unknown model parameters. A typical objective of optimal conjunctive-use planning of surface water and groundwater is to minimize the operational costs of meeting water demand. The optimization methods include mathematical programming techniques such as linear programming, quadratic programming, dynamic programming, stochastic programming, nonlinear programming, and the global search algorithms such as genetic algorithms, simulated annealing, and tabu search. Emphasis is placed on groundwater flow problems as opposed to contaminant transport problems. A typical two-dimensional groundwater flow problem is used to explain the basic formulations and algorithms that have been used to solve the formulated optimization problems.

  19. Using the Electronic Industry Code of Conduct to Evaluate Green Supply Chain Management: An Empirical Study of Taiwan’s Computer Industry

    Directory of Open Access Journals (Sweden)

    Ching-Ching Liu

    2015-03-01

    Full Text Available Electronics companies throughout Asia recognize the benefits of Green Supply Chain Management (GSCM for gaining competitive advantage. A large majority of electronics companies in Taiwan have recently adopted the Electronic Industry Citizenship Coalition (EICC Code of Conduct for defining and managing their social and environmental responsibilities throughout their supply chains. We surveyed 106 Tier 1 suppliers to the Taiwanese computer industry to determine their environmental performance using the EICC Code of Conduct (EICC Code and performed Analysis of Variance (ANOVA on the 63/106 questionnaire responses collected. We test the results to determine whether differences in product type, geographic area, and supplier size correlate with different levels of environmental performance. To our knowledge, this is the first study to analyze questionnaire data on supplier adoption to optimize the implementation of GSCM. The results suggest that characteristic classification of suppliers could be employed to enhance the efficiency of GSCM.

  20. Ethical problems in nursing management: the role of codes of ethics.

    Science.gov (United States)

    Aitamaa, Elina; Leino-Kilpi, Helena; Puukka, Pauli; Suhonen, Riitta

    2010-07-01

    The aim of this study was to identify the ethical problems that nurse managers encounter in their work and the role of codes of ethics in the solutions to these difficulties. The data were collected using a structured questionnaire and analysed statistically. The target sample included all nurse managers in 21 specialized health care or primary health care organizations in two hospital districts in Finland (N = 501; response rate 41%). The most common ethical problems concerned resource allocation as well as providing and developing high quality care. This was the case in different managerial positions as well as in types of organization. Professional codes of ethics were used more often for problems related to patients' care compared with issues of resource allocation. Nurse managers at middle or strategic management levels used codes of ethics more often than those in charge of a ward. More research is required to investigate ethical decision making in nursing management, especially with regard to problem solving. In addition, new guidelines and continuing education in ethics are important for management personnel.

  1. Governing Bodies and Learner Discipline: Managing Rural Schools in South Africa through a Code of Conduct

    Science.gov (United States)

    Mestry, Raj; Khumalo, Jan

    2012-01-01

    The South African Schools Act of 1996 provides that school governing bodies (SGBs) should adopt and assist in the enforcement of a learner code of conduct to maintain discipline effectively. This study focuses on the perceptions and experiences of SGBs in managing discipline in rural secondary schools through the design and enforcement of learner…

  2. Delegated Portfolio Management and Optimal Allocation of Portfolio Managers

    DEFF Research Database (Denmark)

    Christensen, Michael; Vangsgaard Christensen, Michael; Gamskjaer, Ken

    2015-01-01

    In this article, we investigate whether the application of the mean-variance framework on portfolio manager allocation offers any out-of-sample benefits compared to a naïve strategy of equal weighting. Based on an exclusive data-set of high-net-worth (HNW) investors, we utilize a wide variety...

  3. Cost Evaluation and Portfolio Management Optimization for Biopharmaceutical Product Development

    OpenAIRE

    Nie, W.

    2015-01-01

    The pharmaceutical industry is suffering from declining R&D productivity and yet biopharmaceutical firms have been attracting increasing venture capital investment. Effective R&D portfolio management can deliver above average returns under increasing costs of drug development and the high risk of clinical trial failure. This points to the need for advanced decisional tools that facilitate decision-making in R&D portfolio management by efficiently identifying optimal solutions while accounting...

  4. Optimal Maintenance Management of Offshore Wind Farms

    Directory of Open Access Journals (Sweden)

    Alberto Pliego Marugán

    2016-01-01

    Full Text Available Nowadays offshore wind energy is the renewable energy source with the highest growth. Offshore wind farms are composed of large and complex wind turbines, requiring a high level of reliability, availability, maintainability and safety (RAMS. Firms are employing robust remote condition monitoring systems in order to improve RAMS, considering the difficulty to access the wind farm. The main objective of this research work is to optimise the maintenance management of wind farms through the fault probability of each wind turbine. The probability has been calculated by Fault Tree Analysis (FTA employing the Binary Decision Diagram (BDD in order to reduce the computational cost. The fault tree presented in this paper has been designed and validated based on qualitative data from the literature and expert from important European collaborative research projects. The basic events of the fault tree have been prioritized employing the criticality method in order to use resources efficiently. Exogenous variables, e.g., weather conditions, have been also considered in this research work. The results provided by the dynamic probability of failure and the importance measures have been employed to develop a scheduled maintenance that contributes to improve the decision making and, consequently, to reduce the maintenance costs.

  5. The Optimization of Radioactive Waste Management in the Nuclear Installation Decommissioning Process

    Energy Technology Data Exchange (ETDEWEB)

    Zachar, Matej; Necas, Vladimir [Slovak University of Technology in Bratislava, Faculty of Electrical Engineering and Information Technology, Department of Nuclear Physics and Technology, Ilkovicova 3, 812 19 Bratislava (Slovakia)

    2008-07-01

    The paper presents a basic characterization of nuclear installation decommissioning process especially in the term of radioactive materials management. A large amount of solid materials and secondary waste created after implementation of decommissioning activities have to be managed considering their physical, chemical, toxic and radiological characteristics. Radioactive materials should be, after fulfilling all the conditions defined by the authorities, released to the environment for the further use. Non-releasable materials are considered to be a radioactive waste. Their management includes various procedures starting with pre-treatment activities, continuing with storage, treatment and conditioning procedures. Finally, they are disposed in the near surface or deep geological repositories. Considering the advantages and disadvantages of all possible ways of releasing the material from nuclear installation area, optimization of the material management process should be done. Emphasis is placed on the radiological parameters of materials, availability of waste management technologies, waste repositories and on the radiological limits and conditions for materials release or waste disposal. Appropriate optimization of material flow should lead to the significant savings of money, disposal capacities or raw material resources. Using a suitable calculation code e.g. OMEGA, the evaluation of the various material management scenarios and selection of the best one, based on the multi-criterion analysis, should be done. (authors)

  6. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2014-06-01

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health

  7. Lymphangioleiomyomatosis: differential diagnosis and optimal management

    Directory of Open Access Journals (Sweden)

    Xu KF

    2014-08-01

    been confirmed in clinical trials. Research in other molecular-targeted therapies is under investigation. A previously little-known rare disease with no cure is now better understood with regards to its pathogenesis, diagnosis, and management. In this review, current knowledge in diagnosis and differential diagnosis of LAM will be discussed, followed by the discussion of therapy with mTOR inhibitors. Keywords: lymphangioleiomyomatosis, diffuse cystic lung diseases, tuberous sclerosis complex, vascular endothelial growth factor-D, sirolimus

  8. Optimal management of idiopathic macular holes

    Directory of Open Access Journals (Sweden)

    Madi HA

    2016-01-01

    Full Text Available Haifa A Madi,1,* Ibrahim Masri,1,* David H Steel1,2 1Sunderland Eye Infirmary, Sunderland, 2Institute of Genetic Medicine, Newcastle University, International Centre for Life, Newcastle, UK *These authors contributed equally to this work Abstract: This review evaluates the current surgical options for the management of idiopathic macular holes (IMHs, including vitrectomy, ocriplasmin (OCP, and expansile gas use, and discusses key background information to inform the choice of treatment. An evidence-based approach to selecting the best treatment option for the individual patient based on IMH characteristics and patient-specific factors is suggested. For holes without vitreomacular attachment (VMA, vitrectomy is the only option with three key surgical variables: whether to peel the inner limiting membrane (ILM, the type of tamponade agent to be used, and the requirement for postoperative face-down posturing. There is a general consensus that ILM peeling improves primary anatomical hole closure rate; however, in small holes (<250 µm, it is uncertain whether peeling is always required. It has been increasingly recognized that long-acting gas and face-down positioning are not always necessary in patients with small- and medium-sized holes, but large (>400 µm and chronic holes (>1-year history are usually treated with long-acting gas and posturing. Several studies on posturing and gas choice were carried out in combination with ILM peeling, which may also influence the gas and posturing requirement. Combined phacovitrectomy appears to offer more rapid visual recovery without affecting the long-term outcomes of vitrectomy for IMH. OCP is licensed for use in patients with small- or medium-sized holes and VMA. A greater success rate in using OCP has been reported in smaller holes, but further predictive factors for its success are needed to refine its use. It is important to counsel patients realistically regarding the rates of success with

  9. Nickel-Cadmium Battery Operation Management Optimization Using Robust Design

    Science.gov (United States)

    Blosiu, Julian O.; Deligiannis, Frank; DiStefano, Salvador

    1996-01-01

    In recent years following several spacecraft battery anomalies, it was determined that managing the operational factors of NASA flight NiCd rechargeable battery was very important in order to maintain space flight battery nominal performance. The optimization of existing flight battery operational performance was viewed as something new for a Taguchi Methods application.

  10. Analytical models integrated with satellite images for optimized pest management

    Science.gov (United States)

    The global field protection (GFP) was developed to protect and optimize pest management resources integrating satellite images for precise field demarcation with physical models of controlled release devices of pesticides to protect large fields. The GFP was implemented using a graphical user interf...

  11. Optimizing Resource and Energy Recovery for Municipal Solid Waste Management

    Science.gov (United States)

    Significant reductions of carbon emissions and air quality impacts can be achieved by optimizing municipal solid waste (MSW) as a resource. Materials and discards management were found to contribute ~40% of overall U.S. GHG emissions as a result of materials extraction, transpo...

  12. Optimizing Resource and Energy Recovery for Municipal Solid Waste Management

    Science.gov (United States)

    Significant reductions of carbon emissions and air quality impacts can be achieved by optimizing municipal solid waste (MSW) as a resource. Materials and discards management were found to contribute ~40% of overall U.S. GHG emissions as a result of materials extraction, transpo...

  13. Hydroeconomic optimization of reservoir management under downstream water quality constraints

    DEFF Research Database (Denmark)

    Davidsen, Claus; Liu, Suxia; Mo, Xingguo

    2015-01-01

    A hydroeconomic optimization approach is used to guide water management in a Chinese river basin with the objectives of meeting water quantity and water quality constraints, in line with the China 2011 No. 1 Policy Document and 2015 Ten-point Water Plan. The proposed modeling framework couples...... water quantity and water quality management and minimizes the total costs over a planning period assuming stochastic future runoff. The outcome includes cost-optimal reservoir releases, groundwater pumping, water allocation, wastewater treatments and water curtailments. The optimization model uses...... a variant of stochastic dynamic programming known as the water value method. Nonlinearity arising from the water quality constraints is handled with an effective hybrid method combining genetic algorithms and linear programming. Untreated pollutant loads are represented by biochemical oxygen demand (BOD...

  14. Optimal management of urosepsis from the urological perspective.

    Science.gov (United States)

    Wagenlehner, Florian M E; Weidner, Wolfgang; Naber, Kurt G

    2007-11-01

    Urosepsis in adults comprises approximately 25% of all sepsis cases and in most cases is due to complicated urinary tract infections (UTIs). In this paper we review the optimal management of urosepsis from the urological point of view. Urosepsis is often due to obstructed uropathy of the upper or lower urinary tract. The treatment of urosepsis comprises four major aspects: 1. Early goal-directed therapy; 2. Optimal pharmacodynamic exposure to antimicrobials both in blood and in the urinary tract; 3. Control of complicating factors in the urinary tract; 4. Specific sepsis therapy. Early tissue oxygenation, appropriate initial antibiotic therapy and rapid identification and control of the septic focus in the urinary tract are critical steps in the successful management of a patient with severe urosepsis. To achieve this goal an optimal interdisciplinary approach encompassing the emergency unit, urological specialties and intensive-care medicine is necessary.

  15. The neural code for auditory space depends on sound frequency and head size in an optimal manner.

    Directory of Open Access Journals (Sweden)

    Nicol S Harper

    Full Text Available A major cue to the location of a sound source is the interaural time difference (ITD-the difference in sound arrival time at the two ears. The neural representation of this auditory cue is unresolved. The classic model of ITD coding, dominant for a half-century, posits that the distribution of best ITDs (the ITD evoking a neuron's maximal response is unimodal and largely within the range of ITDs permitted by head-size. This is often interpreted as a place code for source location. An alternative model, based on neurophysiology in small mammals, posits a bimodal distribution of best ITDs with exquisite sensitivity to ITDs generated by means of relative firing rates between the distributions. Recently, an optimal-coding model was proposed, unifying the disparate features of these two models under the framework of efficient coding by neural populations. The optimal-coding model predicts that distributions of best ITDs depend on head size and sound frequency: for high frequencies and large heads it resembles the classic model, for low frequencies and small head sizes it resembles the bimodal model. The optimal-coding model makes key, yet unobserved, predictions: for many species, including humans, both forms of neural representation are employed, depending on sound frequency. Furthermore, novel representations are predicted for intermediate frequencies. Here, we examine these predictions in neurophysiological data from five mammalian species: macaque, guinea pig, cat, gerbil and kangaroo rat. We present the first evidence supporting these untested predictions, and demonstrate that different representations appear to be employed at different sound frequencies in the same species.

  16. The neural code for auditory space depends on sound frequency and head size in an optimal manner.

    Science.gov (United States)

    Harper, Nicol S; Scott, Brian H; Semple, Malcolm N; McAlpine, David

    2014-01-01

    A major cue to the location of a sound source is the interaural time difference (ITD)-the difference in sound arrival time at the two ears. The neural representation of this auditory cue is unresolved. The classic model of ITD coding, dominant for a half-century, posits that the distribution of best ITDs (the ITD evoking a neuron's maximal response) is unimodal and largely within the range of ITDs permitted by head-size. This is often interpreted as a place code for source location. An alternative model, based on neurophysiology in small mammals, posits a bimodal distribution of best ITDs with exquisite sensitivity to ITDs generated by means of relative firing rates between the distributions. Recently, an optimal-coding model was proposed, unifying the disparate features of these two models under the framework of efficient coding by neural populations. The optimal-coding model predicts that distributions of best ITDs depend on head size and sound frequency: for high frequencies and large heads it resembles the classic model, for low frequencies and small head sizes it resembles the bimodal model. The optimal-coding model makes key, yet unobserved, predictions: for many species, including humans, both forms of neural representation are employed, depending on sound frequency. Furthermore, novel representations are predicted for intermediate frequencies. Here, we examine these predictions in neurophysiological data from five mammalian species: macaque, guinea pig, cat, gerbil and kangaroo rat. We present the first evidence supporting these untested predictions, and demonstrate that different representations appear to be employed at different sound frequencies in the same species.

  17. Development of core fuel management code system for WWER-type reactors

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In this article, a core fuel management program for hexagonal pressurized water type WWER reactors (CFMHEX) has been developed, which is based on advanced three-dimensional nodal method and integrated with thermal hydraulic code to realize the coupling of neutronics and thermal-hydraulics. In CFMHEX, all these feedback effects such as burnup, power distribution, moderator density, and control rod insertion are considered. The verification and validation of the code system have been examined through the IAEA WWER-1000-type Kalinin NPP benchmark problem. The numerical results are in good agreement with measurements and are close to those of other international institutes.

  18. Housing Development Building Management System (HDBMS For Optimized Electricity Bills

    Directory of Open Access Journals (Sweden)

    Weixian Li

    2017-08-01

    Full Text Available Smart Buildings is a modern building that allows residents to have sustainable comfort with high efficiency of electricity usage. These objectives could be achieved by applying appropriate, capable optimization algorithms and techniques. This paper presents a Housing Development Building Management System (HDBMS strategy inspired by Building Energy Management System (BEMS concept that will integrate with smart buildings using Supply Side Management (SSM and Demand Side Management (DSM System. HDBMS is a Multi-Agent System (MAS based decentralized decision making system proposed by various authors. MAS based HDBMS was created using JAVA on a IEEE FIPA compliant multi-agent platform named JADE. It allows agents to communicate, interact and negotiate with energy supply and demand of the smart buildings to provide the optimal energy usage and minimal electricity costs.  This results in reducing the load of the power distribution system in smart buildings which simulation studies has shown the potential of proposed HDBMS strategy to provide the optimal solution for smart building energy management.

  19. LDPC 码最优化译码算法%Optimization decoding algorithm for LDPC codes

    Institute of Scientific and Technical Information of China (English)

    林志国; 彭卫东; 林晋福; 檀蕊莲; 宋晓鸥

    2014-01-01

    为了提高离散高斯信道下二进制低密度奇偶校验码(low-density parity-check code,LDPC)最优化译码算法的性能和效率,提出了一种改进的 LDPC 码最优化译码算法。首先,通过理论分析和数学推导,构建了译码问题的数学模型;然后,论证并给出了针对该模型的最优化译码算法;最后,基于 VC6.0平台进行了译码的性能和效率仿真并与其他算法进行比较。仿真结果表明,在误码率性能和译码效率上,新算法优于改进前的算法;在误码率性能上,新算法也优于常用的最小和译码算法。仿真结果与理论分析吻合。%An improved optimization decoding algorithm is proposed for binary low-density parity-check (LDPC)codes under any discrete Gaussian channel.First,through theory analysis and mathematical deriva-tion,the mathematical model is constructed for the decoding problem.Then,the optimization decoding algo-rithm is demonstrated for the model.Finally,several simulations are carried out on VC6.0 for the algorithm’s decoding performance and efficiency and the comparison with other algorithms is made.The results show that the algorithm outperforms the former algorithm on either bit-error rate or decoding efficiency and also outper-forms the common min-sum algorithm on bit-error rate.The results are in good agreement with the analysis.

  20. User's manual for the BNW-I optimization code for dry-cooled power plants. Volume III. [PLCIRI

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Daniel, D.J.; De Mier, W.V.; Faletti, D.W.; Wiles, L.E.

    1977-01-01

    This appendix to User's Manual for the BNW-1 Optimization Code for Dry-Cooled Power Plants provides a listing of the BNW-I optimization code for determining, for a particular size power plant, the optimum dry cooling tower design using a plastic tube cooling surface and circular tower arrangement of the tube bundles. (LCL)

  1. Insertion of operation-and-indicate instructions for optimized SIMD code

    Science.gov (United States)

    Eichenberger, Alexander E; Gara, Alan; Gschwind, Michael K

    2013-06-04

    Mechanisms are provided for inserting indicated instructions for tracking and indicating exceptions in the execution of vectorized code. A portion of first code is received for compilation. The portion of first code is analyzed to identify non-speculative instructions performing designated non-speculative operations in the first code that are candidates for replacement by replacement operation-and-indicate instructions that perform the designated non-speculative operations and further perform an indication operation for indicating any exception conditions corresponding to special exception values present in vector register inputs to the replacement operation-and-indicate instructions. The replacement is performed and second code is generated based on the replacement of the at least one non-speculative instruction. The data processing system executing the compiled code is configured to store special exception values in vector output registers, in response to a speculative instruction generating an exception condition, without initiating exception handling.

  2. A Distributed Flow Rate Control Algorithm for Networked Agent System with Multiple Coding Rates to Optimize Multimedia Data Transmission

    Directory of Open Access Journals (Sweden)

    Shuai Zeng

    2013-01-01

    Full Text Available With the development of wireless technologies, mobile communication applies more and more extensively in the various walks of life. The social network of both fixed and mobile users can be seen as networked agent system. At present, kinds of devices and access network technology are widely used. Different users in this networked agent system may need different coding rates multimedia data due to their heterogeneous demand. This paper proposes a distributed flow rate control algorithm to optimize multimedia data transmission of the networked agent system with the coexisting various coding rates. In this proposed algorithm, transmission path and upload bandwidth of different coding rate data between source node, fixed and mobile nodes are appropriately arranged and controlled. On the one hand, this algorithm can provide user nodes with differentiated coding rate data and corresponding flow rate. On the other hand, it makes the different coding rate data and user nodes networked, which realizes the sharing of upload bandwidth of user nodes which require different coding rate data. The study conducts mathematical modeling on the proposed algorithm and compares the system that adopts the proposed algorithm with the existing system based on the simulation experiment and mathematical analysis. The results show that the system that adopts the proposed algorithm achieves higher upload bandwidth utilization of user nodes and lower upload bandwidth consumption of source node.

  3. Optimal management of large scale aquifers under uncertainty

    Science.gov (United States)

    Ghorbanidehno, H.; Kokkinaki, A.; Kitanidis, P. K.; Darve, E. F.

    2016-12-01

    Water resources systems, and especially groundwater reservoirs, are a valuable resource that is often being endangered by contamination and over-exploitation. Optimal control techniques can be applied for groundwater management to ensure the long-term sustainability of this vulnerable resource. Linear Quadratic Gaussian (LQG) control is an optimal control method that combines a Kalman filter for real time estimation with a linear quadratic regulator for dynamic optimization. The LQG controller can be used to determine the optimal controls (e.g. pumping schedule) upon receiving feedback about the system from incomplete noisy measurements. However, applying LQG control for systems of large dimension is computationally expensive. This work presents the Spectral Linear Quadratic Gaussian (SpecLQG) control, a new fast LQG controller that can be used for large scale problems. SpecLQG control combines the Spectral Kalman filter, which is a fast Kalman filter algorithm, with an efficient low rank LQR, and provides a practical approach for combined monitoring, parameter estimation, uncertainty quantification and optimal control for linear and weakly non-linear systems. The computational cost of SpecLQG controller scales linearly with the number of unknowns, a great improvement compared to the quadratic cost of basic LQG. We demonstrate the accuracy and computational efficiency of SpecLQG control using two applications: first, a linear validation case for pumping schedule management in a small homogeneous confined aquifer; and second, a larger scale nonlinear case with unknown heterogeneities in aquifer properties and boundary conditions.

  4. Genetic Searching Algorithm for Optimal Runlength—Limited Codes with Error Control

    Institute of Scientific and Technical Information of China (English)

    RenQingsheng; YeZhongxing

    1997-01-01

    A genetic searching algorithm is presented to construct arbitrarily concatenatable block code with runlength(d,k)constraints.The code also has the ability to correct error during decoding.A similar eliminating operator and an anti-symbiotic operator are suggested to improve the efficiency of the algorithm.

  5. Code-Switching and the Optimal Grammar of Bilingual Language Use

    Science.gov (United States)

    Bhatt, Rakesh M.; Bolonyai, Agnes

    2011-01-01

    In this article, we provide a framework of bilingual grammar that offers a theoretical understanding of the socio-cognitive bases of code-switching in terms of five general principles that, individually or through interaction with each other, explain how and why specific instances of code-switching arise. We provide cross-linguistic empirical…

  6. Optimal control theory applications to management science and economics

    CERN Document Server

    Sethi, Suresh P

    2006-01-01

    Optimal control methods are used to determine the best ways to control a dynamic system. This book applies theoretical work to business management problems developed from the authors' research and classroom instruction. The thoroughly revised new edition has been refined with careful attention to the text and graphic material presentation. Chapters cover a range of topics including finance, production and inventory problems, marketing problems, machine maintenance and replacement, problems of optimal consumption of natural resources, and applications of control theory to economics. The book in

  7. Driving external chemistry optimization via operations management principles.

    Science.gov (United States)

    Bi, F Christopher; Frost, Heather N; Ling, Xiaolan; Perry, David A; Sakata, Sylvie K; Bailey, Simon; Fobian, Yvette M; Sloan, Leslie; Wood, Anthony

    2014-03-01

    Confronted with the need to significantly raise the productivity of remotely located chemistry CROs Pfizer embraced a commitment to continuous improvement which leveraged the tools from both Lean Six Sigma and queue management theory to deliver positive measurable outcomes. During 2012 cycle times were reduced by 48% by optimization of the work in progress and conducting a detailed workflow analysis to identify and address pinch points. Compound flow was increased by 29% by optimizing the request process and de-risking the chemistry. Underpinning both achievements was the development of close working relationships and productive communications between Pfizer and CRO chemists.

  8. Optimal savings management for individuals with defined contribution pension plans

    DEFF Research Database (Denmark)

    Konicz, Agnieszka Karolina; Mulvey, John M.

    2015-01-01

    The paper provides some guidelines to individuals with defined contribution (DC) pension plans on how to manage pension savings both before and after retirement. We argue that decisions regarding investment, annuity payments, and the size of death sum should not only depend on the individual’s age...... characterizing the individual. The problem is solved via a model that combines two optimization approaches: stochastic optimal control and multi-stage stochastic programming. The first method is common in financial and actuarial literature, but produces theoretical results. However, the latter, which...

  9. The Helioseismic and Magnetic Imager (HMI) Vector Magnetic Field Pipeline: Optimization of the Spectral Line Inversion Code

    CERN Document Server

    Centeno, R; Hayashi, K; Norton, A; Hoeksema, J T; Liu, Y; Leka, K D; Barnes, G

    2014-01-01

    The Very Fast Inversion of the Stokes Vector (VFISV) is a Milne-Eddington spectral line inversion code used to determine the magnetic and thermodynamic parameters of the solar photosphere from observations of the Stokes vector in the 6173 A Fe I line by the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO). We report on the modifications made to the original VFISV inversion code in order to optimize its operation within the HMI data pipeline and provide the smoothest solution in active regions. The changes either sped up the computation or reduced the frequency with which the algorithm failed to converge to a satisfactory solution. Additionally, coding bugs which were detected and fixed in the original VFISV release, are reported here.

  10. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.; Faletti, D.W.; Wiles, L.E.

    1978-05-01

    The User's Manual describes how to operate BNW-II, a computer code developed by the Pacific Northwest Laboratory (PNL) as a part of its activities under the Department of Energy (DOE) Dry Cooling Enhancement Program. The computer program offers a comprehensive method of evaluating the cost savings potential of dry/wet-cooled heat rejection systems. Going beyond simple ''figure-of-merit'' cooling tower optimization, this method includes such items as the cost of annual replacement capacity, and the optimum split between plant scale-up and replacement capacity, as well as the purchase and operating costs of all major heat rejection components. Hence the BNW-II code is a useful tool for determining potential cost savings of new dry/wet surfaces, new piping, or other components as part of an optimized system for a dry/wet-cooled plant.

  11. Optimization of an Electromagnetics Code with Multicore Wavefront Diamond Blocking and Multi-dimensional Intra-Tile Parallelization

    KAUST Repository

    Malas, Tareq M.

    2016-07-21

    Understanding and optimizing the properties of solar cells is becoming a key issue in the search for alternatives to nuclear and fossil energy sources. A theoretical analysis via numerical simulations involves solving Maxwell\\'s Equations in discretized form and typically requires substantial computing effort. We start from a hybrid-parallel (MPI+OpenMP) production code that implements the Time Harmonic Inverse Iteration Method (THIIM) with Finite-Difference Frequency Domain (FDFD) discretization. Although this algorithm has the characteristics of a strongly bandwidth-bound stencil update scheme, it is significantly different from the popular stencil types that have been exhaustively studied in the high performance computing literature to date. We apply a recently developed stencil optimization technique, multicore wavefront diamond tiling with multi-dimensional cache block sharing, and describe in detail the peculiarities that need to be considered due to the special stencil structure. Concurrency in updating the components of the electric and magnetic fields provides an additional level of parallelism. The dependence of the cache size requirement of the optimized code on the blocking parameters is modeled accurately, and an auto-tuner searches for optimal configurations in the remaining parameter space. We were able to completely decouple the execution from the memory bandwidth bottleneck, accelerating the implementation by a factor of three to four compared to an optimal implementation with pure spatial blocking on an 18-core Intel Haswell CPU.

  12. Data Mining for Secure Software Engineering – Source Code Management Tool Case Study

    Directory of Open Access Journals (Sweden)

    A.V.Krishna Prasad,

    2010-07-01

    Full Text Available As Data Mining for Secure Software Engineering improves software productivity and quality, software engineers are increasingly applying data mining algorithms to various software engineering tasks. However mining software engineering data poses several challenges, requiring various algorithms to effectively mine sequences, graphs and text from such data. Software engineering data includes code bases, execution traces, historical code changes,mailing lists and bug data bases. They contains a wealth of information about a projects-status, progress and evolution. Using well established data mining techniques, practitioners and researchers can explore the potential of this valuable data in order to better manage their projects and do produce higher-quality software systems that are delivered on time and with in budget. Data mining can be used in gathering and extracting latent security requirements, extracting algorithms and business rules from code, mining legacy applications for requirements and business rules for new projects etc. Mining algorithms for software engineering falls into four main categories: Frequent pattern mining – finding commonly occurring patterns; Pattern matching – finding data instances for given patterns; Clustering – grouping data into clusters and Classification – predicting labels of data based on already labeled data. In this paper, we will discuss the overview of strategies for data mining for secure software engineering, with the implementation of a case study of text mining for source code management tool.

  13. CPT coding patterns at nurse-managed health centers: data from a national survey.

    Science.gov (United States)

    Vonderheid, Susan C; Pohl, Joanne M; Tanner, Clare; Newland, Jamesetta A; Gans, Dave N

    2009-01-01

    Nurse-managed health centers (NMHCs) play an important role in delivering health care services to a wide range of communities and often serve as our nation's safety net providers. Unfortunately, NMHCs struggle to remain in business for a variety of reasons, including underdeveloped business practices. Until now, NMHCs had only data from the Centers for Medicare and Medicaid Services and the Medical Group Management Assocation for comparison with coding patterns in individual centers. This article is the first published report of national data for NMHCs that is available for comparison. Providers need to possess financial acumen to remain open for business. Assessment of CPT coding patterns is a key strategy to support long-term sustainability.

  14. Folded Codes from Function Field Towers and Improved Optimal Rate List Decoding

    CERN Document Server

    Guruswami, Venkatesan

    2012-01-01

    We give a new construction of algebraic codes which are efficiently list decodable from a fraction $1-R-\\eps$ of adversarial errors where $R$ is the rate of the code, for any desired positive constant $\\eps$. The worst-case list size output by the algorithm is $O(1/\\eps)$, matching the existential bound for random codes up to constant factors. Further, the alphabet size of the codes is a constant depending only on $\\eps$ - it can be made $\\exp(\\tilde{O}(1/\\eps^2))$ which is not much worse than the lower bound of $\\exp(\\Omega(1/\\eps))$. The parameters we achieve are thus quite close to the existential bounds in all three aspects - error-correction radius, alphabet size, and list-size - simultaneously. Our code construction is Monte Carlo and has the claimed list decoding property with high probability. Once the code is (efficiently) sampled, the encoding/decoding algorithms are deterministic with a running time $O_\\eps(N^c)$ for an absolute constant $c$, where $N$ is the code's block length. Our construction i...

  15. Portable parallel portfolio optimization in the Aurora Financial Management System

    Science.gov (United States)

    Laure, Erwin; Moritsch, Hans

    2001-07-01

    Financial planning problems are formulated as large scale, stochastic, multiperiod, tree structured optimization problems. An efficient technique for solving this kind of problems is the nested Benders decomposition method. In this paper we present a parallel, portable, asynchronous implementation of this technique. To achieve our portability goals we elected the programming language Java for our implementation and used a high level Java based framework, called OpusJava, for expressing the parallelism potential as well as synchronization constraints. Our implementation is embedded within a modular decision support tool for portfolio and asset liability management, the Aurora Financial Management System.

  16. The Value of Methodical Management: Optimizing Science Results

    Science.gov (United States)

    Saby, Linnea

    2016-01-01

    As science progresses, making new discoveries in radio astronomy becomes increasingly complex. Instrumentation must be incredibly fine-tuned and well-understood, scientists must consider the skills and schedules of large research teams, and inter-organizational projects sometimes require coordination between observatories around the globe. Structured and methodical management allows scientists to work more effectively in this environment and leads to optimal science output. This report outlines the principles of methodical project management in general, and describes how those principles are applied at the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia.

  17. Optimization of the customer capital management system of the enterprise

    Directory of Open Access Journals (Sweden)

    Ie.O. Golysheva

    2013-12-01

    Full Text Available The aim of the article. The aim of the article is development of methodology of optimization of customer capital management system for enterprises. The results of the analysis. Thу article presents the optimization of customer capital management system which based on author’s methodology of customer capital evaluation. This methodology is based on the definition of integral indexes for resource and potential components of customer capital. The author considers the system of business relationship with economic contractors, information about economic contractors and history of relations with them, the trademarks of the company to the resource part and distribution system, communication system and image of the company – to the potential part of customer capital. Accordingly, the improvement of the state of customer capital management is due to the implementation of complex of strategic actions to switch positions on the matrix «resource-potential». The increase of the resource base and potential of customer capital leads to increased competitive position of the company and increases the efficiency of its activities. However, increasing the level of customer capital is required costs of management actions. Therefore, it is necessity of determine of the optimal value of the result and the necessary costs to produce it. The article presents a graphical interpretation of depending costs of customer capital increasing, results of enterprise activity on the state of its customer capital. Going to the next quadrant of the matrix «resource-potential» is made in the direction of «up and right». It is linked to the achievement of results and additional costs. Their tentative list is given in the article. Each transition is associated with a certain level of results and costs. It is necessary to choose that variant, when the ratio is maximum. Thus, in the paper an algorithm has been developed that takes into account all possible transitions and

  18. Efficient and effective compound management to support lead optimization.

    Science.gov (United States)

    Johnson, Chad W; Chatterjee, Moneesh; Kubala, Steve; Helm, David; Houston, John; Banks, Martyn

    2009-06-01

    The introduction of lean thinking and Six Sigma methodologies into the drug discovery process has become an important approach for ensuring efficient workflows while containing costs. For the compound management department at Bristol-Myers Squibb, this has resulted in a partnership with the research community to evaluate and streamline processes to enable cost-disciplined science. The authors describe the results of Lean Six Sigma approaches in the automation and informatics environment that have been optimized to support parallel processing of compounds. This new platform facilitates the rapid and simultaneous data generation from structure activity and structure liability assays. As a result of these compound management improvements, reduction of timelines and quicker decision making has been achieved in the lead optimization process.

  19. Brain natriuretic peptide and optimal management of heart failure

    Institute of Scientific and Technical Information of China (English)

    LI Nan; WANG Jian-an

    2005-01-01

    Aside from the important role of brain natriuretic peptide (BNP) in diagnosis, and differential diagnosis of heart failure, this biological peptide has proved to be an independent surrogate marker of rehospitalization and death of the fatal disease.Several randomized clinical trials demonstrated that drugs such as beta blocker, angiotensin converting enzyme inhibitor, spironolactone and amiodarone have beneficial effects in decreasing circulating BNP level during the management of chronic heart failure. The optimization of clinical decision-making appeals for a representative surrogate marker for heart failure prognosis. The serial point-of-care assessments of BNP concentration provide a therapeutic goal of clinical multi-therapy and an objective guidance for optimal treatment of heart failure. Nevertheless new questions and problems in this area remain to be clarified. On the basis of current research advances, this article gives an overview of BNP peptide and its property and role in the management of heart failure.

  20. Utility Optimal Coding for Packet Transmission over Wireless Networks - Part II: Networks of Packet Erasure Channels

    CERN Document Server

    Karumbu, Premkumar; Leith, Douglas J

    2011-01-01

    We define a class of multi--hop erasure networks that approximates a wireless multi--hop network. The network carries unicast flows for multiple users, and each information packet within a flow is required to be decoded at the flow destination within a specified delay deadline. The allocation of coding rates amongst flows/users is constrained by network capacity. We propose a proportional fair transmission scheme that maximises the sum utility of flow throughputs. This is achieved by {\\em jointly optimising the packet coding rates and the allocation of bits of coded packets across transmission slots.}

  1. Identifying factors affecting optimal management of agricultural water

    Directory of Open Access Journals (Sweden)

    Masoud Samian

    2015-01-01

    In addition to quantitative methodology such as descriptive statistics and factor analysis a qualitative methodology was employed for dynamic simulation among variables through Vensim software. In this study, the factor analysis technique was used through the Kaiser-Meyer-Olkin (KMO and Bartlett tests. From the results, four key elements were identified as factors affecting the optimal management of agricultural water in Hamedan area. These factors were institutional and legal factors, technical and knowledge factors, economic factors and social factors.

  2. The Importance of Supply Chain Management on Financial Optimization

    Directory of Open Access Journals (Sweden)

    Arawati Agus

    2013-01-01

    Full Text Available Many manufacturing companies are facing uncertainties and stiff competition both locally and globally, intensified by increasing needs for sophisticated and high value products from demanding customers. These companies are forced to improve the quality of their supply chain management decisions, products and reduce their manufacturing costs. With today’s volatile and very challenging global market, many manufacturing companies have started to realize the importance of the proper managing of their supply chains. Supply chain management (SCM involves practices such as strategic supplier partnership, customer focus, lean production, postpone concept and technology & innovation. This study investigates the importance of SCM on financial optimization. The study measures production or SCM managers’ perceptions regarding SCM and level of performances in their companies. The paper also specifically investigates whether supply chain performance acts as a mediating variable in the relationship between SCM and financial optimization. These associations were analyzed through statistical methods such as Pearson’s correlation and a regression-based mediated analysis. The findings suggest that SCM has significant correlations with supply chain performance and financial optimization. In addition, the result of the regression-based mediated analysis demonstrates that supply chain performance mediates the linkage between SCM and financial optimization. The findings of the study provide a striking demonstration of the importance of SCM in enhancing the performances of Malaysian manufacturing companies. The result indicates that manufac-turing companies should emphasize greater management support for SCM implementation and a greater degree of attention for production integration and information flow integration in the manufacturing system in order to maximize profit and tzerimize cost.

  3. TECHNIQUE OF OPTIMAL AUDIT PLANNING FOR INFORMATION SECURITY MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    F. N. Shago

    2014-03-01

    Full Text Available Complication of information security management systems leads to the necessity of improving the scientific and methodological apparatus for these systems auditing. Planning is an important and determining part of information security management systems auditing. Efficiency of audit will be defined by the relation of the reached quality indicators to the spent resources. Thus, there is an important and urgent task of developing methods and techniques for optimization of the audit planning, making it possible to increase its effectiveness. The proposed technique gives the possibility to implement optimal distribution for planning time and material resources on audit stages on the basis of dynamics model for the ISMS quality. Special feature of the proposed approach is the usage of a priori data as well as a posteriori data for the initial audit planning, and also the plan adjustment after each audit event. This gives the possibility to optimize the usage of audit resources in accordance with the selected criteria. Application examples of the technique are given while planning audit information security management system of the organization. The result of computational experiment based on the proposed technique showed that the time (cost audit costs can be reduced by 10-15% and, consequently, quality assessments obtained through audit resources allocation can be improved with respect to well-known methods of audit planning.

  4. CONGESTION MANAGEMENT IN DEREGULATED POWER SYSTEMS USING REAL CODED GENETIC ALGORITHM

    Directory of Open Access Journals (Sweden)

    Sujatha Balaraman

    2010-11-01

    Full Text Available In this paper, an efficient method has been proposed for transmission line over load alleviation in deregulated power system using real coded genetic algorithm (RCGA. For secure operation of power system, the network loading has to be maintained within specified limits. Transmission line congestion initiates the cascading outages which forces the system to collapse. Accurate prediction and alleviation of line overloads is the suitable corrective action to avoid network collapse. In this paper an attempt is made to explore the use of real coded genetic algorithm to find the optimal generation rescheduling for relieving congestion. The effectiveness of the proposed algorithm has been analyzed on IEEE 30 bus test system. The results obtained by the proposed method are found to be quite encouraging when compared with Simulated Annealing (SA and hence it will be useful in electrical restructuring.

  5. Optimal Repair of MDS Codes in Distributed Storage via Subspace Interference Alignment

    CERN Document Server

    Cadambe, Viveck R; Jafar, Syed A; Li, Jin

    2011-01-01

    It is well known that an (n,k) code can be used to store 'k' units of information in 'n' unit-capacity disks of a distributed data storage system. If the code used is maximum distance separable (MDS), then the system can tolerate any (n-k) disk failures, since the original information can be recovered from any k surviving disks. The focus of this paper is the design of a systematic MDS code with the additional property that a single disk failure can be repaired with minimum repair bandwidth, i.e., with the minimum possible amount of data to be downloaded for recovery of the failed disk. Previously, a lower bound of (n-1)/(n-k) units has been established by Dimakis et. al, on the repair bandwidth for a single disk failure in an (n,k) MDS code . Recently, the existence of asymptotic codes achieving this lower bound for arbitrary (n,k) has been established by drawing connections to interference alignment. While the existence of asymptotic constructions achieving this lower bound have been shown, finite code cons...

  6. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    Energy Technology Data Exchange (ETDEWEB)

    Binh, Do Quang [University of Technical Education Ho Chi Minh City (Viet Nam); Huy, Ngo Quang [University of Industry Ho Chi Minh City (Viet Nam); Hai, Nguyen Hoang [Centre for Research and Development of Radiation Technology, Ho Chi Minh City (Viet Nam)

    2014-12-15

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  7. Optimal design of FIR high pass filter based on L1 error approximation using real coded genetic algorithm

    Directory of Open Access Journals (Sweden)

    Apoorva Aggarwal

    2015-12-01

    Full Text Available In this paper, an optimal design of linear phase digital finite impulse response (FIR highpass (HP filter using the L1-norm based real-coded genetic algorithm (RCGA is investigated. A novel fitness function based on L1 norm is adopted to enhance the design accuracy. Optimized filter coefficients are obtained by defining the filter objective function in L1 sense using RCGA. Simulation analysis unveils that the performance of the RCGA adopting this fitness function is better in terms of signal attenuation ability of the filter, flatter passband and the convergence rate. Observations are made on the percentage improvement of this algorithm over the gradient-based L1 optimization approach on various factors by a large amount. It is concluded that RCGA leads to the best solution under specified parameters for the FIR filter design on account of slight unnoticeable higher transition width.

  8. Optimal allocation of watershed management cost among different water users

    Institute of Scientific and Technical Information of China (English)

    Wang Zanxin; Margaret M.Calderon

    2006-01-01

    The issue of water scarcity highlights the importance of watershed management. A sound watershed management should make all water users share the incurred cost. This study analyzes the optimal allocation of watershed management cost among different water users. As a consumable, water should be allocated to different users the amounts in which their marginal utilities (Mus) or marginal products (MPs) of water are equal. The value of Mus or MPs equals the water price that the watershed manager charges. When water is simultaneously used as consumable and non-consumable, the watershed manager produces the quantity of water in which the sum of Mus and/or MPs for the two types of uses equals the marginal cost of water production. Each water user should share the portion of watershed management cost in the percentage that his MU or MP accounts for the sum of Mus and/or MPs. Thus, the price of consumable water does not equal the marginal cost of water production even if there is no public good.

  9. Heap leach cyanide irrigation and risk to wildlife: Ramifications for the international cyanide management code.

    Science.gov (United States)

    Donato, D B; Madden-Hallett, D M; Smith, G B; Gursansky, W

    2017-06-01

    Exposed cyanide-bearing solutions associated with gold and silver recovery processes in the mining industry pose a risk to wildlife that interact with these solutions. This has been documented with cyanide-bearing tailings storage facilities, however risks associated with heap leach facilities are poorly documented, monitored and audited. Gold and silver leaching heap leach facilities use cyanide, pH-stabilised, at concentrations deemed toxic to wildlife. Their design and management are known to result in exposed cyanide-bearing solutions that are accessible to and present a risk to wildlife. Monitoring of the presence of exposed solutions, wildlife interaction, interpretation of risks and associated wildlife deaths are poorly documented. This paper provides a list of critical monitoring criteria and attempts to predict wildlife guilds most at risk. Understanding the significance of risks to wildlife from exposed cyanide solutions is complex, involving seasonality, relative position of ponding, temporal nature of ponding, solution palatability, environmental conditions, in situ wildlife species inventory and provision of alternative drinking sources for wildlife. Although a number of heap leach operations are certified as complaint with the International Cyanide Management Code (Cyanide Code), these criteria are not considered by auditors nor has systematic monitoring regime data been published. Without systematic monitoring and further knowledge, wildlife deaths on heap leach facilities are likely to remain largely unrecorded. This has ramifications for those operations certified as compliance with the Cyanide Code. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Swiss Foundation Code 2009 principles and recommendations for the establishment and management of grant-making foundations

    CERN Document Server

    Sprecher, Thomas; Janssen, Martin

    2011-01-01

    The «Swiss Foundation Code 2009» takes up and completes the first European Good Governance Code for grant-making foundations, published in 2005. It contains practical governance guidelines regarding the establishment, organization, management and monitoring of grant-making foundations as well as making due reference to support activities, financial and investment policies. The abridged English version of the „Swiss Foundation Code 2009“ contains 3 principles and 26 recommendations – but not the extensive commentary parts.

  11. Development of accident management technology and computer codes -A study for nuclear safety improvement-

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Kyu; Jae, Moo Sung; Jo, Young Gyun; Park, Rae Jun; Kim, Jae Hwan; Ha, Jae Ju; Kang, Dae Il; Choi, Sun Young; Kim, Si Hwan [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)

    1994-07-01

    We have surveyed new technologies and research results for the accident management of nuclear power plants. And, based on the concept of using the existing plant capabilities for accident management, both in-vessel and ex-vessel strategies were identified and analyzed. When assessing accident management strategies, their effectiveness, adverse effects, and their feasibility must be considered. We have developed a framework for assessing the strategies with these factors in mind. We have applied the developed framework to assessing the strategies, including the likelihood that the operator correctly diagnoses the situation and successfully implements the strategies. Finally, the cavity flooding strategy was assessed by applying it to the station blackout sequence, which have been identified as one of the major contributors to risk at the reference plant. The thermohydraulic analyses with sensitivity calculations have been performed using MAAP 4 computer code. (Author).

  12. Lifting scheme-based method for joint coding 3D stereo digital cinema with luminace correction and optimized prediction

    Science.gov (United States)

    Darazi, R.; Gouze, A.; Macq, B.

    2009-01-01

    Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.

  13. Hydroeconomic optimization of reservoir management under downstream water quality constraints

    Science.gov (United States)

    Davidsen, Claus; Liu, Suxia; Mo, Xingguo; Holm, Peter E.; Trapp, Stefan; Rosbjerg, Dan; Bauer-Gottwein, Peter

    2015-10-01

    A hydroeconomic optimization approach is used to guide water management in a Chinese river basin with the objectives of meeting water quantity and water quality constraints, in line with the China 2011 No. 1 Policy Document and 2015 Ten-point Water Plan. The proposed modeling framework couples water quantity and water quality management and minimizes the total costs over a planning period assuming stochastic future runoff. The outcome includes cost-optimal reservoir releases, groundwater pumping, water allocation, wastewater treatments and water curtailments. The optimization model uses a variant of stochastic dynamic programming known as the water value method. Nonlinearity arising from the water quality constraints is handled with an effective hybrid method combining genetic algorithms and linear programming. Untreated pollutant loads are represented by biochemical oxygen demand (BOD), and the resulting minimum dissolved oxygen (DO) concentration is computed with the Streeter-Phelps equation and constrained to match Chinese water quality targets. The baseline water scarcity and operational costs are estimated to 15.6 billion CNY/year. Compliance to water quality grade III causes a relatively low increase to 16.4 billion CNY/year. Dilution plays an important role and increases the share of surface water allocations to users situated furthest downstream in the system. The modeling framework generates decision rules that result in the economically efficient strategy for complying with both water quantity and water quality constraints.

  14. Principles for a Code of Conduct for the Management and Sustainable Use of Mangrove Ecosystems

    DEFF Research Database (Denmark)

    Macintosh, Donald; Nielsen, Thomas; Zweig, Ronald

    Mangrove ecosystems provide important environmental, economic and social functions throughout tropical and subtropical latitudes, but they have been heavily exploited for their valuable wood and fisheries resources, or converted for development purposes. Recognising the importance of conserving...... mangrove forest ecosystems worldwide, the World Bank commissioned a study with the title "Mainstreaming conservation of coastal biodiversity through formulation of a generic Code of Conduct for Sustainable Management of Mangrove Forest Ecosystems". Formulation of these Principles for a Code of Conduct...... is based on existing knowledge, experience and needs. The articles presented identify key linkages and coordination needs among government departments, NGOs, coastal communities, researchers or research institutions and entrepreneurs who have an interest in the conservation of mangrove ecosystems...

  15. Architecture proposal for the use of QR code in supply chain management

    Directory of Open Access Journals (Sweden)

    Dalton Matsuo Tavares

    2012-01-01

    Full Text Available Supply chain traceability and visibility are key concerns for many companies. Radio-Frequency Identification (RFID is an enabling technology that allows identification of objects in a fully automated manner via radio waves. Nevertheless, this technology has limited acceptance and high costs. This paper presents a research effort undertaken to design a track and trace solution in supply chains, using quick response code (or QR Code for short as a less complex and cost-effective alternative for RFID in supply chain management (SCM. A first architecture proposal using open source software will be presented as a proof of concept. The system architecture is presented in order to achieve tag generation, the image acquisition and pre-processing, product inventory and tracking. A prototype system for the tag identification is developed and discussed at the end of the paper to demonstrate its feasibility.

  16. Optimized Energy Management for Mixed Uplink Traffic in LTE UE

    Directory of Open Access Journals (Sweden)

    Vinod Mirchandani

    2013-03-01

    Full Text Available Battery life is a major issue for any mobile equipment, and reducing energy consumption via energy management in 3G LTE user equipment (UE will be essential for the delivery of a variety of services. Discontinuous transmission (DTX and reception (DRX have been designed to facilitate power management, but they can provide energy savings only via proper tuning. Relevant work in the literature mainly pertains only to discontinuous reception mode (DRX for downlink data. However, today’s increasingly powerful UEs can generate and upload significant amount of data. This paper proposes an energy management framework applicable to both discontinuous transmission (DTX and DRX power saving modes. In particular, in DTX mode it can reduce UE energy consumption for uplink intensive applications like telemedicine or social networking. The proposed novel energy management framework is based on jointly using a-priori analytical evaluation of a M/G/1/K finite uplink queue system for mixed traffic with an optimized DTX/DRX algorithm. DTX mode is modeled by an expression, through which the impact of quality of service (QoS parameters on the UE’s mean energy consumption for uplink transfer is determined. The model extracts and operates on the values computed for the M/G/1/K queue. Finally, a dynamic energy management algorithm for DRX/DTX modes is proposed for energy consumption optimization based on an integrated Analytical Hierarchy Process (AHP and Grey Relational Analysis (GRA. Analytical evaluation has shown that using our algorithm to tune DTX can achieve 49-73% energy saving over not using DTX.

  17. Evolving strategies for optimal care management and plan benefit designs.

    Science.gov (United States)

    Cruickshank, John M

    2012-11-01

    As a prevalent, complex disease, diabetes presents a challenge to managed care. Strategies to optimize type 2 diabetes care management and treatment outcomes have been evolving over the past several years. Novel economic incentive programs (eg, those outlined in the Patient Protection and Affordable Care Act of 2010 that tie revenue from Medicare Advantage plans to the quality of healthcare delivered) are being implemented, as are evidence-based interventions designed to optimize treatment, reduce clinical complications, and lower the total financial burden of the disease. Another step that can improve outcomes is to align managed care diabetes treatment algorithms with national treatment guidelines. In addition, designing the pharmacy benefit to emphasize the overall value of treatment and minimize out-of-pocket expenses for patients can be an effective approach to reducing prescription abandonment. The implementation of emerging models of care that encourage collaboration between providers, support lifestyle changes, and engage patients to become partners in their own treatment also appears to be effective.

  18. Multiple Objective Optimizations for Energy Management System under Uncertainties

    Directory of Open Access Journals (Sweden)

    Mian Xing

    2013-07-01

    Full Text Available Recently, micro-grid gains more and more concerns, because it is flexible and environmentally friendly. Optimization of the distributed generators operation in micro-grid is a complicated and challenging task, a multi objective optimal model was designed to cut off the operation cost, improve the economic benefits and reduce the emission. However, the randomness of the renewable energy generation and load demand makes the decision process much more complicated. Chance constrained programming (CCP was employed to deal with these uncertainties. Besides, the satisfaction degree of the decision was taken into consideration to coordinate the conflicts among different targets. Through the weighted satisfaction degree and coordinate degree, the multi-objective programming can be transformed into single-objective programming. To gain the solution of the optimization problem, genetic algorithm was utilized to search for the optimal strategy. To verify the validity of the proposed model, an energy management system of micro-grid with five types distributed generators was taken as the case study. The results indicate the effectiveness of the proposed method.

  19. Ambulatory anesthesia: optimal perioperative management of the diabetic patient

    Directory of Open Access Journals (Sweden)

    Polderman JAW

    2016-05-01

    Full Text Available Jorinde AW Polderman, Robert van Wilpe, Jan H Eshuis, Benedikt Preckel, Jeroen Hermanides Department of Anaesthesiology, Academic Medical Centre, University of Amsterdam, Amsterdam, the Netherlands Abstract: Given the growing number of patients with diabetes mellitus (DM and the growing number of surgical procedures performed in an ambulatory setting, DM is one of the most encountered comorbidities in patients undergoing ambulatory surgery. Perioperative management of ambulatory patients with DM requires a different approach than patients undergoing major surgery, as procedures are shorter and the stress response caused by surgery is minimal. However, DM is a risk factor for postoperative complications in ambulatory surgery, so should be managed carefully. Given the limited time ambulatory patients spend in the hospital, improvement in management has to be gained from the preanesthetic assessment. The purpose of this review is to summarize current literature regarding the anesthesiologic management of patients with DM in the ambulatory setting. We will discuss the risks of perioperative hyperglycemia together with the pre-, intra-, and postoperative considerations for these patients when encountered in an ambulatory setting. Furthermore, we provide recommendations for the optimal perioperative management of the diabetic patient undergoing ambulatory surgery. Keywords: diabetes mellitus, perioperative period, ambulatory surgery, insulin, complications, GLP-1 agonist, DPP-4 inhibitor

  20. Power Allocation Optimization: Linear Precoding Adapted to NB-LDPC Coded MIMO Transmission

    Directory of Open Access Journals (Sweden)

    Tarek Chehade

    2015-01-01

    Full Text Available In multiple-input multiple-output (MIMO transmission systems, the channel state information (CSI at the transmitter can be used to add linear precoding to the transmitted signals in order to improve the performance and the reliability of the transmission system. This paper investigates how to properly join precoded closed-loop MIMO systems and nonbinary low density parity check (NB-LDPC. The q elements in the Galois field, GF(q, are directly mapped to q transmit symbol vectors. This allows NB-LDPC codes to perfectly fit with a MIMO precoding scheme, unlike binary LDPC codes. The new transmission model is detailed and studied for several linear precoders and various designed LDPC codes. We show that NB-LDPC codes are particularly well suited to be jointly used with precoding schemes based on the maximization of the minimum Euclidean distance (max-dmin criterion. These results are theoretically supported by extrinsic information transfer (EXIT analysis and are confirmed by numerical simulations.

  1. Optimized Reactive Power Flow of DFIG Power Converters for Better Reliability Performance Considering Grid Codes

    DEFF Research Database (Denmark)

    Zhou, Dao; Blaabjerg, Frede; Lau, Mogens

    2015-01-01

    . In order to fulfill the modern grid codes, over-excited reactive power injection will further reduce the lifetime of the rotor-side converter. In this paper, the additional stress of the power semiconductor due to the reactive power injection is firstly evaluated in terms of modulation index...

  2. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    Science.gov (United States)

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Managing and Coding References for Systematic Reviews and Scoping Reviews in EndNote.

    Science.gov (United States)

    Peters, Micah D J

    2017-01-01

    This article describes a novel approach for using EndNote to manage and code references in the conduct and reporting of systematic reviews and scoping reviews. The process is simple and easy for reviewers new to both EndNote and systematic reviews. This process allows reviewers to easily conduct and report systematic reviews in line with the internationally recognized PRISMA reporting guidelines and also facilitates the overall task of systematic or scoping review conduct and reporting from the initial search through to structuring the results, discussion, and conclusions in a rigorous, reproducible, and user-friendly manner.

  4. An Optimal Method for Developing Global Supply Chain Management System

    Directory of Open Access Journals (Sweden)

    Hao-Chun Lu

    2013-01-01

    Full Text Available Owing to the transparency in supply chains, enhancing competitiveness of industries becomes a vital factor. Therefore, many developing countries look for a possible method to save costs. In this point of view, this study deals with the complicated liberalization policies in the global supply chain management system and proposes a mathematical model via the flow-control constraints, which are utilized to cope with the bonded warehouses for obtaining maximal profits. Numerical experiments illustrate that the proposed model can be effectively solved to obtain the optimal profits in the global supply chain environment.

  5. An Optimal Energy Management System for Electric Vehicles using Firefly Optimization Algorithm based Dynamic EDF Scheduling

    Directory of Open Access Journals (Sweden)

    E.Kayalvizhi

    2015-08-01

    Full Text Available Mitigation of global warming gases from burning gasoline for transportation in vehicles is one of the biggest and most complex issues the world has ever faced. In an intention to eradicate the environmental crisis caused due to global warming, electric vehicles were been introduced that are powered by electric motor which works on the energy stored in a battery pack. Inspired by the research on power management in electric vehicles, this paper focuses on the development of an energy management system for electric vehicles (EMSEV to optimally balance the energy from battery pack. The proposed methodology uses firefly optimization algorithm to optimize the power consumption of the devices like electric motor, power steering, air conditioner, power window, automatic door locks, radio, speaker, horn, wiper, GPS, internal and external lights etc., from the battery in electric vehicles. Depending upon the distance to cover and the battery availability, the devices are made to switch down automatically through dynamic EDF scheduling. CAN protocol is used for effective communication between the devices and the controller. Simulation results are obtained using MATLAB.

  6. Application of Artificial Intelligence for Optimization in Pavement Management

    Directory of Open Access Journals (Sweden)

    Reus Salini

    2015-07-01

    Full Text Available Artificial intelligence (AI is a group of techniques that have quite a potential to be applied to pavement engineering and management. In this study, we developed a practical, flexible and out of the box approach to apply genetic algorithms to optimizing the budget allocation and the road maintenance strategy selection for a road network. The aim is to provide an alternative to existing software and better fit the requirements of an important number of pavement managers. To meet the objectives, a new indicator, named Road Global Value Index (RGVI, was created to contemplate the pavement condition, the traffic and the economic and political importance for each and every road section. This paper describes the approach and its components by an example confirming that genetic algorithms are very effective for the intended purpose.

  7. Disturbance, life history, and optimal management for biodiversity

    Science.gov (United States)

    Guo, Q.

    2003-01-01

    Both frequency and intensity of disturbances in many ecosystems have been greatly enhanced by increasing human activities. As a consequence, the short-lived plant species including many exotics might have been dramatically increased in term of both richness and abundance on our planet while many long-lived species might have been lost. Such conclusions can be drawn from broadly observed successional cycles in both theoretical and empirical studies. This article discusses two major issues that have been largely overlooked in current ecosystem management policies and conservation efforts, i.e., life history constraints and future global warming trends. It also addresses the importance of these two factors in balancing disturbance frequency and intensity for optimal biodiversity maintenance and ecosystem management.

  8. Application of Hybrid Optimization-Expert System for Optimal Power Management on Board Space Power Station

    Science.gov (United States)

    Momoh, James; Chattopadhyay, Deb; Basheer, Omar Ali AL

    1996-01-01

    The space power system has two sources of energy: photo-voltaic blankets and batteries. The optimal power management problem on-board has two broad operations: off-line power scheduling to determine the load allocation schedule of the next several hours based on the forecast of load and solar power availability. The nature of this study puts less emphasis on speed requirement for computation and more importance on the optimality of the solution. The second category problem, on-line power rescheduling, is needed in the event of occurrence of a contingency to optimally reschedule the loads to minimize the 'unused' or 'wasted' energy while keeping the priority on certain type of load and minimum disturbance of the original optimal schedule determined in the first-stage off-line study. The computational performance of the on-line 'rescheduler' is an important criterion and plays a critical role in the selection of the appropriate tool. The Howard University Center for Energy Systems and Control has developed a hybrid optimization-expert systems based power management program. The pre-scheduler has been developed using a non-linear multi-objective optimization technique called the Outer Approximation method and implemented using the General Algebraic Modeling System (GAMS). The optimization model has the capability of dealing with multiple conflicting objectives viz. maximizing energy utilization, minimizing the variation of load over a day, etc. and incorporates several complex interaction between the loads in a space system. The rescheduling is performed using an expert system developed in PROLOG which utilizes a rule-base for reallocation of the loads in an emergency condition viz. shortage of power due to solar array failure, increase of base load, addition of new activity, repetition of old activity etc. Both the modules handle decision making on battery charging and discharging and allocation of loads over a time-horizon of a day divided into intervals of 10

  9. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  10. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  11. Decoding and optimized implementation of SECDED codes over GF(q)

    Energy Technology Data Exchange (ETDEWEB)

    Ward, H Lee; Ganti, Anand; Resnick, David R

    2014-11-18

    A plurality of columns for a check matrix that implements a distance d linear error correcting code are populated by providing a set of vectors from which to populate the columns, and applying to the set of vectors a filter operation that reduces the set by eliminating therefrom all vectors that would, if used to populate the columns, prevent the check matrix from satisfying a column-wise linear independence requirement associated with check matrices of distance d linear codes. One of the vectors from the reduced set may then be selected to populate one of the columns. The filtering and selecting repeats iteratively until either all of the columns are populated or the number of currently unpopulated columns exceeds the number of vectors in the reduced set. Columns for the check matrix may be processed to reduce the amount of logic needed to implement the check matrix in circuit logic.

  12. Design, decoding and optimized implementation of SECDED codes over GF(q)

    Energy Technology Data Exchange (ETDEWEB)

    Ward, H Lee; Ganti, Anand; Resnick, David R

    2014-06-17

    A plurality of columns for a check matrix that implements a distance d linear error correcting code are populated by providing a set of vectors from which to populate the columns, and applying to the set of vectors a filter operation that reduces the set by eliminating therefrom all vectors that would, if used to populate the columns, prevent the check matrix from satisfying a column-wise linear independence requirement associated with check matrices of distance d linear codes. One of the vectors from the reduced set may then be selected to populate one of the columns. The filtering and selecting repeats iteratively until either all of the columns are populated or the number of currently unpopulated columns exceeds the number of vectors in the reduced set. Columns for the check matrix may be processed to reduce the amount of logic needed to implement the check matrix in circuit logic.

  13. Decoding and optimized implementation of SECDED codes over GF(q)

    Energy Technology Data Exchange (ETDEWEB)

    Ward, H. Lee; Ganti, Anand; Resnick, David R

    2013-10-22

    A plurality of columns for a check matrix that implements a distance d linear error correcting code are populated by providing a set of vectors from which to populate the columns, and applying to the set of vectors a filter operation that reduces the set by eliminating therefrom all vectors that would, if used to populate the columns, prevent the check matrix from satisfying a column-wise linear independence requirement associated with check matrices of distance d linear codes. One of the vectors from the reduced set may then be selected to populate one of the columns. The filtering and selecting repeats iteratively until either all of the columns are populated or the number of currently unpopulated columns exceeds the number of vectors in the reduced set. Columns for the check matrix may be processed to reduce the amount of logic needed to implement the check matrix in circuit logic.

  14. Optimized Generation of Data-Path from C Codes for FPGAs

    CERN Document Server

    Guo, Zhi; Najjar, Walid; Vissers, Kees

    2011-01-01

    FPGAs, as computing devices, offer significant speedup over microprocessors. Furthermore, their configurability offers an advantage over traditional ASICs. However, they do not yet enjoy high-level language programmability, as microprocessors do. This has become the main obstacle for their wider acceptance by application designers. ROCCC is a compiler designed to generate circuits from C source code to execute on FPGAs, more specifically on CSoCs. It generates RTL level HDLs from frequently executing kernels in an application. In this paper, we describe ROCCC's system overview and focus on its data path generation. We compare the performance of ROCCC-generated VHDL code with that of Xilinx IPs. The synthesis result shows that ROCCC-generated circuit takes around 2x ~ 3x area and runs at comparable clock rate.

  15. Decoding and optimized implementation of SECDED codes over GF(q)

    Science.gov (United States)

    Ward, H. Lee; Ganti, Anand; Resnick, David R

    2013-10-22

    A plurality of columns for a check matrix that implements a distance d linear error correcting code are populated by providing a set of vectors from which to populate the columns, and applying to the set of vectors a filter operation that reduces the set by eliminating therefrom all vectors that would, if used to populate the columns, prevent the check matrix from satisfying a column-wise linear independence requirement associated with check matrices of distance d linear codes. One of the vectors from the reduced set may then be selected to populate one of the columns. The filtering and selecting repeats iteratively until either all of the columns are populated or the number of currently unpopulated columns exceeds the number of vectors in the reduced set. Columns for the check matrix may be processed to reduce the amount of logic needed to implement the check matrix in circuit logic.

  16. Utility Optimal Coding for Packet Transmission over Wireless Networks - Part I: Networks of Binary Symmetric Channels

    CERN Document Server

    Karumbu, Premkumar; Leith, Douglas J

    2011-01-01

    We consider multi--hop networks comprising Binary Symmetric Channels ($\\mathsf{BSC}$s). The network carries unicast flows for multiple users. The utility of the network is the sum of the utilities of the flows, where the utility of each flow is a concave function of its throughput. Given that the network capacity is shared by the flows, there is a contention for network resources like coding rate (at the physical layer), scheduling time (at the MAC layer), etc., among the flows. We propose a proportional fair transmission scheme that maximises the sum utility of flow throughputs subject to the rate and the scheduling constraints. This is achieved by {\\em jointly optimising the packet coding rates of all the flows through the network}.

  17. ON THE OPTIMAL MULTI-RATE THROUGHPUT FOR MULTICAST WITH NETWORK CODING

    Institute of Scientific and Technical Information of China (English)

    Zhang Mu; Zhang Shunyi

    2006-01-01

    This paper investigates the maximal achievable multi-rate throughput problem of a multicast session at the presence of network coding. Deviating from previous works which focus on single-rate network coding, our work takes the heterogeneity of sinks into account and provides multiple data layers to address the problem. Firstly formulated is the maximal achievable throughput problem with the assumption that the data layers are independent and layer rates are static. It is proved that the problem in this case is, unfortunately,Non-deterministic Polynomial-time (NP)-hard. In addition, our formulation is extended to the problems with dependent layers and dynamic layers. Furthermore, the approximation algorithm which satisfies certain fairness is proposed.

  18. Optimal Bidding Strategy in Power Market before and after Congestion Management Using Invasive Weed Optimization

    Directory of Open Access Journals (Sweden)

    Mohsen Khalilpour

    2013-02-01

    Full Text Available Power companies world-wide have been restructuring their electric power systems from a vertically integrated entity to a deregulated, open-market environment. Previously, electric utilities usually sought to maximize the social welfare of the system with distributional equity as its main operational criterion. The operating paradigm was based on achieving the least-cost system solution while meeting reliability and security margins. This often resulted in investments in generating capacity operating at very low capacity factors. Decommissioning of this type of generating capacity was a natural outcome when the vertically integrated utilities moved over to deregulated market operations. This study proposes an optimizing base and load demand relative binding strategy for generating power apprises of different units in the investigated system. Afterwards, congestion effect in this biding strategy is investigated. The described systems analysis is implemented on 5 and 9 bus systems and optimizing technique in this issue is the Invasive Weed Optimization algorithm; the results are then compared by GA. Finally, examined systems is simulated by using the Power World software; experimental results show that the proposed technique (Invasive Weed Optimization is a high performance by compared GA for the congestion management purposes.

  19. Optimal space communication techniques. [a discussion of delta modulation, pulse code modulation, and phase locked systems

    Science.gov (United States)

    Schilling, D. L.

    1975-01-01

    Encoding of video signals using adaptive delta modulation (DM) was investigated, along with the error correction of DM encoded signals corrupted by thermal noise. Conversion from pulse code modulation to delta modulation was studied; an expression for the signal to noise ratio of the DM signal derived was achieved by employing linear, 2-sample, interpolation between sample points. A phase locked loop using a nonlinear processor in lieu of a loop filter is discussed.

  20. Optimal Rate Control in H.264 Video Coding Based on Video Quality Metric

    Directory of Open Access Journals (Sweden)

    R. Karthikeyan

    2014-05-01

    Full Text Available The aim of this research is to find a method for providing better visual quality across the complete video sequence in H.264 video coding standard. H.264 video coding standard with its significantly improved coding efficiency finds important applications in various digital video streaming, storage and broadcast. To achieve comparable quality across the complete video sequence with the constrains on bandwidth availability and buffer fullness, it is important to allocate more bits to frames with high complexity or a scene change and fewer bits to other less complex frames. A frame layer bit allocation scheme is proposed based on the perceptual quality metric as indicator of the frame complexity. The proposed model computes the Quality Index ratio (QIr of the predicted quality index of the current frame to the average quality index of all the previous frames in the group of pictures which is used for bit allocation to the current frame along with bits computed based on buffer availability. The standard deviation of the perceptual quality indicator MOS computed for the proposed model is significantly less which means the quality of the video sequence is identical throughout the full video sequence. Thus the experiment results shows that the proposed model effectively handles the scene changes and scenes with high motion for better visual quality.

  1. Depletion mapping and constrained optimization to support managing groundwater extraction

    Science.gov (United States)

    Fienen, Michael N.; Bradbury, Kenneth R.; Kniffin, Maribeth; Barlow, Paul M.

    2017-01-01

    Groundwater models often serve as management tools to evaluate competing water uses including ecosystems, irrigated agriculture, industry, municipal supply, and others. Depletion potential mapping—showing the model-calculated potential impacts that wells have on stream baseflow - can form the basis for multiple potential management approaches in an oversubscribed basin. Specific management approaches can include scenarios proposed by stakeholders, systematic changes in well pumping based on depletion potential, and formal constrained optimization, which can be used to quantify the tradeoff between water use and stream baseflow. Variables such as the maximum amount of reduction allowed in each well and various groupings of wells using, for example, K-means clustering considering spatial proximity and depletion potential are considered. These approaches provide a potential starting point and guidance for resource managers and stakeholders to make decisions about groundwater management in a basin, spreading responsibility in different ways. We illustrate these approaches in the Little Plover River basin in central Wisconsin, United States—home to a rich agricultural tradition, with farmland and urban areas both in close proximity to a groundwater-dependent trout stream. Groundwater withdrawals have reduced baseflow supplying the Little Plover River below a legally established minimum. The techniques in this work were developed in response to engaged stakeholders with various interests and goals for the basin. They sought to develop a collaborative management plan at a watershed scale that restores the flow rate in the river in a manner that incorporates principles of shared governance and results in effective and minimally disruptive changes in groundwater extraction practices.

  2. 利用齐次距离构造最优码%Construction of optimal codes with Homogeneous distance

    Institute of Scientific and Technical Information of China (English)

    丁健; 李红菊

    2015-01-01

    Based on the torsion codes of a (1 + λu)constacyclic code with arbitrary length over R(pm ,k) = Fpm [u]/ ,a bound for the homogeneous distance of a (1 + λu) constacyclic code with an arbitrary length over R( pm ,k) is obtained and the exact homogeneous distances of some (1 + λu) constacyclic codes over R( pm ,k) are determined ,where λ is a unit of R(pm ,k) .Furthermore , a new distance‐preserving Gray map from R N ( pm ,k) (Homogeneous distance) to F pm(k - 1) N pm (Hamming distance) is defined .It is proved that the Gray image of a linear (1 + λu) constacyclic code of arbitrary length over R(pm ,k) is a linear code over Fpm ,and some optimal linear codes over F2 ,F3 , and F4 are constructed under this Gray map .%利用 R(pm ,k)= Fpm [u]/< uk >上任意长度的(1+λu)常循环码的挠码得到了 R( pm ,k)上任意长度的(1+λu)常循环码的齐次距离的界,并确定了 R(pm ,k)上某些(1+λu)常循环码的齐次距离的准确值,其中λ是R( pm ,k)上的单位。此外,定义了从 RN(pm ,k)(Homogeneous 距离)到 F pm(k -1) N pm (Hamming 距离)的一个新的保距Gray 映射,得到 R(pm ,k)上任意长度的线性(1+λu)常循环码的 Gray 像是 Fpm 上的线性码,构造了 F2、 F3和F4上的一些最优线性码。

  3. An intelligent agent for optimal river-reservoir system management

    Science.gov (United States)

    Rieker, Jeffrey D.; Labadie, John W.

    2012-09-01

    A generalized software package is presented for developing an intelligent agent for stochastic optimization of complex river-reservoir system management and operations. Reinforcement learning is an approach to artificial intelligence for developing a decision-making agent that learns the best operational policies without the need for explicit probabilistic models of hydrologic system behavior. The agent learns these strategies experientially in a Markov decision process through observational interaction with the environment and simulation of the river-reservoir system using well-calibrated models. The graphical user interface for the reinforcement learning process controller includes numerous learning method options and dynamic displays for visualizing the adaptive behavior of the agent. As a case study, the generalized reinforcement learning software is applied to developing an intelligent agent for optimal management of water stored in the Truckee river-reservoir system of California and Nevada for the purpose of streamflow augmentation for water quality enhancement. The intelligent agent successfully learns long-term reservoir operational policies that specifically focus on mitigating water temperature extremes during persistent drought periods that jeopardize the survival of threatened and endangered fish species.

  4. Optimal management of nausea and vomiting of pregnancy

    Directory of Open Access Journals (Sweden)

    Neda Ebrahimi

    2010-08-01

    Full Text Available Neda Ebrahimi1,2, Caroline Maltepe2, Adrienne Einarson21Pharmaceutical Sciences, University of Toronto, Toronto, Ontario, Canada; 2Motherisk Program, The Hospital for Sick Children, Toronto, Ontario, CanadaAbstract: Nausea and vomiting of pregnancy (NVP is a common medical condition in pregnancy with significant physical and psychological morbidity. Up to 90% of women will suffer from NVP symptoms in the first trimester of pregnancy with up to 2% developing hyperemesis gravidarum which is NVP at its worst, leading to hospitalization and even death in extreme cases. Optimal management of NVP begins with nonpharmacological approaches, use of ginger, acupressure, vitamin B6, and dietary adjustments. The positive impact of these noninvasive, inexpensive and safe methods has been demonstrated. Pharmacological treatments are available with varying effectiveness; however, the only drug marketed specifically for the treatment of NVP in pregnancy is Diclectin® (vitamin B6 and doxylamine. In addition, the Motherisk algorithm provides a guideline for use of safe and effective drugs for the treatment of NVP. Optimal medical management of symptoms will ensure the mental and physical wellbeing of expecting mothers and their developing babies during this often stressful and difficult time period. Dismissing NVP as an inconsequential part of pregnancy can have serious ramifications for both mother and baby.Keywords: pharmacological/nonpharmacological treatments, NVP

  5. Information-Theoretic Viewpoints on Optimal Causal Coding-Decoding Problems

    CERN Document Server

    Gorantla, Siva

    2011-01-01

    In this paper we consider an interacting two-agent sequential decision-making problem consisting of a Markov source process, a causal encoder with feedback, and a causal decoder. Motivated by a desire to foster links between control and information theory, we augment the standard formulation by considering general alphabets and a cost function operating on current and previous symbols. Using dynamic programming, we provide a structural result whereby an optimal scheme exists that operates on appropriate sufficient statistics. We emphasize an example where the decoder alphabet lies in a space of beliefs on the source alphabet, and the additive cost function is a log likelihood ratio pertaining to sequential information gain. We also consider the inverse optimal control problem, where a fixed encoder/decoder pair satisfying statistical conditions is shown to be optimal for some cost function, using probabilistic matching. We provide examples of the applicability of this framework to communication with feedback,...

  6. NOVEL BIPHASE CODE -INTEGRATED SIDELOBE SUPPRESSION CODE

    Institute of Scientific and Technical Information of China (English)

    Wang Feixue; Ou Gang; Zhuang Zhaowen

    2004-01-01

    A kind of novel binary phase code named sidelobe suppression code is proposed in this paper. It is defined to be the code whose corresponding optimal sidelobe suppression filter outputs the minimum sidelobes. It is shown that there do exist sidelobe suppression codes better than the conventional optimal codes-Barker codes. For example, the sidelobe suppression code of length 11 with filter of length 39 has better sidelobe level up to 17dB than that of Barker code with the same code length and filter length.

  7. Knowledge Management for Topological Optimization Integration in Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Nicolas Gardan

    2014-01-01

    Full Text Available Engineering design optimization of mechanical structures is nowadays essential in the mechanical industry (automotive, aeronautics, etc.. To remain competitive in the globalized world, it is necessary to create and design structures that, in addition to complying specific mechanical performance, should be less expensive. Engineers must then design parts or assemblies that are a better compromise between mechanical and functional performance, weight, manufacturing costs, and so forth. In this context Additive Manufacturing (AM process offers the possibility to avoid tools and manufacture directly the part. There are numerous technologies which are using different kind of material. For each of these, there are at least two materials: the production material and the support one. Support material is, in most cases, cleaned and becomes a manufacturing residue. Improving the material volume and the global mass of the product is an essential aim surrounding the integration of simulation in additive manufacturing process. Moreover, the layer-by-layer technology of additive manufacturing allows the design of innovative objects, and the use of topological optimization in this context can create a very interesting combination. The purpose of our paper is to present the knowledge management of an AM trade oriented tool which integrated the topological optimization of parts and internal patterns.

  8. Dynamic stochastic optimization models for air traffic flow management

    Science.gov (United States)

    Mukherjee, Avijit

    This dissertation presents dynamic stochastic optimization models for Air Traffic Flow Management (ATFM) that enables decisions to adapt to new information on evolving capacities of National Airspace System (NAS) resources. Uncertainty is represented by a set of capacity scenarios, each depicting a particular time-varying capacity profile of NAS resources. We use the concept of a scenario tree in which multiple scenarios are possible initially. Scenarios are eliminated as possibilities in a succession of branching points, until the specific scenario that will be realized on a particular day is known. Thus the scenario tree branching provides updated information on evolving scenarios, and allows ATFM decisions to be re-addressed and revised. First, we propose a dynamic stochastic model for a single airport ground holding problem (SAGHP) that can be used for planning Ground Delay Programs (GDPs) when there is uncertainty about future airport arrival capacities. Ground delays of non-departed flights can be revised based on updated information from scenario tree branching. The problem is formulated so that a wide range of objective functions, including non-linear delay cost functions and functions that reflect equity concerns can be optimized. Furthermore, the model improves on existing practice by ensuring efficient use of available capacity without necessarily exempting long-haul flights. Following this, we present a methodology and optimization models that can be used for decentralized decision making by individual airlines in the GDP planning process, using the solutions from the stochastic dynamic SAGHP. Airlines are allowed to perform cancellations, and re-allocate slots to remaining flights by substitutions. We also present an optimization model that can be used by the FAA, after the airlines perform cancellation and substitutions, to re-utilize vacant arrival slots that are created due to cancellations. Finally, we present three stochastic integer programming

  9. Integer-Linear-Programing Optimization in Scalable Video Multicast with Adaptive Modulation and Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Dongyul Lee

    2014-01-01

    Full Text Available The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC with adaptive modulation and coding (AMC provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  10. Integer-linear-programing optimization in scalable video multicast with adaptive modulation and coding in wireless networks.

    Science.gov (United States)

    Lee, Dongyul; Lee, Chaewoo

    2014-01-01

    The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  11. Integer-Linear-Programing Optimization in Scalable Video Multicast with Adaptive Modulation and Coding in Wireless Networks

    Science.gov (United States)

    Lee, Chaewoo

    2014-01-01

    The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm. PMID:25276862

  12. Turbo码块同步参数优化设计%Optimized Parameter Design for Turbo Code Block Synchronizer

    Institute of Scientific and Technical Information of China (English)

    吴岭; 张金荣; 刘胜利

    2011-01-01

    Based on research on channel code block synchronization, this paper designs the synchronizer's parameters forthe Turbo code specified in CCSDS recommendation. A set of parameters are acquired for each code block length and coderate to optimize the average time to get locked and data availability, which is calculated with a formula derived in analyticalform. Optimum synchronizer's parameters are selected based on computation and the characteristics of the parameters aresummarized. The synchronizer's parameters are provided for practical engineering reference.%在信道码码块同步技术研究的基础上,针对CCSDS(空间数据系统咨询委员会)建议规定的Turbo码,为不同码块长度、不同编码效率的Turbo码设计一组最佳同步参数,以最优化平均入锁时间和数据有效率.利用理论推导出的解析公式,计算不同参数下的平均入锁时间和数据有效率,从而优选出最佳同步参数,并归纳总结最佳同步参数的特点.本文得到的同步参数可为实际工程提供参考.

  13. User's manual for the BNW-I optimization code for dry-cooled power plants. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Daniel, D.J.; De Mier, W.V.; Faletti, D.W.; Wiles, L.E.

    1977-01-01

    This User's Manual provides information on the use and operation of three versions of BNW-I, a computer code developed by Battelle, Pacific Northwest Laboratory (PNL) as a part of its activities under the ERDA Dry Cooling Tower Program. These three versions of BNW-I were used as reported elsewhere to obtain comparative incremental costs of electrical power production by two advanced concepts (one using plastic heat exchangers and one using ammonia as an intermediate heat transfer fluid) and a state-of-the-art system. The computer program offers a comprehensive method of evaluating the cost savings potential of dry-cooled heat rejection systems and components for power plants. This method goes beyond simple ''figure-of-merit'' optimization of the cooling tower and includes such items as the cost of replacement capacity needed on an annual basis and the optimum split between plant scale-up and replacement capacity, as well as the purchase and operating costs of all major heat rejection components. Hence, the BNW-I code is a useful tool for determining potential cost savings of new heat transfer surfaces, new piping or other components as part of an optimized system for a dry-cooled power plant.

  14. On the Optimality of Repetition Coding among Rate-1 DC-offset STBCs for MIMO Optical Wireless Communications

    KAUST Repository

    Sapenov, Yerzhan

    2017-07-06

    In this paper, an optical wireless multiple-input multiple-output communication system employing intensity-modulation direct-detection is considered. The performance of direct current offset space-time block codes (DC-STBC) is studied in terms of pairwise error probability (PEP). It is shown that among the class of DC-STBCs, the worst case PEP corresponding to the minimum distance between two codewords is minimized by repetition coding (RC), under both electrical and optical individual power constraints. It follows that among all DC-STBCs, RC is optimal in terms of worst-case PEP for static channels and also for varying channels under any turbulence statistics. This result agrees with previously published numerical results showing the superiority of RC in such systems. It also agrees with previously published analytic results on this topic under log-normal turbulence and further extends it to arbitrary turbulence statistics. This shows the redundancy of the time-dimension of the DC-STBC in this system. This result is further extended to sum power constraints with static and turbulent channels, where it is also shown that the time dimension is redundant, and the optimal DC-STBC has a spatial beamforming structure. Numerical results are provided to demonstrate the difference in performance for systems with different numbers of receiving apertures and different throughput.

  15. A wavelet based neural model to optimize and read out a temporal population code

    Directory of Open Access Journals (Sweden)

    Andre eLuvizotto

    2012-05-01

    Full Text Available It has been proposed that the dense excitatory local connectivity of the neo-cortex plays a specific role in the transformation of spatial stimulus information into a temporal representation or a temporal population code (TPC. TPC provides for a rapid, robust and high-capacity encoding of salient stimulus features with respect to position, rotation and distortion. The TPC hypothesis gives a functional interpretation to a core feature of the cortical anatomy: its dense local and sparse long-range connectivity. Thus far, the question of how the TPC encoding can be decoded in downstream areas has not been addressed. Here, we present a neural circuit that decodes the spectral properties of the TPC using a biologically plausible implementation of a Haar transform. We perform a systematic investigation of our model in a recognition task using a standardized stimulus set. We consider alternative implementations using either regular spiking or bursting neurons and a range of spectral bands. Our results show that our wavelet readout circuit provides for robust decoding of the TPC and further compresses the code without loosing speed or quality of decoding. We show that in the TPC signal the relevant stimulus information is present in the frequencies around 100 Hz. Our results show that the TPC is constructed around a small number of coding components that can be well decoded by wavelet coefficients in a neuronal implementation. This solution to the TPC decoding problem suggests that cortical processing streams might well consist of sequential operations where spatio-temporal transformations at lower levels form a compact stimulus encoding using TPC that are are subsequently decoded back to a spatial representation using wavelet transforms. In addition, the results presented here shows that different properties of the stimulus might be transmitted to further processing stages using different frequency component that are captured by appropriately tuned

  16. PlayNCool: Opportunistic Network Coding for Local Optimization of Routing in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2013-01-01

    to determine how much a helper should wait before springing into action based on channel conditions for the optimization of a single link, i.e., the helper will play it cool by only speaking after it has heard enough to be truly useful. These techniques constitute a key feature of PlayNCool and are applicable...

  17. Real-coded genetic algorithm for optimal vibration control of flexible structure

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Presents the study on the optimum location of actuators/sensors for active vibration control in aerospace flexible structures with the performance function first built by maximization of dissipation energy due to control action and a real-coded genetic algorithm then proposed to produce a global-optimum solution, and proves the feasibility and advantages of this algorithm with the example of a standard test function and a two-collocated actuators/sensors cantilever, and comparing the results with those given in the literatures.

  18. GRA prospectus: optimizing design and management of protected areas

    Science.gov (United States)

    Bernknopf, Richard; Halsing, David

    2001-01-01

    Protected areas comprise one major type of global conservation effort that has been in the form of parks, easements, or conservation concessions. Though protected areas are increasing in number and size throughout tropical ecosystems, there is no systematic method for optimally targeting specific local areas for protection, designing the protected area, and monitoring it, or for guiding follow-up actions to manage it or its surroundings over the long run. Without such a system, conservation projects often cost more than necessary and/or risk protecting ecosystems and biodiversity less efficiently than desired. Correcting these failures requires tools and strategies for improving the placement, design, and long-term management of protected areas. The objective of this project is to develop a set of spatially based analytical tools to improve the selection, design, and management of protected areas. In this project, several conservation concessions will be compared using an economic optimization technique. The forest land use portfolio model is an integrated assessment that measures investment in different land uses in a forest. The case studies of individual tropical ecosystems are developed as forest (land) use and preservation portfolios in a geographic information system (GIS). Conservation concessions involve a private organization purchasing development and resource access rights in a certain area and retiring them. Forests are put into conservation, and those people who would otherwise have benefited from extracting resources or selling the right to do so are compensated. Concessions are legal agreements wherein the exact amount and nature of the compensation result from a negotiated agreement between an agent of the conservation community and the local community. Funds are placed in a trust fund, and annual payments are made to local communities and regional/national governments. The payments are made pending third-party verification that the forest expanse

  19. Performance of an Optimized Eta Model Code on the Cray T3E and a Network of PCs

    Science.gov (United States)

    Kouatchou, Jules; Rancic, Miodrag; Geiger, Jim

    2000-01-01

    In the year 2001, NASA will launch the satellite TRIANA that will be the first Earth observing mission to provide a continuous, full disk view of the sunlit Earth. As a part of the HPCC Program at NASA GSFC, we have started a project whose objectives are to develop and implement a 3D cloud data assimilation system, by combining TRIANA measurements with model simulation, and to produce accurate statistics of global cloud coverage as an important element of the Earth's climate. For simulation of the atmosphere within this project we are using the NCEP/NOAA operational Eta model. In order to compare TRIANA and the Eta model data on approximately the same grid without significant downscaling, the Eta model will be integrated at a resolution of about 15 km. The integration domain (from -70 to +70 deg in latitude and 150 deg in longitude) will cover most of the sunlit Earth disc and will continuously rotate around the globe following TRIANA. The cloud data assimilation is supposed to run and produce 3D clouds on a near real-time basis. Such a numerical setup and integration design is very ambitious and computationally demanding. Thus, though the Eta model code has been very carefully developed and its computational efficiency has been systematically polished during the years of operational implementation at NCEP, the current MPI version may still have problems with memory and efficiency for the TRIANA simulations. Within this work, we optimize a parallel version of the Eta model code on a Cray T3E and a network of PCs (theHIVE) in order to improve its overall efficiency. Our optimization procedure consists of introducing dynamically allocated arrays to reduce the size of static memory, and optimizing on a single processor by splitting loops to limit the number of streams. All the presented results are derived using an integration domain centered at the equator, with a size of 60 x 60 deg, and with horizontal resolutions of 1/2 and 1/3 deg, respectively. In accompanying

  20. Redefining Secondary Forests in the Mexican Forest Code: Implications for Management, Restoration, and Conservation

    Directory of Open Access Journals (Sweden)

    Francisco J. Román-Dañobeytia

    2014-05-01

    Full Text Available The Mexican Forest Code establishes structural reference values to differentiate between secondary and old-growth forests and requires a management plan when secondary forests become old-growth and potentially harvestable forests. The implications of this regulation for forest management, restoration, and conservation were assessed in the context of the Calakmul Biosphere Reserve, which is located in the Yucatan Peninsula. The basal area and stem density thresholds currently used by the legislation to differentiate old-growth from secondary forests are 4 m2/ha and 15 trees/ha (trees with a diameter at breast height of >25 cm; however, our research indicates that these values should be increased to 20 m2/ha and 100 trees/ha, respectively. Given that a management plan is required when secondary forests become old-growth forests, many landowners avoid forest-stand development by engaging slash-and-burn agriculture or cattle grazing. We present evidence that deforestation and land degradation may prevent the natural regeneration of late-successional tree species of high ecological and economic importance. Moreover, we discuss the results of this study in the light of an ongoing debate in the Yucatan Peninsula between policy makers, non-governmental organizations (NGOs, landowners and researchers, regarding the modification of this regulation to redefine the concept of acahual (secondary forest and to facilitate forest management and restoration with valuable timber tree species.

  1. Swiss Foundation Code 2015 principles and recommendations for the establishment and management of grant-making foundations

    CERN Document Server

    Sprecher, Thomas; Schnurbein, Georg von

    2015-01-01

    The publication 'Swiss Foundation Code' contains practical governance guidelines on the topics of the establishment of foundations, their organisation, management and supervision, their charitable work and also on finance and investment policy for the contemporary and professional management of charitable foundations.

  2. SEJITS: embedded specializers to turn patterns-based designs into optimized parallel code

    CERN Document Server

    CERN. Geneva

    2012-01-01

    All software should be parallel software. This is natural result of the transition to a many core world. For a small fraction of the world's programmers (efficiency programmers), this is not a problem. They enjoy mapping algorithms onto the details of a particular system and are well served by low level languages and OpenMP, MPI, or OpenCL. Most programmers, however, are "domain specialists" who write code. They are too busy working in their domain of choice (such as physics) to master the intricacies of each computer they use. How do we make these programmers productive without giving up performance? We have been working with a team at UC Berkeley's ParLab to address this problem. The key is a clear software architecture expressed in terms of design patterns that exposes the concurrency in a problem. The resulting code is written using a patterns-based framework within a high level, productivity language (such as Python). Then a separate system is used by a small group o...

  3. Architectural framework for resource management optimization over heterogeneous wireless networks

    Science.gov (United States)

    Tselikas, Nikos; Kapellaki, Sofia; Koutsoloukas, Eleftherios; Venieris, Iakovos S.

    2003-11-01

    The main goal of wireless telecommunication world can be briefly summarized as: "communication anywhere, anytime, any-media and principally at high-data rates." On the other hand, this goal is in conflict with the co-existence of plenty different current and emerging wireless systems covering almost the whole world, since each one follows its own architecture and is based on its particular bedrocks. This results in a heterogeneous depiction of the hyper-set of wireless communications systems. The scope of this paper is to present a highly innovative and scalable architectural framework, which will allow different wireless systems to be interconnected in a common way, able to achieve resource management optimization, augmentation of network performance and maximum utilization of the networks. It will describe a hierarchical management system covering all GSM, GPRS, UMTS and WLAN networks each one individually, as well as a unified and wide wireless telecommunication system including all later, in order to provide enhanced capacity and quality via the accomplished network interworking. The main idea is to monitor all the resources using distributed monitoring components with intention to feed an additional centralized system with alarms, so that a set of management techniques will be selected and applied where needed. In parallel, the centralized system will be able to combine the aforementioned alarms with business models for the efficient use of the available networks according to the type of user, the type of application as well as the user"s location.

  4. Optimal pricing and lot sizing vendor managed inventory

    Directory of Open Access Journals (Sweden)

    Mohsen Ziaee

    2010-07-01

    Full Text Available Vendor Managed Inventory (VMI is one of the effective techniques for managing theinventory in supply chain. VMI models have been proven to reduce the cost of inventorycompared with traditional economic order quantity method under some conditions such asconstant demand and production expenditure. However, the modeling of the VMI problem hasnever been studied under some realistic assumptions such as price dependent demand. In thispaper, three problem formulations are proposed. In the first problem formulation, we study aVMI problem with one buyer and one supplier when demand is considered to be a function ofprice and price elasticity to demand, and production cost is also a function of demand. Theproposed model is formulated and solved in a form of geometric programming. For the secondand the third models, we consider VMI problem with two buyers and two suppliers assumingthat each buyer centre is relatively close to the other buyer centre. Each supplier has only oneproduct which is different from the product of the other supplier. Two suppliers cooperate incustomer relationship management and two buyers cooperate in supplier relationshipmanagement as well, so the suppliers send the orders of two buyers by one vehicle,simultaneously. For the third model, an additional assumption which is practically applicableand reasonable is considered. For all the proposed models, the optimal solution is comparedwith the traditional one. We demonstrate the implementation of our proposed models usingsome numerical examples.

  5. Optimal Control for Bufferbloat Queue Management Using Indirect Method with Parametric Optimization

    Directory of Open Access Journals (Sweden)

    Amr Radwan

    2016-01-01

    Full Text Available Because memory buffers become larger and cheaper, they have been put into network devices to reduce the number of loss packets and improve network performance. However, the consequences of large buffers are long queues at network bottlenecks and throughput saturation, which has been recently noticed in research community as bufferbloat phenomenon. To address such issues, in this article, we design a forward-backward optimal control queue algorithm based on an indirect approach with parametric optimization. The cost function which we want to minimize represents a trade-off between queue length and packet loss rate performance. Through the integration of an indirect approach with parametric optimization, our proposal has advantages of scalability and accuracy compared to direct approaches, while still maintaining good throughput and shorter queue length than several existing queue management algorithms. All numerical analysis, simulation in ns-2, and experiment results are provided to solidify the efficiency of our proposal. In detailed comparisons to other conventional algorithms, the proposed procedure can run much faster than direct collocation methods while maintaining a desired short queue (≈40 packets in simulation and 80 (ms in experiment test.

  6. CONGESTION MANAGEMENT BY OPTIMAL ALLOCATION OF FACTS CONTROLLERS USING HYBRID FISH BEE OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    S. Thangalakshmi

    2014-01-01

    Full Text Available The role of Independent System Operator (ISO in the restructured power industry includes system control, capacity planning, transmission tariff and congestion management; the challenging task being minimizing the congestion. One of the popular techniques used to alleviate congestion is using Flexible AC Transmission Systems (FACTS devices. The power system generally operates near its rated capacity in deregulated market because of intensive usage of transmission grids. So, the major issues that need to be addressed are improving the voltage profile and reducing the power loss in the electrical network. Motivation: The location of FACTS devices can improve the power flow in the line, maintain the bus profile and reduce the losses. However locating the ideal location is a NP problem. This study presents a novel heuristic method to determine the types of FACTS devices and its optimal location in a power system without violating the thermal and voltage limits. Power flow sensitivity index to find the optimal location of UPFC is suggested in this study. A hybrid fish bee swarm optimization is proposed which is based on Artificial Bee Colony (ABC and Fish School Search (FSS methods. This proposed algorithm is tested based on IEEE 30 bus system and line performances are studied.

  7. Performance Analysis of Code OptimizationBased on TMS320C6678 Multi-core DSP

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    In the development of modern DSP, more and more use of C/C++ as a development language has become a trend. Optimizationof C/C++ program has become an important link of the DSP software development. This article describes the structure features ofTMS320C6678 processor, illustrates the principle of efficient optimization method for C/C++, and analyzes the results.

  8. Code Optimization for the Choi-Williams Distribution for ELINT Applications

    Science.gov (United States)

    2009-12-01

    PAGES 98 14. SUBJECT TERMS Choi-Williams Distribution, Signal Processing, Algorithm Optimization, C programming , Low Probability of Intercept (LPI...not be useful for the C programming language, but was included in this thesis because it could be useful when developing a hardware solution to these...representation. Since there is no such single operation in the C programming language, to utilize this method would involve masking out the eight-bit

  9. Optimal management of bone metastases in breast cancer patients

    Directory of Open Access Journals (Sweden)

    Wong MH

    2011-05-01

    Full Text Available MH Wong, N PavlakisDepartment of Medical Oncology, Royal North Shore Hospital, Sydney, NSW, AustraliaAbstract: Bone metastasis in breast cancer is a significant clinical problem. It not only indicates incurable disease with a guarded prognosis, but is also associated with skeletal-related morbidities including bone pain, pathological fractures, spinal cord compression, and hypercalcemia. In recent years, the mechanism of bone metastasis has been further elucidated. Bone metastasis involves a vicious cycle of close interaction between the tumor and the bone microenvironment. In patients with bone metastases, the goal of management is to prevent further skeletal-related events, manage complications, reduce bone pain, and improve quality of life. Bisphosphonates are a proven therapy for the above indications. Recently, a drug of a different class, the RANK ligand antibody, denosumab, has been shown to reduce skeletal-related events more than the bisphosphonate, zoledronic acid. Other strategies of clinical value may include surgery, radiotherapy, radiopharmaceuticals, and, of course, effective systemic therapy. In early breast cancer, bisphosphonates may have an antitumor effect and prevent both bone and non-bone metastases. Whilst two important Phase III trials with conflicting results have led to controversy in this topic, final results from these and other key Phase III trials must still be awaited before a firm conclusion can be drawn about the use of bisphosphonates in this setting. Advances in bone markers, predictive biomarkers, multi-imaging modalities, and the introduction of novel agents have ushered in a new era of proactive management for bone metastases in breast cancer.Keywords: breast cancer, bone metastases, bisphosphonates, denosumab, biomarkers, optimal management

  10. Optimization Under Uncertainty for Management of Renewables in Electricity Markets

    DEFF Research Database (Denmark)

    Zugno, Marco

    -by-price. In a similar setup, the optimal trading (and pricing) problem for a retailer connected to flexible consumers is considered. Finally, market and system operators are challenged by the increasing penetration of renewables, which put stress on markets that were designed to accommodate a generation mix largely......This thesis deals with the development and application of models for decision-making under uncertainty to support the participation of renewables in electricity markets. The output of most renewable sources, e.g., wind, is intermittent and, furthermore, it can only be predicted with a limited...... accuracy. As a result of their non-dispatchable and stochastic nature, the management of renewables poses new challenges as compared to conventional sources of electricity. Focusing in particular on short-term electricity markets, both the trading activities of market participants (producers, retailers...

  11. Optimizing pyramided transgenic Bt crops for sustainable pest management.

    Science.gov (United States)

    Carrière, Yves; Crickmore, Neil; Tabashnik, Bruce E

    2015-02-01

    Transgenic crop pyramids producing two or more Bacillus thuringiensis (Bt) toxins that kill the same insect pest have been widely used to delay evolution of pest resistance. To assess the potential of pyramids to achieve this goal, we analyze data from 38 studies that report effects of ten Bt toxins used in transgenic crops against 15 insect pests. We find that compared with optimal low levels of insect survival, survival on currently used pyramids is often higher for both susceptible insects and insects resistant to one of the toxins in the pyramid. Furthermore, we find that cross-resistance and antagonism between toxins used in pyramids are common, and that these problems are associated with the similarity of the amino acid sequences of domains II and III of the toxins, respectively. This analysis should assist in future pyramid design and the development of sustainable resistance management strategies.

  12. Comparison of risk-based optimization models for reservoir management

    Energy Technology Data Exchange (ETDEWEB)

    Mahootchi, M. [Amirkabir Univ. of Technology, Tehran (Iran, Islamic Republic of). Dept. of Industrial Engineering; Ponnambalam, K.; Tizhoosh, H.R. [Waterloo Univ., ON (Canada). Dept. of Systems Design Engineering

    2010-01-15

    The stochastic nature of input variables in water resource management problems must be carefully considered during decision-making processes. This paper used a single reservoir optimization problem in which 2-stage stochastic programming (TSP) and Fletcher-Ponnambalam (FP) were used to generate open-loop policies where inflows and water prices were uncertain. A simulation was performed to measure the performance of FP and TSP techniques in embedding risk and producing comparable policies. A simulation-based Q-learning algorithm based on reinforcement learning (RL) was used to determine the squared action-value function or each action-state pair produced by the algorithm. The methods were used to produce the trade-off curve between expected benefits and standard deviations of benefits. The study showed that the FP method does not require simulation, while the TSP and Q-learning methods both required simulations. 39 refs., 9 tabs., 4 figs.

  13. Optimized QKD BB84 protocol using quantum dense coding and CNOT gates: feasibility based on probabilistic optical devices

    Science.gov (United States)

    Gueddana, Amor; Attia, Moez; Chatta, Rihab

    2014-05-01

    In this work, we simulate a fiber-based Quantum Key Distribution Protocol (QKDP) BB84 working at the telecoms wavelength 1550 nm with taking into consideration an optimized attack strategy. We consider in our work a quantum channel composed by probabilistic Single Photon Source (SPS), single mode optical Fiber and quantum detector with high efficiency. We show the advantages of using the Quantum Dots (QD) embedded in micro-cavity compared to the Heralded Single Photon Sources (HSPS). Second, we show that Eve is always getting some information depending on the mean photon number per pulse of the used SPS and therefore, we propose an optimized version of the QKDP BB84 based on Quantum Dense Coding (QDC) that could be implemented by quantum CNOT gates. We evaluate the success probability of implementing the optimized QKDP BB84 when using nowadays probabilistic quantum optical devices for circuit realization. We use for our modeling an abstract probabilistic model of a CNOT gate based on linear optical components and having a success probability of sqrt (4/27), we take into consideration the best SPSs realizations, namely the QD and the HSPS, generating a single photon per pulse with a success probability of 0.73 and 0.37, respectively. We show that the protocol is totally secure against attacks but could be correctly implemented only with a success probability of few percent.

  14. Optimal modulation and coding scheme allocation of scalable video multicast over IEEE 802.16e networks

    Directory of Open Access Journals (Sweden)

    Tsai Chia-Tai

    2011-01-01

    Full Text Available Abstract With the rapid development of wireless communication technology and the rapid increase in demand for network bandwidth, IEEE 802.16e is an emerging network technique that has been deployed in many metropolises. In addition to the features of high data rate and large coverage, it also enables scalable video multicasting, which is a potentially promising application, over an IEEE 802.16e network. How to optimally assign the modulation and coding scheme (MCS of the scalable video stream for the mobile subscriber stations to improve spectral efficiency and maximize utility is a crucial task. We formulate this MCS assignment problem as an optimization problem, called the total utility maximization problem (TUMP. This article transforms the TUMP into a precedence constraint knapsack problem, which is a NP-complete problem. Then, a branch and bound method, which is based on two dominance rules and a lower bound, is presented to solve the TUMP. The simulation results show that the proposed branch and bound method can find the optimal solution efficiently.

  15. MagRad: A code to optimize the operation of superconducting magnets in a radiation environment

    Energy Technology Data Exchange (ETDEWEB)

    Yeaw, Christopher T. [Univ. of Wisconsin, Madison, WI (United States)

    1995-01-01

    A powerful computational tool, called MagRad, has been developed which optimizes magnet design for operation in radiation fields. Specifically, MagRad has been used for the analysis and design modification of the cable-in-conduit conductors of the TF magnet systems in fusion reactor designs. Since the TF magnets must operate in a radiation environment which damages the material components of the conductor and degrades their performance, the optimization of conductor design must account not only for start-up magnet performance, but also shut-down performance. The degradation in performance consists primarily of three effects: reduced stability margin of the conductor; a transition out of the well-cooled operating regime; and an increased maximum quench temperature attained in the conductor. Full analysis of the magnet performance over the lifetime of the reactor includes: radiation damage to the conductor, stability, protection, steady state heat removal, shielding effectiveness, optimal annealing schedules, and finally costing of the magnet and reactor. Free variables include primary and secondary conductor geometric and compositional parameters, as well as fusion reactor parameters. A means of dealing with the radiation damage to the conductor, namely high temperature superconductor anneals, is proposed, examined, and demonstrated to be both technically feasible and cost effective. Additionally, two relevant reactor designs (ITER CDA and ARIES-II/IV) have been analyzed. Upon addition of pure copper strands to the cable, the ITER CDA TF magnet design was found to be marginally acceptable, although much room for both performance improvement and cost reduction exists. A cost reduction of 10-15% of the capital cost of the reactor can be achieved by adopting a suitable superconductor annealing schedule. In both of these reactor analyses, the performance predictive capability of MagRad and its associated costing techniques have been demonstrated.

  16. Multipurpose Water Reservoir Management: An Evolutionary Multiobjective Optimization Approach

    Directory of Open Access Journals (Sweden)

    Luís A. Scola

    2014-01-01

    Full Text Available The reservoirs that feed large hydropower plants should be managed in order to provide other uses for the water resources. Those uses include, for instance, flood control and avoidance, irrigation, navigability in the rivers, and other ones. This work presents an evolutionary multiobjective optimization approach for the study of multiple water usages in multiple interlinked reservoirs, including both power generation objectives and other objectives not related to energy generation. The classical evolutionary algorithm NSGA-II is employed as the basic multiobjective optimization machinery, being modified in order to cope with specific problem features. The case studies, which include the analysis of a problem which involves an objective of navigability on the river, are tailored in order to illustrate the usefulness of the data generated by the proposed methodology for decision-making on the problem of operation planning of multiple reservoirs with multiple usages. It is shown that it is even possible to use the generated data in order to determine the cost of any new usage of the water, in terms of the opportunity cost that can be measured on the revenues related to electric energy sales.

  17. Multiobjective Optimization Methods for Congestion Management in Deregulated Power Systems

    Directory of Open Access Journals (Sweden)

    K. Vijayakumar

    2012-01-01

    Full Text Available Congestion management is one of the important functions performed by system operator in deregulated electricity market to ensure secure operation of transmission system. This paper proposes two effective methods for transmission congestion alleviation in deregulated power system. Congestion or overload in transmission networks is alleviated by rescheduling of generators and/or load shedding. The two objectives conflicting in nature (1 transmission line over load and (2 congestion cost are optimized in this paper. The multiobjective fuzzy evolutionary programming (FEP and nondominated sorting genetic algorithm II methods are used to solve this problem. FEP uses the combined advantages of fuzzy and evolutionary programming (EP techniques and gives better unique solution satisfying both objectives, whereas nondominated sorting genetic algorithm (NSGA II gives a set of Pareto-optimal solutions. The methods propose an efficient and reliable algorithm for line overload alleviation due to critical line outages in a deregulated power markets. The quality and usefulness of the algorithm is tested on IEEE 30 bus system.

  18. Optimal Bidding Strategy for Renewable Microgrid with Active Network Management

    Directory of Open Access Journals (Sweden)

    Seung Wan Kim

    2016-01-01

    Full Text Available Active Network Management (ANM enables a microgrid to optimally dispatch the active/reactive power of its Renewable Distributed Generation (RDG and Battery Energy Storage System (BESS units in real time. Thus, a microgrid with high penetration of RDGs can handle their uncertainties and variabilities to achieve the stable operation using ANM. However, the actual power flow in the line connecting the main grid and microgrid may deviate significantly from the day-ahead bids if the bids are determined without consideration of the real-time adjustment through ANM, which will lead to a substantial imbalance cost. Therefore, this study proposes a formulation for obtaining an optimal bidding which reflects the change of power flow in the connecting line by real-time adjustment using ANM. The proposed formulation maximizes the expected profit of the microgrid considering various network and physical constraints. The effectiveness of the proposed bidding strategy is verified through the simulations with a 33-bus test microgrid. The simulation results show that the proposed bidding strategy improves the expected operating profit by reducing the imbalance cost to a greater degree compared to the basic bidding strategy without consideration of ANM.

  19. The Integrated Research in the Organization Code Management System and Organization Code E-records Management System%组织机构代码管理系统与组织机构代码电子档案管理系统的整合性研究

    Institute of Scientific and Technical Information of China (English)

    赵继业

    2011-01-01

    The organization code management system and the organization code e-records management system are the processing systems of the code services.But both of them are running independently,so it makes the problems just as the increased income of the system maintenance and the waste of e system resources. Because of the asynchronism in dealing with the services,which makes the differences in data checking. Therefore,we consider integrating both of them into one system.,which means you can deal with the whole services of code in one system. It not only achieves the systematicity and integrity of the code services,but also is a right thought to optimize the code service processes and promote the right development of the code services.%组织机构代码管理系统与组织机构代码电子档案管理系统是全国各级组织机构代码管理机构普遍应用的代码业务处理系统.由于二者独立运行,造成了系统维护成本增加,系统资源浪费等问题.二者在业务处理上的不同步性,对数据核查也造成一定影响.因此,考虑将二者整合成一个系统,实现代码业务的系统性和完整性,将是优化代码业务流程,促进代码业务良性发展的一个正确思路.

  20. Equipment Inventory Management and Transaction Recording Using Bar Coding Scheme via VB6

    Directory of Open Access Journals (Sweden)

    Geoffrey T. Salvador, PECE

    2016-06-01

    Full Text Available The aim of the study is to implement bar coding system developed through the VB6 and Microsoft Access as mechanism for the PUP ECE Laboratory Transaction recording and monitoring. The study was concerned on proper documenting and managing the daily transaction of the ECE Laboratory with the AutoLab System.Results showed that the AutoLab System effectively automated the recording of transactions merging the existing manual method into one recording mechanism. The Automated Laboratory coined as AutoLab merged the ECE Room Utilization Log Book, ECE Borrower’s Slip and the ECE Transaction Log Book into one complete package in terms of transaction recording and equipment inventory monitoring

  1. HDFS optimization program based on GE coding%基于GE码的HDFS优化方案

    Institute of Scientific and Technical Information of China (English)

    朱媛媛; 王晓京

    2013-01-01

    Concerning Hadoop Distributed File System (HDFS) data disaster recovery efficiency and small files, this paper presented an improved solution based on coding and the solution introduced a coding module of erasure GE to HDFS. Different from the multiple-replication strategy adopted by the original system, the module encoded files of HDFS into a great number of slices, and saved them dispersedly into the clusters of the storage system in distributed fashion. The research methods introduced the new concept of the slice, slice was classified and merged to save in the block and the secondary index of slice was established to solve the small files issue. In the case of cluster failure, the original data would be recovered via decoding by collecting any 70% of the slice, the method also introduced the dynamic replication strategies, through dynamically creating and deleting replications to keep the whole cluster in a good load-balancing status and settle the hotspot issues. The experiments on analogous clusters of storage system show the feasibility and advantages of new measures in proposed solution.%针对Hadoop分布式文件系统(HDFS)数据容灾效率和小文件问题,提出了基于纠删码的解决方案.该方案引用了新型纠删码(GE码)的编码和译码模块,对HDFS中的文件进行编码分片,生成很多个Slice并随机均匀的分配保存到集群中,代替原来HDFS系统的多副本容灾策略.该方法中引入了Slice的新概念,将Slice进行分类合保存在block中并然后通过对Slice建立二级索引来解决小文件问题;该研究方法中抛弃了三备份机制,而是在集群出现节点失效的情况下,通过收集与失效文件相关的任意70%左右的Slice进行原始数据的恢复.通过相关的集群实验结果表明,该方法在容灾效率、小文件问题、存储成本以及安全性上对HDFS作了很大的优化.

  2. A systematic approach: optimization of healthcare operations with knowledge management.

    Science.gov (United States)

    Wickramasinghe, Nilmini; Bali, Rajeev K; Gibbons, M Chris; Choi, J H James; Schaffer, Jonathan L

    2009-01-01

    Effective decision making is vital in all healthcare activities. While this decision making is typically complex and unstructured, it requires the decision maker to gather multispectral data and information in order to make an effective choice when faced with numerous options. Unstructured decision making in dynamic and complex environments is challenging and in almost every situation the decision maker is undoubtedly faced with information inferiority. The need for germane knowledge, pertinent information and relevant data are critical and hence the value of harnessing knowledge and embracing the tools, techniques, technologies and tactics of knowledge management are essential to ensuring efficiency and efficacy in the decision making process. The systematic approach and application of knowledge management (KM) principles and tools can provide the necessary foundation for improving the decision making processes in healthcare. A combination of Boyd's OODA Loop (Observe, Orient, Decide, Act) and the Intelligence Continuum provide an integrated, systematic and dynamic model for ensuring that the healthcare decision maker is always provided with the appropriate and necessary knowledge elements that will help to ensure that healthcare decision making process outcomes are optimized for maximal patient benefit. The example of orthopaedic operating room processes will illustrate the application of the integrated model to support effective decision making in the clinical environment.

  3. Optimal management of night eating syndrome: challenges and solutions

    Directory of Open Access Journals (Sweden)

    Kucukgoncu S

    2015-03-01

    Full Text Available Suat Kucukgoncu, Margaretta Midura, Cenk Tek Department of Psychiatry, Yale University, New Haven, CT, USA Abstract: Night Eating Syndrome (NES is a unique disorder characterized by a delayed pattern of food intake in which recurrent episodes of nocturnal eating and/or excessive food consumption occur after the evening meal. NES is a clinically important disorder due to its relationship to obesity, its association with other psychiatric disorders, and problems concerning sleep. However, NES often goes unrecognized by both health professionals and patients. The lack of knowledge regarding NES in clinical settings may lead to inadequate diagnoses and inappropriate treatment approaches. Therefore, the proper diagnosis of NES is the most important issue when identifying NES and providing treatment for this disorder. Clinical assessment tools such as the Night Eating Questionnaire may help health professionals working with populations vulnerable to NES. Although NES treatment studies are still in their infancy, antidepressant treatments and psychological therapies can be used for optimal management of patients with NES. Other treatment options such as melatonergic medications, light therapy, and the anticonvulsant topiramate also hold promise as future treatment options. The purpose of this review is to provide a summary of NES, including its diagnosis, comorbidities, and treatment approaches. Possible challenges addressing patients with NES and management options are also discussed. Keywords: night eating, obesity, psychiatric disorders, weight, depression

  4. What is the optimal management option for occupational asthma?

    Directory of Open Access Journals (Sweden)

    O. Vandenplas

    2012-06-01

    Full Text Available The optimal management of occupational asthma remains uncertain in clinical practice. The aim of this review was to analyse the published information pertaining to the management of occupational asthma in order to produce evidence-based statements and recommendations. A systematic literature search was conducted up to March 2010 to identify original studies addressing the following different treatment options: 1 persistence of exposure; 2 pharmacological treatment; 3 complete avoidance of exposure; 4 reduction of exposure; and 5 the use of personal protective equipment. After full text evaluation of 83 potentially relevant articles, 52 studies were retained for analysis.The conclusions from this systematic review are limited by the methodological weaknesses of most published studies. Critical analysis of available evidence indicates that: 1 persistent exposure to the causal agent is more likely to result in asthma worsening than complete avoidance; 2 there is insufficient evidence to determine whether pharmacological treatment can alter the course of asthma in subjects who remain exposed; 3 avoidance of exposure leads to recovery of asthma in less than one-third of affected workers; 4 reduction of exposure seems to be less beneficial than complete avoidance of exposure; and 5 personal respiratory equipment does not provide complete protection.

  5. An enhancement of selection and crossover operations in real-coded genetic algorithm for large-dimensionality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Noh Sung; Lee, Jongsoo [Yonsei University, Seoul (Korea, Republic of)

    2016-01-15

    The present study aims to implement a new selection method and a novel crossover operation in a real-coded genetic algorithm. The proposed selection method facilitates the establishment of a successively evolved population by combining several subpopulations: an elitist subpopulation, an off-spring subpopulation and a mutated subpopulation. A probabilistic crossover is performed based on the measure of probabilistic distance between the individuals. The concept of ‘allowance’ is suggested to describe the level of variance in the crossover operation. A number of nonlinear/non-convex functions and engineering optimization problems are explored to verify the capacities of the proposed strategies. The results are compared with those obtained from other genetic and nature-inspired algorithms.

  6. BMI optimization by using parallel UNDX real-coded genetic algorithm with Beowulf cluster

    Science.gov (United States)

    Handa, Masaya; Kawanishi, Michihiro; Kanki, Hiroshi

    2007-12-01

    This paper deals with the global optimization algorithm of the Bilinear Matrix Inequalities (BMIs) based on the Unimodal Normal Distribution Crossover (UNDX) GA. First, analyzing the structure of the BMIs, the existence of the typical difficult structures is confirmed. Then, in order to improve the performance of algorithm, based on results of the problem structures analysis and consideration of BMIs characteristic properties, we proposed the algorithm using primary search direction with relaxed Linear Matrix Inequality (LMI) convex estimation. Moreover, in these algorithms, we propose two types of evaluation methods for GA individuals based on LMI calculation considering BMI characteristic properties more. In addition, in order to reduce computational time, we proposed parallelization of RCGA algorithm, Master-Worker paradigm with cluster computing technique.

  7. Optimization of automatically generated multi-core code for the LTE RACH-PD algorithm

    CERN Document Server

    Pelcat, Maxime; Nezan, Jean François

    2008-01-01

    Embedded real-time applications in communication systems require high processing power. Manual scheduling devel-oped for single-processor applications is not suited to multi-core architectures. The Algorithm Architecture Matching (AAM) methodology optimizes static application implementation on multi-core architectures. The Random Access Channel Preamble Detection (RACH-PD) is an algorithm for non-synchronized access of Long Term Evolu-tion (LTE) wireless networks. LTE aims to improve the spectral efficiency of the next generation cellular system. This paper de-scribes a complete methodology for implementing the RACH-PD. AAM prototyping is applied to the RACH-PD which is modelled as a Synchronous DataFlow graph (SDF). An efficient implemen-tation of the algorithm onto a multi-core DSP, the TI C6487, is then explained. Benchmarks for the solution are given.

  8. Ant Colony Optimization Algorithm For PAPR Reduction In Multicarrier Code Division Multiple Access System

    Directory of Open Access Journals (Sweden)

    Kanchan Singla

    2014-06-01

    Full Text Available MC CDMA is a rising candidate for future generation broadband wireless communication and gained great attention from researchers. It provides benefits of both OFDM and CDMA. Main challenging problem of MC CDMA is high PAPR. It occurs in HPA and reduces system efficiency. There are many PAPR reduction techniques for MC CDMA. In this paper we proposed Ant colony optimization algorithm to reduce PAPR with different number of user using BPSK and QPSK modulation. ACO is a metaheuristic technique and based on the foraging behavior of real ants. It provides solution to many complex problems. Simulation result proves that ACO using BPSK modulation is effective for reducing PAPR in MC CDMA.

  9. ActiWiz – optimizing your nuclide inventory at proton accelerators with a computer code

    CERN Document Server

    Vincke, Helmut

    2014-01-01

    When operating an accelerator one always faces unwanted, but inevitable beam losses. These result in activation of adjacent material, which in turn has an obvious impact on safety and handling constraints. One of the key parameters responsible for activation is the chemical composition of the material which often can be optimized in that respect. In order to facilitate this task also for non-expert users the ActiWiz software has been developed at CERN. Based on a large amount of generic FLUKA Monte Carlo simulations the software applies a specifically developed risk assessment model to provide support to decision makers especially during the design phase as well as common operational work in the domain of radiation protection.

  10. Steps towards verification and validation of the Fetch code for Level 2 analysis, design, and optimization of aqueous homogeneous reactors

    Energy Technology Data Exchange (ETDEWEB)

    Nygaard, E. T. [Babcock and Wilcox Technical Services Group, 800 Main Street, Lynchburg, VA 24504 (United States); Pain, C. C.; Eaton, M. D.; Gomes, J. L. M. A.; Goddard, A. J. H.; Gorman, G.; Tollit, B.; Buchan, A. G.; Cooling, C. M. [Applied Modelling and Computation Group, Dept. of Earth Science and Engineering, Imperial College London, SW7 2AZ (United Kingdom); Angelo, P. L. [Y-12 National Security Complex, Oak Ridge, TN 37831 (United States)

    2012-07-01

    Babcock and Wilcox Technical Services Group (B and W) has identified aqueous homogeneous reactors (AHRs) as a technology well suited to produce the medical isotope molybdenum 99 (Mo-99). AHRs have never been specifically designed or built for this specialized purpose. However, AHRs have a proven history of being safe research reactors. In fact, in 1958, AHRs had 'a longer history of operation than any other type of research reactor using enriched fuel' and had 'experimentally demonstrated to be among the safest of all various type of research reactor now in use [1].' While AHRs have been modeled effectively using simplified 'Level 1' tools, the complex interactions between fluids, neutronics, and solid structures are important (but not necessarily safety significant). These interactions require a 'Level 2' modeling tool. Imperial College London (ICL) has developed such a tool: Finite Element Transient Criticality (FETCH). FETCH couples the radiation transport code EVENT with the computational fluid dynamics code (Fluidity), the result is a code capable of modeling sub-critical, critical, and super-critical solutions in both two-and three-dimensions. Using FETCH, ICL researchers and B and W engineers have studied many fissioning solution systems include the Tokaimura criticality accident, the Y12 accident, SILENE, TRACY, and SUPO. These modeling efforts will ultimately be incorporated into FETCH'S extensive automated verification and validation (V and V) test suite expanding FETCH'S area of applicability to include all relevant physics associated with AHRs. These efforts parallel B and W's engineering effort to design and optimize an AHR to produce Mo99. (authors)

  11. Addressing medical coding and billing part II: a strategy for achieving compliance. A risk management approach for reducing coding and billing errors.

    Science.gov (United States)

    Adams, Diane L; Norman, Helen; Burroughs, Valentine J

    2002-06-01

    Medical practice today, more than ever before, places greater demands on physicians to see more patients, provide more complex medical services and adhere to stricter regulatory rules, leaving little time for coding and billing. Yet, the need to adequately document medical records, appropriately apply billing codes and accurately charge insurers for medical services is essential to the medical practice's financial condition. Many physicians rely on office staff and billing companies to process their medical bills without ever reviewing the bills before they are submitted for payment. Some physicians may not be receiving the payment they deserve when they do not sufficiently oversee the medical practice's coding and billing patterns. This article emphasizes the importance of monitoring and auditing medical record documentation and coding application as a strategy for achieving compliance and reducing billing errors. When medical bills are submitted with missing and incorrect information, they may result in unpaid claims and loss of revenue to physicians. Addressing Medical Audits, Part I--A Strategy for Achieving Compliance--CMS, JCAHO, NCQA, published January 2002 in the Journal of the National Medical Association, stressed the importance of preparing the medical practice for audits. The article highlighted steps the medical practice can take to prepare for audits and presented examples of guidelines used by regulatory agencies to conduct both medical and financial audits. The Medicare Integrity Program was cited as an example of guidelines used by regulators to identify coding errors during an audit and deny payment to providers when improper billing occurs. For each denied claim, payments owed to the medical practice are are also denied. Health care is, no doubt, a costly endeavor for health care providers, consumers and insurers. The potential risk to physicians for improper billing may include loss of revenue, fraud investigations, financial sanction

  12. Optimal management of children on antiretroviral therapy (ART) in primary care: a quality improvement project

    National Research Council Canada - National Science Library

    van Deventer, Claire; Golden, Lauren; du Plessis, Erica; Lion-Cachet, Carien

    2017-01-01

    .... The referral pattern is true for paediatric patients as well. With the added complexity of managing children, there was a concern in the research district that children were not being optimally managed at PHC level. Method...

  13. First-Order Model Management With Variable-Fidelity Physics Applied to Multi-Element Airfoil Optimization

    Science.gov (United States)

    Alexandrov, N. M.; Nielsen, E. J.; Lewis, R. M.; Anderson, W. K.

    2000-01-01

    First-order approximation and model management is a methodology for a systematic use of variable-fidelity models or approximations in optimization. The intent of model management is to attain convergence to high-fidelity solutions with minimal expense in high-fidelity computations. The savings in terms of computationally intensive evaluations depends on the ability of the available lower-fidelity model or a suite of models to predict the improvement trends for the high-fidelity problem, Variable-fidelity models can be represented by data-fitting approximations, variable-resolution models. variable-convergence models. or variable physical fidelity models. The present work considers the use of variable-fidelity physics models. We demonstrate the performance of model management on an aerodynamic optimization of a multi-element airfoil designed to operate in the transonic regime. Reynolds-averaged Navier-Stokes equations represent the high-fidelity model, while the Euler equations represent the low-fidelity model. An unstructured mesh-based analysis code FUN2D evaluates functions and sensitivity derivatives for both models. Model management for the present demonstration problem yields fivefold savings in terms of high-fidelity evaluations compared to optimization done with high-fidelity computations alone.

  14. Optimized Data Transfers Based on the OpenCL Event Management Mechanism

    Directory of Open Access Journals (Sweden)

    Hiroyuki Takizawa

    2015-01-01

    Full Text Available In standard OpenCL programming, hosts are supposed to control their compute devices. Since compute devices are dedicated to kernel computation, only hosts can execute several kinds of data transfers such as internode communication and file access. These data transfers require one host to simultaneously play two or more roles due to the need for collaboration between the host and devices. The codes for such data transfers are likely to be system-specific, resulting in low portability. This paper proposes an OpenCL extension that incorporates such data transfers into the OpenCL event management mechanism. Unlike the current OpenCL standard, the main thread running on the host is not blocked to serialize dependent operations. Hence, an application can easily use the opportunities to overlap parallel activities of hosts and compute devices. In addition, the implementation details of data transfers are hidden behind the extension, and application programmers can use the optimized data transfers without any tricky programming techniques. The evaluation results show that the proposed extension can use the optimized data transfer implementation and thereby increase the sustained data transfer performance by about 18% for a real application accessing a big data file.

  15. A Linked Simulation-Optimization (LSO) Model for Conjunctive Irrigation Management using Clonal Selection Algorithm

    Science.gov (United States)

    Islam, Sirajul; Talukdar, Bipul

    2016-09-01

    A Linked Simulation-Optimization (LSO) model based on a Clonal Selection Algorithm (CSA) was formulated for application in conjunctive irrigation management. A series of measures were considered for reducing the computational burden associated with the LSO approach. Certain modifications were incurred to the formulated CSA, so as to decrease the number of function evaluations. In addition, a simple problem specific code for a two dimensional groundwater flow simulation model was developed. The flow model was further simplified by a novel approach of area reduction, in order to save computational time in simulation. The LSO model was applied in the irrigation command of the Pagladiya Dam Project in Assam, India. With a view to evaluate the performance of the CSA, a Genetic Algorithm (GA) was used as a comparison base. The results from the CSA compared well with those from the GA. In fact, the CSA was found to consume less computational time than the GA while converging to the optimal solution, due to the modifications incurred in it.

  16. Extending DIRAC File Management with Erasure-Coding for efficient storage

    CERN Document Server

    Skipsey, Samuel Cadellin; Britton, David; Crooks, David; Roy, Gareth

    2015-01-01

    The state of the art in Grid style data management is to achieve increased resilience of data via multiple complete replicas of data files across multiple storage endpoints. While this is effective, it is not the most space-efficient approach to resilience, especially when the reliability of individual storage endpoints is sufficiently high that only a few will be inactive at any point in time. We report on work performed as part of GridPP\\cite{GridPP}, extending the Dirac File Catalogue and file management interface to allow the placement of erasure-coded files: each file distributed as N identically-sized chunks of data striped across a vector of storage endpoints, encoded such that any M chunks can be lost and the original file can be reconstructed. The tools developed are transparent to the user, and, as well as allowing up and downloading of data to Grid storage, also provide the possibility of parallelising access across all of the distributed chunks at once, improving data transfer and IO performance. ...

  17. Multi-hop Cooperative Wireless Networks: Diversity Multiplexing Tradeoff and Optimal Code Design

    CERN Document Server

    Sreeram, K; Kumar, P Vijay

    2008-01-01

    We consider single-source single-sink (ss-ss) multi-hop networks, with slow-fading links and single-antenna half-duplex relays. We identify two families of networks that are multi-hop generalizations of the well-studied two-hop network: K-Parallel-Path (KPP) networks and layered networks. KPP networks can be viewed as the union of K node-disjoint parallel relaying paths, each of length greater than one. KPP networks are then generalized to KPP(I) networks, which permit interference between paths and to KPP(D) networks, which possess a direct link from source to sink. We characterize the DMT of these families of networks completely for K > 3. Layered networks are networks comprising of relaying layers with edges existing only within the same layer or between adjacent layers. We prove that a linear DMT between the maximum diversity d_{max} and the maximum multiplexing gain of 1 is achievable for fully-connected layered networks. This is shown to be equal to the optimal DMT if the number of layers is less than 4...

  18. Optimal Management and Design of Energy Systems under Atmospheric Uncertainty

    Science.gov (United States)

    Anitescu, M.; Constantinescu, E. M.; Zavala, V.

    2010-12-01

    optimal management and design of energy systems, such as the power grid or building systems, under atmospheric conditions uncertainty. The framework is defined in terms of a mathematical paradigm called stochastic programming: minimization of the expected value of the decision-makers objective function subject to physical and operational constraints, such as low blackout porbability, that are enforced on each scenario. We report results on testing the framework on the optimal management of power grid systems under high wind penetration scenarios, a problem whose time horizon is in the order of days. We discuss the computational effort of scenario generation which involves running WRF at high spatio-temporal resolution dictated by the operational constraints as well as solving the optimal dispatch problem. We demonstrate that accounting for uncertainty in atmospheric conditions results in blackout prevention, whereas decisions using only mean forecast does not. We discuss issues in using the framework for planning problems, whose time horizon is of several decades and what requirements this problem would entail from climate simulation systems.

  19. A four-column theory for the origin of the genetic code: tracing the evolutionary pathways that gave rise to an optimized code

    Directory of Open Access Journals (Sweden)

    Higgs Paul G

    2009-04-01

    Full Text Available Abstract Background The arrangement of the amino acids in the genetic code is such that neighbouring codons are assigned to amino acids with similar physical properties. Hence, the effects of translational error are minimized with respect to randomly reshuffled codes. Further inspection reveals that it is amino acids in the same column of the code (i.e. same second base that are similar, whereas those in the same row show no particular similarity. We propose a 'four-column' theory for the origin of the code that explains how the action of selection during the build-up of the code leads to a final code that has the observed properties. Results The theory makes the following propositions. (i The earliest amino acids in the code were those that are easiest to synthesize non-biologically, namely Gly, Ala, Asp, Glu and Val. (ii These amino acids are assigned to codons with G at first position. Therefore the first code may have used only these codons. (iii The code rapidly developed into a four-column code where all codons in the same column coded for the same amino acid: NUN = Val, NCN = Ala, NAN = Asp and/or Glu, and NGN = Gly. (iv Later amino acids were added sequentially to the code by a process of subdivision of codon blocks in which a subset of the codons assigned to an early amino acid were reassigned to a later amino acid. (v Later amino acids were added into positions formerly occupied by amino acids with similar properties because this can occur with minimal disruption to the proteins already encoded by the earlier code. As a result, the properties of the amino acids in the final code retain a four-column pattern that is a relic of the earliest stages of code evolution. Conclusion The driving force during this process is not the minimization of translational error, but positive selection for the increased diversity and functionality of the proteins that can be made with a larger amino acid alphabet. Nevertheless, the code that results is one

  20. Ethical guidance in the era of managed care: an analysis of the American College of Healthcare Executives' Code of Ethics.

    Science.gov (United States)

    Higgins, W

    2000-01-01

    Market competition and the rise of managed care are transforming the healthcare system from a physician-dominated cottage industry into a manager-dominated corporate enterprise. The managed care revolution is also undermining the safe-guards offered by medical ethics and raising serious public concerns. These trends highlight the growing importance of ethical standards for managers. The most comprehensive ethical guidance for health service managers is contained in the American College of Healthcare Executives' (ACHE) Code of Ethics. An analysis of the ACHE Code suggests that it does not adequately address several ethical concerns associated with managed care. The ACHE may wish to develop a supplemental statement regarding ethical issues in managed care. A supplemental statement that provides more specific guidance in the areas of financial incentives to reduce utilization, social mission, consumer/patient information, and the health service manager's responsibility to patients could be extremely valuable in today's complex and rapidly changing environment. More specific ethical guidelines would not ensure individual or organizational compliance. However, they would provide professional standards that could guide decision making and help managers evaluate performance in managed care settings.

  1. Managers, Teachers, Students, and Parents' Opinions Concerning Changes on Dress Code Practices as an Educational Policy

    Science.gov (United States)

    Birel, Firat Kiyas

    2016-01-01

    Problem Statement: Dressing for school has been intensely disputed and has led to periodic changes in dress codes since the foundation of the Turkish republic. Practitioners have tried to put some new practices related to school dress codes into practice for redressing former dress code issues involving mandatory dress standards for both students…

  2. Optimal Power Management Strategy for Energy Storage with Stochastic Loads

    Directory of Open Access Journals (Sweden)

    Stefano Pietrosanti

    2016-03-01

    Full Text Available In this paper, a power management strategy (PMS has been developed for the control of energy storage in a system subjected to loads of random duration. The PMS minimises the costs associated with the energy consumption of specific systems powered by a primary energy source and equipped with energy storage, under the assumption that the statistical distribution of load durations is known. By including the variability of the load in the cost function, it was possible to define the optimality criteria for the power flow of the storage. Numerical calculations have been performed obtaining the control strategies associated with the global minimum in energy costs, for a wide range of initial conditions of the system. The results of the calculations have been tested on a MATLAB/Simulink model of a rubber tyre gantry (RTG crane equipped with a flywheel energy storage system (FESS and subjected to a test cycle, which corresponds to the real operation of a crane in the Port of Felixstowe. The results of the model show increased energy savings and reduced peak power demand with respect to existing control strategies, indicating considerable potential savings for port operators in terms of energy and maintenance costs.

  3. Configuration Management of an Optimization Application in a Research Environment

    Science.gov (United States)

    Townsend, James C.; Salas, Andrea O.; Schuler, M. Patricia

    1999-01-01

    Multidisciplinary design optimization (MDO) research aims to increase interdisciplinary communication and reduce design cycle time by combining system analyses (simulations) with design space search and decision making. The High Performance Computing and Communication Program's current High Speed Civil Transport application, HSCT4.0, at NASA Langley Research Center involves a highly complex analysis process with high-fidelity analyses that are more realistic than previous efforts at the Center. The multidisciplinary processes have been integrated to form a distributed application by using the Java language and Common Object Request Broker Architecture (CORBA) software techniques. HSCT4.0 is a research project in which both the application problem and the implementation strategy have evolved as the MDO and integration issues became better understood. Whereas earlier versions of the application and integrated system were developed with a simple, manual software configuration management (SCM) process, it was evident that this larger project required a more formal SCM procedure. This report briefly describes the HSCT4.0 analysis and its CORBA implementation and then discusses some SCM concepts and their application to this project. In anticipation that SCM will prove beneficial for other large research projects, the report concludes with some lessons learned in overcoming SCM implementation problems for HSCT4.0.

  4. Optimal management of giant cell arteritis and polymyalgia rheumatica

    Directory of Open Access Journals (Sweden)

    Charlton R

    2012-04-01

    Full Text Available Rodger CharltonCollege of Medicine, Swansea University, Wales, UKAbstract: Giant cell arteritis (GCA and polymyalgia rheumatica (PMR are clinical diagnoses without "gold standard" serological or histological tests, excluding temporal artery biopsy for GCA. Further, other conditions may mimic GCA and PMR. Treatment with 10–20 mg of prednisolone daily is suggested for PMR or 40–60 mg daily for GCA when temporal arteritis is suspected. This ocular involvement of GCA should be treated as a medical emergency to prevent possible blindness and steroids should be commenced immediately. There are no absolute guidelines as to the dose or duration of administration; the therapeutics of treating this condition and the rate of reduction of prednisolone should be adjusted depending on the individual's response and with consideration of the multiple risks of high-dose and long-term glucocorticoids. Optimal management may need to consider the role of low-dose aspirin in reducing complications. Clinicians should also be aware of studies that indicate an increased incidence of large-artery complications with GCA. This clinical area requires further research through future development of radiological imaging to aid the diagnosis and produce a clearer consensus relating to diagnosis and treatment.Keywords: arteritis, visual loss, blindness, erythrocyte sedimentation rate, stiffness, pain, aspirin, disability, glucocorticoids

  5. Optimized Waterspace Management and Scheduling Using Mixed-Integer Linear Programming

    Science.gov (United States)

    2016-01-01

    TECHNICAL REPORT NSWC PCD TR 2015-003 OPTIMIZED WATERSPACE MANAGEMENT AND SCHEDULING USING MIXED-INTEGER LINEAR PROGRAMMING...constraints required for the mathematical formulation of the MCM scheduling problem pertaining to the survey constraints and logistics management . The...Floudas, Nonlinear and Mixed-Integer Optimization: Fundamentals and Applications, Oxford University Press, 1995. [10] M. J. Bays, A. Shende, D. J

  6. Energy Management Policies for Energy-Neutral Source-Channel Coding

    CERN Document Server

    Castiglione, Paolo; Erkip, Elza; Zemen, Thomas

    2011-01-01

    In cyber-physical systems where sensors measure the temporal evolution of a given phenomenon of interest and radio communication takes place over short distances, the energy spent for source acquisition and compression may be comparable with that used for transmission. Additionally, in order to avoid limited lifetime issues, sensors may be powered via energy harvesting and thus collect all the energy they need from the environment. This work addresses the problem of energy allocation over source acquisition/compression and transmission for energy-harvesting sensors. At first, focusing on a single-sensor, energy management policies are identified that guarantee a maximal average distortion while at the same time ensuring the stability of the queue connecting source and channel encoders. It is shown that the identified class of policies is optimal in the sense that it stabilizes the queue whenever this is feasible by any other technique that satisfies the same average distortion constraint. Moreover, this class...

  7. Code Switching in the Classroom: A Case Study of Economics and Management Students at the University of Sfax, Tunisia

    Science.gov (United States)

    Bach Baoueb, Sallouha Lamia; Toumi, Naouel

    2012-01-01

    This case study explores the motivations for code switching (CS) in the interactions of Tunisian students at the faculty of Economics and Management in Sfax, Tunisia. The study focuses on students' (EMSs) classroom conversations and out-of-classroom peer interactions. The analysis of the social motivations of EMSs' CS behaviour shows that…

  8. Code Switching in the Classroom: A Case Study of Economics and Management Students at the University of Sfax, Tunisia

    Science.gov (United States)

    Bach Baoueb, Sallouha Lamia; Toumi, Naouel

    2012-01-01

    This case study explores the motivations for code switching (CS) in the interactions of Tunisian students at the faculty of Economics and Management in Sfax, Tunisia. The study focuses on students' (EMSs) classroom conversations and out-of-classroom peer interactions. The analysis of the social motivations of EMSs' CS behaviour shows that…

  9. Integrating digital image management software for improved patient care and optimal practice management.

    Science.gov (United States)

    Starr, Jon C

    2006-06-01

    Photographic images provide vital documentation of preoperative, intraoperative, and postoperative results in the clinical dermatologic surgery practice and can document histologic findings from skin biopsies, thereby enhancing patient care. Images may be printed as part of text documents, transmitted via electronic mail, or included in electronic medical records. To describe existing computer software that integrates digital photography and the medical record to improve patient care and practice management. A variety of computer applications are available to optimize the use of digital images in the dermatologic practice.

  10. Optimal Privacy-Cost Trade-off in Demand-Side Management with Storage

    OpenAIRE

    Tan, Onur; Gündüz, Deniz; Gómez-Vilardebó, Jesús

    2015-01-01

    Demand-side energy storage management is studied from a joint privacy-energy cost optimization perspective. Assuming that the user's power demand profile as well as the electricity prices are known non-causally, the optimal energy management (EM) policy that jointly increases the privacy of the user and reduces his energy cost is characterized. The backward water-filling interpretation is provided for the optimal EM policy. While the energy cost is reduced by requesting more energy when the p...

  11. Optimal management of nail disease in patients with psoriasis

    Directory of Open Access Journals (Sweden)

    Piraccini BM

    2015-01-01

    psoriasis and the optimal management of nail disease in patients with psoriasis. Keywords: biologics, nail psoriasis, topical therapy, systemic therapy

  12. Principles for a code of conduct for the sustainable management of mangrove ecosystems: a work in progress for public discussion

    DEFF Research Database (Denmark)

    Nielsen, Thomas

    The Principles for a Code of Conduct for Sustainable Management of Mangrove Ecosystems is a guide to assist states, local and national non-governmental organizations and other stakeholders to develop cooperatively local codes, laws and/or regulations to protect mangroves and the critical functions...... they serve with regard to contributions to local livelihood, biodiversity conservation and coastal protection though sustainable management. The objective is to help bring attention to the importance of mangrove ecosystems, particularly to policy makers, to help arrest and reverse their loss. The Principles...... mangrove management experience, about fifteen country case studies from all regions where mangroves exist, and seven regional workshops to date. The purpose of the presentation at this ReNED forum is gain additional feedback from researchers, in particular, to provide input on the content of the Principles...

  13. Principles for a code of conduct for the sustainable management of mangrove ecosystems: a work in progress for public discussion

    DEFF Research Database (Denmark)

    Nielsen, Thomas

    The Principles for a Code of Conduct for Sustainable Management of Mangrove Ecosystems is a guide to assist states, local and national non-governmental organizations and other stakeholders to develop cooperatively local codes, laws and/or regulations to protect mangroves and the critical functions...... they serve with regard to contributions to local livelihood, biodiversity conservation and coastal protection though sustainable management. The objective is to help bring attention to the importance of mangrove ecosystems, particularly to policy makers, to help arrest and reverse their loss. The Principles...... mangrove management experience, about fifteen country case studies from all regions where mangroves exist, and seven regional workshops to date. The purpose of the presentation at this ReNED forum is gain additional feedback from researchers, in particular, to provide input on the content of the Principles...

  14. Principles for a Code of Conduct for the Sustainable Management of Mangrove Ecosystems: A Work in Progress for Public Discussion

    DEFF Research Database (Denmark)

    Nielsen, Thomas

    The Principles for a Code of Conduct for Sustainable Management of Mangrove Ecosystems is a guide to assist states, local and national non-governmental organizations and other stakeholders to develop cooperatively local codes, laws and/or regulations to protect mangroves and the critical functions......, grassroots organizations and other interested individuals and groups. The Principles were formulated based on a review of global mangrove management experience, about fifteen country case studies, including Auastralia, from all regions where mangroves exist, and seven regional workshops to date. The purpose...... of making a presentation on the Mangrove Principles at IMPAC is to gain additional feedback from interested stakeholders and experts, in particular, to provide inputs to the content of the Principles and recommendations for activities to promote their use as a management tool. The Principles and many...

  15. Revenue enhancement through total quality management/continuous quality improvement (TQM/CQI) in outpatient coding and billing.

    Science.gov (United States)

    Dwore, R B; Murray, B P; Parsons, R J; Smith, P M; Vorderer, L H

    1995-01-01

    To survive and thrive, rural hospitals are seeking enhanced revenues. This study focuses on outpatient laboratory and radiology coding and billing accuracy in a nonrandom sample of seven rural hospitals in a Western state. Information was gathered on (1) procedures incorrectly coded, (2) potential revenue increases from correct coding and billing, (3) barriers to implementing changes, and (4) perceived audit value. The identified major source of potential revenue enhancement was increased fees from private payers. Correct coding and billing to Medicare and Medicaid offered the potential of additional revenue. Participating administrators appreciated the validation of coding and billing practices and identification of potential enhanced revenues. Five of seven hospitals (71.4%) selectively implemented recommended changes. Complete compliance with recommended changes was limited by barriers of tradition, competition, and reimbursement, which must be overcome to realize successful implementation. Joint Commission on Accreditation of Healthcare Organization's (JCAHO) new Total Quality Management/Continuous Quality Improvement (TQM/CQI) emphasis provides an opportunity for revenue enhancement through coding/billing assessments and interdepartmental focus and coordination.

  16. 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences

    CERN Document Server

    Dinh, Tao; Nguyen, Ngoc

    2015-01-01

    This proceedings set contains 85 selected full papers presented at the 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences - MCO 2015, held on May 11–13, 2015 at Lorraine University, France. The present part I of the 2 volume set includes articles devoted to Combinatorial optimization and applications, DC programming and DCA: thirty years of Developments, Dynamic Optimization, Modelling and Optimization in financial engineering, Multiobjective programming, Numerical Optimization, Spline Approximation and Optimization, as well as Variational Principles and Applications

  17. D-MG Tradeoff and Optimal Codes for a Class of AF and DF Cooperative Communication Protocols

    CERN Document Server

    Elia, Petros; Anand, M; Kumar, P Vijay

    2007-01-01

    We consider cooperative relay communication in a fading channel environment under the Orthogonal Amplify and Forward (OAF) and Orthogonal and Non-Orthogonal Selection Decode and Forward (OSDF and NSDF) protocols. For all these protocols, we compute the Diversity-Multiplexing Gain Tradeoff (DMT). We construct DMT optimal codes for the protocols which are sphere decodable and, in certain cases, incur minimum possible delay. Our results establish that the DMT of the OAF protocol is identical to the DMT of the Non-Orthogonal Amplify and Forward (NAF) protocol. Two variants of the NSDF protocol are considered: fixed-NSDF and variable-NSDF protocol. In the variable-NSDF protocol, the fraction of time duration for which the source alone transmits is allowed to vary with the rate of communication. Among the class of static amplify-and-forward and decode-and-forward protocols, the variable-NSDF protocol is shown to have the best known DMT for any number of relays apart from the two-relay case. When there are two relay...

  18. Real-time photoacoustic and ultrasound dual-modality imaging system facilitated with graphics processing unit and code parallel optimization.

    Science.gov (United States)

    Yuan, Jie; Xu, Guan; Yu, Yao; Zhou, Yu; Carson, Paul L; Wang, Xueding; Liu, Xiaojun

    2013-08-01

    Photoacoustic tomography (PAT) offers structural and functional imaging of living biological tissue with highly sensitive optical absorption contrast and excellent spatial resolution comparable to medical ultrasound (US) imaging. We report the development of a fully integrated PAT and US dual-modality imaging system, which performs signal scanning, image reconstruction, and display for both photoacoustic (PA) and US imaging all in a truly real-time manner. The back-projection (BP) algorithm for PA image reconstruction is optimized to reduce the computational cost and facilitate parallel computation on a state of the art graphics processing unit (GPU) card. For the first time, PAT and US imaging of the same object can be conducted simultaneously and continuously, at a real-time frame rate, presently limited by the laser repetition rate of 10 Hz. Noninvasive PAT and US imaging of human peripheral joints in vivo were achieved, demonstrating the satisfactory image quality realized with this system. Another experiment, simultaneous PAT and US imaging of contrast agent flowing through an artificial vessel, was conducted to verify the performance of this system for imaging fast biological events. The GPU-based image reconstruction software code for this dual-modality system is open source and available for download from http://sourceforge.net/projects/patrealtime.

  19. Onboard near-optimal climb-dash energy management

    Science.gov (United States)

    Weston, A.; Cliff, G.; Kelley, H.

    1985-01-01

    This paper studies optimal and near-optimal trajectories of high-performance aircraft in symmetric flight. Onboard, real-time, near-optimal guidance is considered for the climb-dash mission, using some of the boundary-layer structure and hierarchical ideas from singular perturbations. In the case of symmetric flight, this resembles neighborhood-optimal guidance using energy-to-go as the running variable. However, extension to three-dimensional flight is proposed, using families of nominal paths with heading-to-go as the additional running variable. Some computational results are presented for the symmetric case.

  20. Design of investment management optimization system for power grid companies under new electricity reform

    Science.gov (United States)

    Yang, Chunhui; Su, Zhixiong; Wang, Xin; Liu, Yang; Qi, Yongwei

    2017-03-01

    The new normalization of the economic situation and the implementation of a new round of electric power system reform put forward higher requirements to the daily operation of power grid companies. As an important day-to-day operation of power grid companies, investment management is directly related to the promotion of the company's operating efficiency and management level. In this context, the establishment of power grid company investment management optimization system will help to improve the level of investment management and control the company, which is of great significance for power gird companies to adapt to market environment changing as soon as possible and meet the policy environment requirements. Therefore, the purpose of this paper is to construct the investment management optimization system of power grid companies, which includes investment management system, investment process control system, investment structure optimization system, and investment project evaluation system and investment management information platform support system.

  1. Optimal Energy Management of Multi-Microgrids with Sequentially Coordinated Operations

    Directory of Open Access Journals (Sweden)

    Nah-Oak Song

    2015-08-01

    Full Text Available We propose an optimal electric energy management of a cooperative multi-microgrid community with sequentially coordinated operations. The sequentially coordinated operations are suggested to distribute computational burden and yet to make the optimal 24 energy management of multi-microgrids possible. The sequential operations are mathematically modeled to find the optimal operation conditions and illustrated with physical interpretation of how to achieve optimal energy management in the cooperative multi-microgrid community. This global electric energy optimization of the cooperative community is realized by the ancillary internal trading between the microgrids in the cooperative community which reduces the extra cost from unnecessary external trading by adjusting the electric energy production amounts of combined heat and power (CHP generators and amounts of both internal and external electric energy trading of the cooperative community. A simulation study is also conducted to validate the proposed mathematical energy management models.

  2. Inventory Management and the Impact of Anticipation in Evolutionary Stochastic Online Dynamic Optimization

    NARCIS (Netherlands)

    Bosman, P.A.N.; La Poutré, J.A.

    2007-01-01

    Inventory management (IM) is an important area in logistics. The goal is to manage the inventory of a vendor as efficiently as possible. Its practical relevance also makes it an important real-world application for research in optimization. Because inventory must be managed over time, IM optimizatio

  3. Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  4. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    Energy Technology Data Exchange (ETDEWEB)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek; Barnes, Taylor; Wichmann, Nathan; Raman, Karthik; Sasanka, Ruchira; Louie, Steven G.

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW method is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.

  5. Optimal Management of Two-Species Bioeconomics Systems

    Institute of Scientific and Technical Information of China (English)

    Shou Jilin; Tarafdar E U; Li Yanmei

    1998-01-01

    In this paper, the Optimal Control Theory is applied to two-species bioeconomic systems. The optimal harvest policies are presented when one fleet makes natural mixed harvest and when one fleet is divided into two groups to harvest two independent species or two competitive species, respectively. The connections and differences between them are also considered.

  6. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  7. Application of two-dimensional QR code in the management of precious instrument in operating room%QR二维码在手术室贵重仪器管理中的应用

    Institute of Scientific and Technical Information of China (English)

    蒋佳男; 王必超; 冯善武; 王化宇; 付东英

    2015-01-01

    Objective To manage the precious instruments in operating room using the self-made two-di-mensional code by QR two-dimensional code storage information principle. Methods Two-dimensional QR code, generated by the two-dimensional code software based on the relevant information of the required storage instru-ments edited by the office software, were pasted in the instruments for equipment management. Results The effi-ciency of equipment management using two-dimensional QR code technology was higher than that of traditional method. Conclusion Two-dimensional QR codes provide valuable benefits for the optimization of management pro-cess, data statistics, and instrument tracing, showing great convenience to equipment management in operating room.%目的:运用QR二维码存储信息原理,自制二维码对手术室贵重仪器进行管理。方法利用Office软件编辑所需存储仪器的相应信息,使二维码软件生成对应的二维码并粘贴在仪器上,进行仪器管理。结果使用QR二维码技术进行仪器管理效率明显高于传统管理效率。结论 QR二维码在优化管理流程,数据统计,仪器追踪等方面效果明显,为手术室仪器管理工作提供便捷。

  8. Optimal reconfiguration-based dynamic tariff for congestion management and line loss reduction in distribution networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Cheng, Lin

    2016-01-01

    This paper presents an optimal reconfiguration-based dynamic tariff (DT) method for congestion management and line loss reduction in distribution networks with high penetration of electric vehicles. In the proposed DT concept, feeder reconfiguration (FR) is employed through mixed integer programm......This paper presents an optimal reconfiguration-based dynamic tariff (DT) method for congestion management and line loss reduction in distribution networks with high penetration of electric vehicles. In the proposed DT concept, feeder reconfiguration (FR) is employed through mixed integer...... manner through the DT framework. Three case studies were conducted to validate the optimal reconfiguration-based DT method for congestion management and line loss reduction in distribution networks....

  9. THE ANALYSIS AND ASSESSMENT OF AN ETHICS MANAGEMENT TOOL - CANADIAN MARKETING ASSOCIATION’S CODE OF ETHICS

    Directory of Open Access Journals (Sweden)

    IBRIAN CĂRĂMIDARU

    2007-01-01

    Full Text Available The object of this study is the Code of Ethics and Standards of Practice of the Canadian Marketing Association (CMA. The focus is on the structure, contents and the role of this tool. The developing, upgrading and implementing assigned to this code are compared to the standards proposed by Institute of Business Ethics (London.The fact that the CMA is an organization built upon a professional criteria places all its ethical rules in the area of professional business ethics, but being an association that postulates behavioural frames for marketers, it involves the behaviour of firms and individuals that may, or may not be members of the CMA. Using this type of code, proposed for analysis, the CMA replaces with its contents other business ethics tools (credo, policies and procedures, conduct codes etc. Actually, Code of Ethics and Standards of Practice presents both the overall strategies that the ACM uses concerning ethics management and the specific policies and procedures for controversial situations from a moral point of view.

  10. A simulation-optimization model for effective water resources management in the coastal zone

    Science.gov (United States)

    Spanoudaki, Katerina; Kampanis, Nikolaos

    2015-04-01

    -diffusion equation describing the fate and transport of contaminants introduced in a 3D turbulent flow field to the partial differential equation describing the fate and transport of contaminants in 3D transient groundwater flow systems. The model has been further developed to include the effects of density variations on surface water and groundwater flow, while the already built-in solute transport capabilities are used to simulate salinity interactions. The refined model is based on the finite volume method using a cell-centred structured grid, providing thus flexibility and accuracy in simulating irregular boundary geometries. For addressing water resources management problems, simulation models are usually externally coupled with optimisation-based management models. However this usually requires a very large number of iterations between the optimisation and simulation models in order to obtain the optimal management solution. As an alternative approach, for improved computational efficiency, an Artificial Neural Network (ANN) is trained as an approximate simulator of IRENE. The trained ANN is then linked to a Genetic Algorithm (GA) based optimisation model for managing salinisation problems in the coastal zone. The linked simulation-optimisation model is applied to a hypothetical study area for performance evaluation. Acknowledgement The work presented in this paper has been funded by the Greek State Scholarships Foundation (IKY), Fellowships of Excellence for Postdoctoral Studies (Siemens Program), 'A simulation-optimization model for assessing the best practices for the protection of surface water and groundwater in the coastal zone', (2013 - 2015). References Spanoudaki, K., Stamou, A.I. and Nanou-Giannarou, A. (2009). Development and verification of a 3-D integrated surface water-groundwater model. Journal of Hydrology, 375 (3-4), 410-427. Spanoudaki, K. (2010). Integrated numerical modelling of surface water groundwater systems (in Greek). Ph.D. Thesis, National Technical

  11. Principles for a Code of Conduct for the Sustainable Management of Mangrove Ecosystems: A Work in Progress for Public Discussion

    DEFF Research Database (Denmark)

    Nielsen, Thomas

    The Principles for a Code of Conduct for Sustainable Management of Mangrove Ecosystems is a guide to assist states, local and national non-governmental organizations and other stakeholders to develop cooperatively local codes, laws and/or regulations to protect mangroves and the critical functions...... they serve. Mangroves contribute significantly to local livelihoods, biodiversity conservation, fisheries productivity and coastal protection. But these economic and environmental benefits are only achievable through greater awareness that mangroves must be managed sustainably. The objective is to help bring...... attention to the importance of mangrove ecosystems, particularly to policy makers, to help arrest and reverse their loss. The Principles, being a work in progress, are being discussed in a range of forums that involve representatives from governments, NGOs, multilateral organizations, research institutions...

  12. Application of ant colony optimization approach to severe accident management measures of Maanshan nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, C.-M.; Wang, S.-J. [Inst. of Nuclear Energy Research, Taiwan (China)

    2011-07-01

    The first three guidelines in the Maanshan SAMG were respectively evaluated for the effects in the SBO incident. The MAAP5 code was used to simulate the sequence of events and physical phenomena in the plant. The results show that the priority optimization should be carried out at two separated scenarios, i.e. the power recovered prior or after hot-leg creep rupture. The performance indices in the ant colony optimization could be the vessel life and the hydrogen generation from core for ant colony optimization. (author)

  13. Optimal trading quantity integration as a basis for optimal portfolio management

    Directory of Open Access Journals (Sweden)

    Saša Žiković

    2005-06-01

    Full Text Available The author in this paper points out the reason behind calculating and using optimal trading quantity in conjunction with Markowitz’s Modern portfolio theory. In the opening part the author presents an example of calculating optimal weights using Markowitz’s Mean-Variance approach, followed by an explanation of basic logic behind optimal trading quantity. The use of optimal trading quantity is not limited to systems with Bernoulli outcome, but can also be used when trading shares, futures, options etc. Optimal trading quantity points out two often-overlooked axioms: (1 a system with negative mathematical expectancy can never be transformed in a system with positive mathematical expectancy, (2 by missing the optimal trading quantity an investor can turn a system with positive expectancy into a negative one. Optimal trading quantity is that quantity which maximizes geometric mean (growth function of a particular system. To determine the optimal trading quantity for simpler systems, with a very limited number of outcomes, a set of Kelly’s formulas is appropriate. In the conclusion the summary of the paper is presented.

  14. A model for the optimal risk management of farm firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    2012-01-01

    by a description of procedures for coordinated and economical application of resources to control the probability and/or impact of unfortunate events. Besides identifying the major risk factors and tools for risk management in agricultural production, the paper will look critically into the current methods......Risk management is an integrated part of business or firm management and deals with the problem of how to avoid the risk of economic losses when the objective is to maximize expected profit. This paper will focus on the identification, assessment, and prioritization of risks in agriculture followed...... for risk management Risk management is typically based on numerical analysis and the concept of efficiency. None of the methods developed so far actually solve the basic question of how the individual manager should behave so as to optimise the balance between expected profit/income and risk. In the paper...

  15. Selection of the optimal model of integrated sustainable management system in the mining companies

    OpenAIRE

    Miletić, Slavica; Bogdanović, Dejan; Paunković, Jane

    2015-01-01

    The multi-criteria analysis for the selection of the optimal model of integrated management system was conducted in this paper in order to improve the performance of mining companies. Integrated management system is the process of integrating of different management systems in a contemporary business as a requirement for each company in order to survive in the market. Modern companies that have implemented integrated management system work better, have arranged processes, improve its structur...

  16. Optimal power flow for technically feasible Energy Management systems in Islanded Microgrids

    DEFF Research Database (Denmark)

    Sanseverino, Eleonora Riva; T. T. Quynh, T.; Di Silvestre, Maria Luisa

    2016-01-01

    flow is carried out using a Glow-worm Swarm Optimizer. The control level is organized into two different sub-levels, the highest of which accounts for minimum cost operation and the lowest one solving the optimal power flow and devising the set points of inverter interfaced generation units......This paper presents a combined optimal energy and power flow management for islanded microgrids. The highest control level in this case will provide a feasible and optimized operating point around the economic optimum. In order to account for both unbalanced and balanced loads, the optimal power...

  17. Assets and Liabilities Management – Concept and Optimal Organization

    OpenAIRE

    Ciobotea Adina; Oaca Sorina Cristina

    2011-01-01

    Asset-liability management (ALM) is a term whose meaning has evolved. It is used in slightly different ways in different contexts. ALM was pioneered by financial institutions, but corporations now also apply ALM techniques. In banking, asset and liability management is the practice of managing risks that arise due to mismatches between the assets and liabilities (debts and assets) of the bank. This can also be seen in insurance. Banks face several risks such as the liquidity risk, interest ra...

  18. A convex programming framework for optimal and bounded suboptimal well field management

    DEFF Research Database (Denmark)

    Dorini, Gianluca Fabio; Thordarson, Fannar Ørn; Bauer-Gottwein, Peter

    2012-01-01

    are often convex, hence global optimality can be attained by a wealth of algorithms. Among these, the Interior Point methods are extensively employed for practical applications, as they are capable of efficiently solving large-scale problems. Despite this, management models explicitly embedding both systems....... The objective of the management is to minimize the total cost of pump operations over a multistep time horizon, while fulfilling a set of time-varying management constraints. Optimization in groundwater management and pressurized WDNs have been widely investigated in the literature. Problem formulations...

  19. Final Cost and Performance Report Application of Flow and Transport Optimization Codes to Groundwater Pump and Treat Systems

    Science.gov (United States)

    2004-01-01

    and treat; Remediation System Evaluation ( RSE ); Remedial Process Optimization (RPO) 16. SECURITY CLASSIFICATION OF: 19a. NAME OF RESPONSIBLE...Demolition Explosive RPO Remedial Process Optimization RSE Remediation System Evaluation SVE soil vapor extraction TCE Trichloroethylene TIO...standards in less time). Remediation System Evaluation ( RSE ) or Remedial Process Optimization (RPO) provides a broad assessment of optimization

  20. Optimized and secure technique for multiplexing QR code images of single characters: application to noiseless messages retrieval

    Science.gov (United States)

    Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto

    2015-08-01

    We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.

  1. Performance Optimization of Dispersion-Managed WDM Systems Based on Four-Wave Mixing

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    We systemically investigate the interchannel four-wave mixing (FWM) in dispersion-managed WDM systems with arbitrary launch position. We optimize the number of fiber sections, and the dispersion ratio for the system performance.

  2. Optimization Alternatives of Information Systems for Risk Management

    Directory of Open Access Journals (Sweden)

    Iulia APOSTOL-MAURER

    2008-01-01

    Full Text Available The focus of this article lies on synthesizing the requirements a risk management information system should meet in order to ensure an efficient risk management within a company, but also on presenting the architecture of the information system proposed by the author, from a structural point of view as well as from the background of the data integration alternatives.

  3. Optimization Alternatives of Information Systems for Risk Management

    OpenAIRE

    Iulia APOSTOL-MAURER

    2008-01-01

    The focus of this article lies on synthesizing the requirements a risk management information system should meet in order to ensure an efficient risk management within a company, but also on presenting the architecture of the information system proposed by the author, from a structural point of view as well as from the background of the data integration alternatives.

  4. Cover crop residue management for optimizing weed control

    NARCIS (Netherlands)

    Kruidhof, H.M.; Bastiaans, L.; Kropff, M.J.

    2009-01-01

    Although residue management seems a key factor in residue-mediated weed suppression, very few studies have systematically compared the influence of different residue management strategies on the establishment of crop and weed species. We evaluated the effect of several methods of pre-treatment and p

  5. Management of Automotive Engine Based on Stable Fuzzy Technique with Parallel Sliding Mode Optimization

    Directory of Open Access Journals (Sweden)

    Mansour Bazregar

    2013-12-01

    Full Text Available Both fuzzy logic and sliding mode can compensate the steady-state error of proportional-derivative (PD method. This paper presents parallel sliding mode optimization for fuzzy PD management. The asymptotic stability of fuzzy PD management with first-order sliding mode optimization in the parallel structure is proven. For the parallel structure, the finite time convergence with a super-twisting second-order sliding-mode is guaranteed.

  6. Multiobjective waste management optimization strategy coupling life cycle assessment and genetic algorithms: application to PET bottles

    OpenAIRE

    Komly, Claude-Emma; Azzaro-Pantel, Catherine; Hubert, Antoine; Pibouleau, Luc; Archambault, Valérie

    2012-01-01

    International audience; A mathematical model based on life-cycle assessment (LCA) results is developed to assess the environmental efficiency of the end-of-life management of polyethylene terephthalate (PET) bottles. For this purpose, multiobjective optimization and decision support tools are used to define optimal targets for efficient waste management. The global environmental impacts associated with the treatment of PET bottles from their cradle to their ultimate graves (incineration, land...

  7. Patients optimizing epilepsy management via an online community

    Science.gov (United States)

    Barnes, Deborah; Parko, Karen; Durgin, Tracy; Van Bebber, Stephanie; Graham, Arianne; Wicks, Paul

    2015-01-01

    Objective: The study objective was to test whether engaging in an online patient community improves self-management and self-efficacy in veterans with epilepsy. Methods: The study primary outcomes were validated questionnaires for self-management (Epilepsy Self-Management Scale [ESMS]) and self-efficacy (Epilepsy Self-Efficacy Scale [ESES]). Results were based on within-subject comparisons of pre- and postintervention survey responses of veterans with epilepsy engaging with the PatientsLikeMe platform for a period of at least 6 weeks. Analyses were based on both completer and intention-to-treat scenarios. Results: Of 249 eligible participants enrolled, 92 individuals completed both surveys. Over 6 weeks, completers improved their epilepsy self-management (ESMS total score from 139.7 to 142.7, p = 0.02) and epilepsy self-efficacy (ESES total score from 244.2 to 254.4, p = 0.02) scores, with greatest impact on an information management subscale (ESMS–information management total score from 20.3 to 22.4, p < 0.001). Results were similar in intention-to-treat analyses. Median number of logins, postings to forums, leaving profile comments, and sending private messages were more common in completers than noncompleters. Conclusions: An internet-based psychosocial intervention was feasible to implement in the US veteran population and increased epilepsy self-management and self-efficacy scores. The greatest improvement was noted for information management behaviors. Patients with chronic conditions are increasingly encouraged to self-manage their condition, and digital communities have potential advantages, such as convenience, scalability to large populations, and building a community support network. Classification of evidence: This study provides Class IV evidence that for patients with epilepsy, engaging in an online patient community improves self-management and self-efficacy. PMID:26085605

  8. Managing XML Data to optimize Performance into Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    Iuliana BOTHA

    2011-06-01

    Full Text Available This paper propose some possibilities for manage XML data in order to optimize performance into object-relational databases. It is detailed the possibility of storing XML data into such databases, using for exemplification an Oracle database and there are tested some optimizing techniques of the queries over XMLType tables, like indexing and partitioning tables.

  9. Optimal search: a practical interpretation of information-driven sensor management

    NARCIS (Netherlands)

    Katsilieris, F.; Boers, Y.

    We consider the problem of scheduling an agile sensor for performing optimal search for a target. A probability density function is created for representing our knowledge about where the target might be and it is utilized by the proposed sensor management criteria for finding optimal search

  10. Portfolio management using value at risk: A comparison between genetic algorithms and particle swarm optimization

    NARCIS (Netherlands)

    V.A.F. Dallagnol (V. A F); J.H. van den Berg (Jan); L. Mous (Lonneke)

    2009-01-01

    textabstractIn this paper, it is shown a comparison of the application of particle swarm optimization and genetic algorithms to portfolio management, in a constrained portfolio optimization problem where no short sales are allowed. The objective function to be minimized is the value at risk calculat

  11. Distribution Locational Marginal Pricing for Optimal Electric Vehicle Charging Management

    DEFF Research Database (Denmark)

    Li, Ruoyang; Wu, Qiuwei; Oren, Shmuel S.

    2013-01-01

    This paper presents an integrated distribution locational marginal pricing (DLMP) method designed to alleviate congestion induced by electric vehicle (EV) loads in future power systems. In the proposed approach, the distribution system operator (DSO) determines distribution locational marginal...... prices (DLMPs) by solving the social welfare optimization of the Electric distribution system which considers EV aggregators as Price takers in the local DSO market and demand price elasticity. Nonlinear optimization has been used to solve the social welfare optimization problem in order to obtain...... the DLMPs. The efficacy of the proposed approach was demonstrated by using the bus 4 distribution system of the Roy Billinton Test System (RBTS) and Danish driving data. The case study results show that the integrated DLMP methodology can successfully alleviate the congestion caused by EV loads. It is also...

  12. Optimal Power Cost Management Using Stored Energy in Data Centers

    CERN Document Server

    Urgaonkar, Rahul; Neely, Michael J; Sivasubramaniam, Anand

    2011-01-01

    Since the electricity bill of a data center constitutes a significant portion of its overall operational costs, reducing this has become important. We investigate cost reduction opportunities that arise by the use of uninterrupted power supply (UPS) units as energy storage devices. This represents a deviation from the usual use of these devices as mere transitional fail-over mechanisms between utility and captive sources such as diesel generators. We consider the problem of opportunistically using these devices to reduce the time average electric utility bill in a data center. Using the technique of Lyapunov optimization, we develop an online control algorithm that can optimally exploit these devices to minimize the time average cost. This algorithm operates without any knowledge of the statistics of the workload or electricity cost processes, making it attractive in the presence of workload and pricing uncertainties. An interesting feature of our algorithm is that its deviation from optimality reduces as the...

  13. User's guide for the BNW-III optimization code for modular dry/wet-cooled power plants

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Faletti, D.W.

    1984-09-01

    This user's guide describes BNW-III, a computer code developed by the Pacific Northwest Laboratory (PNL) as part of the Dry Cooling Enhancement Program sponsored by the US Department of Energy (DOE). The BNW-III code models a modular dry/wet cooling system for a nuclear or fossil fuel power plant. The purpose of this guide is to give the code user a brief description of what the BNW-III code is and how to use it. It describes the cooling system being modeled and the various models used. A detailed description of code input and code output is also included. The BNW-III code was developed to analyze a specific cooling system layout. However, there is a large degree of freedom in the type of cooling modules that can be selected and in the performance of those modules. The costs of the modules are input to the code, giving the user a great deal of flexibility.

  14. User's manual for DELSOL2: a computer code for calculating the optical performance and optimal system design for solar-thermal central-receiver plants

    Energy Technology Data Exchange (ETDEWEB)

    Dellin, T.A.; Fish, M.J.; Yang, C.L.

    1981-08-01

    DELSOL2 is a revised and substantially extended version of the DELSOL computer program for calculating collector field performance and layout, and optimal system design for solar thermal central receiver plants. The code consists of a detailed model of the optical performance, a simpler model of the non-optical performance, an algorithm for field layout, and a searching algorithm to find the best system design. The latter two features are coupled to a cost model of central receiver components and an economic model for calculating energy costs. The code can handle flat, focused and/or canted heliostats, and external cylindrical, multi-aperture cavity, and flat plate receivers. The program optimizes the tower height, receiver size, field layout, heliostat spacings, and tower position at user specified power levels subject to flux limits on the receiver and land constraints for field layout. The advantages of speed and accuracy characteristic of Version I are maintained in DELSOL2.

  15. ON CHANNEL ESTIMATION USING OPTIMAL TRAINING SEQUENCES IN CYCLIC-PREFIX-BASED SINGLE-CARRIER SYSTEMS WITH SPACE-TIME BLOCK-CODING

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this paper, a new scheme that combines Space-Time Block-Coding (STBC) based on an Alamouti-like scheme and the Least Squares (LS) channel estimation using optimal training sequences in Cyclic-Prefix-based (CP)\\Single-Carrier (SC) systems is proposed. With two transmit antennas, based on Cramer-Rao lower bound for channel estimation, it is shown that the Periodic Complementary Set (PCS) is optimal over frequency-selective fading channels. Compared with the normal scheme without STBC, 3dB Mean Square Error (MSE) performance gains and fewer restrictions on the length of channel impulse response are demonstrated.

  16. An optimization-based approach for facility energy management with uncertainties, and, Power portfolio optimization in deregulated electricity markets with risk management

    Science.gov (United States)

    Xu, Jun

    Topic 1. An Optimization-Based Approach for Facility Energy Management with Uncertainties. Effective energy management for facilities is becoming increasingly important in view of the rising energy costs, the government mandate on the reduction of energy consumption, and the human comfort requirements. This part of dissertation presents a daily energy management formulation and the corresponding solution methodology for HVAC systems. The problem is to minimize the energy and demand costs through the control of HVAC units while satisfying human comfort, system dynamics, load limit constraints, and other requirements. The problem is difficult in view of the fact that the system is nonlinear, time-varying, building-dependent, and uncertain; and that the direct control of a large number of HVAC components is difficult. In this work, HVAC setpoints are the control variables developed on top of a Direct Digital Control (DDC) system. A method that combines Lagrangian relaxation, neural networks, stochastic dynamic programming, and heuristics is developed to predict the system dynamics and uncontrollable load, and to optimize the setpoints. Numerical testing and prototype implementation results show that our method can effectively reduce total costs, manage uncertainties, and shed the load, is computationally efficient. Furthermore, it is significantly better than existing methods. Topic 2. Power Portfolio Optimization in Deregulated Electricity Markets with Risk Management. In a deregulated electric power system, multiple markets of different time scales exist with various power supply instruments. A load serving entity (LSE) has multiple choices from these instruments to meet its load obligations. In view of the large amount of power involved, the complex market structure, risks in such volatile markets, stringent constraints to be satisfied, and the long time horizon, a power portfolio optimization problem is of critical importance but difficulty for an LSE to serve the

  17. OPTIMAL SOLUTIONS FOR IMPLEMENTING THE SUPPLY-SALES CHAIN MANAGEMENT

    National Research Council Canada - National Science Library

    Elena COFAS

    2014-01-01

    ...: obsolete inventory devaluation, impairment etc. Since the 1980’s, several companies came together in the same service all functions dealing logistic flow from supply to distribution, through production management and resource planning...

  18. Optimizing resource and energy recovery for materials and waste management

    Science.gov (United States)

    Decisions affecting materials management today are generally based on cost and a presumption of favorable outcomes without an understanding of the environmental tradeoffs. However, there is a growing demand to better understand and quantify the net environmental and energy trade-...

  19. Optimizing the Banking Activity Using Assets & Liabilities Management

    Directory of Open Access Journals (Sweden)

    Vasile Dedu

    2008-10-01

    Full Text Available In the actual study, starting from the international experience, we revealed the role that should be taken by the Assets and Liabilities Committee (ALCO within the Romanian commercial banks. ALCO became one of the tools used by the executive management of the banks to take decisions regarding the future policy of assets and liabilities management, relying on the synthetic information prepared by well trained technicians but without voting right (usually middle management staff. We consider that the implementation of an assets and liabilities management strategy cannot be done without an appropriate corporate governance structure, even though the bank is having highly specialized staff. Models of some western banking institutions may be considered as benchmarks by the Romanian banks.

  20. Optimizing resource and energy recovery for materials and waste management

    Science.gov (United States)

    Decisions affecting materials management today are generally based on cost and a presumption of favorable outcomes without an understanding of the environmental tradeoffs. However, there is a growing demand to better understand and quantify the net environmental and energy trade-...