WorldWideScience

Sample records for management optimization code

  1. Engineering application of in-core fuel management optimization code with CSA algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhihong; Hu, Yongming [INET, Tsinghua university, Beijing 100084 (China)

    2009-06-15

    PWR in-core loading (reloading) pattern optimization is a complex combined problem. An excellent fuel management optimization code can greatly improve the efficiency of core reloading design, and bring economic and safety benefits. Today many optimization codes with experiences or searching algorithms (such as SA, GA, ANN, ACO) have been developed, while how to improve their searching efficiency and engineering usability still needs further research. CSA (Characteristic Statistic Algorithm) is a global optimization algorithm with high efficiency developed by our team. The performance of CSA has been proved on many problems (such as Traveling Salesman Problems). The idea of CSA is to induce searching direction by the statistic distribution of characteristic values. This algorithm is quite suitable for fuel management optimization. Optimization code with CSA has been developed and was used on many core models. The research in this paper is to improve the engineering usability of CSA code according to all the actual engineering requirements. Many new improvements have been completed in this code, such as: 1. Considering the asymmetry of burn-up in one assembly, the rotation of each assembly is considered as new optimization variables in this code. 2. Worth of control rods must satisfy the given constraint, so some relative modifications are added into optimization code. 3. To deal with the combination of alternate cycles, multi-cycle optimization is considered in this code. 4. To confirm the accuracy of optimization results, many identifications of the physics calculation module in this code have been done, and the parameters of optimization schemes are checked by SCIENCE code. The improved optimization code with CSA has been used on Qinshan nuclear plant of China. The reloading of cycle 7, 8, 9 (12 months, no burnable poisons) and the 18 months equilibrium cycle (with burnable poisons) reloading are optimized. At last, many optimized schemes are found by CSA code

  2. Two-dimensional core calculation research for fuel management optimization based on CPACT code

    International Nuclear Information System (INIS)

    Chen Xiaosong; Peng Lianghui; Gang Zhi

    2013-01-01

    Fuel management optimization process requires rapid assessment for the core layout program, and the commonly used methods include two-dimensional diffusion nodal method, perturbation method, neural network method and etc. A two-dimensional loading patterns evaluation code was developed based on the three-dimensional LWR diffusion calculation program CPACT. Axial buckling introduced to simulate the axial leakage was searched in sub-burnup sections to correct the two-dimensional core diffusion calculation results. Meanwhile, in order to get better accuracy, the weight equivalent volume method of the control rod assembly cross-section was improved. (authors)

  3. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  4. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  5. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  6. Progress on DART code optimization

    International Nuclear Information System (INIS)

    Taboada, Horacio; Solis, Diego; Rest, Jeffrey

    1999-01-01

    This work consists about the progress made on the design and development of a new optimized version of DART code (DART-P), a mechanistic computer model for the performance calculation and assessment of aluminum dispersion fuel. It is part of a collaboration agreement between CNEA and ANL in the area of Low Enriched Uranium Advanced Fuels. It is held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy, signed on October 16, 1997 between US DOE and the National Atomic Energy Commission of the Argentine Republic. DART optimization is a biannual program; it is operative since February 8, 1999 and has the following goals: 1. Design and develop a new DART calculation kernel for implementation within a parallel processing architecture. 2. Design and develop new user-friendly I/O routines to be resident on Personal Computer (PC)/WorkStation (WS) platform. 2.1. The new input interface will be designed and developed by means of a Visual interface, able to guide the user in the construction of the problem to be analyzed with the aid of a new database (described in item 3, below). The new I/O interface will include input data check controls in order to avoid corrupted input data. 2.2. The new output interface will be designed and developed by means of graphical tools, able to translate numeric data output into 'on line' graphic information. 3. Design and develop a new irradiated materials database, to be resident on PC/WS platform, so as to facilitate the analysis of the behavior of different fuel and meat compositions with DART-P. Currently, a different version of DART is used for oxide, silicide, and advanced alloy fuels. 4. Develop rigorous general inspection algorithms in order to provide valuable DART-P benchmarks. 5. Design and develop new models, such as superplasticity, elastoplastic feedback, improved models for the calculation of fuel deformation and the evolution of the fuel microstructure for

  7. Some optimizations of the animal code

    International Nuclear Information System (INIS)

    Fletcher, W.T.

    1975-01-01

    Optimizing techniques were performed on a version of the ANIMAL code (MALAD1B) at the source-code (FORTRAN) level. Sample optimizing techniques and operations used in MALADOP--the optimized version of the code--are presented, along with a critique of some standard CDC 7600 optimizing techniques. The statistical analysis of total CPU time required for MALADOP and MALAD1B shows a run-time saving of 174 msec (almost 3 percent) in the code MALADOP during one time step

  8. Optimizing Extender Code for NCSX Analyses

    International Nuclear Information System (INIS)

    Richman, M.; Ethier, S.; Pomphrey, N.

    2008-01-01

    Extender is a parallel C++ code for calculating the magnetic field in the vacuum region of a stellarator. The code was optimized for speed and augmented with tools to maintain a specialized NetCDF database. Two parallel algorithms were examined. An even-block work-distribution scheme was comparable in performance to a master-slave scheme. Large speedup factors were achieved by representing the plasma surface with a spline rather than Fourier series. The accuracy of this representation and the resulting calculations relied on the density of the spline mesh. The Fortran 90 module db access was written to make it easy to store Extender output in a manageable database. New or updated data can be added to existing databases. A generalized PBS job script handles the generation of a database from scratch

  9. Iterative optimization of quantum error correcting codes

    International Nuclear Information System (INIS)

    Reimpell, M.; Werner, R.F.

    2005-01-01

    We introduce a convergent iterative algorithm for finding the optimal coding and decoding operations for an arbitrary noisy quantum channel. This algorithm does not require any error syndrome to be corrected completely, and hence also finds codes outside the usual Knill-Laflamme definition of error correcting codes. The iteration is shown to improve the figure of merit 'channel fidelity' in every step

  10. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  11. UNIX code management and distribution

    International Nuclear Information System (INIS)

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process

  12. Optimal Codes for the Burst Erasure Channel

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure

  13. Zip Code Manager

    Data.gov (United States)

    Office of Personnel Management — The system used to associate what Federal Employees Health Benefits Program (FEHBP) and Federal Employees Dental/Vision Program (FEDVIP) health, dental, and vision...

  14. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  15. Optimal patch code design via device characterization

    Science.gov (United States)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  16. WASA-BOSS. Development and application of Severe Accident Codes. Evaluation and optimization of accident management measures. Subproject F. Contributions to code validation using BWR data and to evaluation and optimization of accident management measures. Final report

    International Nuclear Information System (INIS)

    Di Marcello, Valentino; Imke, Uwe; Sanchez Espinoza, Victor

    2016-09-01

    The exact knowledge of the transient course of events and of the dominating processes during a severe accident in a nuclear power station is a mandatory requirement to elaborate strategies and measures to minimize the radiological consequences of core melt. Two typical experiments using boiling water reactor assemblies were modelled and simulated with the severe accident simulation code ATHLET-CD. The experiments are related to the early phase of core degradation in a boiling water reactor. The results reproduce the thermal behavior and the hydrogen production due to oxidation inside the bundle until relocation of material by melting. During flooding of the overheated assembly temperatures and hydrogen oxidation are under estimated. The deviations from the experimental results can be explained by the missing model to simulate bore carbide oxidation of the control rods. On basis of a hypothetical loss of coolant accident in a typical German boiling water reactor the effectivity of flooding the partial degraded core is investigated. This measure of mitigation is efficient and prevents failure of the reactor pressure vessel if it starts before molten material is relocated into the lower plenum. Considerable amount of hydrogen is produced by oxidation of the metallic components.

  17. Optimized reversible binary-coded decimal adders

    DEFF Research Database (Denmark)

    Thomsen, Michael Kirkedal; Glück, Robert

    2008-01-01

    Abstract Babu and Chowdhury [H.M.H. Babu, A.R. Chowdhury, Design of a compact reversible binary coded decimal adder circuit, Journal of Systems Architecture 52 (5) (2006) 272-282] recently proposed, in this journal, a reversible adder for binary-coded decimals. This paper corrects and optimizes...... their design. The optimized 1-decimal BCD full-adder, a 13 × 13 reversible logic circuit, is faster, and has lower circuit cost and less garbage bits. It can be used to build a fast reversible m-decimal BCD full-adder that has a delay of only m + 17 low-power reversible CMOS gates. For a 32-decimal (128-bit....... Keywords: Reversible logic circuit; Full-adder; Half-adder; Parallel adder; Binary-coded decimal; Application of reversible logic synthesis...

  18. Optimizing Plutonium stock management

    International Nuclear Information System (INIS)

    Niquil, Y.; Guillot, J.

    1997-01-01

    Plutonium from spent fuel reprocessing is reused in new MOX assemblies. Since plutonium isotopic composition deteriorates with time, it is necessary to optimize plutonium stock management over a long period, to guarantee safe procurement, and contribute to a nuclear fuel cycle policy at the lowest cost. This optimization is provided by the prototype software POMAR

  19. Scaling Optimization of the SIESTA MHD Code

    Science.gov (United States)

    Seal, Sudip; Hirshman, Steven; Perumalla, Kalyan

    2013-10-01

    SIESTA is a parallel three-dimensional plasma equilibrium code capable of resolving magnetic islands at high spatial resolutions for toroidal plasmas. Originally designed to exploit small-scale parallelism, SIESTA has now been scaled to execute efficiently over several thousands of processors P. This scaling improvement was accomplished with minimal intrusion to the execution flow of the original version. First, the efficiency of the iterative solutions was improved by integrating the parallel tridiagonal block solver code BCYCLIC. Krylov-space generation in GMRES was then accelerated using a customized parallel matrix-vector multiplication algorithm. Novel parallel Hessian generation algorithms were integrated and memory access latencies were dramatically reduced through loop nest optimizations and data layout rearrangement. These optimizations sped up equilibria calculations by factors of 30-50. It is possible to compute solutions with granularity N/P near unity on extremely fine radial meshes (N > 1024 points). Grid separation in SIESTA, which manifests itself primarily in the resonant components of the pressure far from rational surfaces, is strongly suppressed by finer meshes. Large problem sizes of up to 300 K simultaneous non-linear coupled equations have been solved on the NERSC supercomputers. Work supported by U.S. DOE under Contract DE-AC05-00OR22725 with UT-Battelle, LLC.

  20. Code Differentiation for Hydrodynamic Model Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Henninger, R.J.; Maudlin, P.J.

    1999-06-27

    Use of a hydrodynamics code for experimental data fitting purposes (an optimization problem) requires information about how a computed result changes when the model parameters change. These so-called sensitivities provide the gradient that determines the search direction for modifying the parameters to find an optimal result. Here, the authors apply code-based automatic differentiation (AD) techniques applied in the forward and adjoint modes to two problems with 12 parameters to obtain these gradients and compare the computational efficiency and accuracy of the various methods. They fit the pressure trace from a one-dimensional flyer-plate experiment and examine the accuracy for a two-dimensional jet-formation problem. For the flyer-plate experiment, the adjoint mode requires similar or less computer time than the forward methods. Additional parameters will not change the adjoint mode run time appreciably, which is a distinct advantage for this method. Obtaining ''accurate'' sensitivities for the j et problem parameters remains problematic.

  1. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  2. DIRAC optimized workload management

    CERN Document Server

    Paterson, S K

    2008-01-01

    The LHCb DIRAC Workload and Data Management System employs advanced optimization techniques in order to dynamically allocate resources. The paradigms realized by DIRAC, such as late binding through the Pilot Agent approach, have proven to be highly successful. For example, this has allowed the principles of workload management to be applied not only at the time of user job submission to the Grid but also to optimize the use of computing resources once jobs have been acquired. Along with the central application of job priorities, DIRAC minimizes the system response time for high priority tasks. This paper will describe the recent developments to support Monte Carlo simulation, data processing and distributed user analysis in a consistent way across disparate compute resources including individual PCs, local batch systems, and the Worldwide LHC Computing Grid. The Grid environment is inherently unpredictable and whilst short-term studies have proven to deliver high job efficiencies, the system performance over ...

  3. Non-binary Hybrid LDPC Codes: Structure, Decoding and Optimization

    OpenAIRE

    Sassatelli, Lucile; Declercq, David

    2007-01-01

    In this paper, we propose to study and optimize a very general class of LDPC codes whose variable nodes belong to finite sets with different orders. We named this class of codes Hybrid LDPC codes. Although efficient optimization techniques exist for binary LDPC codes and more recently for non-binary LDPC codes, they both exhibit drawbacks due to different reasons. Our goal is to capitalize on the advantages of both families by building codes with binary (or small finite set order) and non-bin...

  4. GESDATA: A failure-data management code

    International Nuclear Information System (INIS)

    Garcia Gay, J.; Francia Gonzalez, L.; Ortega Prieto, P.; Mira McWilliams, J.; Aguinaga Zapata, M.

    1987-01-01

    GESDATA is a failure data management code for both qualitative and quantitative fault-tree evaluation. Data management using the code should provide the analyst, in the quickest and easiest way, with the reliability data which constitute the input values for fault-tree evaluation programs. (orig./HSCH)

  5. Optimal Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1994-01-01

    Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....

  6. Optimal super dense coding over memory channels

    OpenAIRE

    Shadman, Zahra; Kampermann, Hermann; Macchiavello, Chiara; Bruß, Dagmar

    2011-01-01

    We study the super dense coding capacity in the presence of quantum channels with correlated noise. We investigate both the cases of unitary and non-unitary encoding. Pauli channels for arbitrary dimensions are treated explicitly. The super dense coding capacity for some special channels and resource states is derived for unitary encoding. We also provide an example of a memory channel where non-unitary encoding leads to an improvement in the super dense coding capacity.

  7. Efficient topology optimization in MATLAB using 88 lines of code

    DEFF Research Database (Denmark)

    Andreassen, Erik; Clausen, Anders; Schevenels, Mattias

    2011-01-01

    The paper presents an efficient 88 line MATLAB code for topology optimization. It has been developed using the 99 line code presented by Sigmund (Struct Multidisc Optim 21(2):120–127, 2001) as a starting point. The original code has been extended by a density filter, and a considerable improvemen...... of the basic code to include recent PDE-based and black-and-white projection filtering methods. The complete 88 line code is included as an appendix and can be downloaded from the web site www.topopt.dtu.dk....

  8. Joint research project WASA-BOSS: Further development and application of severe accident codes. Assessment and optimization of accident management measures. Project B: Accident analyses for pressurized water reactors with the application of the ATHLET-CD code

    International Nuclear Information System (INIS)

    Jobst, Matthias; Kliem, Soeren; Kozmenkov, Yaroslav; Wilhelm, Polina

    2017-02-01

    Within the framework of the project an ATHLET-CD input deck for a generic German PWR of type KONVOI has been created. This input deck was applied to the simulation of severe accidents from the accident categories station blackout (SBO) and small-break loss-of-coolant accidents (SBLOCA). The complete accident transient from initial event at full power until the damage of reactor pressure vessel (RPV) is covered and all relevant severe accident phenomena are modelled: start of core heat up, fission product release, melting of fuel and absorber material, oxidation and release of hydrogen, relocation of molten material inside the core, relocation to the lower plenum, damage and failure of the RPV. The model has been applied to the analysis of preventive and mitigative accident management measures for SBO and SBLOCA transients. Therefore, the measures primary side depressurization (PSD), injection to the primary circuit by mobile pumps and for SBLOCA the delayed injection by the cold leg hydro-accumulators have been investigated and the assumptions and start criteria of these measures have been varied. The time evolutions of the transients and time margins for the initiation of additional measures have been assessed. An uncertainty and sensitivity study has been performed for the early phase of one SBO scenario with PSD (until the start of core melt). In addition to that, a code -to-code comparison between ATHLET-CD and the severe accident code MELCOR has been carried out.

  9. Using Peephole Optimization on Intermediate Code

    NARCIS (Netherlands)

    Tanenbaum, A.S.; van Staveren, H.; Stevenson, J.W.

    1982-01-01

    Many portable compilers generate an intermediate code that is subsequently translated into the target machine's assembly language. In this paper a stack-machine-based intermediate code suitable for algebraic languages (e.g., PASCAL, C, FORTRAN) and most byte-addressed mini- and microcomputers is

  10. Should managers have a code of conduct?

    Science.gov (United States)

    Bayliss, P

    1994-02-01

    Much attention is currently being given to values and ethics in the NHS. Issues of accountability are being explored as a consequence of the Cadbury report. The Institute of Health Services Management (IHSM) is considering whether managers should have a code of ethics. Central to this issue is what managers themselves think; the application of such a code may well stand or fall by whether managers are prepared to have ownership of it, and are prepared to make it work. Paul Bayliss reports on a survey of managers' views.

  11. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  12. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-01-01

    and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design

  13. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-03-13

    In this work we have developed a restructuring software tool (RT-CUDA) following the proposed optimization specifications to bridge the gap between high-level languages and the machine dependent CUDA environment. RT-CUDA takes a C program and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design of linear algebra solvers. We expect RT-CUDA to be needed by many KSA industries dealing with science and engineering simulation on massively parallel computers like NVIDIA GPUs.

  14. ARC Code TI: Optimal Alarm System Design and Implementation

    Data.gov (United States)

    National Aeronautics and Space Administration — An optimal alarm system can robustly predict a level-crossing event that is specified over a fixed prediction horizon. The code contained in this packages provides...

  15. VVER-440 loading patterns optimization using ATHENA code

    International Nuclear Information System (INIS)

    Katovsky, K.; Sustek, J.; Bajgl, J.; Cada, R.

    2009-01-01

    In this paper the Czech optimization state-of-the-art, new code system development goals and OPAL optimization system are briefly mentioned. The algorithms, maths, present status and future developments of the ATHENA code are described. A calculation exercise of the Dukovany NPP cycles, on increased power using ATHENA, starting with on-coming 24th cycle (303 FPD) continuing with 25th (322 FPD), and 26th (336 FPD); for all cycles K R ≤1.54 is presented

  16. Optimizing the ATLAS code with different profilers

    CERN Document Server

    Kama, S; The ATLAS collaboration

    2013-01-01

    After the current maintenance period, the LHC will provide higher energy collisions with increased luminosity. In order to keep up with these higher rates, ATLAS software needs to speed up substantially. However, ATLAS code is composed of approximately 4M lines, written by many different programmers with different backgrounds, which makes code optimisation a challenge. To help with this effort different profiling tools and techniques are being used. These include well known tools, such as the Valgrind suite and Intel Amplifier; less common tools like PIN, PAPI, and GOODA; as well as techniques such as library interposing. In this talk we will mainly focus on PIN tools and GOODA. PIN is a dynamic binary instrumentation tool which can obtain statistics such as call counts, instruction counts and interrogate functions' arguments. It has been used to obtain CLHEP Matrix profiles, operations and vector sizes for linear algebra calculations which has provided the insight necessary to achieve significant performance...

  17. Optimization of Particle-in-Cell Codes on RISC Processors

    Science.gov (United States)

    Decyk, Viktor K.; Karmesin, Steve Roy; Boer, Aeint de; Liewer, Paulette C.

    1996-01-01

    General strategies are developed to optimize particle-cell-codes written in Fortran for RISC processors which are commonly used on massively parallel computers. These strategies include data reorganization to improve cache utilization and code reorganization to improve efficiency of arithmetic pipelines.

  18. Adaptive RD Optimized Hybrid Sound Coding

    NARCIS (Netherlands)

    Schijndel, N.H. van; Bensa, J.; Christensen, M.G.; Colomes, C.; Edler, B.; Heusdens, R.; Jensen, J.; Jensen, S.H.; Kleijn, W.B.; Kot, V.; Kövesi, B.; Lindblom, J.; Massaloux, D.; Niamut, O.A.; Nordén, F.; Plasberg, J.H.; Vafin, R.; Virette, D.; Wübbolt, O.

    2008-01-01

    Traditionally, sound codecs have been developed with a particular application in mind, their performance being optimized for specific types of input signals, such as speech or audio (music), and application constraints, such as low bit rate, high quality, or low delay. There is, however, an

  19. Fuel management codes for fast reactors

    International Nuclear Information System (INIS)

    Sicard, B.; Coulon, P.; Mougniot, J.C.; Gouriou, A.; Pontier, M.; Skok, J.; Carnoy, M.; Martin, J.

    The CAPHE code is used for managing and following up fuel subassemblies in the Phenix fast neutron reactor; the principal experimental results obtained since this reactor was commissioned are analyzed with this code. They are mainly concerned with following up fuel subassembly powers and core reactivity variations observed up to the beginning of the fifth Phenix working cycle (3/75). Characteristics of Phenix irradiated fuel subassemblies calculated by the CAPHE code are detailed as at April 1, 1975 (burn-up steel damage)

  20. Code organization and configuration management

    International Nuclear Information System (INIS)

    Wellisch, J.P.; Ashby, S.; Williams, C.; Osborne, I.

    2001-01-01

    Industry experts are increasingly focusing on team productivity as the key to success. The base of the team effort is the four-fold structure of software in terms of logical organisation, physical organisation, managerial organisation, and dynamical structure. The authors describe the ideas put into action within the CMS software for organising software into sub-systems and packages, and to establish configuration management in a multi-project environment. The authors use a structure that allows to maximise the independence of software development in individual areas, and at the same time emphasises the overwhelming importance of the interdependencies between the packages and components in the system. The authors comment on release procedures, and describe the inter-relationship between release, development, integration, and testing

  1. Italian electricity supply contracts optimization: ECO computer code

    International Nuclear Information System (INIS)

    Napoli, G.; Savelli, D.

    1993-01-01

    The ECO (Electrical Contract Optimization) code written in the Microsoft WINDOWS 3.1 language can be handled with a 286 PC and a minimum of RAM. It consists of four modules, one for the calculation of ENEL (Italian National Electricity Board) tariffs, one for contractual time-of-use tariffs optimization, a table of tariff coefficients, and a module for monthly power consumption calculations based on annual load diagrams. The optimization code was developed by ENEA (Italian Agency for New Technology, Energy and the Environment) to help Italian industrial firms comply with new and complex national electricity supply contractual regulations and tariffs. In addition to helping industrial firms determine optimum contractual arrangements, the code also assists them in optimizing their choice of equipment and production cycles

  2. An engineering code to analyze hypersonic thermal management systems

    Science.gov (United States)

    Vangriethuysen, Valerie J.; Wallace, Clark E.

    1993-01-01

    Thermal loads on current and future aircraft are increasing and as a result are stressing the energy collection, control, and dissipation capabilities of current thermal management systems and technology. The thermal loads for hypersonic vehicles will be no exception. In fact, with their projected high heat loads and fluxes, hypersonic vehicles are a prime example of systems that will require thermal management systems (TMS) that have been optimized and integrated with the entire vehicle to the maximum extent possible during the initial design stages. This will not only be to meet operational requirements, but also to fulfill weight and performance constraints in order for the vehicle to takeoff and complete its mission successfully. To meet this challenge, the TMS can no longer be two or more entirely independent systems, nor can thermal management be an after thought in the design process, the typical pervasive approach in the past. Instead, a TMS that was integrated throughout the entire vehicle and subsequently optimized will be required. To accomplish this, a method that iteratively optimizes the TMS throughout the vehicle will not only be highly desirable, but advantageous in order to reduce the manhours normally required to conduct the necessary tradeoff studies and comparisons. A thermal management engineering computer code that is under development and being managed at Wright Laboratory, Wright-Patterson AFB, is discussed. The primary goal of the code is to aid in the development of a hypersonic vehicle TMS that has been optimized and integrated on a total vehicle basis.

  3. Optimism in Enrollment Management

    Science.gov (United States)

    Buster-Williams, Kimberley

    2016-01-01

    Enrollment managers, like most managers, have goals that must be focused on with precision, excitement, and vigor. Enrollment managers must excel at enrollment planning. Typically, enrollment planning unites undergraduate and graduate recruitment plans, out-of-state recruitment plans, marketing plans, retention plans, international enrollment…

  4. Development of a graphical interface computer code for reactor fuel reloading optimization

    International Nuclear Information System (INIS)

    Do Quang Binh; Nguyen Phuoc Lan; Bui Xuan Huy

    2007-01-01

    This report represents the results of the project performed in 2007. The aim of this project is to develop a graphical interface computer code that allows refueling engineers to design fuel reloading patterns for research reactor using simulated graphical model of reactor core. Besides, this code can perform refueling optimization calculations based on genetic algorithms as well as simulated annealing. The computer code was verified based on a sample problem, which relies on operational and experimental data of Dalat research reactor. This code can play a significant role in in-core fuel management practice at nuclear research reactor centers and in training. (author)

  5. Recent developments in KTF. Code optimization and improved numerics

    International Nuclear Information System (INIS)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin

    2012-01-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  6. Recent developments in KTF. Code optimization and improved numerics

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin [Karlsruhe Institute of Technology (KIT) (Germany). Inst. for Neutron Physics and Reactor Technology (INR)

    2012-11-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  7. Optimized iterative decoding method for TPC coded CPM

    Science.gov (United States)

    Ma, Yanmin; Lai, Penghui; Wang, Shilian; Xie, Shunqin; Zhang, Wei

    2018-05-01

    Turbo Product Code (TPC) coded Continuous Phase Modulation (CPM) system (TPC-CPM) has been widely used in aeronautical telemetry and satellite communication. This paper mainly investigates the improvement and optimization on the TPC-CPM system. We first add the interleaver and deinterleaver to the TPC-CPM system, and then establish an iterative system to iteratively decode. However, the improved system has a poor convergence ability. To overcome this issue, we use the Extrinsic Information Transfer (EXIT) analysis to find the optimal factors for the system. The experiments show our method is efficient to improve the convergence performance.

  8. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  9. Optimal shutdown management

    International Nuclear Information System (INIS)

    Bottasso, C L; Croce, A; Riboldi, C E D

    2014-01-01

    The paper presents a novel approach for the synthesis of the open-loop pitch profile during emergency shutdowns. The problem is of interest in the design of wind turbines, as such maneuvers often generate design driving loads on some of the machine components. The pitch profile synthesis is formulated as a constrained optimal control problem, solved numerically using a direct single shooting approach. A cost function expressing a compromise between load reduction and rotor overspeed is minimized with respect to the unknown blade pitch profile. Constraints may include a load reduction not-to-exceed the next dominating loads, a not-to-be-exceeded maximum rotor speed, and a maximum achievable blade pitch rate. Cost function and constraints are computed over a possibly large number of operating conditions, defined so as to cover as well as possible the operating situations encountered in the lifetime of the machine. All such conditions are simulated by using a high-fidelity aeroservoelastic model of the wind turbine, ensuring the accuracy of the evaluation of all relevant parameters. The paper demonstrates the capabilities of the novel proposed formulation, by optimizing the pitch profile of a multi-MW wind turbine. Results show that the procedure can reliably identify optimal pitch profiles that reduce design-driving loads, in a fully automated way

  10. Optimal shutdown management

    Science.gov (United States)

    Bottasso, C. L.; Croce, A.; Riboldi, C. E. D.

    2014-06-01

    The paper presents a novel approach for the synthesis of the open-loop pitch profile during emergency shutdowns. The problem is of interest in the design of wind turbines, as such maneuvers often generate design driving loads on some of the machine components. The pitch profile synthesis is formulated as a constrained optimal control problem, solved numerically using a direct single shooting approach. A cost function expressing a compromise between load reduction and rotor overspeed is minimized with respect to the unknown blade pitch profile. Constraints may include a load reduction not-to-exceed the next dominating loads, a not-to-be-exceeded maximum rotor speed, and a maximum achievable blade pitch rate. Cost function and constraints are computed over a possibly large number of operating conditions, defined so as to cover as well as possible the operating situations encountered in the lifetime of the machine. All such conditions are simulated by using a high-fidelity aeroservoelastic model of the wind turbine, ensuring the accuracy of the evaluation of all relevant parameters. The paper demonstrates the capabilities of the novel proposed formulation, by optimizing the pitch profile of a multi-MW wind turbine. Results show that the procedure can reliably identify optimal pitch profiles that reduce design-driving loads, in a fully automated way.

  11. PWR fuel management optimization

    International Nuclear Information System (INIS)

    Dumas, Michel.

    1981-10-01

    This report is aimed to the optimization of the refueling pattern of a nuclear reactor. At the beginning of a reactor cycle a batch of fuel assemblies is available: the physical properties of the assemblies are known: the mathematical problem is to determine the refueling pattern which maximizes the reactivity or which provides the flattest possible power distribution. The state of the core is mathematically characterized by a system of partial derivative equations, its smallest eigenvalue and the associated eigenvector. After a study of the convexity properties of the problem, two algorithms are proposed. The first one exhanges assemblies to improve the starting configurations. The enumeration of the exchanges is limited to the 2 by 2, 3 by 3, 4 by 4 permutations. The second one builds a solution in two steps: in the first step the discrete variables are replaced by continuous variables. The non linear optimization problem obtained is solved by ''the Method of Approximation Programming'' and in the second step, the refuelling pattern which provides the best approximation of the optimal power distribution is searched by a Branch an d Bound Method [fr

  12. Optimization of well field management

    DEFF Research Database (Denmark)

    Hansen, Annette Kirstine

    Groundwater is a limited but important resource for fresh water supply. Differ- ent conflicting objectives are important when operating a well field. This study investigates how the management of a well field can be improved with respect to different objectives simultaneously. A framework...... for optimizing well field man- agement using multi-objective optimization is developed. The optimization uses the Strength Pareto Evolutionary Algorithm 2 (SPEA2) to find the Pareto front be- tween the conflicting objectives. The Pareto front is a set of non-inferior optimal points and provides an important tool...... for the decision-makers. The optimization framework is tested on two case studies. Both abstract around 20,000 cubic meter of water per day, but are otherwise rather different. The first case study concerns the management of Hardhof waterworks, Switzer- land, where artificial infiltration of river water...

  13. Optimizing risk management

    International Nuclear Information System (INIS)

    Kindred, G.W.

    2000-01-01

    Commercial nuclear power plant management is focussed on the safe, efficient, economical production of electricity. To accomplish the safe aspect of the equation, risk must be determined for the operation and maintenance of the facility. To accomplish the efficient aspect of the equation, management must understand those risks and factor risk insights into their decision process. The final piece of the equation is economical which is accomplished by minimizing, plant outage durations and proper utilization of resources. Probabilistic Risk Assessment can provide the risk insights to accomplish all three; safety, efficiency, and economically. How? Safe production of electricity can be quantified by use of PRA modeling and other risk insights that can determine the core damage frequency. Efficient production of electricity can be influenced by providing management with quantified risk insights for use in decision making. And, one example of economical production of electricity is by not having over conservative deterministic based defense in depth approaches to system maintenance and availability. By using risk-informed insights nuclear safety can be quantified and risk can be managed. Confidence in this approach can be achieved by ensuring the content and quality of the PRA is standardized throughout the industry. The time has arrived for Probabilistic Risk Assessment to take an active position as a major role player in the safe, efficient, and economical operation of commercial nuclear power plants. (author)

  14. A Fast Optimization Method for General Binary Code Learning.

    Science.gov (United States)

    Shen, Fumin; Zhou, Xiang; Yang, Yang; Song, Jingkuan; Shen, Heng; Tao, Dacheng

    2016-09-22

    Hashing or binary code learning has been recognized to accomplish efficient near neighbor search, and has thus attracted broad interests in recent retrieval, vision and learning studies. One main challenge of learning to hash arises from the involvement of discrete variables in binary code optimization. While the widely-used continuous relaxation may achieve high learning efficiency, the pursued codes are typically less effective due to accumulated quantization error. In this work, we propose a novel binary code optimization method, dubbed Discrete Proximal Linearized Minimization (DPLM), which directly handles the discrete constraints during the learning process. Specifically, the discrete (thus nonsmooth nonconvex) problem is reformulated as minimizing the sum of a smooth loss term with a nonsmooth indicator function. The obtained problem is then efficiently solved by an iterative procedure with each iteration admitting an analytical discrete solution, which is thus shown to converge very fast. In addition, the proposed method supports a large family of empirical loss functions, which is particularly instantiated in this work by both a supervised and an unsupervised hashing losses, together with the bits uncorrelation and balance constraints. In particular, the proposed DPLM with a supervised `2 loss encodes the whole NUS-WIDE database into 64-bit binary codes within 10 seconds on a standard desktop computer. The proposed approach is extensively evaluated on several large-scale datasets and the generated binary codes are shown to achieve very promising results on both retrieval and classification tasks.

  15. Optimal and efficient decoding of concatenated quantum block codes

    International Nuclear Information System (INIS)

    Poulin, David

    2006-01-01

    We consider the problem of optimally decoding a quantum error correction code--that is, to find the optimal recovery procedure given the outcomes of partial ''check'' measurements on the system. In general, this problem is NP hard. However, we demonstrate that for concatenated block codes, the optimal decoding can be efficiently computed using a message-passing algorithm. We compare the performance of the message-passing algorithm to that of the widespread blockwise hard decoding technique. Our Monte Carlo results using the five-qubit and Steane's code on a depolarizing channel demonstrate significant advantages of the message-passing algorithms in two respects: (i) Optimal decoding increases by as much as 94% the error threshold below which the error correction procedure can be used to reliably send information over a noisy channel; and (ii) for noise levels below these thresholds, the probability of error after optimal decoding is suppressed at a significantly higher rate, leading to a substantial reduction of the error correction overhead

  16. Optimized Method for Generating and Acquiring GPS Gold Codes

    Directory of Open Access Journals (Sweden)

    Khaled Rouabah

    2015-01-01

    Full Text Available We propose a simpler and faster Gold codes generator, which can be efficiently initialized to any desired code, with a minimum delay. Its principle consists of generating only one sequence (code number 1 from which we can produce all the other different signal codes. This is realized by simply shifting this sequence by different delays that are judiciously determined by using the bicorrelation function characteristics. This is in contrast to the classical Linear Feedback Shift Register (LFSR based Gold codes generator that requires, in addition to the shift process, a significant number of logic XOR gates and a phase selector to change the code. The presence of all these logic XOR gates in classical LFSR based Gold codes generator provokes the consumption of an additional time in the generation and acquisition processes. In addition to its simplicity and its rapidity, the proposed architecture, due to the total absence of XOR gates, has fewer resources than the conventional Gold generator and can thus be produced at lower cost. The Digital Signal Processing (DSP implementations have shown that the proposed architecture presents a solution for acquiring Global Positioning System (GPS satellites signals optimally and in a parallel way.

  17. Optimization of the particle pusher in a diode simulation code

    International Nuclear Information System (INIS)

    Theimer, M.M.; Quintenz, J.P.

    1979-09-01

    The particle pusher in Sandia's particle-in-cell diode simulation code has been rewritten to reduce the required run time of a typical simulation. The resulting new version of the code has been found to run up to three times as fast as the original with comparable accuracy. The cost of this optimization was an increase in storage requirements of about 15%. The new version has also been written to run efficiently on a CRAY-1 computing system. Steps taken to affect this reduced run time are described. Various test cases are detailed

  18. Optimal energy management strategy for self-reconfigurable batteries

    International Nuclear Information System (INIS)

    Bouchhima, Nejmeddine; Schnierle, Marc; Schulte, Sascha; Birke, Kai Peter

    2017-01-01

    This paper proposes a novel energy management strategy for multi-cell high voltage batteries where the current through each cell can be controlled, called self-reconfigurable batteries. An optimized control strategy further enhances the energy efficiency gained by the hardware architecture of those batteries. Currently, achieving cell equalization by using the active balancing circuits is considered as the best way to optimize the energy efficiency of the battery pack. This study demonstrates that optimizing the energy efficiency of self-reconfigurable batteries is no more strongly correlated to the cell balancing. According to the features of this novel battery architecture, the energy management strategy is formulated as nonlinear dynamic optimization problem. To solve this optimal control, an optimization algorithm that generates the optimal discharge policy for a given driving cycle is developed based on dynamic programming and code vectorization. The simulation results show that the designed energy management strategy maximizes the system efficiency across the battery lifetime over conventional approaches. Furthermore, the present energy management strategy can be implemented online due to the reduced complexity of the optimization algorithm. - Highlights: • The energy efficiency of self-reconfigurable batteries is maximized. • The energy management strategy for the battery is formulated as optimal control problem. • Developing an optimization algorithm using dynamic programming techniques and code vectorization. • Simulation studies are conducted to validate the proposed optimal strategy.

  19. Numerical computation of molecular integrals via optimized (vectorized) FORTRAN code

    International Nuclear Information System (INIS)

    Scott, T.C.; Grant, I.P.; Saunders, V.R.

    1997-01-01

    The calculation of molecular properties based on quantum mechanics is an area of fundamental research whose horizons have always been determined by the power of state-of-the-art computers. A computational bottleneck is the numerical calculation of the required molecular integrals to sufficient precision. Herein, we present a method for the rapid numerical evaluation of molecular integrals using optimized FORTRAN code generated by Maple. The method is based on the exploitation of common intermediates and the optimization can be adjusted to both serial and vectorized computations. (orig.)

  20. 41 CFR 101-30.403-2 - Management codes.

    Science.gov (United States)

    2010-07-01

    ....4-Use of the Federal Catalog System § 101-30.403-2 Management codes. For internal use within an... codes shall not be affixed immediately adjacent to or as a part of the national stock number, nor shall... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Management codes. 101-30...

  1. Cooperative optimization and their application in LDPC codes

    Science.gov (United States)

    Chen, Ke; Rong, Jian; Zhong, Xiaochun

    2008-10-01

    Cooperative optimization is a new way for finding global optima of complicated functions of many variables. The proposed algorithm is a class of message passing algorithms and has solid theory foundations. It can achieve good coding gains over the sum-product algorithm for LDPC codes. For (6561, 4096) LDPC codes, the proposed algorithm can achieve 2.0 dB gains over the sum-product algorithm at BER of 4×10-7. The decoding complexity of the proposed algorithm is lower than the sum-product algorithm can do; furthermore, the former can achieve much lower error floor than the latter can do after the Eb / No is higher than 1.8 dB.

  2. Fundamentals of an Optimal Multirate Subband Coding of Cyclostationary Signals

    Directory of Open Access Journals (Sweden)

    D. Kula

    2000-06-01

    Full Text Available A consistent theory of optimal subband coding of zero mean wide-sense cyclostationary signals, with N-periodic statistics, is presented in this article. An M-channel orthonormal uniform filter bank, employing N-periodic analysis and synthesis filters, is used while an average variance condition is applied to evaluate the output distortion. In three lemmas and final theorem, the necessity of decorrelation of blocked subband signals and requirement of specific ordering of power spectral densities are proven.

  3. Perturbation theory in nuclear fuel management optimization

    International Nuclear Information System (INIS)

    Ho, L.W.; Rohach, A.F.

    1982-01-01

    Perturbation theory along with a binary fuel shuffling technique is applied to predict the effects of various core configurations and, hence, the optimization of in-core fuel management. The computer code FULMNT has been developed to shuffle the fuel assemblies in search of the lowest possible power peaking factor. An iteration approach is used in the search routine. A two-group diffusion theory method is used to obtain the power distribution for the iterations. A comparison of the results of this method with other methods shows that this approach can save computer time and obtain better power peaking factors. The code also has a burnup capability that can be used to check power peaking throughout the core life

  4. Iterative optimization of performance libraries by hierarchical division of codes

    International Nuclear Information System (INIS)

    Donadio, S.

    2007-09-01

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  5. Thermally Optimized Paradigm of Thermal Management (TOP-M)

    Science.gov (United States)

    2017-07-18

    19b. TELEPHONE NUMBER (Include area code) 18-07-2017 Final Technical Jul 2015 - Jul 2017 NICOP - Thermally Optimized Paradigm of Thermal Management ...The main goal of this research was to present a New Thermal Management Approach, which combines thermally aware Very/Ultra Large Scale Integration...SPAD) image sensors were used to demonstrate the new thermal management approach. Thermal management , integrated temperature sensors, Vt extractor

  6. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  7. Optimization of administrative management costs

    OpenAIRE

    Podolchak, N.; Chepil, B.

    2015-01-01

    It is important to determine the optimal level of administrative costs in order to achieve main targets of any enterprise, to perform definite tasks, to implement these tasks and not to worsen condition and motivation of the workers. Also it is essential to remember about strategic goals in the area of HR on the long run. The refore, the main idea in using optimization model for assessing the effectiveness of management costs will be to find the minimum level of expenses within the given l...

  8. Constellation labeling optimization for bit-interleaved coded APSK

    Science.gov (United States)

    Xiang, Xingyu; Mo, Zijian; Wang, Zhonghai; Pham, Khanh; Blasch, Erik; Chen, Genshe

    2016-05-01

    This paper investigates the constellation and mapping optimization for amplitude phase shift keying (APSK) modulation, which is deployed in Digital Video Broadcasting Satellite - Second Generation (DVB-S2) and Digital Video Broadcasting - Satellite services to Handhelds (DVB-SH) broadcasting standards due to its merits of power and spectral efficiency together with the robustness against nonlinear distortion. The mapping optimization is performed for 32-APSK according to combined cost functions related to Euclidean distance and mutual information. A Binary switching algorithm and its modified version are used to minimize the cost function and the estimated error between the original and received data. The optimized constellation mapping is tested by combining DVB-S2 standard Low-Density Parity-Check (LDPC) codes in both Bit-Interleaved Coded Modulation (BICM) and BICM with iterative decoding (BICM-ID) systems. The simulated results validate the proposed constellation labeling optimization scheme which yields better performance against conventional 32-APSK constellation defined in DVB-S2 standard.

  9. On Optimal Policies for Network-Coded Cooperation

    DEFF Research Database (Denmark)

    Khamfroush, Hana; Roetter, Daniel Enrique Lucani; Pahlevani, Peyman

    2015-01-01

    Network-coded cooperative communication (NC-CC) has been proposed and evaluated as a powerful technology that can provide a better quality of service in the next-generation wireless systems, e.g., D2D communications. Previous contributions have focused on performance evaluation of NC-CC scenarios...... rather than searching for optimal policies that can minimize the total cost of reliable packet transmission. We break from this trend by initially analyzing the optimal design of NC-CC for a wireless network with one source, two receivers, and half-duplex erasure channels. The problem is modeled...... as a special case of Markov decision process (MDP), which is called stochastic shortest path (SSP), and is solved for any field size, arbitrary number of packets, and arbitrary erasure probabilities of the channels. The proposed MDP solution results in an optimal transmission policy per time slot, and we use...

  10. Random mask optimization for fast neutron coded aperture imaging

    Energy Technology Data Exchange (ETDEWEB)

    McMillan, Kyle [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Univ. of California, Los Angeles, CA (United States); Marleau, Peter [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Brubaker, Erik [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-05-01

    In coded aperture imaging, one of the most important factors determining the quality of reconstructed images is the choice of mask/aperture pattern. In many applications, uniformly redundant arrays (URAs) are widely accepted as the optimal mask pattern. Under ideal conditions, thin and highly opaque masks, URA patterns are mathematically constructed to provide artifact-free reconstruction however, the number of URAs for a chosen number of mask elements is limited and when highly penetrating particles such as fast neutrons and high-energy gamma-rays are being imaged, the optimum is seldom achieved. In this case more robust mask patterns that provide better reconstructed image quality may exist. Through the use of heuristic optimization methods and maximum likelihood expectation maximization (MLEM) image reconstruction, we show that for both point and extended neutron sources a random mask pattern can be optimized to provide better image quality than that of a URA.

  11. Perturbation theory in nuclear fuel management optimization

    International Nuclear Information System (INIS)

    Ho, L.W.

    1981-01-01

    Nuclear in-core fuel management involves all the physical aspects which allow optimal operation of the nuclear fuel within the reactor core. In most nuclear power reactors, fuel loading patterns which have a minimum power peak are economically desirable to allow the reactors to operate at the highest power density and to minimize the possibility of fuel failure. In this study, perturbation theory along with a binary fuel shuffling technique is applied to predict the effects of various core configurations, and hence, the optimization of in-core fuel management. The computer code FULMNT has been developed to shuffle the fuel assemblies in search of the lowest possible power peaking factor. An iteration approach is used in the search routine. A two-group diffusion theory method is used to obtain the power distribution for the iterations. A comparison of the results of this method with other methods shows that this approach can save computer time. The code also has a burnup capability which can be used to check power peaking throughout the core life

  12. MOSEG code for safety oriented maintenance management Safety of management of maintenance oriented by MOSEG code

    International Nuclear Information System (INIS)

    Torres Valle, Antonio

    2005-01-01

    Full text: One of the main reasons that makes maintenance contribute highly when facing safety problems and facilities availability is the lack of maintenance management systems to solve these fields in a balanced way. Their main setbacks are shown in this paper. It briefly describes the development of an integrating algorithm for a safety and availability-oriented maintenance management by virtue of the MOSEG Win 1.0 code. (author)

  13. Comparative evaluation of various optimization methods and the development of an optimization code system SCOOP

    International Nuclear Information System (INIS)

    Suzuki, Tadakazu

    1979-11-01

    Thirty two programs for linear and nonlinear optimization problems with or without constraints have been developed or incorporated, and their stability, convergence and efficiency have been examined. On the basis of these evaluations, the first version of the optimization code system SCOOP-I has been completed. The SCOOP-I is designed to be an efficient, reliable, useful and also flexible system for general applications. The system enables one to find global optimization point for a wide class of problems by selecting the most appropriate optimization method built in it. (author)

  14. BWROPT: A multi-cycle BWR fuel cycle optimization code

    Energy Technology Data Exchange (ETDEWEB)

    Ottinger, Keith E.; Maldonado, G. Ivan, E-mail: Ivan.Maldonado@utk.edu

    2015-09-15

    Highlights: • A multi-cycle BWR fuel cycle optimization algorithm is presented. • New fuel inventory and core loading pattern determination. • The parallel simulated annealing algorithm was used for the optimization. • Variable sampling probabilities were compared to constant sampling probabilities. - Abstract: A new computer code for performing BWR in-core and out-of-core fuel cycle optimization for multiple cycles simultaneously has been developed. Parallel simulated annealing (PSA) is used to optimize the new fuel inventory and placement of new and reload fuel for each cycle considered. Several algorithm improvements were implemented and evaluated. The most significant of these are variable sampling probabilities and sampling new fuel types from an ordered array. A heuristic control rod pattern (CRP) search algorithm was also implemented, which is useful for single CRP determinations, however, this feature requires significant computational resources and is currently not practical for use in a full multi-cycle optimization. The PSA algorithm was demonstrated to be capable of significant objective function reduction and finding candidate loading patterns without constraint violations. The use of variable sampling probabilities was shown to reduce runtime while producing better results compared to using constant sampling probabilities. Sampling new fuel types from an ordered array was shown to have a mixed effect compared to random new fuel type sampling, whereby using both random and ordered sampling produced better results but required longer runtimes.

  15. Optimization of inventory management in furniture manufacturing

    OpenAIRE

    Karkauskas, Justinas

    2017-01-01

    Aim of research - to present inventory management optimization guidelines for furniture manufacturing company, based on analysis of scientific literature and empirical research. Tasks of the Issue: • Disclose problems of inventory management in furniture manufacturing sector; • To analyze theoretical inventory management decisions; • To develop theoretical inventory management optimization model; • Do empirical research of inventory management and present offers for optimizatio...

  16. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  17. Nuclear fuel management optimization for LWRs

    International Nuclear Information System (INIS)

    Turinsky, Paul J.

    1997-01-01

    LWR in core nuclear fuel management involves the placement of fuel and control materials so that a specified objective is achieved within constraints. Specifically, one is interested in determining the core loading pattern (LP of fuel assemblies and burnable poisons and for BWR, also control rod insertion versus cycle exposure. Possible objectives include minimization of feed enrichment and maximization of cycle energy production, discharge burnup or thermal margin. Constraints imposed relate to physical constraints, e.g. no discrete burnable poisons in control rod locations, and operational and safety constraints, e.g. maximum power peaking limit. The LP optimization problem is a large scale, nonlinear, mixed-integer decision variables problem with active constraints. Even with quarter core symmetry imposed, there are above 10 100 possible LPs. The implication is that deterministic optimization methods are not suitable, so in this work we have pursued using the stochastic Simulated Annealing optimization method. Adaptive penalty functions are used to impose certain constraints, allowing unfeasible regions of the search space to be transverse. Since ten of thousands of LPs must be examined to achieve high computational efficiency, higher-order Generalized Perturbation Theory is utilized to solve the Nodal Expansion Method for of the two-group neutron diffusion. These methods have been incorporated into the FORMOSA series of codes and used to optimize PWR and BWR reload cores. (author). 9 refs., 3 tabs

  18. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available

  19. Investigation of Navier-Stokes Code Verification and Design Optimization

    Science.gov (United States)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  20. Optimization and Validation of the Developed Uranium Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    γ-ray spectroscopy is a representative non-destructive assay for nuclear material, and less time-consuming and less expensive than the destructive analysis method. The destructive technique is more precise than NDA technique, however, there is some correction algorithm which can improve the performance of γ-spectroscopy. For this reason, an analysis code for uranium isotopic analysis is developed by Applied Nuclear Physics Group in Seoul National University. Overlapped γ- and x-ray peaks in the 89-101 keV X{sub α}-region are fitted with Gaussian and Lorentzian distribution peak functions, tail and background functions. In this study, optimizations for the full-energy peak efficiency calibration and fitting parameters of peak tail and background are performed, and validated with 24 hour acquisition of CRM uranium samples. The optimization of peak tail and background parameters are performed with the validation by using CRM uranium samples. The analysis performance is improved in HEU samples, but more optimization of fitting parameters is required in LEU sample analysis. In the future, the optimization research about the fitting parameters with various type of uranium samples will be performed. {sup 234}U isotopic analysis algorithms and correction algorithms (coincidence effect, self-attenuation effect) will be developed.

  1. Turbine Airfoil Optimization Using Quasi-3D Analysis Codes

    Directory of Open Access Journals (Sweden)

    Sanjay Goel

    2009-01-01

    Full Text Available A new approach to optimize the geometry of a turbine airfoil by simultaneously designing multiple 2D sections of the airfoil is presented in this paper. The complexity of 3D geometry modeling is circumvented by generating multiple 2D airfoil sections and constraining their geometry in the radial direction using first- and second-order polynomials that ensure smoothness in the radial direction. The flow fields of candidate geometries obtained during optimization are evaluated using a quasi-3D, inviscid, CFD analysis code. An inviscid flow solver is used to reduce the execution time of the analysis. Multiple evaluation criteria based on the Mach number profile obtained from the analysis of each airfoil cross-section are used for computing a quality metric. A key contribution of the paper is the development of metrics that emulate the perception of the human designer in visually evaluating the Mach Number distribution. A mathematical representation of the evaluation criteria coupled with a parametric geometry generator enables the use of formal optimization techniques in the design. The proposed approach is implemented in the optimal design of a low-pressure turbine nozzle.

  2. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  3. The efficiency and fidelity of the in-core nuclear fuel management code FORMOSA-P

    International Nuclear Information System (INIS)

    Kropaczek, D.J.; Turinsky, P.J.

    1994-01-01

    The second-order generalized perturbation theory (GPT), nodal neutronic model utilized within the nuclear fuel management optimization code FORMOSA-P is presented within the context of prediction fidelity and computational efficiency versus forward solution. Key features of thr GPT neutronics model as implemented within the Simulated Annealing optimization adaptive control algorithm are discussed. Supporting results are then presented demonstrating the superior consistency of adaptive control for both global and local optimization searches. (authors). 15 refs., 1 fig., 4 tabs

  4. An Improved Real-Coded Population-Based Extremal Optimization Method for Continuous Unconstrained Optimization Problems

    Directory of Open Access Journals (Sweden)

    Guo-Qiang Zeng

    2014-01-01

    Full Text Available As a novel evolutionary optimization method, extremal optimization (EO has been successfully applied to a variety of combinatorial optimization problems. However, the applications of EO in continuous optimization problems are relatively rare. This paper proposes an improved real-coded population-based EO method (IRPEO for continuous unconstrained optimization problems. The key operations of IRPEO include generation of real-coded random initial population, evaluation of individual and population fitness, selection of bad elements according to power-law probability distribution, generation of new population based on uniform random mutation, and updating the population by accepting the new population unconditionally. The experimental results on 10 benchmark test functions with the dimension N=30 have shown that IRPEO is competitive or even better than the recently reported various genetic algorithm (GA versions with different mutation operations in terms of simplicity, effectiveness, and efficiency. Furthermore, the superiority of IRPEO to other evolutionary algorithms such as original population-based EO, particle swarm optimization (PSO, and the hybrid PSO-EO is also demonstrated by the experimental results on some benchmark functions.

  5. Optimal management of perimenopausal depression

    Directory of Open Access Journals (Sweden)

    Barbara L Parry

    2010-06-01

    Full Text Available Barbara L ParryDepartment of Psychiatry, University of California, San Diego, USAAbstract: Only recently has the perimenopause become recognized as a time when women are at risk for new onset and recurrence of major depression. Untreated depression at this time not only exacerbates the course of a depressive illness, but also puts women at increased risk for sleep disorders, cardiovascular disease, diabetes, and osteoporosis. Although antidepressant medication is the mainstay of treatment, adjunctive therapy, especially with estrogen replacement, may be indicated in refractory cases, and may speed the onset of antidepressant action. Many, but not all, studies, report that progesterone antagonizes the beneficial effects of estrogen. Although some antidepressants improve vasomotor symptoms, in general they are not as effective as estrogen alone for relieving these symptoms. Estrogen alone, however, does not generally result in remission of major depression in most (but not all studies, but may provide benefit to some women with less severe symptoms if administered in therapeutic ranges. The selective serotonin reuptake inhibitors (SSRIs in addition to estrogen are usually more beneficial in improving mood than SSRIs or estrogen treatment alone for major depression, whereas the selective norepinephrine and serotonin reuptake inhibitors do not require the addition of estrogen to exert their antidepressant effects in menopausal depression. In addition to attention to general health, hormonal status, and antidepressant treatment, the optimal management of perimenopausal depression also requires attention to the individual woman’s psychosocial and spiritual well being.Keywords: menopause, depression, management

  6. A novel neutron energy spectrum unfolding code using particle swarm optimization

    International Nuclear Information System (INIS)

    Shahabinejad, H.; Sohrabpour, M.

    2017-01-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code. - Highlights: • Introducing a novel method for neutron spectrum unfolding. • Implementation of a particle swarm optimization code for neutron unfolding. • Comparing results of the PSO code with those of recently published TGASU code. • Match results of the PSO code with those of TGASU code. • Greater convergence rate of implemented PSO code than TGASU code.

  7. Joint research project WASA-BOSS: Further development and application of severe accident codes. Assessment and optimization of accident management measures. Project B: Accident analyses for pressurized water reactors with the application of the ATHLET-CD code; Verbundprojekt WASA-BOSS: Weiterentwicklung und Anwendung von Severe Accident Codes. Bewertung und Optimierung von Stoerfallmassnahmen. Teilprojekt B: Druckwasserreaktor-Stoerfallanalysen unter Verwendung des Severe-Accident-Codes ATHLET-CD

    Energy Technology Data Exchange (ETDEWEB)

    Jobst, Matthias; Kliem, Soeren; Kozmenkov, Yaroslav; Wilhelm, Polina

    2017-02-15

    Within the framework of the project an ATHLET-CD input deck for a generic German PWR of type KONVOI has been created. This input deck was applied to the simulation of severe accidents from the accident categories station blackout (SBO) and small-break loss-of-coolant accidents (SBLOCA). The complete accident transient from initial event at full power until the damage of reactor pressure vessel (RPV) is covered and all relevant severe accident phenomena are modelled: start of core heat up, fission product release, melting of fuel and absorber material, oxidation and release of hydrogen, relocation of molten material inside the core, relocation to the lower plenum, damage and failure of the RPV. The model has been applied to the analysis of preventive and mitigative accident management measures for SBO and SBLOCA transients. Therefore, the measures primary side depressurization (PSD), injection to the primary circuit by mobile pumps and for SBLOCA the delayed injection by the cold leg hydro-accumulators have been investigated and the assumptions and start criteria of these measures have been varied. The time evolutions of the transients and time margins for the initiation of additional measures have been assessed. An uncertainty and sensitivity study has been performed for the early phase of one SBO scenario with PSD (until the start of core melt). In addition to that, a code -to-code comparison between ATHLET-CD and the severe accident code MELCOR has been carried out.

  8. Optimization and standardization of pavement management processes.

    Science.gov (United States)

    2004-08-01

    This report addresses issues related to optimization and standardization of current pavement management processes in Kentucky. Historical pavement management records were analyzed, which indicates that standardization is necessary in future pavement ...

  9. Optimization of the FAST ICRF antenna using TOPICA code

    International Nuclear Information System (INIS)

    Sorba, M.; Milanesio, D.; Maggiora, R.; Tuccillo, A.

    2010-01-01

    Ion Cyclotron Resonance Heating is one of the most important auxiliary heating systems in most plasma confinement experiments. Because of this, the need for very accurate design of ion cyclotron (IC) launchers has dramatically grown in recent years. Furthermore, a reliable simulation tool is a crucial request in the successful design of these antennas, since full testing is impossible outside experiments. One of the most advanced and validated simulation codes is TOPICA, which offers the possibility to handle the geometrical level of detail of a real antenna in front of an accurately described plasma scenario. Adopting this essential tool made possible to reach a refined design of ion cyclotron radio frequency antenna for the FAST (Fusion Advanced Studies Torus) experiment . Starting from a streamlined antenna model and then following well-defined refinement procedures, an optimized launcher design in terms of power delivered to plasma has been finally achieved. The computer-assisted geometry refinements allowed an increase in the performances of the antenna and notably in power handling: the extent of the gained improvements were not experienced in the past, essentially due to the absence of predictive tools capable of analyzing the detailed effects of antenna geometry in plasma facing conditions. Thus, with the help of TOPICA code, it has been possible to comply with the FAST experiment requirements in terms of vacuum chamber constraints and power delivered to plasma. Once an antenna geometry was optimized with a reference plasma profile, the analysis of the performances of the launcher has been extended with respect to two plasma scenarios. Exploiting all TOPICA features, it has been possible to predict the behavior of the launcher in real operating conditions, for instance varying the position of the separatrix surface. In order to fulfil the analysis of the FAST IC antenna, the study of the RF potentials, which depend on the parallel electric field computation

  10. Human resources managers as custodians of the King III code

    Directory of Open Access Journals (Sweden)

    Frank de Beer

    2015-05-01

    Full Text Available The objective of this research was to perform an exploratory study on the knowledge and understanding of the King III code among Human Resources (HR managers in South African companies. The King III code is a comprehensive international corporate governance regime which addresses the financial, social, ethical and environmental practices of organisations. HR management plays a role in managing corporate governance by using the King III code as a guideline. The main research questions were: Does HR management know, understand, apply, and have the ability to use the King III code in terms of ethical decision-making? What role does HR management play in corporate governance? A random sample of available HR managers, senior HR consultants and HR directors was taken and semi-structured interviews were conducted. The results indicated that the respondents had no in-depth knowledge of the King III code. They did not fully understand the King III code and its implications nor did they use it to ensure ethical management. The themes most emphasised by the participants were: culture, reward and remuneration, policies and procedures and performance management. The participants emphasised the importance of these items  and HR’s role in managing them.

  11. Optimized Min-Sum Decoding Algorithm for Low Density Parity Check Codes

    OpenAIRE

    Mohammad Rakibul Islam; Dewan Siam Shafiullah; Muhammad Mostafa Amir Faisal; Imran Rahman

    2011-01-01

    Low Density Parity Check (LDPC) code approaches Shannon–limit performance for binary field and long code lengths. However, performance of binary LDPC code is degraded when the code word length is small. An optimized min-sum algorithm for LDPC code is proposed in this paper. In this algorithm unlike other decoding methods, an optimization factor has been introduced in both check node and bit node of the Min-sum algorithm. The optimization factor is obtained before decoding program, and the sam...

  12. Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal

    Science.gov (United States)

    Zamudio, Gabriel S.; José, Marco V.

    2018-03-01

    In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.

  13. Study of nuclear computer code maintenance and management system

    International Nuclear Information System (INIS)

    Ryu, Chang Mo; Kim, Yeon Seung; Eom, Heung Seop; Lee, Jong Bok; Kim, Ho Joon; Choi, Young Gil; Kim, Ko Ryeo

    1989-01-01

    Software maintenance is one of the most important problems since late 1970's.We wish to develop a nuclear computer code system to maintenance and manage KAERI's nuclear software. As a part of this system, we have developed three code management programs for use on CYBER and PC systems. They are used in systematic management of computer code in KAERI. The first program is embodied on the CYBER system to rapidly provide information on nuclear codes to the users. The second and the third programs were embodied on the PC system for the code manager and for the management of data in korean language, respectively. In the requirement analysis, we defined each code, magnetic tape, manual and abstract information data. In the conceptual design, we designed retrieval, update, and output functions. In the implementation design, we described the technical considerations of database programs, utilities, and directions for the use of databases. As a result of this research, we compiled the status of nuclear computer codes which belonged KAERI until September, 1988. Thus, by using these three database programs, we could provide the nuclear computer code information to the users more rapidly. (Author)

  14. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  15. Is the international safety management code an organisational tool ...

    African Journals Online (AJOL)

    The birth of the International Management Code for the Safe Operation of Ships and for Pollution Prevention (hereinafter ISM Code) is said to a reaction to the sinking of the Herald of free Enterprise on 6th March 1987.The human element is said to be a generic term used to describe what makes humans behave the way ...

  16. IM (Integrity Management) software must show flexibility to local codes

    Energy Technology Data Exchange (ETDEWEB)

    Brors, Markus [ROSEN Technology and Research Center GmbH (Germany); Diggory, Ian [Macaw Engineering Ltd., Northumberland (United Kingdom)

    2009-07-01

    There are many internationally recognized codes and standards, such as API 1160 and ASME B31.8S, which help pipeline operators to manage and maintain the integrity of their pipeline networks. However, operators in many countries still use local codes that often reflect the history of pipeline developments in their region and are based on direct experience and research on their pipelines. As pipeline companies come under increasing regulatory and financial pressures to maintain the integrity of their networks, it is important that operators using regional codes are able to benchmark their integrity management schemes against these international standards. Any comprehensive Pipeline Integrity Management System (PIMS) software package should therefore not only incorporate industry standards for pipeline integrity assessment but also be capable of implementing regional codes for comparison purposes. This paper describes the challenges and benefits of incorporating one such set of regional pipeline standards into ROSEN Asset Integrity Management Software (ROAIMS). (author)

  17. Optimal Cash Management Under Uncertainty

    OpenAIRE

    Bensoussan, Alain; Chutani, Anshuman; Sethi, Suresh

    2009-01-01

    We solve an agent's optimization problem of meeting demands for cash over time with cash deposited in bank or invested in stock. The stock pays dividends and uncertain capital gains, and a commission is incurred in buying and selling of stock. We use a stochastic maximum principle to obtain explicitly the optimal transaction policy.

  18. WASA-BOSS. Development and application of Severe Accident Codes. Evaluation and optimization of accident management measures. Subproject F. Contributions to code validation using BWR data and to evaluation and optimization of accident management measures. Final report; WASA-BOSS. Weiterentwicklung und Anwendung von Severe Accident Codes. Bewertung und Optimierung von Stoerfallmassnahmen. Teilprojekt F. Beitraege zur Codevalidierung anhand von SWR-Daten und zur Bewertung und Optimierung von Stoerfallmassnahmen. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino; Imke, Uwe; Sanchez Espinoza, Victor

    2016-09-15

    The exact knowledge of the transient course of events and of the dominating processes during a severe accident in a nuclear power station is a mandatory requirement to elaborate strategies and measures to minimize the radiological consequences of core melt. Two typical experiments using boiling water reactor assemblies were modelled and simulated with the severe accident simulation code ATHLET-CD. The experiments are related to the early phase of core degradation in a boiling water reactor. The results reproduce the thermal behavior and the hydrogen production due to oxidation inside the bundle until relocation of material by melting. During flooding of the overheated assembly temperatures and hydrogen oxidation are under estimated. The deviations from the experimental results can be explained by the missing model to simulate bore carbide oxidation of the control rods. On basis of a hypothetical loss of coolant accident in a typical German boiling water reactor the effectivity of flooding the partial degraded core is investigated. This measure of mitigation is efficient and prevents failure of the reactor pressure vessel if it starts before molten material is relocated into the lower plenum. Considerable amount of hydrogen is produced by oxidation of the metallic components.

  19. Recovering management information from source code

    NARCIS (Netherlands)

    Kwiatkowski, L.; Verhoef, C.

    2013-01-01

    IT has become a production means for many organizations and an important element of business strategy. Even though its effective management is a must, reality shows that this area still remains in its infancy. IT management relies profoundly on relevant information which enables risk mitigation or

  20. Application bar-code system for solid radioactive waste management

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. H.; Kim, T. K.; Kang, I. S.; Cho, H. S.; Son, J. S. [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    Solid radioactive wastes are generated from the post-irradiated fuel examination facility, the irradiated material examination facility, the research reactor, and the laboratories at KAERI. A bar-code system for a solid radioactive waste management of a research organization became necessary while developing the RAWMIS(Radioactive Waste Management Integration System) which it can generate personal history management for efficient management of a waste, documents, all kinds of statistics. This paper introduces an input and output application program design to do to database with data in the results and a stream process of a treatment that analyzed the waste occurrence present situation and data by bar-code system.

  1. CRACKEL: a computer code for CFR fuel management calculations

    International Nuclear Information System (INIS)

    Burstall, R.F.; Ball, M.A.; Thornton, D.E.J.

    1975-12-01

    The CRACKLE computer code is designed to perform rapid fuel management surveys of CFR systems. The code calculates overall features such as reactivity, power distributions and breeding gain, and also calculates for each sub-assembly plutonium content and power output. A number of alternative options are built into the code, in order to permit different fuel management strategies to be calculated, and to perform more detailed calculations when necessary. A brief description is given of the methods of calculation, and the input facilities of CRACKLE, with examples. (author)

  2. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2015-01-01

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... solutions. Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate...... their performance from various aspects to better understand their applicability, including computation time, reconstruction error, sparsity, detection...

  3. Novel Area Optimization in FPGA Implementation Using Efficient VHDL Code

    Directory of Open Access Journals (Sweden)

    . Zulfikar

    2012-10-01

    Full Text Available A new novel method for area efficiency in FPGA implementation is presented. The method is realized through flexibility and wide capability of VHDL coding. This method exposes the arithmetic operations such as addition, subtraction and others. The design technique aim to reduce occupies area for multi stages circuits by selecting suitable range of all value involved in every step of calculations. Conventional and efficient VHDL coding methods are presented and the synthesis result is compared. The VHDL code which limits range of integer values is occupies less area than the one which is not. This VHDL coding method is suitable for multi stage circuits.

  4. Novel Area Optimization in FPGA Implementation Using Efficient VHDL Code

    Directory of Open Access Journals (Sweden)

    Zulfikar .

    2015-05-01

    Full Text Available A new novel method for area efficiency in FPGA implementation is presented. The method is realized through flexibility and wide capability of VHDL coding. This method exposes the arithmetic operations such as addition, subtraction and others. The design technique aim to reduce occupies area for multi stages circuits by selecting suitable range of all value involved in every step of calculations. Conventional and efficient VHDL coding methods are presented and the synthesis result is compared. The VHDL code which limits range of integer values is occupies less area than the one which is not. This VHDL coding method is suitable for multi stage circuits.

  5. Generalized rank weights of reducible codes, optimal cases and related properties

    DEFF Research Database (Denmark)

    Martinez Peñas, Umberto

    2018-01-01

    in network coding. In this paper, we study their security behavior against information leakage on networks when applied as coset coding schemes, giving the following main results: 1) we give lower and upper bounds on their generalized rank weights (GRWs), which measure worst case information leakage...... to the wire tapper; 2) we find new parameters for which these codes are MRD (meaning that their first GRW is optimal) and use the previous bounds to estimate their higher GRWs; 3) we show that all linear (over the extension field) codes, whose GRWs are all optimal for fixed packet and code sizes but varying...... length are reducible codes up to rank equivalence; and 4) we show that the information leaked to a wire tapper when using reducible codes is often much less than the worst case given by their (optimal in some cases) GRWs. We conclude with some secondary related properties: conditions to be rank...

  6. Optimal Near-Hitless Network Failure Recovery Using Diversity Coding

    Science.gov (United States)

    Avci, Serhat Nazim

    2013-01-01

    Link failures in wide area networks are common and cause significant data losses. Mesh-based protection schemes offer high capacity efficiency but they are slow, require complex signaling, and instable. Diversity coding is a proactive coding-based recovery technique which offers near-hitless (sub-ms) restoration with a competitive spare capacity…

  7. Differentially Encoded LDPC Codes—Part II: General Case and Code Optimization

    Directory of Open Access Journals (Sweden)

    Jing Li (Tiffany

    2008-04-01

    Full Text Available This two-part series of papers studies the theory and practice of differentially encoded low-density parity-check (DE-LDPC codes, especially in the context of noncoherent detection. Part I showed that a special class of DE-LDPC codes, product accumulate codes, perform very well with both coherent and noncoherent detections. The analysis here reveals that a conventional LDPC code, however, is not fitful for differential coding and does not, in general, deliver a desirable performance when detected noncoherently. Through extrinsic information transfer (EXIT analysis and a modified “convergence-constraint” density evolution (DE method developed here, we provide a characterization of the type of LDPC degree profiles that work in harmony with differential detection (or a recursive inner code in general, and demonstrate how to optimize these LDPC codes. The convergence-constraint method provides a useful extension to the conventional “threshold-constraint” method, and can match an outer LDPC code to any given inner code with the imperfectness of the inner decoder taken into consideration.

  8. Optimal management strategies in variable environments: Stochastic optimal control methods

    Science.gov (United States)

    Williams, B.K.

    1985-01-01

    Dynamic optimization was used to investigate the optimal defoliation of salt desert shrubs in north-western Utah. Management was formulated in the context of optimal stochastic control theory, with objective functions composed of discounted or time-averaged biomass yields. Climatic variability and community patterns of salt desert shrublands make the application of stochastic optimal control both feasible and necessary. A primary production model was used to simulate shrub responses and harvest yields under a variety of climatic regimes and defoliation patterns. The simulation results then were used in an optimization model to determine optimal defoliation strategies. The latter model encodes an algorithm for finite state, finite action, infinite discrete time horizon Markov decision processes. Three questions were addressed: (i) What effect do changes in weather patterns have on optimal management strategies? (ii) What effect does the discounting of future returns have? (iii) How do the optimal strategies perform relative to certain fixed defoliation strategies? An analysis was performed for the three shrub species, winterfat (Ceratoides lanata), shadscale (Atriplex confertifolia) and big sagebrush (Artemisia tridentata). In general, the results indicate substantial differences among species in optimal control strategies, which are associated with differences in physiological and morphological characteristics. Optimal policies for big sagebrush varied less with variation in climate, reserve levels and discount rates than did either shadscale or winterfat. This was attributed primarily to the overwintering of photosynthetically active tissue and to metabolic activity early in the growing season. Optimal defoliation of shadscale and winterfat generally was more responsive to differences in plant vigor and climate, reflecting the sensitivity of these species to utilization and replenishment of carbohydrate reserves. Similarities could be seen in the influence of both

  9. Efficacy of Code Optimization on Cache-based Processors

    Science.gov (United States)

    VanderWijngaart, Rob F.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The current common wisdom in the U.S. is that the powerful, cost-effective supercomputers of tomorrow will be based on commodity (RISC) micro-processors with cache memories. Already, most distributed systems in the world use such hardware as building blocks. This shift away from vector supercomputers and towards cache-based systems has brought about a change in programming paradigm, even when ignoring issues of parallelism. Vector machines require inner-loop independence and regular, non-pathological memory strides (usually this means: non-power-of-two strides) to allow efficient vectorization of array operations. Cache-based systems require spatial and temporal locality of data, so that data once read from main memory and stored in high-speed cache memory is used optimally before being written back to main memory. This means that the most cache-friendly array operations are those that feature zero or unit stride, so that each unit of data read from main memory (a cache line) contains information for the next iteration in the loop. Moreover, loops ought to be 'fat', meaning that as many operations as possible are performed on cache data-provided instruction caches do not overflow and enough registers are available. If unit stride is not possible, for example because of some data dependency, then care must be taken to avoid pathological strides, just ads on vector computers. For cache-based systems the issues are more complex, due to the effects of associativity and of non-unit block (cache line) size. But there is more to the story. Most modern micro-processors are superscalar, which means that they can issue several (arithmetic) instructions per clock cycle, provided that there are enough independent instructions in the loop body. This is another argument for providing fat loop bodies. With these restrictions, it appears fairly straightforward to produce code that will run efficiently on any cache-based system. It can be argued that although some of the important

  10. Application of fuel management calculation codes for CANDU reactor

    International Nuclear Information System (INIS)

    Ju Haitao; Wu Hongchun

    2003-01-01

    Qinshan Phase III Nuclear Power Plant adopts CANDU-6 reactors. It is the first time for China to introduce this heavy water pressure tube reactor. In order to meet the demands of the fuel management calculation, DRAGON/DONJON code is developed in this paper. Some initial fuel management calculations about CANDU-6 reactor of Qinshan Phase III are carried out using DRAGON/DONJON code. The results indicate that DRAGON/DONJON can be used for the fuel management calculation for Qinshan Phase III

  11. In-core fuel management code package validation for BWRs

    International Nuclear Information System (INIS)

    1995-12-01

    The main goal of the present CRP (Coordinated Research Programme) was to develop benchmarks which are appropriate to check and improve the fuel management computer code packages and their procedures. Therefore, benchmark specifications were established which included a set of realistic data for running in-core fuel management codes. Secondly, the results of measurements and/or operating data were also provided to verify and compare with these parameters as calculated by the in-core fuel management codes or code packages. For the BWR it was established that the Mexican Laguna Verde 1 BWR would serve as the model for providing data on the benchmark specifications. It was decided to provide results for the first 2 cycles of Unit 1 of the Laguna Verde reactor. The analyses of the above benchmarks are performed in two stages. In the first stage, the lattice parameters are generated as a function of burnup at different voids and with and without control rod. These lattice parameters form the input for 3-dimensional diffusion theory codes for over-all reactor analysis. The lattice calculations were performed using different methods, such as, Monte Carlo, 2-D integral transport theory methods. Supercell Model and transport-diffusion model with proper correction for burnable absorber. Thus the variety of results should provide adequate information for any institute or organization to develop competence to analyze In-core fuel management codes. 15 refs, figs and tabs

  12. Watershed Management Optimization Support Tool (WMOST) ...

    Science.gov (United States)

    EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green infrastructure), wastewater, drinking water, and land conservation programs to find the least cost solutions. The pdf version of these presentations accompany the recorded webinar with closed captions to be posted on the WMOST web page. The webinar was recorded at the time a training workshop took place for EPA's Watershed Management Optimization Support Tool (WMOST, v2).

  13. Optimizing decommissioning and waste management

    International Nuclear Information System (INIS)

    McKeown, J.

    2000-01-01

    UKAEA has clarified its future purpose. It is a nuclear environmental restoration business. Its proud history of being at the forefront of nuclear research now provides decommissioning challenges of unique breadth. The methods employed, and in some cases developed, by UKAEA to assist in the optimization of its overall work programme are identified. (author)

  14. Novel Area Optimization in FPGA Implementation Using Efficient VHDL Code

    OpenAIRE

    Zulfikar, Z

    2012-01-01

    A new novel method for area efficiency in FPGA implementation is presented. The method is realized through flexibility and wide capability of VHDL coding. This method exposes the arithmetic operations such as addition, subtraction and others. The design technique aim to reduce occupies area for multi stages circuits by selecting suitable range of all value involved in every step of calculations. Conventional and efficient VHDL coding methods are presented and the synthesis result is compared....

  15. The in-core fuel management code system for VVER reactors

    International Nuclear Information System (INIS)

    Cada, R.; Krysl, V.; Mikolas, P.; Sustek, J.; Svarny, J.

    2004-01-01

    The structure and methodology of a fuel management system for NPP VVER 1000 (NPP Temelin) and VVER 440 (NPP Dukovany) is described. It is under development in SKODA JS a.s. and is followed by practical applications. The general objectives of the system are maximization of end of cycle reactivity, the minimization of fresh fuel inventory for the minimization of fed enrichment and minimization of burnable poisons (BPs) inventory. They are also safety related constraints in witch minimization of power peaking plays a dominant role. General structure of the system consists in preparation of input data for macrocode calculation, algorithms (codes) for optimization of fuel loading, calculation of fuel enrichment and BPs assignment. At present core loading can be calculated (optimized) by Tabu search algorithm (code ATHENA), genetic algorithm (code Gen1) and hybrid algorithm - simplex procedure with application of Tabu search algorithm on binary shuffling (code OPAL B ). Enrichment search is realized by the application of simplex algorithm (OPAL B code) and BPs assignment by module BPASS and simplex algorithm in OPAL B code. Calculations of the real core loadings are presented and a comparison of different optimization methods is provided. (author)

  16. Codes of practice and related issues in biomedical waste management

    Energy Technology Data Exchange (ETDEWEB)

    Moy, D.; Watt, C. [Griffith Univ. (Australia)

    1996-12-31

    This paper outlines the development of a National Code of Practice for biomedical waste management in Australia. The 10 key areas addressed by the code are industry mission statement; uniform terms and definitions; community relations - public perceptions and right to know; generation, source separation, and handling; storage requirements; transportation; treatment and disposal; disposal of solid and liquid residues and air emissions; occupational health and safety; staff awareness and education. A comparison with other industry codes in Australia is made. A list of outstanding issues is also provided; these include the development of standard containers, treatment effectiveness, and reusable sharps containers.

  17. MISER-I: a computer code for JOYO fuel management

    International Nuclear Information System (INIS)

    Yamashita, Yoshioki

    1976-06-01

    A computer code ''MISER-I'' is for a nuclear fuel management of Japan Experimental Fast Breeder Reactor JOYO. The nuclear fuel management in JOYO can be regarded as a fuel assembly management because a handling unit of fuel in JOYO plant is a fuel subassembly (core and blanket subassembly), and so the recording of material balance in computer code is made with each subassembly. The input information into computer code is given with each subassembly for a transfer operation, or with one reactor cycle and every one month for a burn-up in reactor core. The output information of MISER-I code is the fuel assembly storage record, fuel storage weight record in each material balance subarea at any specified day, and fuel subassembly transfer history record. Change of nuclear fuel composition and weight due to a burn-up is calculated with JOYO-Monitoring Code by off-line computation system. MISER-I code is written in FORTRAN-IV language for FACOM 230-48 computer. (auth.)

  18. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments.

    Science.gov (United States)

    Santos, José; Monteagudo, Angel

    2011-02-21

    As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the fact that the best possible codes show the patterns of the

  19. Simulated evolution applied to study the genetic code optimality using a model of codon reassignments

    Directory of Open Access Journals (Sweden)

    Monteagudo Ángel

    2011-02-01

    Full Text Available Abstract Background As the canonical code is not universal, different theories about its origin and organization have appeared. The optimization or level of adaptation of the canonical genetic code was measured taking into account the harmful consequences resulting from point mutations leading to the replacement of one amino acid for another. There are two basic theories to measure the level of optimization: the statistical approach, which compares the canonical genetic code with many randomly generated alternative ones, and the engineering approach, which compares the canonical code with the best possible alternative. Results Here we used a genetic algorithm to search for better adapted hypothetical codes and as a method to guess the difficulty in finding such alternative codes, allowing to clearly situate the canonical code in the fitness landscape. This novel proposal of the use of evolutionary computing provides a new perspective in the open debate between the use of the statistical approach, which postulates that the genetic code conserves amino acid properties far better than expected from a random code, and the engineering approach, which tends to indicate that the canonical genetic code is still far from optimal. We used two models of hypothetical codes: one that reflects the known examples of codon reassignment and the model most used in the two approaches which reflects the current genetic code translation table. Although the standard code is far from a possible optimum considering both models, when the more realistic model of the codon reassignments was used, the evolutionary algorithm had more difficulty to overcome the efficiency of the canonical genetic code. Conclusions Simulated evolution clearly reveals that the canonical genetic code is far from optimal regarding its optimization. Nevertheless, the efficiency of the canonical code increases when mistranslations are taken into account with the two models, as indicated by the

  20. Optimizing diabetes management: managed care strategies.

    Science.gov (United States)

    Tzeel, E Albert

    2013-06-01

    Both the prevalence of type 2 diabetes mellitus (DM) and its associated costs have been rising over time and are projected to continue to escalate. Therefore, type 2 DM (T2DM) management costs represent a potentially untenable strain on the healthcare system unless substantial, systemic changes are made. Managed care organizations (MCOs) are uniquely positioned to attempt to make the changes necessary to reduce the burdens associated with T2DM by developing policies that align with evidence-based DM management guidelines and other resources. For example, MCOs can encourage members to implement healthy lifestyle choices, which have been shown to reduce DM-associated mortality and delay comorbidities. In addition, MCOs are exploring the strengths and weaknesses of several different benefit plan designs. Value-based insurance designs, sometimes referred to as value-based benefit designs, use both direct and indirect data to invest in incentives that change behaviors through health information technologies, communications, and services to improve health, productivity, quality, and financial trends. Provider incentive programs, sometimes referred to as "pay for performance," represent a payment/delivery paradigm that places emphasis on rewarding value instead of volume to align financial incentives and quality of care. Accountable care organizations emphasize an alignment between reimbursement and implementation of best practices through the use of disease management and/ or clinical pathways and health information technologies. Consumer-directed health plans, or high-deductible health plans, combine lower premiums with high annual deductibles to encourage members to seek better value for health expenditures. Studies conducted to date on these different designs have produced mixed results.

  1. ASME nuclear codes and standards risk management strategic planning

    International Nuclear Information System (INIS)

    Hill, Ralph S. III; Balkey, Kenneth R.; Erler, Bryan A.; Wesley Rowley, C.

    2007-01-01

    This paper is prepared in honor and in memory of the late Professor Emeritus Yasuhide Asada to recognize his contributions to ASME Nuclear Codes and Standards initiatives, particularly those related to risk-informed technology and System Based Code developments. For nearly two decades, numerous risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to properly manage the numerous initiatives currently underway or planned for the future, the ASME Board on Nuclear Codes and Standards (BNCS) has an established Risk Management Strategic Plan (Plan) that is maintained and updated by the ASME BNCS Risk Management Task Group. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent probabilistic risk assessment (PRA) standards developments for nuclear power plant applications. The paper discusses planned applications within ASME Nuclear Codes and Standards that will require expansion of the ASME PRA Standard to support new advanced light water reactor and next generation reactor developments, such as for high temperature gas-cooled reactors. Emerging regulatory developments related to risk-informed, performance- based approaches are summarized. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is also summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, including related U.S. regulatory activities. (author)

  2. RAID-6 reed-solomon codes with asymptotically optimal arithmetic complexities

    KAUST Repository

    Lin, Sian-Jheng; Alloum, Amira; Al-Naffouri, Tareq Y.

    2016-01-01

    present a configuration of the factors of the second-parity formula, such that the arithmetic complexity can reach the optimal complexity bound when the code length approaches infinity. In the proposed approach, the intermediate data used for the first

  3. ASME nuclear codes and standards risk management strategic plan

    International Nuclear Information System (INIS)

    Balkey, Kenneth R.

    2003-01-01

    Over the past 15 years, several risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to better manage the numerous initiatives in the future, the ASME Board on Nuclear Codes and Standards has recently developed and approved a Risk Management Strategic Plan. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent issuance of the ASME Standard for Probabilistic Risk Assessment (PRA) for Nuclear Power Plant Applications. The paper discusses potential applications within ASME Nuclear Codes and Standards that may require expansion of the PRA Standard, such as for new generation reactors, or the development of new PRA Standards. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, and related U.S. regulatory activities are also summarized. (author)

  4. PACC information management code for common cause failures analysis

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Garcia Gay, J.; Mira McWilliams, J.

    1987-01-01

    The purpose of this paper is to present the PACC code, which, through an adequate data management, makes the task of computerized common-mode failure analysis easier. PACC processes and generates information in order to carry out the corresponding qualitative analysis, by means of the boolean technique of transformation of variables, and the quantitative analysis either using one of several parametric methods or a direct data-base. As far as the qualitative analysis is concerned, the code creates several functional forms for the transformation equations according to the user's choice. These equations are subsequently processed by boolean manipulation codes, such as SETS. The quantitative calculations of the code can be carried out in two different ways: either starting from a common cause data-base, or through parametric methods, such as the Binomial Failure Rate Method, the Basic Parameters Method or the Multiple Greek Letter Method, among others. (orig.)

  5. Optimal management of orthodontic pain.

    Science.gov (United States)

    Topolski, Francielle; Moro, Alexandre; Correr, Gisele Maria; Schimim, Sasha Cristina

    2018-01-01

    Pain is an undesirable side effect of orthodontic tooth movement, which causes many patients to give up orthodontic treatment or avoid it altogether. The aim of this study was to investigate, through an analysis of the scientific literature, the best method for managing orthodontic pain. The methodological aspects involved careful definition of keywords and diligent search in databases of scientific articles published in the English language, without any restriction of publication date. We recovered 1281 articles. After the filtering and classification of these articles, 56 randomized clinical trials were selected. Of these, 19 evaluated the effects of different types of drugs for the control of orthodontic pain, 16 evaluated the effects of low-level laser therapy on orthodontic pain, and 21 evaluated other methods of pain control. Drugs reported as effective in orthodontic pain control included ibuprofen, paracetamol, naproxen sodium, aspirin, etoricoxib, meloxicam, piroxicam, and tenoxicam. Most studies report favorable outcomes in terms of alleviation of orthodontic pain with the use of low-level laser therapy. Nevertheless, we noticed that there is no consensus, both for the drug and for laser therapy, on the doses and clinical protocols most appropriate for orthodontic pain management. Alternative methods for orthodontic pain control can also broaden the clinician's range of options in the search for better patient care.

  6. Optimization of multi-phase compressible lattice Boltzmann codes on massively parallel multi-core systems

    NARCIS (Netherlands)

    Biferale, L.; Mantovani, F.; Pivanti, M.; Pozzati, F.; Sbragaglia, M.; Schifano, S.F.; Toschi, F.; Tripiccione, R.

    2011-01-01

    We develop a Lattice Boltzmann code for computational fluid-dynamics and optimize it for massively parallel systems based on multi-core processors. Our code describes 2D multi-phase compressible flows. We analyze the performance bottlenecks that we find as we gradually expose a larger fraction of

  7. PlayNCool: Opportunistic Network Coding for Local Optimization of Routing in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2013-01-01

    This paper introduces PlayNCool, an opportunistic protocol with local optimization based on network coding to increase the throughput of a wireless mesh network (WMN). PlayNCool aims to enhance current routing protocols by (i) allowing random linear network coding transmissions end-to-end, (ii) r...

  8. Supply chain management and optimization in manufacturing

    CERN Document Server

    Pirim, Harun; Yilbas, Bekir Sami

    2014-01-01

    This book introduces general supply chain terminology particularly for novice readers, state of the art supply chain management and optimization issues and problems in manufacturing. The book provides insights for making supply chain decisions, planning and scheduling through supply chain network. It introduces optimization problems, i.e. transportation of raw materials, products and location, inventory of plants, warehouses and retailers, faced throughout the supply chain network.

  9. Optimal management of orthodontic pain

    Directory of Open Access Journals (Sweden)

    Topolski F

    2018-03-01

    Full Text Available Francielle Topolski,1 Alexandre Moro,1,2 Gisele Maria Correr,3 Sasha Cristina Schimim1 1Department of Orthodontics, Positivo University, Curitiba, Paraná, Brazil; 2Department of Orthodontics, Federal University of Paraná, Curitiba, Paraná, Brazil; 3Department of Restorative Dentistry, Positivo University, Curitiba, Paraná, Brazil Abstract: Pain is an undesirable side effect of orthodontic tooth movement, which causes many patients to give up orthodontic treatment or avoid it altogether. The aim of this study was to investigate, through an analysis of the scientific literature, the best method for managing orthodontic pain. The methodological aspects involved careful definition of keywords and diligent search in databases of scientific articles published in the English language, without any restriction of publication date. We recovered 1281 articles. After the filtering and classification of these articles, 56 randomized clinical trials were selected. Of these, 19 evaluated the effects of different types of drugs for the control of orthodontic pain, 16 evaluated the effects of low-level laser therapy on orthodontic pain, and 21 evaluated other methods of pain control. Drugs reported as effective in orthodontic pain control included ibuprofen, paracetamol, naproxen sodium, aspirin, etoricoxib, meloxicam, piroxicam, and tenoxicam. Most studies report favorable outcomes in terms of alleviation of orthodontic pain with the use of low-level laser therapy. Nevertheless, we noticed that there is no consensus, both for the drug and for laser therapy, on the doses and clinical protocols most appropriate for orthodontic pain management. Alternative methods for orthodontic pain control can also broaden the clinician’s range of options in the search for better patient care. Keywords: tooth movement, pain control, drug therapy, laser therapy

  10. Optimal Management of Hydropower Systems

    Science.gov (United States)

    Bensalem, A.; Cherif, F.; Bennagoune, S.; Benbouza, M. S.; El-Maouhab, A.

    In this study we propose a new model for solving the short term management of water reservoirs with variable waterfall. The stored water in these reservoirs is used to produce the electrical energy. The proposed model is based on the enhancement of the value of water by taking into account its location in any reservoir and its waterfall high. The water outflow in the upper reservoir to produce electrical energy is reused in the lower reservoirs to produce electrical energy too. On the other hand the amount of water flow necessary to produce the same amount of electrical energy decrease as the high of waterfall increases. Thus, the objective function is represented in function of the water potential energy stocked in all reservoirs. To analyze this model, we have developed an algorithm based on the discrete maximum principle. To solve the obtained equations, an iterative method based on the gradient method is used. And to satisfy the constraints we have used the Augmented Lagrangian method.

  11. Optimal quantum error correcting codes from absolutely maximally entangled states

    Science.gov (United States)

    Raissi, Zahra; Gogolin, Christian; Riera, Arnau; Acín, Antonio

    2018-02-01

    Absolutely maximally entangled (AME) states are pure multi-partite generalizations of the bipartite maximally entangled states with the property that all reduced states of at most half the system size are in the maximally mixed state. AME states are of interest for multipartite teleportation and quantum secret sharing and have recently found new applications in the context of high-energy physics in toy models realizing the AdS/CFT-correspondence. We work out in detail the connection between AME states of minimal support and classical maximum distance separable (MDS) error correcting codes and, in particular, provide explicit closed form expressions for AME states of n parties with local dimension \

  12. Transoptr-a second order beam transport design code with automatic internal optimization and general constraints

    International Nuclear Information System (INIS)

    Heighway, E.A.

    1980-07-01

    A second order beam transport design code with parametric optimization is described. The code analyzes the transport of charged particle beams through a user defined magnet system. The magnet system parameters are varied (within user defined limits) until the properties of the transported beam and/or the system transport matrix match those properties requested by the user. The code uses matrix formalism to represent the transport elements and optimization is achieved using the variable metric method. Any constraints that can be expressed algebraically may be included by the user as part of his design. Instruction in the use of the program is given. (auth)

  13. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  14. Characterization and Optimization of LDPC Codes for the 2-User Gaussian Multiple Access Channel

    Directory of Open Access Journals (Sweden)

    Declercq David

    2007-01-01

    Full Text Available We address the problem of designing good LDPC codes for the Gaussian multiple access channel (MAC. The framework we choose is to design multiuser LDPC codes with joint belief propagation decoding on the joint graph of the 2-user case. Our main result compared to existing work is to express analytically EXIT functions of the multiuser decoder with two different approximations of the density evolution. This allows us to propose a very simple linear programming optimization for the complicated problem of LDPC code design with joint multiuser decoding. The stability condition for our case is derived and used in the optimization constraints. The codes that we obtain for the 2-user case are quite good for various rates, especially if we consider the very simple optimization procedure.

  15. Optimizing fusion PIC code performance at scale on Cori Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, T. S.; Deslippe, J.

    2017-07-23

    In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale well up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.

  16. Product code optimization for determinate state LDPC decoding in robust image transmission.

    Science.gov (United States)

    Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G

    2006-08-01

    We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission.

  17. Quo vadis code optimization in high energy physics

    International Nuclear Information System (INIS)

    Jarp, S.

    1994-01-01

    Although performance tuning and optimization can be considered less critical than in the past, there are still many High Energy Physics (HEP) applications and application domains that can profit from such an undertaking. In CERN's CORE (Centrally Operated RISC Environment) where all major RISC vendors are present, this implies an understanding of the various computer architectures, instruction sets and performance analysis tools from each of these vendors. This paper discusses some initial observations after having evaluated the situation and makes some recommendations for further progress

  18. Small cell networks deployment, management, and optimization

    CERN Document Server

    Claussen, Holger; Ho, Lester; Razavi, Rouzbeh; Kucera, Stepan

    2018-01-01

    Small Cell Networks: Deployment, Management, and Optimization addresses key problems of the cellular network evolution towards HetNets. It focuses on the latest developments in heterogeneous and small cell networks, as well as their deployment, operation, and maintenance. It also covers the full spectrum of the topic, from academic, research, and business to the practice of HetNets in a coherent manner. Additionally, it provides complete and practical guidelines to vendors and operators interested in deploying small cells. The first comprehensive book written by well-known researchers and engineers from Nokia Bell Labs, Small Cell Networks begins with an introduction to the subject--offering chapters on capacity scaling and key requirements of future networks. It then moves on to sections on coverage and capacity optimization, and interference management. From there, the book covers mobility management, energy efficiency, and small cell deployment, ending with a section devoted to future trends and applicat...

  19. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  20. Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization

    Science.gov (United States)

    Green, Lawrence; Carle, Alan; Fagan, Mike

    1999-01-01

    Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop

  1. Improvement of JRR-4 core management code system

    International Nuclear Information System (INIS)

    Izumo, H.; Watanabe, S.; Nagatomi, H.; Hori, N.

    2000-01-01

    In the modification of JRR-4, the fuel was changed from 93% high enrichment uranium aluminized fuel to 20% low enriched uranium silicide fuel in conformity with the framework of reduced enrichment program on JAERI research reactors. As changing of this, JRR-4 core management code system which estimates excess reactivity of core, fuel burn-up and so on, was improved too. It had been difficult for users to operate the former code system because its input-output form was text-form. But, in the new code system (COMMAS-JRR), users are able to operate the code system without using difficult text-form input. The estimation results of excess reactivity of JRR-4 LEU fuel core were showed very good agreements with the measured value. It is the strong points of this new code system to be operated simply by using the windows form pictures act on a personal workstation equip with the graphical-user-interface (GUI), and to estimate accurately the specific characteristics of the LEU core. (author)

  2. Efficacy of Code Optimization on Cache-Based Processors

    Science.gov (United States)

    VanderWijngaart, Rob F.; Saphir, William C.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    In this paper a number of techniques for improving the cache performance of a representative piece of numerical software is presented. Target machines are popular processors from several vendors: MIPS R5000 (SGI Indy), MIPS R8000 (SGI PowerChallenge), MIPS R10000 (SGI Origin), DEC Alpha EV4 + EV5 (Cray T3D & T3E), IBM RS6000 (SP Wide-node), Intel PentiumPro (Ames' Whitney), Sun UltraSparc (NERSC's NOW). The optimizations all attempt to increase the locality of memory accesses. But they meet with rather varied and often counterintuitive success on the different computing platforms. We conclude that it may be genuinely impossible to obtain portable performance on the current generation of cache-based machines. At the least, it appears that the performance of modern commodity processors cannot be described with parameters defining the cache alone.

  3. Model-Based Speech Signal Coding Using Optimized Temporal Decomposition for Storage and Broadcasting Applications

    Science.gov (United States)

    Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret

    2003-12-01

    A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.

  4. Game-Theoretic Rate-Distortion-Complexity Optimization of High Efficiency Video Coding

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Milani, Simone; Forchhammer, Søren

    2013-01-01

    profiles in order to tailor the computational load to the different hardware and power-supply resources of devices. In this work, we focus on optimizing the quantization parameter and partition depth in HEVC via a game-theoretic approach. The proposed rate control strategy alone provides 0.2 dB improvement......This paper presents an algorithm for rate-distortioncomplexity optimization for the emerging High Efficiency Video Coding (HEVC) standard, whose high computational requirements urge the need for low-complexity optimization algorithms. Optimization approaches need to specify different complexity...

  5. THE OPTIMAL CONTROL IN THE MODELOF NETWORK SECURITY FROM MALICIOUS CODE

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The paper deals with a mathematical model of network security. The model is described in terms of the nonlinear optimal control. As a criterion of the control problem quality the price of the summary damage inflicted by the harmful codes is chosen, under additional restriction: the number of recovered nodes is maximized. The Pontryagin maximum principle for construction of the optimal decisions is formulated. The number of switching points of the optimal control is found. The explicit form of optimal control is given using the Lagrange multipliers method.

  6. Power Optimization of Wireless Media Systems With Space-Time Block Codes

    OpenAIRE

    Yousefi'zadeh, Homayoun; Jafarkhani, Hamid; Moshfeghi, Mehran

    2004-01-01

    We present analytical and numerical solutions to the problem of power control in wireless media systems with multiple antennas. We formulate a set of optimization problems aimed at minimizing total power consumption of wireless media systems subject to a given level of QoS and an available bit rate. Our formulation takes in to consideration the power consumption related to source coding, channel coding, and transmission of multiple-transmit antennas. In our study, we consider Gauss-Markov and...

  7. Differentially Encoded LDPC Codes—Part II: General Case and Code Optimization

    Directory of Open Access Journals (Sweden)

    Li (Tiffany Jing

    2008-01-01

    Full Text Available This two-part series of papers studies the theory and practice of differentially encoded low-density parity-check (DE-LDPC codes, especially in the context of noncoherent detection. Part I showed that a special class of DE-LDPC codes, product accumulate codes, perform very well with both coherent and noncoherent detections. The analysis here reveals that a conventional LDPC code, however, is not fitful for differential coding and does not, in general, deliver a desirable performance when detected noncoherently. Through extrinsic information transfer (EXIT analysis and a modified "convergence-constraint" density evolution (DE method developed here, we provide a characterization of the type of LDPC degree profiles that work in harmony with differential detection (or a recursive inner code in general, and demonstrate how to optimize these LDPC codes. The convergence-constraint method provides a useful extension to the conventional "threshold-constraint" method, and can match an outer LDPC code to any given inner code with the imperfectness of the inner decoder taken into consideration.

  8. Spectral-Amplitude-Coded OCDMA Optimized for a Realistic FBG Frequency Response

    Science.gov (United States)

    Penon, Julien; El-Sahn, Ziad A.; Rusch, Leslie A.; Larochelle, Sophie

    2007-05-01

    We develop a methodology for numerical optimization of fiber Bragg grating frequency response to maximize the achievable capacity of a spectral-amplitude-coded optical code-division multiple-access (SAC-OCDMA) system. The optimal encoders are realized, and we experimentally demonstrate an incoherent SAC-OCDMA system with seven simultaneous users. We report a bit error rate (BER) of 2.7 x 10-8 at 622 Mb/s for a fully loaded network (seven users) using a 9.6-nm optical band. We achieve error-free transmission (BER < 1 x 10-9) for up to five simultaneous users.

  9. Methods for Distributed Optimal Energy Management

    DEFF Research Database (Denmark)

    Brehm, Robert

    The presented research deals with the fundamental underlying methods and concepts of how the growing number of distributed generation units based on renewable energy resources and distributed storage devices can be most efficiently integrated into the existing utility grid. In contrast to convent......The presented research deals with the fundamental underlying methods and concepts of how the growing number of distributed generation units based on renewable energy resources and distributed storage devices can be most efficiently integrated into the existing utility grid. In contrast...... to conventional centralised optimal energy flow management systems, here-in, focus is set on how optimal energy management can be achieved in a decentralised distributed architecture such as a multi-agent system. Distributed optimisation methods are introduced, targeting optimisation of energy flow in virtual......-consumption of renewable energy resources in low voltage grids. It can be shown that this method prevents mutual discharging of batteries and prevents peak loads, a supervisory control instance can dictate the level of autarchy from the utility grid. Further it is shown that the problem of optimal energy flow management...

  10. Genetic algorithm for the optimization of the loading pattern for reactor core fuel management

    International Nuclear Information System (INIS)

    Zhou Sheng; Hu Yongming; zheng Wenxiang

    2000-01-01

    The paper discusses the application of a genetic algorithm to the optimization of the loading pattern for in-core fuel management with the NP characteristics. The algorithm develops a matrix model for the fuel assembly loading pattern. The burnable poisons matrix was assigned randomly considering the distributed nature of the poisons. A method based on the traveling salesman problem was used to solve the problem. A integrated code for in-core fuel management was formed by combining this code with a reactor physics code

  11. Integration of QR codes into an anesthesia information management system for resident case log management.

    Science.gov (United States)

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Nuclear-thermal-coupled optimization code for the fusion breeding blanket conceptual design

    International Nuclear Information System (INIS)

    Li, Jia; Jiang, Kecheng; Zhang, Xiaokang; Nie, Xingchen; Zhu, Qinjun; Liu, Songlin

    2016-01-01

    Highlights: • A nuclear-thermal-coupled predesign code has been developed for optimizing the radial build arrangement of fusion breeding blanket. • Coupling module aims at speeding up the efficiency of design progress by coupling the neutronics calculation code with the thermal-hydraulic analysis code. • Radial build optimization algorithm aims at optimal arrangement of breeding blanket considering one or multiple specified objectives subject to the design criteria such as material temperature limit and available TBR. - Abstract: Fusion breeding blanket as one of the key in-vessel components performs the functions of breeding the tritium, removing the nuclear heat and heat flux from plasma chamber as well as acting as part of shielding system. The radial build design which determines the arrangement of function zones and material properties on the radial direction is the basis of the detailed design of fusion breeding blanket. For facilitating the radial build design, this study aims for developing a pre-design code to optimize the radial build of blanket with considering the performance of nuclear and thermal-hydraulic simultaneously. Two main features of this code are: (1) Coupling of the neutronics analysis with the thermal-hydraulic analysis to speed up the analysis progress; (2) preliminary optimization algorithm using one or multiple specified objectives subject to the design criteria in the form of constrains imposed on design variables and performance parameters within the possible engineering ranges. This pre-design code has been applied to the conceptual design of water-cooled ceramic breeding blanket in project of China fusion engineering testing reactor (CFETR).

  13. Nuclear-thermal-coupled optimization code for the fusion breeding blanket conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jia, E-mail: lijia@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230027, Anhui (China); Jiang, Kecheng; Zhang, Xiaokang [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031, Anhui (China); Nie, Xingchen [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230027, Anhui (China); Zhu, Qinjun; Liu, Songlin [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031, Anhui (China)

    2016-12-15

    Highlights: • A nuclear-thermal-coupled predesign code has been developed for optimizing the radial build arrangement of fusion breeding blanket. • Coupling module aims at speeding up the efficiency of design progress by coupling the neutronics calculation code with the thermal-hydraulic analysis code. • Radial build optimization algorithm aims at optimal arrangement of breeding blanket considering one or multiple specified objectives subject to the design criteria such as material temperature limit and available TBR. - Abstract: Fusion breeding blanket as one of the key in-vessel components performs the functions of breeding the tritium, removing the nuclear heat and heat flux from plasma chamber as well as acting as part of shielding system. The radial build design which determines the arrangement of function zones and material properties on the radial direction is the basis of the detailed design of fusion breeding blanket. For facilitating the radial build design, this study aims for developing a pre-design code to optimize the radial build of blanket with considering the performance of nuclear and thermal-hydraulic simultaneously. Two main features of this code are: (1) Coupling of the neutronics analysis with the thermal-hydraulic analysis to speed up the analysis progress; (2) preliminary optimization algorithm using one or multiple specified objectives subject to the design criteria in the form of constrains imposed on design variables and performance parameters within the possible engineering ranges. This pre-design code has been applied to the conceptual design of water-cooled ceramic breeding blanket in project of China fusion engineering testing reactor (CFETR).

  14. IBFAN Africa training initiatives: code implementation and lactation management.

    Science.gov (United States)

    Mbuli, A

    1994-01-01

    As part of an ongoing effort to halt the decline of breast feeding rates in Africa, 35 representatives of 12 different African countries met in Mangochi, Malawi, in February 1994. The Code of Marketing of Breastmilk Substitutes was scrutinized. National codes were drafted based on the "Model Law" of the IBFAN Code Documentation Centre (ICDC), Penang. Mechanisms of implementation, specific to each country, were developed. Strategies for the promotion, protection, and support of breast feeding, which is very important to child survival in Africa, were discussed. The training course was organized by ICDC, in conjunction with IBFAN Africa, and with the support of the United Nations Children's Fund (UNICEF) and the World Health Organization (WHO). Countries in eastern, central, and southern Africa were invited to send participants, who included professors, pediatricians, nutritionists, MCH personnel, nurses, and lawyers. IBFAN Africa has also been conducting lactation management workshops for a number of years in African countries. 26 health personnel (pediatricians, nutritionists, senior nursing personnel, and MCH workers), representing 7 countries in the southern African region, attended a training of trainers lactation management workshop in Swaziland in August, 1993 with the support of their UNICEF country offices. The workshop included lectures, working sessions, discussions, and slide and video presentations. Topics covered included national nutrition statuses, the importance of breast feeding, the anatomy and physiology of breast feeding, breast feeding problems, the International Code of Marketing, counseling skills, and training methods. The field trip to a training course covering primary health care that was run by the Traditional Healers Organization (THO) in Swaziland was of particular interest because of the strong traditional medicine sector in many African countries. IBFAN Africa encourages use of community workers (traditional healers, Rural Health

  15. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    Science.gov (United States)

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  16. Status Report on Hydrogen Management and Related Computer Codes

    International Nuclear Information System (INIS)

    Liang, Z.; Chan, C.K.; Sonnenkalb, M.; Bentaib, A.; Malet, J.; Sangiorgi, M.; Gryffroy, D.; Gyepi-Garbrah, S.; Duspiva, J.; Sevon, T.; Kelm, S.; Reinecke, E.A.; Xu, Z.J.; Cervone, A.; Utsuno, H.; Hotta, A.; Hong, S.W.; Kim, J.T.; Visser, D.C.; Stempniewicz, M.M.; Kuriene, L.; Prusinski, P.; Martin-Valdepenas, J.M.; Frid, W.; Isaksson, P.; Dreier, J.; Paladino, D.; Algama, D.; Notafrancesco, A.; Amri, A.; Kissane, M.; )

    2014-01-01

    In follow-up to the Fukushima Daiichi NPP accident, the Committee on the Safety of Nuclear Installations (CSNI) decided to launch several high priority activities. At the 14. plenary meeting of the Working Group on Analysis and Management of Accidents (WGAMA), a proposal for a status paper on hydrogen generation, transport and mitigation under severe accident conditions was approved. The proposed activity is in line with the WGAMA mandate and it was considered to be needed to revisit the hydrogen issue. The report is broken down into five Chapters and two appendixes. Chapter 1 provides background information for this activity and expected topics defined by the WGAMA members. A general understanding of hydrogen behavior and control in severe accidents is discussed. A brief literature review is included in this chapter to summarize the progress obtained from the early US NRC sponsored research on hydrogen and recent international OECD or EC sponsored projects on hydrogen related topics (generation, distribution, combustion and mitigation). Chapter 2 provides a general overview of the various reactor designs of Western PWRs, BWRs, Eastern European VVERs and PHWRs (CANDUs). The purpose is to understand the containment design features in relation to hydrogen management measures. Chapter 3 provides a detailed description of national requirements on hydrogen management and hydrogen mitigation measures inside the containment and other places (e.g., annulus space, secondary buildings, spent fuel pool, etc.). Discussions are followed on hydrogen analysis approaches, application of safety systems (e.g., spray, containment ventilation, local air cooler, suppression pool, and latch systems), hydrogen measurement strategies as well as lessons learnt from the Fukushima Daiichi nuclear power accident. Chapter 4 provides an overview of various codes that are being used for hydrogen risk assessment, and the codes capabilities and validation status in terms of hydrogen related

  17. The Effect of Slot-Code Optimization in Warehouse Order Picking

    Directory of Open Access Journals (Sweden)

    Andrea Fumi

    2013-07-01

    most appropriate material handling resource configuration. Building on previous work on the effect of slot-code optimization on travel times in single/dual command cycles, the authors broaden the scope to include the most general picking case, thus widening the range of applicability and realising former suggestions for future research.

  18. RAID-6 reed-solomon codes with asymptotically optimal arithmetic complexities

    KAUST Repository

    Lin, Sian-Jheng

    2016-12-24

    In computer storage, RAID 6 is a level of RAID that can tolerate two failed drives. When RAID-6 is implemented by Reed-Solomon (RS) codes, the penalty of the writing performance is on the field multiplications in the second parity. In this paper, we present a configuration of the factors of the second-parity formula, such that the arithmetic complexity can reach the optimal complexity bound when the code length approaches infinity. In the proposed approach, the intermediate data used for the first parity is also utilized to calculate the second parity. To the best of our knowledge, this is the first approach supporting the RAID-6 RS codes to approach the optimal arithmetic complexity.

  19. Optimization and management in manufacturing engineering resource collaborative optimization and management through the Internet of Things

    CERN Document Server

    Liu, Xinbao; Liu, Lin; Cheng, Hao; Zhou, Mi; Pardalos, Panos M

    2017-01-01

    Problems facing manufacturing clusters that intersect information technology, process management, and optimization within the Internet of Things (IoT) are examined in this book. Recent advances in information technology have transformed the use of resources and data exchange, often leading to management and optimization problems attributable to technology limitations and strong market competition. This book discusses several problems and concepts which makes significant connections in the areas of information sharing, organization management, resource operations, and performance assessment. Geared toward practitioners and researchers, this treatment deepens the understanding between resource collaborative management and advanced information technology. Those in manufacturing will utilize the numerous mathematical models and methods offered to solve practical problems related to cutting stock, supply chain scheduling, and inventory management.  Academics and students with a basic knowledge of manufacturing, c...

  20. WASA-BOSS. Development and application of Severe Accident Codes. Evaluation and optimization of accident management measures. Subproject D. Study on water film cooling for PWR's passive containment cooling system. Final report

    International Nuclear Information System (INIS)

    Huang, Xi

    2016-07-01

    In the present study, a new phenomenological model was developed, to describe the water film flow under conditions of a passive containment cooling system (PCCS). The new model takes two different flow regimes into consideration, i.e. continuous water film and rivulets. For water film flow, the traditional Nusselt's was modified, to consider orientation angle and surface sheer stress. The transition from water film to rivulet as well as the structure of the stable rivulet at its onset point was modeled by using the minimum energy principle (MEP) combined with conservation equations. In addition, two different contact angles, i.e. advancing angle and retreating angle, were applied to take the hysteresis effect into consideration. The models of individual processes were validated as far as possible based on experimental data selected from open literature and from collaboration partner as well. With the models a new program module was developed and implemented into the COCOSYS program. The extended COCOSYS program was applied to analyze the containment behavior of the European generic containment and the performance of the passive containment cooling system ofthe AP1000. The results indicate clearly the importance of the new model and provide information for the optimization of the PCCS of AP1000.

  1. WASA-BOSS. Development and application of Severe Accident Codes. Evaluation and optimization of accident management measures. Subproject D. Study on water film cooling for PWR's passive containment cooling system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xi

    2016-07-15

    In the present study, a new phenomenological model was developed, to describe the water film flow under conditions of a passive containment cooling system (PCCS). The new model takes two different flow regimes into consideration, i.e. continuous water film and rivulets. For water film flow, the traditional Nusselt's was modified, to consider orientation angle and surface sheer stress. The transition from water film to rivulet as well as the structure of the stable rivulet at its onset point was modeled by using the minimum energy principle (MEP) combined with conservation equations. In addition, two different contact angles, i.e. advancing angle and retreating angle, were applied to take the hysteresis effect into consideration. The models of individual processes were validated as far as possible based on experimental data selected from open literature and from collaboration partner as well. With the models a new program module was developed and implemented into the COCOSYS program. The extended COCOSYS program was applied to analyze the containment behavior of the European generic containment and the performance of the passive containment cooling system ofthe AP1000. The results indicate clearly the importance of the new model and provide information for the optimization of the PCCS of AP1000.

  2. Optimal glucose management in the perioperative period.

    Science.gov (United States)

    Evans, Charity H; Lee, Jane; Ruhlman, Melissa K

    2015-04-01

    Hyperglycemia is a common finding in surgical patients during the perioperative period. Factors contributing to poor glycemic control include counterregulatory hormones, hepatic insulin resistance, decreased insulin-stimulated glucose uptake, use of dextrose-containing intravenous fluids, and enteral and parenteral nutrition. Hyperglycemia in the perioperative period is associated with increased morbidity, decreased survival, and increased resource utilization. Optimal glucose management in the perioperative period contributes to reduced morbidity and mortality. To readily identify hyperglycemia, blood glucose monitoring should be instituted for all hospitalized patients. Published by Elsevier Inc.

  3. Profitability and optimization of data management

    Energy Technology Data Exchange (ETDEWEB)

    Boussa, M. [Sonatrach, Alger (Algeria). Petroleum Engineering and Development

    2008-07-01

    Information systems and technologies for the oil and gas industry were discussed with particular reference to the use of data analysis in dynamic planning processes. This paper outlined the risks and challenges associated with reorganizing data systems and the costs associated with equipment and software purchases. Issues related to Intranet encryption and electronic commerce systems were also reviewed along with the impact of the Internet on the oil and gas industry. New methods for using real time data systems for updating well data were outlined together with recent developments in Intranet and Extranet technologies and services. Other topics of discussion included new software applications for network optimization and nodal analyses; industry-specific software developed for well testing and reservoir engineering; and simulation and management production software. Data management solutions for storing, retrieving and analyzing data streams were presented. It was concluded that successful organizations must develop accurate data systems in order to ensure continuing success. 4 refs., 8 figs.

  4. A study on the nuclear computer code maintenance and management system

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Huh, Young Hwan; Lee, Jong Bok; Choi, Young Gil; Suh, Soong Hyok; Kang, Byong Heon; Kim, Hee Kyung; Kim, Ko Ryeo; Park, Soo Jin

    1990-12-01

    According to current software development and quality assurance trends. It is necessary to develop computer code management system for nuclear programs. For this reason, the project started in 1987. Main objectives of the project are to establish a nuclear computer code management system, to secure software reliability, and to develop nuclear computer code packages. Contents of performing the project in this year were to operate and maintain computer code information system of KAERI computer codes, to develop application tool, AUTO-i, for solving the 1st and 2nd moments of inertia on polygon or circle, and to research nuclear computer code conversion between different machines. For better supporting the nuclear code availability and reliability, assistance from users who are using codes is required. Lastly, for easy reference about the codes information, we presented list of code names and information on the codes which were introduced or developed during this year. (Author)

  5. Optimal Management of Geothermal Heat Extraction

    Science.gov (United States)

    Patel, I. H.; Bielicki, J. M.; Buscheck, T. A.

    2015-12-01

    Geothermal energy technologies use the constant heat flux from the subsurface in order to produce heat or electricity for societal use. As such, a geothermal energy system is not inherently variable, like systems based on wind and solar resources, and an operator can conceivably control the rate at which heat is extracted and used directly, or converted into a commodity that is used. Although geothermal heat is a renewable resource, this heat can be depleted over time if the rate of heat extraction exceeds the natural rate of renewal (Rybach, 2003). For heat extraction used for commodities that are sold on the market, sustainability entails balancing the rate at which the reservoir renews with the rate at which heat is extracted and converted into profit, on a net present value basis. We present a model that couples natural resource economic approaches for managing renewable resources with simulations of geothermal reservoir performance in order to develop an optimal heat mining strategy that balances economic gain with the performance and renewability of the reservoir. Similar optimal control approaches have been extensively studied for renewable natural resource management of fisheries and forests (Bonfil, 2005; Gordon, 1954; Weitzman, 2003). Those models determine an optimal path of extraction of fish or timber, by balancing the regeneration of stocks of fish or timber that are not harvested with the profit from the sale of the fish or timber that is harvested. Our model balances the regeneration of reservoir temperature with the net proceeds from extracting heat and converting it to electricity that is sold to consumers. We used the Non-isothermal Unconfined-confined Flow and Transport (NUFT) model (Hao, Sun, & Nitao, 2011) to simulate the performance of a sedimentary geothermal reservoir under a variety of geologic and operational situations. The results of NUFT are incorporated into the natural resource economics model to determine production strategies that

  6. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    Science.gov (United States)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  7. Genetic algorithms used for PWRs refuel management automatic optimization: a new modelling

    International Nuclear Information System (INIS)

    Chapot, Jorge Luiz C.; Schirru, Roberto; Silva, Fernando Carvalho da

    1996-01-01

    A Genetic Algorithms-based system, linking the computer codes GENESIS 5.0 and ANC through the interface ALGER, has been developed aiming the PWRs fuel management optimization. An innovative codification, the Lists Model, has been incorporated to the genetic system, which avoids the use of variants of the standard crossover operator and generates only valid loading patterns in the core. The GENESIS/ALGER/ANC system has been successfully tested in an optimization study for Angra-1 second cycle. (author)

  8. Application of genetic algorithm in the fuel management optimization for the high flux engineering test reactor

    International Nuclear Information System (INIS)

    Shi Xueming; Wu Hongchun; Sun Shouhua; Liu Shuiqing

    2003-01-01

    The in-core fuel management optimization model based on the genetic algorithm has been established. An encode/decode technique based on the assemblies position is presented according to the characteristics of HFETR. Different reproduction strategies have been studied. The expert knowledge and the adaptive genetic algorithms are incorporated into the code to get the optimized loading patterns that can be used in HFETR

  9. Watershed Management Optimization Support Tool (WMOST) v3: Theoretical Documentation

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context, accounting fo...

  10. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Qing [Univ. of Colorado, Colorado Springs, CO (United States); Whaley, Richard Clint [Univ. of Texas, San Antonio, TX (United States); Qasem, Apan [Texas State Univ., San Marcos, TX (United States); Quinlan, Daniel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-11-23

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis, identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.

  11. Authorship attribution of source code by using back propagation neural network based on particle swarm optimization.

    Science.gov (United States)

    Yang, Xinyu; Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao

    2017-01-01

    Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead.

  12. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  13. The SWAN/NPSOL code system for multivariable multiconstraint shield optimization

    International Nuclear Information System (INIS)

    Watkins, E.F.; Greenspan, E.

    1995-01-01

    SWAN is a useful code for optimization of source-driven systems, i.e., systems for which the neutron and photon distribution is the solution of the inhomogeneous transport equation. Over the years, SWAN has been applied to the optimization of a variety of nuclear systems, such as minimizing the thickness of fusion reactor blankets and shields, the weight of space reactor shields, the cost for an ICF target chamber shield, and the background radiation for explosive detection systems and maximizing the beam quality for boron neutron capture therapy applications. However, SWAN's optimization module can handle up to a single constraint and was inefficient in handling problems with many variables. The purpose of this work is to upgrade SWAN's optimization capability

  14. Tunable wavefront coded imaging system based on detachable phase mask: Mathematical analysis, optimization and underlying applications

    Science.gov (United States)

    Zhao, Hui; Wei, Jingxuan

    2014-09-01

    The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.

  15. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    International Nuclear Information System (INIS)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.; Faletti, D.W.; Wiles, L.E.

    1978-05-01

    The User's Manual describes how to operate BNW-II, a computer code developed by the Pacific Northwest Laboratory (PNL) as a part of its activities under the Department of Energy (DOE) Dry Cooling Enhancement Program. The computer program offers a comprehensive method of evaluating the cost savings potential of dry/wet-cooled heat rejection systems. Going beyond simple ''figure-of-merit'' cooling tower optimization, this method includes such items as the cost of annual replacement capacity, and the optimum split between plant scale-up and replacement capacity, as well as the purchase and operating costs of all major heat rejection components. Hence the BNW-II code is a useful tool for determining potential cost savings of new dry/wet surfaces, new piping, or other components as part of an optimized system for a dry/wet-cooled plant

  16. [Symbol: see text]2 Optimized predictive image coding with [Symbol: see text]∞ bound.

    Science.gov (United States)

    Chuah, Sceuchin; Dumitrescu, Sorina; Wu, Xiaolin

    2013-12-01

    In many scientific, medical, and defense applications of image/video compression, an [Symbol: see text]∞ error bound is required. However, pure[Symbol: see text]∞-optimized image coding, colloquially known as near-lossless image coding, is prone to structured errors such as contours and speckles if the bit rate is not sufficiently high; moreover, most of the previous [Symbol: see text]∞-based image coding methods suffer from poor rate control. In contrast, the [Symbol: see text]2 error metric aims for average fidelity and hence preserves the subtlety of smooth waveforms better than the ∞ error metric and it offers fine granularity in rate control, but pure [Symbol: see text]2-based image coding methods (e.g., JPEG 2000) cannot bound individual errors as the [Symbol: see text]∞-based methods can. This paper presents a new compression approach to retain the benefits and circumvent the pitfalls of the two error metrics. A common approach of near-lossless image coding is to embed into a DPCM prediction loop a uniform scalar quantizer of residual errors. The said uniform scalar quantizer is replaced, in the proposed new approach, by a set of context-based [Symbol: see text]2-optimized quantizers. The optimization criterion is to minimize a weighted sum of the [Symbol: see text]2 distortion and the entropy while maintaining a strict [Symbol: see text]∞ error bound. The resulting method obtains good rate-distortion performance in both [Symbol: see text]2 and [Symbol: see text]∞ metrics and also increases the rate granularity. Compared with JPEG 2000, the new method not only guarantees lower [Symbol: see text]∞ error for all bit rates, but also it achieves higher PSNR for relatively high bit rates.

  17. Optimal management of genital herpes: current perspectives.

    Science.gov (United States)

    Sauerbrei, Andreas

    2016-01-01

    As one of the most common sexually transmitted diseases, genital herpes is a global medical problem with significant physical and psychological morbidity. Genital herpes is caused by herpes simplex virus type 1 or type 2 and can manifest as primary and/or recurrent infection. This manuscript provides an overview about the fundamental knowledge on the virus, its epidemiology, and infection. Furthermore, the current possibilities of antiviral therapeutic interventions and laboratory diagnosis of genital herpes as well as the present situation and perspectives for the treatment by novel antivirals and prevention of disease by vaccination are presented. Since the medical management of patients with genital herpes simplex virus infection is often unsatisfactory, this review aims at all physicians and health professionals who are involved in the care of patients with genital herpes. The information provided would help to improve the counseling of affected patients and to optimize the diagnosis, treatment, and prevention of this particular disease.

  18. OPAL- the in-core fuel management code system for WWER reactors

    International Nuclear Information System (INIS)

    Krysl, V.; Mikolas, P.; Sustek, J.; Svarny, J.; Vlachovsky, K.

    2002-01-01

    Fuel management optimization is a complex problem namely for WWER reactors, which at present are utilizing burnable poisons (BP) to great extent. In this paper, first the concept and methodologies of a fuel management system for WWER 440 (NPP Dukovany) and NPP WWER 1000 (NPP Temelin) under development in Skoda JS a.s. are described and followed by some practical applications. The objective of this advanced system is to minimize fuel cost by preserving all safety constraints and margins. Future enhancements of the system will allow is it to perform fuel management optimization in the multi-cycle mode. The general objective functions of the system are the maximization of EOC reactivity, the maximization of discharge burnup, the minimization of fresh fuel inventory / or the minimization of feed enrichment, the minimization of the BP inventory. There are also safety related constraints, in which the minimization of power peaking plays a dominant role. The core part of the system requires meeting the major objective: maximizing the EOC Keff for a given fuel cycle length and consists of four coupled calculation steps. The first is the calculation of a Loading Priority Scheme (LPS). which is used to rank the core positions in terms of assembly Kinf values. In the second step the Haling power distribution is calculated and by using fuel shuffle and/or enrichment splitting algorithms and heuristic rules the core pattern is modified to meet core constraints. In this second step a directive/evolutionary algorithm with expert rules based optimization code is used. The optimal BP assignment is alternatively considered to be a separate third step of the procedure. In the fourth step the core is depleted in normal up to 3D pin wise level using the BP distribution developed in step three and meeting all constraints is checked. One of the options of this optimization system is expert friendly interactive mode (Authors)

  19. Dynamic optimization the calculus of variations and optimal control in economics and management

    CERN Document Server

    Kamien, Morton I

    2012-01-01

    Since its initial publication, this text has defined courses in dynamic optimization taught to economics and management science students. The two-part treatment covers the calculus of variations and optimal control. 1998 edition.

  20. A NEM diffusion code for fuel management and time average core calculation

    International Nuclear Information System (INIS)

    Mishra, Surendra; Ray, Sherly; Kumar, A.N.

    2005-01-01

    A computer code based on Nodal expansion method has been developed for solving two groups three dimensional diffusion equation. This code can be used for fuel management and time average core calculation. Explicit Xenon and fuel temperature estimation are also incorporated in this code. TAPP-4 phase-B physics experimental results were analyzed using this code and a code based on FD method. This paper gives the comparison of the observed data and the results obtained with this code and FD code. (author)

  1. Development of free-piston Stirling engine performance and optimization codes based on Martini simulation technique

    Science.gov (United States)

    Martini, William R.

    1989-01-01

    A FORTRAN computer code is described that could be used to design and optimize a free-displacer, free-piston Stirling engine similar to the RE-1000 engine made by Sunpower. The code contains options for specifying displacer and power piston motion or for allowing these motions to be calculated by a force balance. The engine load may be a dashpot, inertial compressor, hydraulic pump or linear alternator. Cycle analysis may be done by isothermal analysis or adiabatic analysis. Adiabatic analysis may be done using the Martini moving gas node analysis or the Rios second-order Runge-Kutta analysis. Flow loss and heat loss equations are included. Graphical display of engine motions and pressures and temperatures are included. Programming for optimizing up to 15 independent dimensions is included. Sample performance results are shown for both specified and unconstrained piston motions; these results are shown as generated by each of the two Martini analyses. Two sample optimization searches are shown using specified piston motion isothermal analysis. One is for three adjustable input and one is for four. Also, two optimization searches for calculated piston motion are presented for three and for four adjustable inputs. The effect of leakage is evaluated. Suggestions for further work are given.

  2. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)

    2016-01-15

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation

  3. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm 3 , which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm 3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique

  4. Code of accounts. Management overview volume: Environmental restoration

    International Nuclear Information System (INIS)

    Fox, M.B.; Birkholz, H.L.

    1997-10-01

    The purpose of this procedure is to provide the requirement for assigning cost collection codes and the structure of these codes for all costs incurred for the Environmental Restoration Contract. The coding structure will be used in the budgeting and control of project costs

  5. GEMSFITS: Code package for optimization of geochemical model parameters and inverse modeling

    International Nuclear Information System (INIS)

    Miron, George D.; Kulik, Dmitrii A.; Dmytrieva, Svitlana V.; Wagner, Thomas

    2015-01-01

    Highlights: • Tool for generating consistent parameters against various types of experiments. • Handles a large number of experimental data and parameters (is parallelized). • Has a graphical interface and can perform statistical analysis on the parameters. • Tested on fitting the standard state Gibbs free energies of aqueous Al species. • Example on fitting interaction parameters of mixing models and thermobarometry. - Abstract: GEMSFITS is a new code package for fitting internally consistent input parameters of GEM (Gibbs Energy Minimization) geochemical–thermodynamic models against various types of experimental or geochemical data, and for performing inverse modeling tasks. It consists of the gemsfit2 (parameter optimizer) and gfshell2 (graphical user interface) programs both accessing a NoSQL database, all developed with flexibility, generality, efficiency, and user friendliness in mind. The parameter optimizer gemsfit2 includes the GEMS3K chemical speciation solver ( (http://gems.web.psi.ch/GEMS3K)), which features a comprehensive suite of non-ideal activity- and equation-of-state models of solution phases (aqueous electrolyte, gas and fluid mixtures, solid solutions, (ad)sorption. The gemsfit2 code uses the robust open-source NLopt library for parameter fitting, which provides a selection between several nonlinear optimization algorithms (global, local, gradient-based), and supports large-scale parallelization. The gemsfit2 code can also perform comprehensive statistical analysis of the fitted parameters (basic statistics, sensitivity, Monte Carlo confidence intervals), thus supporting the user with powerful tools for evaluating the quality of the fits and the physical significance of the model parameters. The gfshell2 code provides menu-driven setup of optimization options (data selection, properties to fit and their constraints, measured properties to compare with computed counterparts, and statistics). The practical utility, efficiency, and

  6. A study on the nuclear computer codes installation and management system

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Huh, Young Hwan; Kim, Hee Kyung; Kang, Byung Heon; Kim, Ko Ryeo; Suh, Soong Hyok; Choi, Young Gil; Lee, Jong Bok

    1990-12-01

    From 1987 a number of technical transfer related to nuclear power plant had been performed from C-E for YGN 3 and 4 construction. Among them, installation and management of the computer codes for YGN 3 and 4 fuel and nuclear steam supply system was one of the most important project. Main objectives of this project are to establish the nuclear computer code management system, to develop QA procedure for nuclear codes, to secure the nuclear code reliability and to extend techanical applicabilities including the user-oriented utility programs for nuclear codes. Contents of performing the project in this year was to produce 215 transmittal packages of nuclear codes installation including making backup magnetic tape and microfiche for software quality assurance. Lastly, for easy reference about the nuclear codes information we presented list of code names and information on the codes which were introduced from C-E. (Author)

  7. A nuclear reload optimization approach using a real coded genetic algorithm with random keys

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    The fuel reload of a Pressurized Water Reactor is made whenever the burn up of the fuel assemblies in the nucleus of the reactor reaches a certain value such that it is not more possible to maintain a critical reactor producing energy at nominal power. The problem of fuel reload optimization consists on determining the positioning of the fuel assemblies within the nucleus of the reactor in an optimized way to minimize the cost benefit relationship of fuel assemblies cost per maximum burn up, and also satisfying symmetry and safety restrictions. The fuel reload optimization problem difficulty grows exponentially with the number of fuel assemblies in the nucleus of the reactor. During decades the fuel reload optimization problem was solved manually by experts that used their knowledge and experience to build configurations of the reactor nucleus, and testing them to verify if safety restrictions of the plant are satisfied. To reduce this burden, several optimization techniques have been used, included the binary code genetic algorithm. In this work we show the use of a real valued coded approach of the genetic algorithm, with different recombination methods, together with a transformation mechanism called random keys, to transform the real values of the genes of each chromosome in a combination of discrete fuel assemblies for evaluation of the reload optimization. Four different recombination methods were tested: discrete recombination, intermediate recombination, linear recombination and extended linear recombination. For each of the 4 recombination methods 10 different tests using different seeds for the random number generator were conducted 10 generating, totaling 40 tests. The results of the application of the genetic algorithm are shown with formulation of real numbers for the problem of the nuclear reload of the plant Angra 1 type PWR. Since the best results in the literature for this problem were found by the parallel PSO we will it use for comparison

  8. Optimized Irregular Low-Density Parity-Check Codes for Multicarrier Modulations over Frequency-Selective Channels

    Directory of Open Access Journals (Sweden)

    Valérian Mannoni

    2004-09-01

    Full Text Available This paper deals with optimized channel coding for OFDM transmissions (COFDM over frequency-selective channels using irregular low-density parity-check (LDPC codes. Firstly, we introduce a new characterization of the LDPC code irregularity called “irregularity profile.” Then, using this parameterization, we derive a new criterion based on the minimization of the transmission bit error probability to design an irregular LDPC code suited to the frequency selectivity of the channel. The optimization of this criterion is done using the Gaussian approximation technique. Simulations illustrate the good performance of our approach for different transmission channels.

  9. Optimization of the Penelope code in F language for the simulation of the X-ray spectrum in radiodiagnosis

    International Nuclear Information System (INIS)

    Ballon P, C. I.; Quispe V, N. Y.; Vega R, J. L. J.

    2017-10-01

    The computational simulation to obtain the X-ray spectrum in the range of radio-diagnosis, allows a study and advance knowledge of the transport process of X-rays in the interaction with matter using the Monte Carlo method. With the obtaining of the X-ray spectra we can know the dose that the patient receives when he undergoes a radiographic study or CT, improving the quality of the obtained image. The objective of the present work was to implement and optimize the open source Penelope (Monte Carlo code for the simulation of the transport of electrons and photons in the matter) 2008 version programming extra code in functional language F, managing to double the processing speed, thus reducing the simulation time spent and errors when optimizing the software initially programmed in Fortran 77. The results were compared with those of Penelope, obtaining a good concordance. We also simulated the obtaining of a Pdd curve (depth dose profile) for a Theratron Equinox cobalt-60 teletherapy device, also validating the software implemented for high energies. (Author)

  10. Multiple Description Coding Based on Optimized Redundancy Removal for 3D Depth Map

    Directory of Open Access Journals (Sweden)

    Sen Han

    2016-06-01

    Full Text Available Multiple description (MD coding is a promising alternative for the robust transmission of information over error-prone channels. In 3D image technology, the depth map represents the distance between the camera and objects in the scene. Using the depth map combined with the existing multiview image, it can be efficient to synthesize images of any virtual viewpoint position, which can display more realistic 3D scenes. Differently from the conventional 2D texture image, the depth map contains a lot of spatial redundancy information, which is not necessary for view synthesis, but may result in the waste of compressed bits, especially when using MD coding for robust transmission. In this paper, we focus on the redundancy removal of MD coding based on the DCT (discrete cosine transform domain. In view of the characteristics of DCT coefficients, at the encoder, a Lagrange optimization approach is designed to determine the amounts of high frequency coefficients in the DCT domain to be removed. It is noted considering the low computing complexity that the entropy is adopted to estimate the bit rate in the optimization. Furthermore, at the decoder, adaptive zero-padding is applied to reconstruct the depth map when some information is lost. The experimental results have shown that compared to the corresponding scheme, the proposed method demonstrates better rate central and side distortion performance.

  11. The SWAN-SCALE code for the optimization of critical systems

    International Nuclear Information System (INIS)

    Greenspan, E.; Karni, Y.; Regev, D.; Petrie, L.M.

    1999-01-01

    The SWAN optimization code was recently developed to identify the maximum value of k eff for a given mass of fissile material when in combination with other specified materials. The optimization process is iterative; in each iteration SWAN varies the zone-dependent concentration of the system constituents. This change is guided by the equal volume replacement effectiveness functions (EVREF) that SWAN generates using first-order perturbation theory. Previously, SWAN did not have provisions to account for the effect of the composition changes on neutron cross-section resonance self-shielding; it used the cross sections corresponding to the initial system composition. In support of the US Department of Energy Nuclear Criticality Safety Program, the authors recently removed the limitation on resonance self-shielding by coupling SWAN with the SCALE code package. The purpose of this paper is to briefly describe the resulting SWAN-SCALE code and to illustrate the effect that neutron cross-section self-shielding could have on the maximum k eff and on the corresponding system composition

  12. Optimal Replacement and Management Policies for Beef Cows

    OpenAIRE

    W. Marshall Frasier; George H. Pfeiffer

    1994-01-01

    Beef cow replacement studies have not reflected the interaction between herd management and the culling decision. We demonstrate techniques for modeling optimal beef cow replacement intervals and discrete management policies by incorporating the dynamic effects of management on future productivity when biological response is uncertain. Markovian decision analysis is used to identify optimal beef cow management on a ranch typical of the Sandhills region of Nebraska. Issues of breeding season l...

  13. Optimal management of idiopathic scoliosis in adolescence

    Directory of Open Access Journals (Sweden)

    Kotwicki T

    2013-07-01

    the need for surgical treatment. Surgery is the treatment of choice for severe idiopathic scoliosis which is rapidly progressive, with early onset, late diagnosis, and neglected or failed conservative treatment. The psychologic impact of idiopathic scoliosis, a chronic disease occurring in the psychologically fragile period of adolescence, is important because of its body distorting character and the onerous treatment required, either conservative or surgical. Optimal management of idiopathic scoliosis requires cooperation within a professional team which includes the entire therapeutic spectrum, extending from simple watchful observation of nonprogressive mild deformities through to early surgery for rapidly deteriorating curvature. Probably most demanding is adequate management with regard to the individual course of the disease in a given patient, while avoiding overtreatment or undertreatment. Keywords: management, idiopathic scoliosis, adolescence

  14. Three-dimensional polarization marked multiple-QR code encryption by optimizing a single vectorial beam

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Hua, Binbin; Wang, Zhisong

    2015-10-01

    We demonstrate the feasibility of three dimensional (3D) polarization multiplexing by optimizing a single vectorial beam using a multiple-signal window multiple-plane (MSW-MP) phase retrieval algorithm. Original messages represented with multiple quick response (QR) codes are first partitioned into a series of subblocks. Then, each subblock is marked with a specific polarization state and randomly distributed in 3D space with both longitudinal and transversal adjustable freedoms. A generalized 3D polarization mapping protocol is established to generate a 3D polarization key. Finally, multiple-QR code is encrypted into one phase only mask and one polarization only mask based on the modified Gerchberg-Saxton (GS) algorithm. We take the polarization mask as the cyphertext and the phase only mask as additional dimension of key. Only when both the phase key and 3D polarization key are correct, original messages can be recovered. We verify our proposal with both simulation and experiment evidences.

  15. Dynamic systems of regional economy management optimization

    Science.gov (United States)

    Trofimov, S.; Kudzh, S.

    directions of an industrial policy of region. The situational-analytical centers (SAC) of regional administration The major component of SAC is dynamic modeling, analysis, forecasting and optimization systems, based on modern intellectual information technologies. Spheres of SAC are not only financial streams management and investments optimization, but also strategic forecasting functions, which provide an optimum choice, "aiming", search of optimum ways of regional development and corresponding investments. It is expedient to consider an opportunity of formation of the uniform organizational-methodical center of an industrial policy of region. This organization can be directly connected to the scheduled-analytical services of the largest economic structures, local authorities, the ministries and departments. Such "direct communication" is capable to provide an effective regional development strategic management. Anyway, the output on foreign markets demands concentration of resources and support of authorities. Offered measures are capable to provide a necessary coordination of efforts of a various level economic structures. For maintenance of a regional industrial policy an attraction of all newest methods of strategic planning and management is necessary. Their activity should be constructed on the basis of modern approaches of economic systems management, cause the essence of an industrial policy is finally reduced to an effective regional and corporate economic activities control centers formation. Opportunities of optimum regional economy planning and management as uniform system Approaches to planning regional economic systems can be different. We will consider some most effective methods of planning and control over a regional facilities condition. All of them are compact and evident, that allows to put them into the group of average complexity technologies. At the decision of problems of a regional resource management is rather perspective the so

  16. Optimal energy management of a hybrid electric powertrain system using improved particle swarm optimization

    International Nuclear Information System (INIS)

    Chen, Syuan-Yi; Hung, Yi-Hsuan; Wu, Chien-Hsun; Huang, Siang-Ting

    2015-01-01

    Highlights: • Online sub-optimal energy management using IPSO. • A second-order HEV model with 5 major segments was built. • IPSO with equivalent-fuel fitness function using 5 particles. • Engine, rule-based control, PSO, IPSO and ECMS are compared. • Max. 31+% fuel economy and 56+% energy consumption improved. - Abstract: This study developed an online suboptimal energy management system by using improved particle swarm optimization (IPSO) for engine/motor hybrid electric vehicles. The vehicle was modeled on the basis of second-order dynamics, and featured five major segments: a battery, a spark ignition engine, a lithium battery, transmission and vehicle dynamics, and a driver model. To manage the power distribution of dual power sources, the IPSO was equipped with three inputs (rotational speed, battery state-of-charge, and demanded torque) and one output (power split ratio). Five steps were developed for IPSO: (1) initialization; (2) determination of the fitness function; (3) selection and memorization; (4) modification of position and velocity; and (5) a stopping rule. Equivalent fuel consumption by the engine and motor was used as the fitness function with five particles, and the IPSO-based vehicle control unit was completed and integrated with the vehicle simulator. To quantify the energy improvement of IPSO, a four-mode rule-based control (system ready, motor only, engine only, and hybrid modes) was designed according to the engine efficiency and rotational speed. A three-loop Equivalent Consumption Minimization Strategy (ECMS) was coded as the best case. The simulation results revealed that IPSO searches the optimal solution more efficiently than conventional PSO does. In two standard driving cycles, ECE and FTP, the improvements in the equivalent fuel consumption and energy consumption compared to baseline were (24.25%, 45.27%) and (31.85%, 56.41%), respectively, for the IPSO. The CO_2 emission for all five cases (pure engine, rule-based, PSO

  17. Optimal management with hybrid dynamics : The shallow lake problem

    NARCIS (Netherlands)

    Reddy, P.V.; Schumacher, Hans; Engwerda, Jacob; Camlibel, M.K.; Julius, A.A.; Pasumarthy, R.

    2015-01-01

    In this article we analyze an optimal management problem that arises in ecological economics using hybrid systems modeling. First, we introduce a discounted autonomous infinite horizon hybrid optimal control problem and develop few tools to analyze the necessary conditions for optimality. Next,

  18. Optimal management of non-Markovian biological populations

    Science.gov (United States)

    Williams, B.K.

    2007-01-01

    Wildlife populations typically are described by Markovian models, with population dynamics influenced at each point in time by current but not previous population levels. Considerable work has been done on identifying optimal management strategies under the Markovian assumption. In this paper we generalize this work to non-Markovian systems, for which population responses to management are influenced by lagged as well as current status and/or controls. We use the maximum principle of optimal control theory to derive conditions for the optimal management such a system, and illustrate the effects of lags on the structure of optimal habitat strategies for a predator-prey system.

  19. A strategy for optimizing item-pool management

    NARCIS (Netherlands)

    Ariel, A.; van der Linden, Willem J.; Veldkamp, Bernard P.

    2006-01-01

    Item-pool management requires a balancing act between the input of new items into the pool and the output of tests assembled from it. A strategy for optimizing item-pool management is presented that is based on the idea of a periodic update of an optimal blueprint for the item pool to tune item

  20. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.; Faletti, D.W.; Wiles, L.E.

    1978-05-01

    This volume provides a listing of the BNW-II dry/wet ammonia heat rejection optimization code and is an appendix to Volume I which gives a narrative description of the code's algorithms as well as logic, input and output information.

  1. An effective coded excitation scheme based on a predistorted FM signal and an optimized digital filter

    DEFF Research Database (Denmark)

    Misaridis, Thanasis; Jensen, Jørgen Arendt

    1999-01-01

    This paper presents a coded excitation imaging system based on a predistorted FM excitation and a digital compression filter designed for medical ultrasonic applications, in order to preserve both axial resolution and contrast. In radars, optimal Chebyshev windows efficiently weight a nearly...... as with pulse excitation (about 1.5 lambda), depending on the filter design criteria. The axial sidelobes are below -40 dB, which is the noise level of the measuring imaging system. The proposed excitation/compression scheme shows good overall performance and stability to the frequency shift due to attenuation...... be removed by weighting. We show that by using a predistorted chirp with amplitude or phase shaping for amplitude ripple reduction and a correlation filter that accounts for the transducer's natural frequency weighting, output sidelobe levels of -35 to -40 dB are directly obtained. When an optimized filter...

  2. An Order Coding Genetic Algorithm to Optimize Fuel Reloads in a Nuclear Boiling Water Reactor

    International Nuclear Information System (INIS)

    Ortiz, Juan Jose; Requena, Ignacio

    2004-01-01

    A genetic algorithm is used to optimize the nuclear fuel reload for a boiling water reactor, and an order coding is proposed for the chromosomes and appropriate crossover and mutation operators. The fitness function was designed so that the genetic algorithm creates fuel reloads that, on one hand, satisfy the constrictions for the radial power peaking factor, the minimum critical power ratio, and the maximum linear heat generation rate while optimizing the effective multiplication factor at the beginning and end of the cycle. To find the values of these variables, a neural network trained with the behavior of a reactor simulator was used to predict them. The computation time is therefore greatly decreased in the search process. We validated this method with data from five cycles of the Laguna Verde Nuclear Power Plant in Mexico

  3. Optimal management of idiopathic scoliosis in adolescence

    Science.gov (United States)

    Kotwicki, Tomasz; Chowanska, Joanna; Kinel, Edyta; Czaprowski, Dariusz; Tomaszewski, Marek; Janusz, Piotr

    2013-01-01

    . Optimal management of idiopathic scoliosis requires cooperation within a professional team which includes the entire therapeutic spectrum, extending from simple watchful observation of nonprogressive mild deformities through to early surgery for rapidly deteriorating curvature. Probably most demanding is adequate management with regard to the individual course of the disease in a given patient, while avoiding overtreatment or undertreatment. PMID:24600296

  4. Outage Analysis and Optimization of SWIPT in Network-Coded Two-Way Relay Networks

    Directory of Open Access Journals (Sweden)

    Ruihong Jiang

    2017-01-01

    Full Text Available This paper investigates the outage performance of simultaneous wireless information and power transfer (SWIPT in network-coded two-way relay systems, where a relay first harvests energy from the signals transmitted by two sources and then uses the harvested energy to forward the received information to the two sources. We consider two transmission protocols, power splitting two-way relay (PS-TWR and time switching two-way relay (TS-TWR protocols. We present two explicit expressions for the system outage probability of the two protocols and further derive approximate expressions for them in high and low SNR cases. To explore the system performance limits, two optimization problems are formulated to minimize the system outage probability. Since the problems are nonconvex and have no known solution methods, a genetic algorithm- (GA- based algorithm is designed. Numerical and simulation results validate our theoretical analysis. It is shown that, by jointly optimizing the time assignment and SWIPT receiver parameters, a great performance gain can be achieved for both PS-TWR and TS-TWR. Moreover, the optimized PS-TWR always outperforms the optimized TS-TWR in terms of outage performance. Additionally, the effects of parameters including relay location and transmit powers are also discussed, which provide some insights for the SWIPT-enabled two-way relay networks.

  5. Numerical optimization of the ramp-down phase with the RAPTOR code

    Science.gov (United States)

    Teplukhina, Anna; Sauter, Olivier; Felici, Federico; The Tcv Team; The ASDEX-Upgrade Team; The Eurofusion Mst1 Team

    2017-10-01

    The ramp-down optimization goal in this work is defined as the fastest possible decrease of a plasma current while avoiding any disruptions caused by reaching physical or technical limits. Numerical simulations and preliminary experiments on TCV and AUG have shown that a fast decrease of plasma elongation and an adequate timing of the H-L transition during current ramp-down can help to avoid reaching high values of the plasma internal inductance. The RAPTOR code (F. Felici et al., 2012 PPCF 54; F. Felici, 2011 EPFL PhD thesis), developed for real-time plasma control, has been used for an optimization problem solving. Recently the transport model has been extended to include the ion temperature and electron density transport equations in addition to the electron temperature and current density transport equations, increasing the physical applications of the code. The gradient-based models for the transport coefficients (O. Sauter et al., 2014 PPCF 21; D. Kim et al., 2016 PPCF 58) have been implemented to RAPTOR and tested during this work. Simulations of the AUG and TCV entire plasma discharges will be presented. See the author list of S. Coda et al., Nucl. Fusion 57 2017 102011.

  6. SPEXTRA: Optimal extraction code for long-slit spectra in crowded fields

    Science.gov (United States)

    Sarkisyan, A. N.; Vinokurov, A. S.; Solovieva, Yu. N.; Sholukhova, O. N.; Kostenkov, A. E.; Fabrika, S. N.

    2017-10-01

    We present a code for the optimal extraction of long-slit 2D spectra in crowded stellar fields. Its main advantage and difference from the existing spectrum extraction codes is the presence of a graphical user interface (GUI) and a convenient visualization system of data and extraction parameters. On the whole, the package is designed to study stars in crowded fields of nearby galaxies and star clusters in galaxies. Apart from the spectrum extraction for several stars which are closely located or superimposed, it allows the spectra of objects to be extracted with subtraction of superimposed nebulae of different shapes and different degrees of ionization. The package can also be used to study single stars in the case of a strong background. In the current version, the optimal extraction of 2D spectra with an aperture and the Gaussian function as PSF (point spread function) is proposed. In the future, the package will be supplemented with the possibility to build a PSF based on a Moffat function. We present the details of GUI, illustrate main features of the package, and show results of extraction of the several interesting spectra of objects from different telescopes.

  7. Optimization and Openmp Parallelization of a Discrete Element Code for Convex Polyhedra on Multi-Core Machines

    Science.gov (United States)

    Chen, Jian; Matuttis, Hans-Georg

    2013-02-01

    We report our experiences with the optimization and parallelization of a discrete element code for convex polyhedra on multi-core machines and introduce a novel variant of the sort-and-sweep neighborhood algorithm. While in theory the whole code in itself parallelizes ideally, in practice the results on different architectures with different compilers and performance measurement tools depend very much on the particle number and optimization of the code. After difficulties with the interpretation of the data for speedup and efficiency are overcome, respectable parallelization speedups could be obtained.

  8. Service Operations Optimization: Recent Development in Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Bin Shen

    2015-01-01

    Full Text Available Services are the key of success in operation management. Designing the effective strategies by optimization techniques is the fundamental and important condition for performance increase in service operations (SOs management. In this paper, we mainly focus on investigating SOs optimization in the areas of supply chain management, which create the greatest business values. Specifically, we study the recent development of SOs optimization associated with supply chain by categorizing them into four different industries (i.e., e-commerce industry, consumer service industry, public sector, and fashion industry and four various SOs features (i.e., advertising, channel coordination, pricing, and inventory. Moreover, we conduct the technical review on the stylish industries/topics and typical optimization models. The classical optimization approaches for SOs management in supply chain are presented. The managerial implications of SOs in supply chain are discussed.

  9. OPTIMAL CONTROL THEORY FOR SUSTAINABLE ENVIRONMENTAL MANAGEMENT

    Science.gov (United States)

    With growing world population, diminishing resources, and realization of the harmful effects of various pollutants, research focus in environmental management has shifted towards sustainability. The goal of a sustainable management strategy is to promote the structure and operati...

  10. Optimization of reload of nuclear power plants using ACO together with the GENES reactor physics code

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Alan M.M. de; Freire, Fernando S.; Nicolau, Andressa S.; Schirru, Roberto, E-mail: alan@lmp.ufrj.br, E-mail: andressa@lmp.ufrj.br, E-mail: schirru@lmp.ufrj.br, E-mail: ffreire@eletronuclear.gov.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil); Eletrobras Termonuclear S.A. (ELETRONUCLEAR), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    The Nuclear reload of a Pressurized Water Reactor (PWR) occurs whenever the burning of the fuel elements can no longer maintain the criticality of the reactor, that is, it cannot maintain the Nuclear power plant operates within its nominal power. Nuclear reactor reload optimization problem consists of finding a loading pattern of fuel assemblies in the reactor core in order to minimize the cost/benefit ratio, trying to obtain maximum power generation with a minimum of cost, since in all reloads an average of one third of the new fuel elements are purchased. This loading pattern must also satisfy constraints of symmetry and security. In practice, it consists of the placing 121 fuel elements in 121 core positions, in the case of the Angra 1 Brazilian Nuclear Power Plant (NPP), making this new arrangement provide the best cost/benefit ratio. It is an extremely complex problem, since it has around 1% of great places. A core of 121 fuel elements has approximately 10{sup 13} combinations and 10{sup 11} great locations. With this number of possible combinations it is impossible to test all, in order to choose the best. In this work a system called ACO-GENES is proposed in order to optimization the Nuclear Reactor Reload Problem. ACO is successfully used in combination problems, and it is expected that ACO-GENES will show a robust optimization system, since in addition to optimizing ACO, it allows important prior knowledge such as K infinite, burn, etc. After optimization by ACO-GENES, the best results will be validated by a licensed reactor physics code and will be compared with the actual results of the cycle. (author)

  11. Optimization of reload of nuclear power plants using ACO together with the GENES reactor physics code

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Freire, Fernando S.; Nicolau, Andressa S.; Schirru, Roberto

    2017-01-01

    The Nuclear reload of a Pressurized Water Reactor (PWR) occurs whenever the burning of the fuel elements can no longer maintain the criticality of the reactor, that is, it cannot maintain the Nuclear power plant operates within its nominal power. Nuclear reactor reload optimization problem consists of finding a loading pattern of fuel assemblies in the reactor core in order to minimize the cost/benefit ratio, trying to obtain maximum power generation with a minimum of cost, since in all reloads an average of one third of the new fuel elements are purchased. This loading pattern must also satisfy constraints of symmetry and security. In practice, it consists of the placing 121 fuel elements in 121 core positions, in the case of the Angra 1 Brazilian Nuclear Power Plant (NPP), making this new arrangement provide the best cost/benefit ratio. It is an extremely complex problem, since it has around 1% of great places. A core of 121 fuel elements has approximately 10"1"3 combinations and 10"1"1 great locations. With this number of possible combinations it is impossible to test all, in order to choose the best. In this work a system called ACO-GENES is proposed in order to optimization the Nuclear Reactor Reload Problem. ACO is successfully used in combination problems, and it is expected that ACO-GENES will show a robust optimization system, since in addition to optimizing ACO, it allows important prior knowledge such as K infinite, burn, etc. After optimization by ACO-GENES, the best results will be validated by a licensed reactor physics code and will be compared with the actual results of the cycle. (author)

  12. Optimizing antibiotic usage in hospitals: a qualitative study of the perspectives of hospital managers.

    Science.gov (United States)

    Broom, A; Gibson, A F; Broom, J; Kirby, E; Yarwood, T; Post, J J

    2016-11-01

    Antibiotic optimization in hospitals is an increasingly critical priority in the context of proliferating resistance. Despite the emphasis on doctors, optimizing antibiotic use within hospitals requires an understanding of how different stakeholders, including non-prescribers, influence practice and practice change. This study was designed to understand Australian hospital managers' perspectives on antimicrobial resistance, managing antibiotic governance, and negotiating clinical vis-à-vis managerial priorities. Twenty-three managers in three hospitals participated in qualitative semi-structured interviews in Australia in 2014 and 2015. Data were systematically coded and thematically analysed. The findings demonstrate, from a managerial perspective: (1) competing demands that can hinder the prioritization of antibiotic governance; (2) ineffectiveness of audit and monitoring methods that limit rationalization for change; (3) limited clinical education and feedback to doctors; and (4) management-directed change processes are constrained by the perceived absence of a 'culture of accountability' for antimicrobial use amongst doctors. Hospital managers report considerable structural and interprofessional challenges to actualizing antibiotic optimization and governance. These challenges place optimization as a lower priority vis-à-vis other issues that management are confronted with in hospital settings, and emphasize the importance of antimicrobial stewardship (AMS) programmes that engage management in understanding and addressing the barriers to change. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  13. Application of startup/core management code system to YGN 3 startup testing

    International Nuclear Information System (INIS)

    Chi, Sung Goo; Hah, Yung Joon; Doo, Jin Yong; Kim, Dae Kyum

    1995-01-01

    YGN 3 is the first nuclear power plant in Korea to use the fixed incore detector system for startup testing and core management. The startup/core management code system was developed from existing ABB-C-E codes and applied for YGN 3 startup testing, especially for physics and CPC(Core Protection Calculator)/COLSS (Core Operating Limit Supervisory System) related testing. The startup/core management code system consists of startup codes which include the CEBASE, CECOR, CEFAST and CEDOPS, and startup data reduction codes which include FLOWRATE, COREPERF, CALMET, and VARTAV. These codes were implemented on an HP/Apollo model 9000 series 400 workstation at the YGN 3 site and successfully applied to startup testing and core management. The startup codes made a great contribution in upgrading the reliability of test results and reducing the test period by taking and analyzing core data automatically. The data reduction code saved the manpower and time for test data reduction and decreased the chance for error in the analysis. It is expected that this code system will make similar contributions for reducing the startup testing duration of YGN 4 and UCN3,4

  14. Development of Monte Carlo-based pebble bed reactor fuel management code

    International Nuclear Information System (INIS)

    Setiadipura, Topan; Obara, Toru

    2014-01-01

    Highlights: • A new Monte Carlo-based fuel management code for OTTO cycle pebble bed reactor was developed. • The double-heterogeneity was modeled using statistical method in MVP-BURN code. • The code can perform analysis of equilibrium and non-equilibrium phase. • Code-to-code comparisons for Once-Through-Then-Out case were investigated. • Ability of the code to accommodate the void cavity was confirmed. - Abstract: A fuel management code for pebble bed reactors (PBRs) based on the Monte Carlo method has been developed in this study. The code, named Monte Carlo burnup analysis code for PBR (MCPBR), enables a simulation of the Once-Through-Then-Out (OTTO) cycle of a PBR from the running-in phase to the equilibrium condition. In MCPBR, a burnup calculation based on a continuous-energy Monte Carlo code, MVP-BURN, is coupled with an additional utility code to be able to simulate the OTTO cycle of PBR. MCPBR has several advantages in modeling PBRs, namely its Monte Carlo neutron transport modeling, its capability of explicitly modeling the double heterogeneity of the PBR core, and its ability to model different axial fuel speeds in the PBR core. Analysis at the equilibrium condition of the simplified PBR was used as the validation test of MCPBR. The calculation results of the code were compared with the results of diffusion-based fuel management PBR codes, namely the VSOP and PEBBED codes. Using JENDL-4.0 nuclide library, MCPBR gave a 4.15% and 3.32% lower k eff value compared to VSOP and PEBBED, respectively. While using JENDL-3.3, MCPBR gave a 2.22% and 3.11% higher k eff value compared to VSOP and PEBBED, respectively. The ability of MCPBR to analyze neutron transport in the top void of the PBR core and its effects was also confirmed

  15. Optimization of Candu fuel management with gradient methods using generalized perturbation theory

    International Nuclear Information System (INIS)

    Chambon, R.; Varin, E.; Rozon, D.

    2005-01-01

    CANDU fuel management problems are solved using time-average representation of the core. Optimization problems based on this representation have been defined in the early nineties. The mathematical programming using the generalized perturbation theory (GPT) that was developed has been implemented in the reactor code DONJON. The use of the augmented Lagrangian (AL) method is presented and evaluated in this paper. This approach is mandatory for new constraint problems. Combined with the classical Lemke method, it proves to be very efficient to reach optimal solution in a very limited number of iterations. (authors)

  16. Optimal inventory management and order book modeling

    KAUST Repository

    Baradel, Nicolas

    2018-02-16

    We model the behavior of three agent classes acting dynamically in a limit order book of a financial asset. Namely, we consider market makers (MM), high-frequency trading (HFT) firms, and institutional brokers (IB). Given a prior dynamic of the order book, similar to the one considered in the Queue-Reactive models [14, 20, 21], the MM and the HFT define their trading strategy by optimizing the expected utility of terminal wealth, while the IB has a prescheduled task to sell or buy many shares of the considered asset. We derive the variational partial differential equations that characterize the value functions of the MM and HFT and explain how almost optimal control can be deduced from them. We then provide a first illustration of the interactions that can take place between these different market participants by simulating the dynamic of an order book in which each of them plays his own (optimal) strategy.

  17. Heuristic rules embedded genetic algorithm for in-core fuel management optimization

    Science.gov (United States)

    Alim, Fatih

    The objective of this study was to develop a unique methodology and a practical tool for designing loading pattern (LP) and burnable poison (BP) pattern for a given Pressurized Water Reactor (PWR) core. Because of the large number of possible combinations for the fuel assembly (FA) loading in the core, the design of the core configuration is a complex optimization problem. It requires finding an optimal FA arrangement and BP placement in order to achieve maximum cycle length while satisfying the safety constraints. Genetic Algorithms (GA) have been already used to solve this problem for LP optimization for both PWR and Boiling Water Reactor (BWR). The GA, which is a stochastic method works with a group of solutions and uses random variables to make decisions. Based on the theories of evaluation, the GA involves natural selection and reproduction of the individuals in the population for the next generation. The GA works by creating an initial population, evaluating it, and then improving the population by using the evaluation operators. To solve this optimization problem, a LP optimization package, GARCO (Genetic Algorithm Reactor Code Optimization) code is developed in the framework of this thesis. This code is applicable for all types of PWR cores having different geometries and structures with an unlimited number of FA types in the inventory. To reach this goal, an innovative GA is developed by modifying the classical representation of the genotype. To obtain the best result in a shorter time, not only the representation is changed but also the algorithm is changed to use in-core fuel management heuristics rules. The improved GA code was tested to demonstrate and verify the advantages of the new enhancements. The developed methodology is explained in this thesis and preliminary results are shown for the VVER-1000 reactor hexagonal geometry core and the TMI-1 PWR. The improved GA code was tested to verify the advantages of new enhancements. The core physics code

  18. OPT13B and OPTIM4 - computer codes for optical model calculations

    International Nuclear Information System (INIS)

    Pal, S.; Srivastava, D.K.; Mukhopadhyay, S.; Ganguly, N.K.

    1975-01-01

    OPT13B is a computer code in FORTRAN for optical model calculations with automatic search. A summary of different formulae used for computation is given. Numerical methods are discussed. The 'search' technique followed to obtain the set of optical model parameters which produce best fit to experimental data in a least-square sense is also discussed. Different subroutines of the program are briefly described. Input-output specifications are given in detail. A modified version of OPT13B specifications are given in detail. A modified version of OPT13B is OPTIM4. It can be used for optical model calculations where the form factors of different parts of the optical potential are known point by point. A brief description of the modifications is given. (author)

  19. Review of dynamic optimization methods in renewable natural resource management

    Science.gov (United States)

    Williams, B.K.

    1989-01-01

    In recent years, the applications of dynamic optimization procedures in natural resource management have proliferated. A systematic review of these applications is given in terms of a number of optimization methodologies and natural resource systems. The applicability of the methods to renewable natural resource systems are compared in terms of system complexity, system size, and precision of the optimal solutions. Recommendations are made concerning the appropriate methods for certain kinds of biological resource problems.

  20. Asset Allocation and Optimal Contract for Delegated Portfolio Management

    Science.gov (United States)

    Liu, Jingjun; Liang, Jianfeng

    This article studies the portfolio selection and the contracting problems between an individual investor and a professional portfolio manager in a discrete-time principal-agent framework. Portfolio selection and optimal contracts are obtained in closed form. The optimal contract was composed with the fixed fee, the cost, and the fraction of excess expected return. The optimal portfolio is similar to the classical two-fund separation theorem.

  1. IAEA provisional code of practice on management of radioactive waste from nuclear power plants

    International Nuclear Information System (INIS)

    1982-10-01

    This Code of Practice defines the minimum requirements for operations and design of structures, systems and components important for management of wastes from thermal nuclear power plants. It emphasizes what safety requirements shall be met rather than specifies how these requirements can be met; the latter aspect is covered in Safety Guides. The Code defines the need for a Government to assume responsibility for regulating waste management practices in conjunction with the regulation of a nuclear power plant. The Code does not prejudge the organization of the regulatory authority, which may differ from one Member State to another, and may involve more than one body. Similarly, the Code does not deal specifically with the functions of a regulatory authority responsible for such matters, although it may be of value to Member States in providing a basis for consideration of such functions. The Code deals with the entire management system for all wastes from nuclear power plants embodying thermal reactors including PWR, BWR, HWR and HTGR technologies. Topics included are: design, normal and abnormal operation, and regulation of management systems for gaseous, liquid and solid wastes, including decommissioning wastes. The Code includes measures to be taken with regard to the wastes arising from spent fuel management at nuclear power plants. However, the options for further management of spent fuel are only outlined since it is the subject of decisions by individual Member States. The Code does not require that an option(s) be decided upon prior to construction or operation of a nuclear power plant

  2. Adaptive optimization for active queue management supporting TCP flows

    NARCIS (Netherlands)

    Baldi, S.; Kosmatopoulos, Elias B.; Pitsillides, Andreas; Lestas, Marios; Ioannou, Petros A.; Wan, Y.; Chiu, George; Johnson, Katie; Abramovitch, Danny

    2016-01-01

    An adaptive decentralized strategy for active queue management of TCP flows over communication networks is presented. The proposed strategy solves locally, at each link, an optimal control problem, minimizing a cost composed of residual capacity and buffer queue size. The solution of the optimal

  3. Spectral decomposition of optimal asset-liability management

    NARCIS (Netherlands)

    Decamps, M.; de Schepper, A.; Goovaerts, M.

    2009-01-01

    This paper concerns optimal asset-liability management when the assets and the liabilities are modeled by means of correlated geometric Brownian motions as suggested in Gerber and Shiu [2003. Geometric Brownian motion models for assets and liabilities: from pension funding to optimal dividends.

  4. Coding hazardous tree failures for a data management system

    Science.gov (United States)

    Lee A. Paine

    1978-01-01

    Codes for automatic data processing (ADP) are provided for hazardous tree failure data submitted on Report of Tree Failure forms. Definitions of data items and suggestions for interpreting ambiguously worded reports are also included. The manual is intended to insure the production of accurate and consistent punched ADP cards which are used in transfer of the data to...

  5. Synergy optimization and operation management on syndicate complementary knowledge cooperation

    Science.gov (United States)

    Tu, Kai-Jan

    2014-10-01

    The number of multi enterprises knowledge cooperation has grown steadily, as a result of global innovation competitions. I have conducted research based on optimization and operation studies in this article, and gained the conclusion that synergy management is effective means to break through various management barriers and solve cooperation's chaotic systems. Enterprises must communicate system vision and access complementary knowledge. These are crucial considerations for enterprises to exert their optimization and operation knowledge cooperation synergy to meet global marketing challenges.

  6. Optimization Problems in Supply Chain Management

    NARCIS (Netherlands)

    D. Romero Morales (Dolores)

    2000-01-01

    textabstractMaria Dolores Romero Morales was born on Augustus 5th, 1971, in Sevilla (Spain). She studied Mathematics at University of Sevilla from 1989 to 1994 and specialized in Statistics and Operations Research. She wrote her Master's thesis on Global Optimization in Location Theory under the

  7. Hybrid vehicle energy management: singular optimal control

    NARCIS (Netherlands)

    Delprat, S.; Hofman, T.; Paganelli, S.

    2017-01-01

    Hybrid vehicle energymanagement is often studied in simulation as an optimal control problem. Under strict convexity assumptions, a solution can be developed using Pontryagin’s minimum principle. In practice, however, many engineers do not formally check these assumptions resulting in the possible

  8. Optimal inventory management with supply backordering

    NARCIS (Netherlands)

    Jaksic, M.; Fransoo, J.C.

    2015-01-01

    We study the inventory control problem of a retailer working under stochastic demand and stochastic limited supply. We assume that the unfilled part of the retailer¿s order is fully backordered at the supplier and replenished with certainty in the following period. As it may not always be optimal

  9. Solution of optimization problems by means of the CASTEM 2000 computer code

    International Nuclear Information System (INIS)

    Charras, Th.; Millard, A.; Verpeaux, P.

    1991-01-01

    In the nuclear industry, it can be necessary to use robots for operation in contaminated environment. Most of the time, positioning of some parts of the robot must be very accurate, which highly depends on the structural (mass and stiffness) properties of its various components. Therefore, there is a need for a 'best' design, which is a compromise between technical (mechanical properties) and economical (material quantities, design and manufacturing cost) matters. This is precisely the aim of optimization techniques, in the frame of structural analysis. A general statement of this problem could be as follows: find the set of parameters which leads to the minimum of a given function, and satisfies some constraints. For example, in the case of a robot component, the parameters can be some geometrical data (plate thickness, ...), the function can be the weight and the constraints can consist in design criteria like a given stiffness and in some manufacturing technological constraints (minimum available thickness, etc). For nuclear industry purposes, a robust method was chosen and implemented in the new generation computer code CASTEM 2000. The solution of the optimum design problem is obtained by solving a sequence of convex subproblems, in which the various functions (the function to minimize and the constraints) are transformed by convex linearization. The method has been programmed in the case of continuous as well as discrete variables. According to the highly modular architecture of the CASTEM 2000 code, only one new operation had to be introduced: the solution of a sub problem with convex linearized functions, which is achieved by means of a conjugate gradient technique. All other operations were already available in the code, and the overall optimum design is realized by means of the Gibiane language. An example of application will be presented to illustrate the possibilities of the method. (author)

  10. Core design optimization by integration of a fast 3-D nodal code in a heuristic search procedure

    Energy Technology Data Exchange (ETDEWEB)

    Geemert, R. van; Leege, P.F.A. de; Hoogenboom, J.E.; Quist, A.J. [Delft University of Technology, NL-2629 JB Delft (Netherlands)

    1998-07-01

    An automated design tool is being developed for the Hoger Onderwijs Reactor (HOR) in Delft, the Netherlands, which is a 2 MWth swimming-pool type research reactor. As a black box evaluator, the 3-D nodal code SILWER, which up to now has been used only for evaluation of predetermined core designs, is integrated in the core optimization procedure. SILWER is a part of PSl's ELCOS package and features optional additional thermal-hydraulic, control rods and xenon poisoning calculations. This allows for fast and accurate evaluation of different core designs during the optimization search. Special attention is paid to handling the in- and output files for SILWER such that no adjustment of the code itself is required for its integration in the optimization programme. The optimization objective, the safety and operation constraints, as well as the optimization procedure, are discussed. (author)

  11. Core design optimization by integration of a fast 3-D nodal code in a heuristic search procedure

    International Nuclear Information System (INIS)

    Geemert, R. van; Leege, P.F.A. de; Hoogenboom, J.E.; Quist, A.J.

    1998-01-01

    An automated design tool is being developed for the Hoger Onderwijs Reactor (HOR) in Delft, the Netherlands, which is a 2 MWth swimming-pool type research reactor. As a black box evaluator, the 3-D nodal code SILWER, which up to now has been used only for evaluation of predetermined core designs, is integrated in the core optimization procedure. SILWER is a part of PSl's ELCOS package and features optional additional thermal-hydraulic, control rods and xenon poisoning calculations. This allows for fast and accurate evaluation of different core designs during the optimization search. Special attention is paid to handling the in- and output files for SILWER such that no adjustment of the code itself is required for its integration in the optimization programme. The optimization objective, the safety and operation constraints, as well as the optimization procedure, are discussed. (author)

  12. Compliance Behavior Analysis of the Ship Crew to the International Safety Management (Ism) Code in Indonesia???

    OpenAIRE

    Desi Albert Mamahit; Heny K Daryanto; Ujang Sumarwan; Eva Zhoriva Yusuf

    2013-01-01

    The purpose of this code is to provide international standards for the management and safe operation of ships and pollution prevention Furthermore, this study has the objective to identify the role of the ISM Code on maritime activities in Indonesia, knowing the perceptions and attitudes regarding the conduct of the crew boat ISM Code. Location research is conducted on the crew that was in the Port of Tanjung Priok in Jakarta. Data collection and processing is done for 3 months. The study was...

  13. Development of hydraulic analysis code for optimizing thermo-chemical is process reactors

    International Nuclear Information System (INIS)

    Terada, Atsuhiko; Hino, Ryutaro; Hirayama, Toshio; Nakajima, Norihiro; Sugiyama, Hitoshi

    2007-01-01

    The Japan Atomic Energy Agency has been conducting study on thermochemical IS process for water splitting hydrogen production. Based on the test results and know-how obtained through the bench-scale test, a pilot test plant, which has a hydrogen production performance of 30 Nm 3 /h, is being designed conceptually as the next step of the IS process development. In design of the IS pilot plant, it is important to make chemical reactors compact with high performance from the viewpoint of plant cost reduction. A new hydraulic analytical code has been developed for optimizing mixing performance of multi-phase flow involving chemical reactions especially in the Bunsen reactor. Complex flow pattern with gas-liquid chemical interaction involving flow instability will be characterized in the Bunsen reactor. Preliminary analytical results obtained with above mentioned code, especially flow patterns induced by swirling flow agreed well with that measured by water experiments, which showed vortex breakdown pattern in a simplified Bunsen reactor. (author)

  14. The role of stochasticity in an information-optimal neural population code

    International Nuclear Information System (INIS)

    Stocks, N G; Nikitin, A P; McDonnell, M D; Morse, R P

    2009-01-01

    In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems. The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise; in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and, hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations. In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.

  15. The role of stochasticity in an information-optimal neural population code

    Energy Technology Data Exchange (ETDEWEB)

    Stocks, N G; Nikitin, A P [School of Engineering, University of Warwick, Coventry CV4 7AL (United Kingdom); McDonnell, M D [Institute for Telecommunications Research, University of South Australia, SA 5095 (Australia); Morse, R P, E-mail: n.g.stocks@warwick.ac.u [School of Life and Health Sciences, Aston University, Birmingham B4 7ET (United Kingdom)

    2009-12-01

    In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems. The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise; in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and, hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations. In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.

  16. A nodal Grean's function method of reactor core fuel management code, NGCFM2D

    International Nuclear Information System (INIS)

    Li Dongsheng; Yao Dong.

    1987-01-01

    This paper presents the mathematical model and program structure of the nodal Green's function method of reactor core fuel management code, NGCFM2D. Computing results of some reactor cores by NGCFM2D are analysed and compared with other codes

  17. Management-retrieval code system of fission barrier parameter sub-library

    International Nuclear Information System (INIS)

    Zhang Limin; Su Zongdi; Ge Zhigang

    1995-01-01

    The fission barrier parameter (FBP) library, which is a sub-library of Chinese Evaluated Nuclear Parameter library (CENPL), stores various popular used fission barrier parameters from different historical period, and could retrieve the required fission barrier parameters by using the management retrieval code system of the FBP sub-library. The function, feature and operation instruction of the code system are described briefly

  18. Radiation protection optimization and work management

    International Nuclear Information System (INIS)

    Schieber, C.

    1994-09-01

    The influence quantification of bound factors to work management, and the obtained results when you apply the dosimetric economical evaluation model of the radiation protection experiments, prove that ALARA principle application musn't bound to actions on the radiation sources, but that you can find a wide act field in the irradiation work volume management topics. 53 refs., 5 tabs., 10 figs., 4 appendixes

  19. Fuel management optimization for a PWR

    International Nuclear Information System (INIS)

    Dumas, M.; Robeau, D.

    1981-04-01

    This study is aimed to optimize the refueling pattern of a PWR. Two methods are developed, they are based on a linearized form of the optimization problem. The first method determines a feasible solution in two steps; in the first one the original problem is replaced by a relaxed one which is solved by the Method of Approximation Programming. The second step is based on the Branch and Bound method to find the feasible solution closest to the solution obtained in the first step. The second method starts from a given refueling pattern and tries to improve this pattern by the calculation of the effects of 2 by 2, 3 by 3 and 4 by 4 permutations on the objective function. Numerical results are given for a typical PWR refueling using the two methods

  20. An optimized cosine-modulated nonuniform filter bank design for subband coding of ECG signal

    Directory of Open Access Journals (Sweden)

    A. Kumar

    2015-07-01

    Full Text Available A simple iterative technique for the design of nonuniform cosine modulated filter banks (CMFBS is presented in this paper. The proposed technique employs a single parameter for optimization. The nonuniform cosine modulated filter banks are derived by merging the adjacent filters of uniform cosine modulated filter banks. The prototype filter is designed with the aid of different adjustable window functions such as Kaiser, Cosh and Exponential, and by using the constrained equiripple finite impulse response (FIR digital filter design technique. In this method, either cut off frequency or passband edge frequency is varied in order to adjust the filter coefficients so that reconstruction error could be optimized/minimized to zero. Performance and effectiveness of the proposed method in terms of peak reconstruction error (PRE, aliasing distortion (AD, computational (CPU time, and number of iteration (NOI have been shown through the numerical examples and comparative studies. Finally, the technique is exploited for the subband coding of electrocardiogram (ECG and speech signals.

  1. Stereoscopic Visual Attention-Based Regional Bit Allocation Optimization for Multiview Video Coding

    Directory of Open Access Journals (Sweden)

    Dai Qionghai

    2010-01-01

    Full Text Available We propose a Stereoscopic Visual Attention- (SVA- based regional bit allocation optimization for Multiview Video Coding (MVC by the exploiting visual redundancies from human perceptions. We propose a novel SVA model, where multiple perceptual stimuli including depth, motion, intensity, color, and orientation contrast are utilized, to simulate the visual attention mechanisms of human visual system with stereoscopic perception. Then, a semantic region-of-interest (ROI is extracted based on the saliency maps of SVA. Both objective and subjective evaluations of extracted ROIs indicated that the proposed SVA model based on ROI extraction scheme outperforms the schemes only using spatial or/and temporal visual attention clues. Finally, by using the extracted SVA-based ROIs, a regional bit allocation optimization scheme is presented to allocate more bits on SVA-based ROIs for high image quality and fewer bits on background regions for efficient compression purpose. Experimental results on MVC show that the proposed regional bit allocation algorithm can achieve over % bit-rate saving while maintaining the subjective image quality. Meanwhile, the image quality of ROIs is improved by  dB at the cost of insensitive image quality degradation of the background image.

  2. Efficient Coding and Statistically Optimal Weighting of Covariance among Acoustic Attributes in Novel Sounds

    Science.gov (United States)

    Stilp, Christian E.; Kluender, Keith R.

    2012-01-01

    To the extent that sensorineural systems are efficient, redundancy should be extracted to optimize transmission of information, but perceptual evidence for this has been limited. Stilp and colleagues recently reported efficient coding of robust correlation (r = .97) among complex acoustic attributes (attack/decay, spectral shape) in novel sounds. Discrimination of sounds orthogonal to the correlation was initially inferior but later comparable to that of sounds obeying the correlation. These effects were attenuated for less-correlated stimuli (r = .54) for reasons that are unclear. Here, statistical properties of correlation among acoustic attributes essential for perceptual organization are investigated. Overall, simple strength of the principal correlation is inadequate to predict listener performance. Initial superiority of discrimination for statistically consistent sound pairs was relatively insensitive to decreased physical acoustic/psychoacoustic range of evidence supporting the correlation, and to more frequent presentations of the same orthogonal test pairs. However, increased range supporting an orthogonal dimension has substantial effects upon perceptual organization. Connectionist simulations and Eigenvalues from closed-form calculations of principal components analysis (PCA) reveal that perceptual organization is near-optimally weighted to shared versus unshared covariance in experienced sound distributions. Implications of reduced perceptual dimensionality for speech perception and plausible neural substrates are discussed. PMID:22292057

  3. Development of Learning Management in Moral Ethics and Code of Ethics of the Teaching Profession Course

    Science.gov (United States)

    Boonsong, S.; Siharak, S.; Srikanok, V.

    2018-02-01

    The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.

  4. Well Field Management Using Multi-Objective Optimization

    DEFF Research Database (Denmark)

    Hansen, Annette Kirstine; Hendricks Franssen, H. J.; Bauer-Gottwein, Peter

    2013-01-01

    with infiltration basins, injection wells and abstraction wells. The two management objectives are to minimize the amount of water needed for infiltration and to minimize the risk of getting contaminated water into the drinking water wells. The management is subject to a daily demand fulfilment constraint. Two...... different optimization methods are tested. Constant scheduling where decision variables are held constant during the time of optimization, and sequential scheduling where the optimization is performed stepwise for daily time steps. The latter is developed to work in a real-time situation. Case study...

  5. A discrete optimization method for nuclear fuel management

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1993-01-01

    Nuclear fuel management can be seen as a large discrete optimization problem under constraints, and optimization methods on such problems are numerically costly. After an introduction of the main aspects of nuclear fuel management, this paper presents a new way to treat the combinatorial problem by using information included in the gradient of optimized cost function. A new search process idea is to choose, by direct observation of the gradient, the more interesting changes in fuel loading patterns. An example is then developed to illustrate an operating mode of the method. Finally, connections with classical simulated annealing and genetic algorithms are described as an attempt to improve search processes. 16 refs., 2 figs

  6. MOSEG code for safety oriented maintenance management Safety of management of maintenance oriented by MOSEG code; Codigo MOSEG para la gestion de mantenimiento orientada a la seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Torres Valle, Antonio [Instituto Superior de Tecnologias y Ciencias Aplicadas, La Habana (Cuba). Dept. Ingenieria Nuclear]. E-mail: atorres@fctn.isctn.edu.cu; Rivero Oliva, Jose de Jesus [Centro de Gestion de la Informacion y Desarrollo de la Energia (CUBAENERGIA) (Cuba)]. E-mail: jose@cubaenergia.cu

    2005-07-01

    Full text: One of the main reasons that makes maintenance contribute highly when facing safety problems and facilities availability is the lack of maintenance management systems to solve these fields in a balanced way. Their main setbacks are shown in this paper. It briefly describes the development of an integrating algorithm for a safety and availability-oriented maintenance management by virtue of the MOSEG Win 1.0 code. (author)

  7. Optimal natural resources management under uncertainty with catastrophic risk

    Energy Technology Data Exchange (ETDEWEB)

    Motoh, Tsujimura [Graduate School of Economics, Kyoto University, Yoshida-honmochi, Sakyo-ku, Kyoto 606-8501 (Japan)

    2004-05-01

    We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource.

  8. Optimal natural resources management under uncertainty with catastrophic risk

    International Nuclear Information System (INIS)

    Motoh, Tsujimura

    2004-01-01

    We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource

  9. A Low-Jitter Wireless Transmission Based on Buffer Management in Coding-Aware Routing

    Directory of Open Access Journals (Sweden)

    Cunbo Lu

    2015-08-01

    Full Text Available It is significant to reduce packet jitter for real-time applications in a wireless network. Existing coding-aware routing algorithms use the opportunistic network coding (ONC scheme in a packet coding algorithm. The ONC scheme never delays packets to wait for the arrival of a future coding opportunity. The loss of some potential coding opportunities may degrade the contribution of network coding to jitter performance. In addition, most of the existing coding-aware routing algorithms assume that all flows participating in the network have equal rate. This is unrealistic, since multi-rate environments often appear. To overcome the above problem and expand coding-aware routing to multi-rate scenarios, from the view of data transmission, we present a low-jitter wireless transmission algorithm based on buffer management (BLJCAR, which decides packets in coding node according to the queue-length based threshold policy instead of the regular ONC policy as used in existing coding-aware routing algorithms. BLJCAR is a unified framework to merge the single rate case and multiple rate case. Simulations results show that the BLJCAR algorithm embedded in coding-aware routing outperforms the traditional ONC policy in terms of jitter, packet delivery delay, packet loss ratio and network throughput in network congestion in any traffic rates.

  10. Nuclear fuel management optimization using adaptive evolutionary algorithms with heuristics

    International Nuclear Information System (INIS)

    Axmann, J.K.; Van de Velde, A.

    1996-01-01

    Adaptive Evolutionary Algorithms in combination with expert knowledge encoded in heuristics have proved to be a robust and powerful optimization method for the design of optimized PWR fuel loading pattern. Simple parallel algorithmic structures coupled with a low amount of communications between computer processor units in use makes it possible for workstation clusters to be employed efficiently. The extension of classic evolution strategies not only by new and alternative methods but also by the inclusion of heuristics with effects on the exchange probabilities of the fuel assemblies at specific core positions leads to the RELOPAT optimization code of the Technical University of Braunschweig. In combination with the new, neutron-physical 3D nodal core simulator PRISM developed by SIEMENS the PRIMO loading pattern optimization system has been designed. Highly promising results in the recalculation of known reload plans for German PWR's new lead to a commercially usable program. (author)

  11. Managing supply chains : transport optimization and chain synchronization

    NARCIS (Netherlands)

    van Woensel, T.; Dabia, S.; de Kok, A.G.; van Nunen, J.A.E.E.; Huijbregts, P.; Rietveld, P.

    2011-01-01

    Transport optimization is part of the broad area of physical distribution and logistics management. Physical distribution involves the handling, movement, and storage of goods from the point of origin to their point of consumption or use, via various channels of distribution. Logistics management

  12. Cover crop-based ecological weed management: exploration and optimization

    NARCIS (Netherlands)

    Kruidhof, H.M.

    2008-01-01

    Keywords: organic farming, ecologically-based weed management, cover crops, green manure, allelopathy, Secale cereale, Brassica napus, Medicago sativa

    Cover crop-based ecological weed management: exploration and optimization. In organic farming systems, weed control is recognized as one

  13. Optimization on the financial management of the bank with goal ...

    African Journals Online (AJOL)

    Financial management is crucial for planning bank's asset and liabilities while taking consideration for multiple objectives. The objective of this study is to develop a Goal Programming (GP) model to optimize the financial management of Public Bank Berhad in. Malaysia. Six goals from the financial statements namely total ...

  14. Fuel management and core design code systems for pressurized water reactor neutronic calculations

    International Nuclear Information System (INIS)

    Ahnert, C.; Arayones, J.M.

    1985-01-01

    A package of connected code systems for the neutronic calculations relevant in fuel management and core design has been developed and applied for validation to the startup tests and first operating cycle of a 900MW (electric) PWR. The package includes the MARIA code system for the modeling of the different types of PWR fuel assemblies, the CARMEN code system for detailed few group diffusion calculations for PWR cores at operating and burnup conditions, and the LOLA code system for core simulation using onegroup nodal theory parameters explicitly calculated from the detailed solutions

  15. Optimal energy management of HEVs with hybrid storage system

    International Nuclear Information System (INIS)

    Vinot, E.; Trigui, R.

    2013-01-01

    Highlights: • A battery and ultra-capacitor system for parallel hybrid vehicle is considered. • Optimal management using Pontryagin’s minimum principle is developed. • Battery stress limitation is taken into account by means of RMS current. • Rule based management approaching the optimal control is proposed. • Comparison between rule based and optimal management are proposed using Pareto front. - Abstract: Energy storage systems are a key point in the design and development of electric and hybrid vehicles. In order to reduce the battery size and its current stress, a hybrid storage system, where a battery is coupled with an electrical double-layer capacitor (EDLC) is considered in this paper. The energy management of such a configuration is not obvious and the optimal operation concerning the energy consumption and battery RMS current has to be identified. Most of the past work on the optimal energy management of HEVs only considered one additional power source. In this paper, the control of a hybrid vehicle with a hybrid storage system (HSS), where two additional power sources are used, is presented. Applying the Pontryagin’s minimum principle, an optimal energy management strategy is found and compared to a rule-based parameterized control strategy. Simulation results are shown and discussed. Applied on a small compact car, optimal and ruled-based methods show that gains of fuel consumption and/or a battery RMS current higher than 15% may be obtained. The paper also proves that a well tuned rule-based algorithm presents rather good performances when compared to the optimal strategy and remains relevant for different driving cycles. This rule-based algorithm may easily be implemented in a vehicle prototype or in an HIL test bench

  16. Complex energy system management using optimization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bridgeman, Stuart; Hurdowar-Castro, Diana; Allen, Rick; Olason, Tryggvi; Welt, Francois

    2010-09-15

    Modern energy systems are often very complex with respect to the mix of generation sources, energy storage, transmission, and avenues to market. Historically, power was provided by government organizations to load centers, and pricing was provided in a regulatory manner. In recent years, this process has been displaced by the independent system operator (ISO). This complexity makes the operation of these systems very difficult, since the components of the system are interdependent. Consequently, computer-based large-scale simulation and optimization methods like Decision Support Systems are now being used. This paper discusses the application of a DSS to operations and planning systems.

  17. Optimal Stand Management: Traditional and Neotraditional Solutions

    Science.gov (United States)

    Karen Lee Abt; Jeffrey P. Prestemon

    2003-01-01

    The traditional Faustmann (1849) model has served as the foundation of economic theory of the firm for the forestry production process. Since its introduction over 150 years ago, many variations of the Faustmann have been developed which relax certain assumptions of the traditional model, including constant prices, risk neutrality, zero production and management costs...

  18. Optimal grazing management strategies: evaluating key concepts ...

    African Journals Online (AJOL)

    Finally, overstocking will override key management initiatives, such as effective recovery periods, leading to rangeland degradation. Thus, in variable climates, stocking rate should be set conservatively to allow easier adaptation of animal numbers to rainfall variability from year to year. We suggest several key concepts that ...

  19. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    Science.gov (United States)

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  20. FREQUENCY ANALYSIS OF RLE-BLOCKS REPETITIONS IN THE SERIES OF BINARY CODES WITH OPTIMAL MINIMAX CRITERION OF AUTOCORRELATION FUNCTION

    Directory of Open Access Journals (Sweden)

    A. A. Kovylin

    2013-01-01

    Full Text Available The article describes the problem of searching for binary pseudo-random sequences with quasi-ideal autocorrelation function, which are to be used in contemporary communication systems, including mobile and wireless data transfer interfaces. In the synthesis of binary sequences sets, the target set is manning them based on the minimax criterion by which a sequence is considered to be optimal according to the intended application. In the course of the research the optimal sequences with order of up to 52 were obtained; the analysis of Run Length Encoding was carried out. The analysis showed regularities in the distribution of series number of different lengths in the codes that are optimal on the chosen criteria, which would make it possible to optimize the searching process for such codes in the future.

  1. Asset management -- Integrated software optimizes production performance

    International Nuclear Information System (INIS)

    Polczer, S.

    1998-01-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle's universal data server software development tools with ATS's upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting

  2. Asset management -- Integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-10-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server software development tools with ATS`s upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting.

  3. Economic optimization of nuclear waste management

    International Nuclear Information System (INIS)

    DeWames, R.E.; Grantham, L.F.; Guon, J.; McKisson, R.L.

    1984-01-01

    The paper presented here addresses the impact of waste management system operating parameters on overall system economics. The conclusion reached by this study is that currently available technology and proposed operating conditions do not lead to optimum economics. The decision to utilize the current reference waste package and non-optimum operating conditions will cause added expenditures of 7 billion dollars over the next several decades. Further, this paper points out that optimum economics is not necessarily incompatible with improved system safety

  4. Optimal Investment by Financially Xenophobic Managers

    OpenAIRE

    Jason G. Cummins; Ingmar Nyman

    2000-01-01

    Case studies show that corporate managers seek financial independence to avoid interference by outside financiers. We incorporate this financial xenophobia as a fixed cost in a simple dynamic model of financing and investment. To avoid refinancing in the future, the firm alters its behavior depending on the extent of its financial xenophobia and the realization of a revenue shock. With a sufficiently adverse shock, the firm holds no liquidity. Otherwise, the firm precautionarily saves and hol...

  5. Optimal pain management for radical prostatectomy surgery

    DEFF Research Database (Denmark)

    Joshi, Grish P; Jaschinski, Thomas; Bonnet, Francis

    2015-01-01

    BACKGROUND: Increase in the diagnosis of prostate cancer has increased the incidence of radical prostatectomy. However, the literature assessing pain therapy for this procedure has not been systematically evaluated. Thus, optimal pain therapy for patients undergoing radical prostatectomy remains...... controversial. METHODS: Medline, Embase, and Cochrane Central Register of Controlled Trials were searched for studies assessing the effects of analgesic and anesthetic interventions on pain after radical prostatectomy. All searches were conducted in October 2012 and updated in June 2015. RESULTS: Most...... treatments studied improved pain relief and/or reduced opioid requirements. However, there were significant differences in the study designs and the variables evaluated, precluding quantitative analysis and consensus recommendations. CONCLUSIONS: This systematic review reveals that there is a lack...

  6. Optimization of landscape services under uncoordinated management by multiple landowners.

    Science.gov (United States)

    Porto, Miguel; Correia, Otília; Beja, Pedro

    2014-01-01

    Landscapes are often patchworks of private properties, where composition and configuration patterns result from cumulative effects of the actions of multiple landowners. Securing the delivery of services in such multi-ownership landscapes is challenging, because it is difficult to assure tight compliance to spatially explicit management rules at the level of individual properties, which may hinder the conservation of critical landscape features. To deal with these constraints, a multi-objective simulation-optimization procedure was developed to select non-spatial management regimes that best meet landscape-level objectives, while accounting for uncoordinated and uncertain response of individual landowners to management rules. Optimization approximates the non-dominated Pareto frontier, combining a multi-objective genetic algorithm and a simulator that forecasts trends in landscape pattern as a function of management rules implemented annually by individual landowners. The procedure was demonstrated with a case study for the optimum scheduling of fuel treatments in cork oak forest landscapes, involving six objectives related to reducing management costs (1), reducing fire risk (3), and protecting biodiversity associated with mid- and late-successional understories (2). There was a trade-off between cost, fire risk and biodiversity objectives, that could be minimized by selecting management regimes involving ca. 60% of landowners clearing the understory at short intervals (around 5 years), and the remaining managing at long intervals (ca. 75 years) or not managing. The optimal management regimes produces a mosaic landscape dominated by stands with herbaceous and low shrub understories, but also with a satisfactory representation of old understories, that was favorable in terms of both fire risk and biodiversity. The simulation-optimization procedure presented can be extended to incorporate a wide range of landscape dynamic processes, management rules and quantifiable

  7. Optimization of landscape services under uncoordinated management by multiple landowners.

    Directory of Open Access Journals (Sweden)

    Miguel Porto

    Full Text Available Landscapes are often patchworks of private properties, where composition and configuration patterns result from cumulative effects of the actions of multiple landowners. Securing the delivery of services in such multi-ownership landscapes is challenging, because it is difficult to assure tight compliance to spatially explicit management rules at the level of individual properties, which may hinder the conservation of critical landscape features. To deal with these constraints, a multi-objective simulation-optimization procedure was developed to select non-spatial management regimes that best meet landscape-level objectives, while accounting for uncoordinated and uncertain response of individual landowners to management rules. Optimization approximates the non-dominated Pareto frontier, combining a multi-objective genetic algorithm and a simulator that forecasts trends in landscape pattern as a function of management rules implemented annually by individual landowners. The procedure was demonstrated with a case study for the optimum scheduling of fuel treatments in cork oak forest landscapes, involving six objectives related to reducing management costs (1, reducing fire risk (3, and protecting biodiversity associated with mid- and late-successional understories (2. There was a trade-off between cost, fire risk and biodiversity objectives, that could be minimized by selecting management regimes involving ca. 60% of landowners clearing the understory at short intervals (around 5 years, and the remaining managing at long intervals (ca. 75 years or not managing. The optimal management regimes produces a mosaic landscape dominated by stands with herbaceous and low shrub understories, but also with a satisfactory representation of old understories, that was favorable in terms of both fire risk and biodiversity. The simulation-optimization procedure presented can be extended to incorporate a wide range of landscape dynamic processes, management rules

  8. Optimal management of hemophilic arthropathy and hematomas

    Directory of Open Access Journals (Sweden)

    Lobet S

    2014-10-01

    Full Text Available Sébastien Lobet,1,2 Cedric Hermans,1 Catherine Lambert1 1Hemostasis-Thrombosis Unit, Division of Hematology, 2Division of Physical Medicine and Rehabilitation, Cliniques Universitaires Saint-Luc, Brussels, Belgium Abstract: Hemophilia is a hematological disorder characterized by a partial or complete deficiency of clotting factor VIII or IX. Its bleeding complications primarily affect the musculoskeletal system. Hemarthrosis is a major hemophilia-related complication, responsible for a particularly debilitating chronic arthropathy, in the long term. In addition to clotting factor concentrates, usually prescribed by the hematologist, managing acute hemarthrosis and chronic arthropathy requires a close collaboration between the orthopedic surgeon and physiotherapist. This collaboration, comprising a coagulation and musculoskeletal specialist, is key to effectively preventing hemarthrosis, managing acute joint bleeding episodes, assessing joint function, and actively treating chronic arthropathy. This paper reviews, from a practical point of view, the pathophysiology, clinical manifestations, and treatment of hemarthrosis and chronic hemophilia-induced arthropathy for hematologists, orthopedic surgeons, and physiotherapists. Keywords: hemophilia, arthropathy, hemarthrosis, hematoma, physiotherapy, target joint

  9. Optimal management of chronic osteomyelitis: current perspectives

    Directory of Open Access Journals (Sweden)

    Pande KC

    2015-08-01

    Full Text Available Ketan C Pande Raja Isteri Pengiran Anak Saleha Hospital, Bandar Seri Begawan, BruneiAbstract: Chronic osteomyelitis is a challenging condition to treat. It is seen mostly after open fractures or in implant-related infections following treatment of fractures and prosthetic joint replacements. Recurrence of infection is well known, and successful treatment requires a multidisciplinary team approach with surgical debridement and appropriate antimicrobial therapy as the cornerstone of treatment. Staging of the disease and identification of the causative microorganism is essential before initiation of treatment. Important surgical steps include radical debridement of necrotic and devitalized tissue, removal of implants, management of resultant dead space, soft-tissue coverage, and skeletal stabilization or management of skeletal defects. The route of administration and duration of antimicrobial therapy continues to be debated. The role of biofilm is now clearly established in the chronicity of bone infection, and newer modalities are being developed to address various issues related to biofilm formation. The present review addresses various aspects of chronic osteomyelitis of long bones seen in adults, with a review of recent developments. Keywords: osteomyelitis, infection, biofilm, bone, therapy, treatment

  10. Asset management: integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-06-01

    Two new multi-dimensional databases, which expand the `row and column` concept of spreadsheets into multiple categories of data called dimensions, are described. These integrated software packages provide the foundation for industry players such as Poco Petroleum Ltd and Numac Energy Inc to gain a competitive advantage, by overhauling their respective data collection and retrieval systems to allow for timely cost analysis and financial reporting. Energy Warehouse, an on-line analytical processing product marketed by SysGold Ltd, is one of the software products described. It gathers various sources of information, allows advanced searches and generates reports previously unavailable in other conventional financial accounting systems. The second product discussed - the Canadian Upstream Energy System (CUES) - is an on-line analytical processing system developed by Oracle Corporation and Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server and software development tools with ATS`s upstream financial, land, geotechnical and production applications. The software also allows for optimization of facilities, analysis of production efficiencies and comparison of performance against industry standards.

  11. Enhanced Protein Production in Escherichia coli by Optimization of Cloning Scars at the Vector-Coding Sequence Junction

    DEFF Research Database (Denmark)

    Mirzadeh, Kiavash; Martinez, Virginia; Toddo, Stephen

    2015-01-01

    are poorly expressed even when they are codon-optimized and expressed from vectors with powerful genetic elements. In this study, we show that poor expression can be caused by certain nucleotide sequences (e.g., cloning scars) at the junction between the vector and the coding sequence. Since these sequences...

  12. Management of manufacture and installation of plant pipings by bar code system

    International Nuclear Information System (INIS)

    Suwa, Minoru

    1995-01-01

    As for the piping system of nuclear power plants, the number of parts is very large, and the mill sheet is attached to each part, therefore, it is necessary to manage them individually, and large man power is required. In order to resolve the delay of mechanization in the factory, bar code system was adopted on full scale. At the time of taking parts out from the store, bar code labels are stuck to all piping parts. By this means, all the processes of manufacture and inspection are managed with a computer, and it is useful for labor saving and the prevention of mistaken input. This system is centering around the system of the progress management for piping manufacture, and is operated by being coupled with respective systems of production design, order and inventory, mill sheet management and installation management. The management of production design, manufacture, inspection and installation is explained. There is the problem of sticking bar code labels again as the labels become dirty or parts pass through coating and pickling processes. The direct carving of bar codes on parts by laser marker was tried, and it was successful for stainless steel, but in carbon steel pipes, it was hard to read. It is desirable to develop the bar codes which endure until the end of plant life. (K.I.)

  13. The role of crossover operator in evolutionary-based approach to the problem of genetic code optimization.

    Science.gov (United States)

    Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł

    2016-12-01

    One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure

  14. Minimizing human error in radiopharmaceutical preparation and administration via a bar code-enhanced nuclear pharmacy management system.

    Science.gov (United States)

    Hakala, John L; Hung, Joseph C; Mosman, Elton A

    2012-09-01

    The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.

  15. Severe accident management. Optimized guidelines and strategies

    International Nuclear Information System (INIS)

    Braun, Matthias; Löffler, Micha; Plank, Hermann; Asse, Dietmar; Dimmelmeier, Harald

    2014-01-01

    The highest priority for mitigating the consequences of a severe accident with core melt lies in securing containment integrity, as this represents the last barrier against fission product release to the environment. Containment integrity is endangered by several physical phenomena, especially highly transient phenomena following high-pressure reactor pressure vessel failure (like direct containment heating or steam explosions which can lead to early containment failure), hydrogen combustion, quasi-static over-pressure, temperature failure of penetrations, and basemat penetration by core melt. Each of these challenges can be counteracted by dedicated severe accident mitigation hardware, like dedicated primary circuit depressurization valves, hydrogen recombiners or igniters, filtered containment venting, containment cooling systems, and core melt stabilization systems (if available). However, besides their main safety function these systems often have also secondary effects that need to be considered. Filtered containment venting causes (though limited) fission product release into the environment, primary circuit depressurization leads to loss of coolant, and an ex-vessel core melt stabilization system as well as hydrogen igniters can generate high pressure and temperature loads on the containment. To ensure that during a severe accident any available systems are used to their full beneficial extent while minimizing their potential negative impact, AREVA has implemented a severe accident management for German nuclear power plants. This concept makes use of extensive numerical simulations of the entire plant, quantifying the impact of system activations (operational systems, safety systems, as well as dedicated severe accident systems) on the accident progression for various scenarios. Based on the knowledge gained, a handbook has been developed, allowing the plant operators to understand the current state of the plant (supported by computational aids), to predict

  16. Optimal Resource Management in a Stochastic Schaefer Model

    OpenAIRE

    Richard Hartman

    2008-01-01

    This paper incorporates uncertainty into the growth function of the Schaefer model for the optimal management of a biological resource. There is a critical value for the biological stock, and it is optimal to do no harvesting if the biological stock is below that critical value and to exert whatever harvesting effort is necessary to prevent the stock from rising above that critical value. The introduction of uncertainty increases the critical value of the stock.

  17. Commands for financial data management and portfolio optimization

    OpenAIRE

    C. Alberto Dorantes

    2013-01-01

    Several econometric software offer portfolio management tools for practitioners and researchers. For example, MatLab and R offer a great variety of tools for the simulation, optimization, and analysis of financial time series. Stata, together with Mata, offers powerful programming tools for the simulation, optimization, and analysis of financial data. However, related user commands are scarce. In this presentation, commands for online market data collection, data manipulation, and financial a...

  18. A Simulation-Optimization Model for Seawater Intrusion Management at Pingtung Coastal Area, Taiwan

    Directory of Open Access Journals (Sweden)

    Po-Syun Huang

    2018-02-01

    Full Text Available The coastal regions of Pingtung Plain in southern Taiwan rely on groundwater as their main source of fresh water for aquaculture, agriculture, domestic, and industrial sectors. The availability of fresh groundwater is threatened by unsustainable groundwater extraction and the over-pumpage leads to the serious problem of seawater intrusion. It is desired to find appropriate management strategies to control groundwater salinity and mitigate seawater intrusion. In this study, a simulation–optimization model has been presented to solve the problem of seawater intrusion along the coastal aquifers in Pingtung Plain and the objective is using injection well barriers and minimizing the total injection rate based on the pre-determined locations of injection barriers. The SEAWAT code is used to simulate the process of seawater intrusion and the surrogate model of artificial neural networks (ANNs is used to approximate the seawater intrusion (SWI numerical model to increase the computational efficiency during the optimization process. The heuristic optimization scheme of differential evolution (DE algorithm is selected to identify the global optimal management solution. Two different management scenarios, one is the injection barriers located along the coast and the other is the injection barrier located at the inland, are considered and the optimized results show that the deployment of injection barriers at the inland is more effective to reduce total dissolved solids (TDS concentrations and mitigate seawater intrusion than that along the coast. The computational time can be reduced by more than 98% when using ANNs to replace the numerical model and the DE algorithm has been confirmed as a robust optimization scheme to solve groundwater management problems. The proposed framework can identify the most reliable management strategies and provide a reference tool for decision making with regard to seawater intrusion remediation.

  19. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  20. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  1. Optimization of the Monte Carlo code for modeling of photon migration in tissue.

    Science.gov (United States)

    Zołek, Norbert S; Liebert, Adam; Maniewski, Roman

    2006-10-01

    The Monte Carlo method is frequently used to simulate light transport in turbid media because of its simplicity and flexibility, allowing to analyze complicated geometrical structures. Monte Carlo simulations are, however, time consuming because of the necessity to track the paths of individual photons. The time consuming computation is mainly associated with the calculation of the logarithmic and trigonometric functions as well as the generation of pseudo-random numbers. In this paper, the Monte Carlo algorithm was developed and optimized, by approximation of the logarithmic and trigonometric functions. The approximations were based on polynomial and rational functions, and the errors of these approximations are less than 1% of the values of the original functions. The proposed algorithm was verified by simulations of the time-resolved reflectance at several source-detector separations. The results of the calculation using the approximated algorithm were compared with those of the Monte Carlo simulations obtained with an exact computation of the logarithm and trigonometric functions as well as with the solution of the diffusion equation. The errors of the moments of the simulated distributions of times of flight of photons (total number of photons, mean time of flight and variance) are less than 2% for a range of optical properties, typical of living tissues. The proposed approximated algorithm allows to speed up the Monte Carlo simulations by a factor of 4. The developed code can be used on parallel machines, allowing for further acceleration.

  2. Economically oriented process optimization in waste management.

    Science.gov (United States)

    Maroušek, Josef

    2014-06-01

    A brief report on the development of novel apparatus is presented. It was verified in a commercial scale that a new concept of anaerobic fermentation followed by continuous pyrolysis is technically and economically feasible to manage previously enzymatically hydrolyzed waste haylage in huge volumes. The design of the concept is thoroughly described, documented in figures, and biochemically analyzed in detail. Assessment of the concept shows that subsequent pyrolysis of the anaerobically fermented residue allows among biogas to produce also high-quality biochar. This significantly improves the overall economy. In addition, it may be assumed that this applied research is consistent with previous theoretical assumptions stating that any kind of aerobic or anaerobic fermentation increases the microporosity of the biochar obtained.

  3. Optimal management of complications associated with achondroplasia

    Directory of Open Access Journals (Sweden)

    Irel

    2014-06-01

    Full Text Available Penny J Ireland,1 Verity Pacey,2,3 Andreas Zankl,4 Priya Edwards,1 Leanne M Johnston,5 Ravi Savarirayan6 1Queensland Paediatric Rehabilitation Service, Royal Children’s Hospital, Herston, Brisbane, Queensland, 2Physiotherapy Department, The Children’s Hospital at Westmead, Sydney, New South Wales, 3Department of Health Professions, Macquarie University, Sydney, New South Wales, 4Genetic Medicine, Children’s Hospital, Westmead, Sydney, New South Wales, 5School of Health and Rehabilitation Sciences, University of Queensland, Brisbane, Queensland, 6Victorian Clinical Genetics Service, Royal Children’s Hospital, Melbourne, Victoria, Australia Abstract: Achondroplasia is the most common form of skeletal dysplasia, resulting in disproportionate short stature, and affects over 250,000 people worldwide. Individuals with achondroplasia demonstrate a number of well-recognized anatomical features that impact on growth and development, with a complex array of medical issues that are best managed through a multidisciplinary team approach. The complexity of this presentation, whereby individual impairments may impact upon multiple activity and participation areas, requires consideration and discussion under a broad framework to gain a more thorough understanding of the experience of this condition for individuals with achondroplasia. This paper examines the general literature and research evidence on the medical and health aspects of individuals with achondroplasia and presents a pictorial model of achondroplasia based on The International Classification of Functioning, Disability, and Health (ICF. An expanded model of the ICF will be used to review and present the current literature pertaining to the musculoskeletal, neurological, cardiorespiratory, and ear, nose, and throat impairments and complications across the lifespan, with discussion on the impact of these impairments upon activity and participation performance. Further research is required to

  4. An Optimization of the Risk Management using Derivatives

    Directory of Open Access Journals (Sweden)

    Ovidiu ŞONTEA

    2011-07-01

    Full Text Available This article aims to provide a process that can be used in financial risk management by resolving problems of minimizing the risk measure (VaR using derivatives products, bonds and options. This optimization problem was formulated in the hedging situation of a portfolio formed by an active and a put option on this active, respectively a bond and an option on this bond. In the first optimization problem we will obtain the coverage ratio of the optimal price for the excertion of the option which is in fact the relative cost of the option’s value. In the second optimization problem we obtained optimal exercise price for a put option which is to support a bond.

  5. An Enhanced System Architecture for Optimized Demand Side Management in Smart Grid

    Directory of Open Access Journals (Sweden)

    Anzar Mahmood

    2016-04-01

    Full Text Available Demand Side Management (DSM through optimization of home energy consumption in the smart grid environment is now one of the well-known research areas. Appliance scheduling has been done through many different algorithms to reduce peak load and, consequently, the Peak to Average Ratio (PAR. This paper presents a Comprehensive Home Energy Management Architecture (CHEMA with integration of multiple appliance scheduling options and enhanced load categorization in a smart grid environment. The CHEMA model consists of six layers and has been modeled in Simulink with an embedded MATLAB code. A single Knapsack optimization technique is used for scheduling and four different cases of cost reduction are modeled at the second layer of CHEMA. Fault identification and electricity theft control have also been added in CHEMA. Furthermore, carbon footprint calculations have been incorporated in order to make the users aware of environmental concerns. Simulation results prove the effectiveness of the proposed model.

  6. Evaluation of the need for stochastic optimization of out-of-core nuclear fuel management decisions

    International Nuclear Information System (INIS)

    Thomas, R.L. Jr.

    1989-01-01

    Work has been completed on utilizing mathematical optimization techniques to optimize out-of-core nuclear fuel management decisions. The objective of such optimization is to minimize the levelized fuel cycle cost over some planning horizon. Typical decision variables include feed enrichments and number of assemblies, burnable poison requirements, and burned fuel to reinsert for every cycle in the planning horizon. Engineering constraints imposed consist of such items as discharge burnup limits, maximum enrichment limit, and target cycle energy productions. Earlier the authors reported on the development of the OCEON code, which employs the integer Monte Carlo Programming method as the mathematical optimization method. The discharge burnpups, and feed enrichment and burnable poison requirements are evaluated, initially employing a linear reactivity core physics model and refined using a coarse mesh nodal model. The economic evaluation is completed using a modification of the CINCAS methodology. Interest now is to assess the need for stochastic optimization, which will account for cost components and cycle energy production uncertainties. The implication of the present studies is that stochastic optimization in regard to cost component uncertainties need not be completed since deterministic optimization will identify nearly the same family of near-optimum cycling schemes

  7. Review: Optimization methods for groundwater modeling and management

    Science.gov (United States)

    Yeh, William W.-G.

    2015-09-01

    Optimization methods have been used in groundwater modeling as well as for the planning and management of groundwater systems. This paper reviews and evaluates the various optimization methods that have been used for solving the inverse problem of parameter identification (estimation), experimental design, and groundwater planning and management. Various model selection criteria are discussed, as well as criteria used for model discrimination. The inverse problem of parameter identification concerns the optimal determination of model parameters using water-level observations. In general, the optimal experimental design seeks to find sampling strategies for the purpose of estimating the unknown model parameters. A typical objective of optimal conjunctive-use planning of surface water and groundwater is to minimize the operational costs of meeting water demand. The optimization methods include mathematical programming techniques such as linear programming, quadratic programming, dynamic programming, stochastic programming, nonlinear programming, and the global search algorithms such as genetic algorithms, simulated annealing, and tabu search. Emphasis is placed on groundwater flow problems as opposed to contaminant transport problems. A typical two-dimensional groundwater flow problem is used to explain the basic formulations and algorithms that have been used to solve the formulated optimization problems.

  8. Self-adaptive global best harmony search algorithm applied to reactor core fuel management optimization

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.; Valavi, K.

    2013-01-01

    Highlights: • SGHS enhanced the convergence rate of LPO using some improvements in comparison to basic HS and GHS. • SGHS optimization algorithm obtained averagely better fitness relative to basic HS and GHS algorithms. • Upshot of the SGHS implementation in the LPO reveals its flexibility, efficiency and reliability. - Abstract: The aim of this work is to apply the new developed optimization algorithm, Self-adaptive Global best Harmony Search (SGHS), for PWRs fuel management optimization. SGHS algorithm has some modifications in comparison with basic Harmony Search (HS) and Global-best Harmony Search (GHS) algorithms such as dynamically change of parameters. For the demonstration of SGHS ability to find an optimal configuration of fuel assemblies, basic Harmony Search (HS) and Global-best Harmony Search (GHS) algorithms also have been developed and investigated. For this purpose, Self-adaptive Global best Harmony Search Nodal Expansion package (SGHSNE) has been developed implementing HS, GHS and SGHS optimization algorithms for the fuel management operation of nuclear reactor cores. This package uses developed average current nodal expansion code which solves the multi group diffusion equation by employment of first and second orders of Nodal Expansion Method (NEM) for two dimensional, hexagonal and rectangular geometries, respectively, by one node per a FA. Loading pattern optimization was performed using SGHSNE package for some test cases to present the SGHS algorithm capability in converging to near optimal loading pattern. Results indicate that the convergence rate and reliability of the SGHS method are quite promising and practically, SGHS improves the quality of loading pattern optimization results relative to HS and GHS algorithms. As a result, it has the potential to be used in the other nuclear engineering optimization problems

  9. Spent fuel management fee methodology and computer code user's manual

    International Nuclear Information System (INIS)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively

  10. Optimal management of complications associated with achondroplasia

    Science.gov (United States)

    Ireland, Penny J; Pacey, Verity; Zankl, Andreas; Edwards, Priya; Johnston, Leanne M; Savarirayan, Ravi

    2014-01-01

    Achondroplasia is the most common form of skeletal dysplasia, resulting in disproportionate short stature, and affects over 250,000 people worldwide. Individuals with achondroplasia demonstrate a number of well-recognized anatomical features that impact on growth and development, with a complex array of medical issues that are best managed through a multidisciplinary team approach. The complexity of this presentation, whereby individual impairments may impact upon multiple activity and participation areas, requires consideration and discussion under a broad framework to gain a more thorough understanding of the experience of this condition for individuals with achondroplasia. This paper examines the general literature and research evidence on the medical and health aspects of individuals with achondroplasia and presents a pictorial model of achondroplasia based on The International Classification of Functioning, Disability, and Health (ICF). An expanded model of the ICF will be used to review and present the current literature pertaining to the musculoskeletal, neurological, cardiorespiratory, and ear, nose, and throat impairments and complications across the lifespan, with discussion on the impact of these impairments upon activity and participation performance. Further research is required to fully identify factors influencing participation and to help develop strategies to address these factors. PMID:25053890

  11. Optimal management of idiopathic macular holes.

    Science.gov (United States)

    Madi, Haifa A; Masri, Ibrahim; Steel, David H

    2016-01-01

    This review evaluates the current surgical options for the management of idiopathic macular holes (IMHs), including vitrectomy, ocriplasmin (OCP), and expansile gas use, and discusses key background information to inform the choice of treatment. An evidence-based approach to selecting the best treatment option for the individual patient based on IMH characteristics and patient-specific factors is suggested. For holes without vitreomacular attachment (VMA), vitrectomy is the only option with three key surgical variables: whether to peel the inner limiting membrane (ILM), the type of tamponade agent to be used, and the requirement for postoperative face-down posturing. There is a general consensus that ILM peeling improves primary anatomical hole closure rate; however, in small holes (holes, but large (>400 µm) and chronic holes (>1-year history) are usually treated with long-acting gas and posturing. Several studies on posturing and gas choice were carried out in combination with ILM peeling, which may also influence the gas and posturing requirement. Combined phacovitrectomy appears to offer more rapid visual recovery without affecting the long-term outcomes of vitrectomy for IMH. OCP is licensed for use in patients with small- or medium-sized holes and VMA. A greater success rate in using OCP has been reported in smaller holes, but further predictive factors for its success are needed to refine its use. It is important to counsel patients realistically regarding the rates of success with intravitreal OCP and its potential complications. Expansile gas can be considered as a further option in small holes with VMA; however, larger studies are required to provide guidance on its use.

  12. Purchasing and inventory management techniques for optimizing inventory investment

    International Nuclear Information System (INIS)

    McFarlane, I.; Gehshan, T.

    1993-01-01

    In an effort to reduce operations and maintenance costs among nuclear plants, many utilities are taking a closer look at their inventory investment. Various approaches for inventory reduction have been used and discussed, but these approaches are often limited to an inventory management perspective. Interaction with purchasing and planning personnel to reduce inventory investment is a necessity in utility efforts to become more cost competitive. This paper addresses the activities that purchasing and inventory management personnel should conduct in an effort to optimize inventory investment while maintaining service-level goals. Other functions within a materials management organization, such as the warehousing and investment recovery functions, can contribute to optimizing inventory investment. However, these are not addressed in this paper because their contributions often come after inventory management and purchasing decisions have been made

  13. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2014-06-01

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health

  14. Optimization of Coding of AR Sources for Transmission Across Channels with Loss

    DEFF Research Database (Denmark)

    Arildsen, Thomas

    Source coding concerns the representation of information in a source signal using as few bits as possible. In the case of lossy source coding, it is the encoding of a source signal using the fewest possible bits at a given distortion or, at the lowest possible distortion given a specified bit rate....... Channel coding is usually applied in combination with source coding to ensure reliable transmission of the (source coded) information at the maximal rate across a channel given the properties of this channel. In this thesis, we consider the coding of auto-regressive (AR) sources which are sources that can...... compared to the case where the encoder is unaware of channel loss. We finally provide an extensive overview of cross-layer communication issues which are important to consider due to the fact that the proposed algorithm interacts with the source coding and exploits channel-related information typically...

  15. Combining independent de novo assemblies optimizes the coding transcriptome for nonconventional model eukaryotic organisms.

    Science.gov (United States)

    Cerveau, Nicolas; Jackson, Daniel J

    2016-12-09

    Next-generation sequencing (NGS) technologies are arguably the most revolutionary technical development to join the list of tools available to molecular biologists since PCR. For researchers working with nonconventional model organisms one major problem with the currently dominant NGS platform (Illumina) stems from the obligatory fragmentation of nucleic acid material that occurs prior to sequencing during library preparation. This step creates a significant bioinformatic challenge for accurate de novo assembly of novel transcriptome data. This challenge becomes apparent when a variety of modern assembly tools (of which there is no shortage) are applied to the same raw NGS dataset. With the same assembly parameters these tools can generate markedly different assembly outputs. In this study we present an approach that generates an optimized consensus de novo assembly of eukaryotic coding transcriptomes. This approach does not represent a new assembler, rather it combines the outputs of a variety of established assembly packages, and removes redundancy via a series of clustering steps. We test and validate our approach using Illumina datasets from six phylogenetically diverse eukaryotes (three metazoans, two plants and a yeast) and two simulated datasets derived from metazoan reference genome annotations. All of these datasets were assembled using three currently popular assembly packages (CLC, Trinity and IDBA-tran). In addition, we experimentally demonstrate that transcripts unique to one particular assembly package are likely to be bioinformatic artefacts. For all eight datasets our pipeline generates more concise transcriptomes that in fact possess more unique annotatable protein domains than any of the three individual assemblers we employed. Another measure of assembly completeness (using the purpose built BUSCO databases) also confirmed that our approach yields more information. Our approach yields coding transcriptome assemblies that are more likely to be

  16. Integrating the nursing management minimum data set into the logical observation identifier names and codes system.

    Science.gov (United States)

    Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane

    2008-11-06

    This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.

  17. The management-retrieval code of nuclear level density sub-library (CENPL-NLD)

    International Nuclear Information System (INIS)

    Ge Zhigang; Su Zongdi; Huang Zhongfu; Dong Liaoyuan

    1995-01-01

    The management-retrieval code of the Nuclear Level Density (NLD) is presented. It contains two retrieval ways: single nucleus (SN) and neutron reaction (NR). The latter contains four kinds of retrieval types. This code not only can retrieve level density parameter and the data related to the level density, but also can calculate the relevant data by using different level density parameters and do comparison of the calculated results with related data in order to help user to select level density parameters

  18. Aging plant life management - the requirements defined to date by the KTA nuclear engineering codes

    International Nuclear Information System (INIS)

    Kalinowski, I.

    1996-01-01

    German nuclear engineering codes so far do not enclose a specific aging plant life management programme. However, the existing codes and standards do contain a number of applicable requirements and principles of relevance to objectives and principles of such programmes, as they also cover aging-induced effects on power plants. The major principles relating to preventive safety engineering and quality assurance are laid down in the publications KTA 1401, 1404, 1201, 1202, and KTA 3211. (DG) [de

  19. Optimal management of idiopathic macular holes

    Directory of Open Access Journals (Sweden)

    Madi HA

    2016-01-01

    Full Text Available Haifa A Madi,1,* Ibrahim Masri,1,* David H Steel1,2 1Sunderland Eye Infirmary, Sunderland, 2Institute of Genetic Medicine, Newcastle University, International Centre for Life, Newcastle, UK *These authors contributed equally to this work Abstract: This review evaluates the current surgical options for the management of idiopathic macular holes (IMHs, including vitrectomy, ocriplasmin (OCP, and expansile gas use, and discusses key background information to inform the choice of treatment. An evidence-based approach to selecting the best treatment option for the individual patient based on IMH characteristics and patient-specific factors is suggested. For holes without vitreomacular attachment (VMA, vitrectomy is the only option with three key surgical variables: whether to peel the inner limiting membrane (ILM, the type of tamponade agent to be used, and the requirement for postoperative face-down posturing. There is a general consensus that ILM peeling improves primary anatomical hole closure rate; however, in small holes (<250 µm, it is uncertain whether peeling is always required. It has been increasingly recognized that long-acting gas and face-down positioning are not always necessary in patients with small- and medium-sized holes, but large (>400 µm and chronic holes (>1-year history are usually treated with long-acting gas and posturing. Several studies on posturing and gas choice were carried out in combination with ILM peeling, which may also influence the gas and posturing requirement. Combined phacovitrectomy appears to offer more rapid visual recovery without affecting the long-term outcomes of vitrectomy for IMH. OCP is licensed for use in patients with small- or medium-sized holes and VMA. A greater success rate in using OCP has been reported in smaller holes, but further predictive factors for its success are needed to refine its use. It is important to counsel patients realistically regarding the rates of success with

  20. Code Optimization, Frozen Glassy Phase and Improved Decoding Algorithms for Low-Density Parity-Check Codes

    International Nuclear Information System (INIS)

    Huang Hai-Ping

    2015-01-01

    The statistical physics properties of low-density parity-check codes for the binary symmetric channel are investigated as a spin glass problem with multi-spin interactions and quenched random fields by the cavity method. By evaluating the entropy function at the Nishimori temperature, we find that irregular constructions with heterogeneous degree distribution of check (bit) nodes have higher decoding thresholds compared to regular counterparts with homogeneous degree distribution. We also show that the instability of the mean-field calculation takes place only after the entropy crisis, suggesting the presence of a frozen glassy phase at low temperatures. When no prior knowledge of channel noise is assumed (searching for the ground state), we find that a reinforced strategy on normal belief propagation will boost the decoding threshold to a higher value than the normal belief propagation. This value is close to the dynamical transition where all local search heuristics fail to identify the true message (codeword or the ferromagnetic state). After the dynamical transition, the number of metastable states with larger energy density (than the ferromagnetic state) becomes exponentially numerous. When the noise level of the transmission channel approaches the static transition point, there starts to exist exponentially numerous codewords sharing the identical ferromagnetic energy. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  1. Delegated Portfolio Management and Optimal Allocation of Portfolio Managers

    DEFF Research Database (Denmark)

    Christensen, Michael; Vangsgaard Christensen, Michael; Gamskjaer, Ken

    2015-01-01

    In this article, we investigate whether the application of the mean-variance framework on portfolio manager allocation offers any out-of-sample benefits compared to a naïve strategy of equal weighting. Based on an exclusive data-set of high-net-worth (HNW) investors, we utilize a wide variety of ...

  2. Reactor physics computer code development for neutronic design, fuel-management, reactor operation and safety analysis of PHWRs

    International Nuclear Information System (INIS)

    Rastogi, B.P.

    1989-01-01

    This report discusses various reactor physics codes developed for neutronic design, fuel-management, reactor operation and safety analysis of PHWRs. These code packages have been utilized for nuclear design of 500 MWe and new 235 MWe PHWRs. (author)

  3. Content Adaptive Lagrange Multiplier Selection for Rate-Distortion Optimization in 3-D Wavelet-Based Scalable Video Coding

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2018-03-01

    Full Text Available Rate-distortion optimization (RDO plays an essential role in substantially enhancing the coding efficiency. Currently, rate-distortion optimized mode decision is widely used in scalable video coding (SVC. Among all the possible coding modes, it aims to select the one which has the best trade-off between bitrate and compression distortion. Specifically, this tradeoff is tuned through the choice of the Lagrange multiplier. Despite the prevalence of conventional method for Lagrange multiplier selection in hybrid video coding, the underlying formulation is not applicable to 3-D wavelet-based SVC where the explicit values of the quantization step are not available, with on consideration of the content features of input signal. In this paper, an efficient content adaptive Lagrange multiplier selection algorithm is proposed in the context of RDO for 3-D wavelet-based SVC targeting quality scalability. Our contributions are two-fold. First, we introduce a novel weighting method, which takes account of the mutual information, gradient per pixel, and texture homogeneity to measure the temporal subband characteristics after applying the motion-compensated temporal filtering (MCTF technique. Second, based on the proposed subband weighting factor model, we derive the optimal Lagrange multiplier. Experimental results demonstrate that the proposed algorithm enables more satisfactory video quality with negligible additional computational complexity.

  4. A model for the optimal risk management of (farm) firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    Current methods of risk management focus on efficiency and do not provide operational answers to the basic question of how to optimise and balance the two objectives, maximisation of expected income and minimisation of risk. This paper uses the Capital Asset Pricing Model (CAPM) to derive...... an operational criterion for the optimal risk management of firms. The criterion assumes that the objective of the firm manager is to maximise the market value of the firm and is based on the condition that the application of risk management tools has a symmetric effect on the variability of income around...... the mean. The criterion is based on the expected consequences of risk management on relative changes in the variance of return on equity and expected income. The paper demonstrates how the criterion may be used to evaluate and compare the effect of different risk management tools, and it illustrates how...

  5. Development of Geometry Optimization Methodology with In-house CFD code, and Challenge in Applying to Fuel Assembly

    International Nuclear Information System (INIS)

    Jeong, J. H.; Lee, K. L.

    2016-01-01

    The wire spacer has important roles to avoid collisions between adjacent rods, to mitigate a vortex induced vibration, and to enhance convective heat transfer by wire spacer induced secondary flow. Many experimental and numerical works has been conducted to understand the thermal-hydraulics of the wire-wrapped fuel bundles. There has been enormous growth in computing capability. Recently, a huge increase of computer power allows to three-dimensional simulation of thermal-hydraulics of wire-wrapped fuel bundles. In this study, the geometry optimization methodology with RANS based in-house CFD (Computational Fluid Dynamics) code has been successfully developed in air condition. In order to apply the developed methodology to fuel assembly, GGI (General Grid Interface) function is developed for in-house CFD code. Furthermore, three-dimensional flow fields calculated with in-house CFD code are compared with those calculated with general purpose commercial CFD solver, CFX. The geometry optimization methodology with RANS based in-house CFD code has been successfully developed in air condition. In order to apply the developed methodology to fuel assembly, GGI function is developed for in-house CFD code as same as CFX. Even though both analyses are conducted with same computational meshes, numerical error due to GGI function locally occurred in only CFX solver around rod surface and boundary region between inner fluid region and outer fluid region.

  6. Principles for a Code of Conduct for the Management and Sustainable Use of Mangrove Ecosystems

    DEFF Research Database (Denmark)

    Macintosh, Donald; Nielsen, Thomas; Zweig, Ronald

    mangrove forest ecosystems worldwide, the World Bank commissioned a study with the title "Mainstreaming conservation of coastal biodiversity through formulation of a generic Code of Conduct for Sustainable Management of Mangrove Forest Ecosystems". Formulation of these Principles for a Code of Conduct...... and the sustainable use of mangrove resources. It recommends key legislation and enforcement mechanisms (e.g. governmental and/or community based) considered necessary to ensure the effective conservation, protection and sustainable use of mangroves. The Principles for a Code of Conduct for mangroves was prepared......, Africa, and Central and South America. These workshops provided an opportunity to seek expert advice regarding practical examples of sound mangrove management, or problems for management, from each region, and to illustrate them in the working document. A peer review workshop was held in Washington...

  7. Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.

    Science.gov (United States)

    Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh

    2018-01-01

    Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.

  8. Optimal energy management for a flywheel-based hybrid vehicle

    NARCIS (Netherlands)

    Berkel, van K.; Hofman, T.; Vroemen, B.G.; Steinbuch, M.

    2011-01-01

    This paper presents the modeling and design of an optimal Energy Management Strategy (EMS) for a flywheel-based hybrid vehicle, that does not use any electrical motor/generator, or a battery, for its hybrid functionalities. The hybrid drive train consists of only low-cost components, such as a

  9. Electricity portfolio management : optimal peak/off-peak allocations

    NARCIS (Netherlands)

    Huisman, R.; Mahieu, R.J.; Schlichter, F.

    2009-01-01

    Electricity purchasers manage a portfolio of contracts in order to purchase the expected future electricity consumption profile of a company or a pool of clients. This paper proposes a mean-variance framework to address the concept of structuring the portfolio and focuses on how to optimally

  10. Nickel-Cadmium Battery Operation Management Optimization Using Robust Design

    Science.gov (United States)

    Blosiu, Julian O.; Deligiannis, Frank; DiStefano, Salvador

    1996-01-01

    In recent years following several spacecraft battery anomalies, it was determined that managing the operational factors of NASA flight NiCd rechargeable battery was very important in order to maintain space flight battery nominal performance. The optimization of existing flight battery operational performance was viewed as something new for a Taguchi Methods application.

  11. Optimal detection and control strategies for invasive species management

    Science.gov (United States)

    Shefali V. Mehta; Robert G. Haight; Frances R. Homans; Stephen Polasky; Robert C. Venette

    2007-01-01

    The increasing economic and environmental losses caused by non-native invasive species amplify the value of identifying and implementing optimal management options to prevent, detect, and control invasive species. Previous literature has focused largely on preventing introductions of invasive species and post-detection control activities; few have addressed the role of...

  12. Optimizing Resource and Energy Recovery for Municipal Solid Waste Management

    Science.gov (United States)

    Significant reductions of carbon emissions and air quality impacts can be achieved by optimizing municipal solid waste (MSW) as a resource. Materials and discards management were found to contribute ~40% of overall U.S. GHG emissions as a result of materials extraction, transpo...

  13. Optimization of tritium management within the ITER project

    International Nuclear Information System (INIS)

    Cortes, P.; Elbez-Uzan, J.; Glugla, M.; Rosanvallon, S.; Ciattaglia, S.; Iseli, M.; Rodriguez-Rodrigo, L.

    2009-01-01

    The authors describe the tritium cycle existing within the ITER project and which has been considered since its beginning. They indicate how confinement systems ensure tritium confinement, how tritium is recovered and processed. They indicate the different tritium management optimization ways which have been identified and integrated into the ITER design

  14. Modeling the optimal management of spent nuclear fuel

    International Nuclear Information System (INIS)

    Nachlas, J.A.; Kurstedt, H.A. Jr.; Swindle, D.W. Jr.; Korcz, K.O.

    1977-01-01

    Recent governmental policy decisions dictate that strategies for managing spent nuclear fuel be developed. Two models are constructed to investigate the optimum residence time and the optimal inventory withdrawal policy for fuel material that presently must be stored. The mutual utility of the models is demonstrated through reference case application

  15. Real-Time Demand Side Management Algorithm Using Stochastic Optimization

    Directory of Open Access Journals (Sweden)

    Moses Amoasi Acquah

    2018-05-01

    Full Text Available A demand side management technique is deployed along with battery energy-storage systems (BESS to lower the electricity cost by mitigating the peak load of a building. Most of the existing methods rely on manual operation of the BESS, or even an elaborate building energy-management system resorting to a deterministic method that is susceptible to unforeseen growth in demand. In this study, we propose a real-time optimal operating strategy for BESS based on density demand forecast and stochastic optimization. This method takes into consideration uncertainties in demand when accounting for an optimal BESS schedule, making it robust compared to the deterministic case. The proposed method is verified and tested against existing algorithms. Data obtained from a real site in South Korea is used for verification and testing. The results show that the proposed method is effective, even for the cases where the forecasted demand deviates from the observed demand.

  16. The Optimization of Radioactive Waste Management in the Nuclear Installation Decommissioning Process

    International Nuclear Information System (INIS)

    Zachar, Matej; Necas, Vladimir

    2008-01-01

    The paper presents a basic characterization of nuclear installation decommissioning process especially in the term of radioactive materials management. A large amount of solid materials and secondary waste created after implementation of decommissioning activities have to be managed considering their physical, chemical, toxic and radiological characteristics. Radioactive materials should be, after fulfilling all the conditions defined by the authorities, released to the environment for the further use. Non-releasable materials are considered to be a radioactive waste. Their management includes various procedures starting with pre-treatment activities, continuing with storage, treatment and conditioning procedures. Finally, they are disposed in the near surface or deep geological repositories. Considering the advantages and disadvantages of all possible ways of releasing the material from nuclear installation area, optimization of the material management process should be done. Emphasis is placed on the radiological parameters of materials, availability of waste management technologies, waste repositories and on the radiological limits and conditions for materials release or waste disposal. Appropriate optimization of material flow should lead to the significant savings of money, disposal capacities or raw material resources. Using a suitable calculation code e.g. OMEGA, the evaluation of the various material management scenarios and selection of the best one, based on the multi-criterion analysis, should be done. (authors)

  17. Optimal sequence of landfills in solid waste management

    Energy Technology Data Exchange (ETDEWEB)

    Andre, F.J. [Universidad Pablo de Olavide (Spain); Cerda, E. [Universidad Complutense de Madrid (Spain)

    2001-07-01

    Given that landfills are depletable and replaceable resources, the right approach, when dealing with landfill management, is that of designing an optimal sequence of landfills rather than designing every single landfill separately. In this paper, we use Optimal Control models, with mixed elements of both continuous-and discrete-time problems, to determine an optimal sequence of landfills, as regarding their capacity and lifetime. The resulting optimization problems involve splitting a time horizon of planning into several subintervals, the length of which has to be decided. In each of the subintervals some costs, the amount of which depends on the value of the decision variables, have to be borne. The obtained results may be applied to other economic problems such as private and public investments, consumption decisions on durable goods, etc. (Author)

  18. AN OPTIMAL MAINTENANCE MANAGEMENT MODEL FOR AIRPORT CONCRETE PAVEMENT

    Science.gov (United States)

    Shimomura, Taizo; Fujimori, Yuji; Kaito, Kiyoyuki; Obama, Kengo; Kobayashi, Kiyoshi

    In this paper, an optimal management model is formulated for the performance-based rehabilitation/maintenance contract for airport concrete pavement, whereby two types of life cycle cost risks, i.e., ground consolidation risk and concrete depreciation risk, are explicitly considered. The non-homogenous Markov chain model is formulated to represent the deterioration processes of concrete pavement which are conditional upon the ground consolidation processes. The optimal non-homogenous Markov decision model with multiple types of risk is presented to design the optimal rehabilitation/maintenance plans. And the methodology to revise the optimal rehabilitation/maintenance plans based upon the monitoring data by the Bayesian up-to-dating rules. The validity of the methodology presented in this paper is examined based upon the case studies carried out for the H airport.

  19. Housing Development Building Management System (HDBMS For Optimized Electricity Bills

    Directory of Open Access Journals (Sweden)

    Weixian Li

    2017-08-01

    Full Text Available Smart Buildings is a modern building that allows residents to have sustainable comfort with high efficiency of electricity usage. These objectives could be achieved by applying appropriate, capable optimization algorithms and techniques. This paper presents a Housing Development Building Management System (HDBMS strategy inspired by Building Energy Management System (BEMS concept that will integrate with smart buildings using Supply Side Management (SSM and Demand Side Management (DSM System. HDBMS is a Multi-Agent System (MAS based decentralized decision making system proposed by various authors. MAS based HDBMS was created using JAVA on a IEEE FIPA compliant multi-agent platform named JADE. It allows agents to communicate, interact and negotiate with energy supply and demand of the smart buildings to provide the optimal energy usage and minimal electricity costs.  This results in reducing the load of the power distribution system in smart buildings which simulation studies has shown the potential of proposed HDBMS strategy to provide the optimal solution for smart building energy management.

  20. A Distributed Flow Rate Control Algorithm for Networked Agent System with Multiple Coding Rates to Optimize Multimedia Data Transmission

    Directory of Open Access Journals (Sweden)

    Shuai Zeng

    2013-01-01

    Full Text Available With the development of wireless technologies, mobile communication applies more and more extensively in the various walks of life. The social network of both fixed and mobile users can be seen as networked agent system. At present, kinds of devices and access network technology are widely used. Different users in this networked agent system may need different coding rates multimedia data due to their heterogeneous demand. This paper proposes a distributed flow rate control algorithm to optimize multimedia data transmission of the networked agent system with the coexisting various coding rates. In this proposed algorithm, transmission path and upload bandwidth of different coding rate data between source node, fixed and mobile nodes are appropriately arranged and controlled. On the one hand, this algorithm can provide user nodes with differentiated coding rate data and corresponding flow rate. On the other hand, it makes the different coding rate data and user nodes networked, which realizes the sharing of upload bandwidth of user nodes which require different coding rate data. The study conducts mathematical modeling on the proposed algorithm and compares the system that adopts the proposed algorithm with the existing system based on the simulation experiment and mathematical analysis. The results show that the system that adopts the proposed algorithm achieves higher upload bandwidth utilization of user nodes and lower upload bandwidth consumption of source node.

  1. How do primary care doctors in England and Wales code and manage people with chronic kidney disease? Results from the National Chronic Kidney Disease Audit.

    Science.gov (United States)

    Kim, Lois G; Cleary, Faye; Wheeler, David C; Caplin, Ben; Nitsch, Dorothea; Hull, Sally A

    2017-10-16

    In the UK, primary care records are electronic and require doctors to ascribe disease codes to direct care plans and facilitate safe prescribing. We investigated factors associated with coding of chronic kidney disease (CKD) in patients with reduced kidney function and the impact this has on patient management. We identified patients meeting biochemical criteria for CKD (two estimated glomerular filtration rates 90 days apart) from 1039 general practitioner (GP) practices in a UK audit. Clustered logistic regression was used to identify factors associated with coding for CKD and improvement in coding as a result of the audit process. We investigated the relationship between coding and five interventions recommended for CKD: achieving blood pressure targets, proteinuria testing, statin prescription and flu and pneumococcal vaccination. Of 256 000 patients with biochemical CKD, 30% did not have a GP CKD code. Males, older patients, those with more severe CKD, diabetes or hypertension or those prescribed statins were more likely to have a CKD code. Among those with continued biochemical CKD following audit, these same characteristics increased the odds of improved coding. Patients without any kidney diagnosis were less likely to receive optimal care than those coded for CKD [e.g. odds ratio for meeting blood pressure target 0.78 (95% confidence interval 0.76-0.79)]. Older age, male sex, diabetes and hypertension are associated with coding for those with biochemical CKD. CKD coding is associated with receiving key primary care interventions recommended for CKD. Increased efforts to incentivize CKD coding may improve outcomes for CKD patients. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA.

  2. Optimization of an Electromagnetics Code with Multicore Wavefront Diamond Blocking and Multi-dimensional Intra-Tile Parallelization

    KAUST Repository

    Malas, Tareq M.

    2016-07-21

    Understanding and optimizing the properties of solar cells is becoming a key issue in the search for alternatives to nuclear and fossil energy sources. A theoretical analysis via numerical simulations involves solving Maxwell\\'s Equations in discretized form and typically requires substantial computing effort. We start from a hybrid-parallel (MPI+OpenMP) production code that implements the Time Harmonic Inverse Iteration Method (THIIM) with Finite-Difference Frequency Domain (FDFD) discretization. Although this algorithm has the characteristics of a strongly bandwidth-bound stencil update scheme, it is significantly different from the popular stencil types that have been exhaustively studied in the high performance computing literature to date. We apply a recently developed stencil optimization technique, multicore wavefront diamond tiling with multi-dimensional cache block sharing, and describe in detail the peculiarities that need to be considered due to the special stencil structure. Concurrency in updating the components of the electric and magnetic fields provides an additional level of parallelism. The dependence of the cache size requirement of the optimized code on the blocking parameters is modeled accurately, and an auto-tuner searches for optimal configurations in the remaining parameter space. We were able to completely decouple the execution from the memory bandwidth bottleneck, accelerating the implementation by a factor of three to four compared to an optimal implementation with pure spatial blocking on an 18-core Intel Haswell CPU.

  3. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    Energy Technology Data Exchange (ETDEWEB)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.; Faletti, D.W.; Wiles, L.E.

    1978-05-01

    The User's Manual describes how to operate BNW-II, a computer code developed by the Pacific Northwest Laboratory (PNL) as a part of its activities under the Department of Energy (DOE) Dry Cooling Enhancement Program. The computer program offers a comprehensive method of evaluating the cost savings potential of dry/wet-cooled heat rejection systems. Going beyond simple ''figure-of-merit'' cooling tower optimization, this method includes such items as the cost of annual replacement capacity, and the optimum split between plant scale-up and replacement capacity, as well as the purchase and operating costs of all major heat rejection components. Hence the BNW-II code is a useful tool for determining potential cost savings of new dry/wet surfaces, new piping, or other components as part of an optimized system for a dry/wet-cooled plant.

  4. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    Science.gov (United States)

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  5. Theory of the space-dependent fuel management computer code ''UAFCC''

    International Nuclear Information System (INIS)

    El-Meshad, Y.; Morsy, S.; El-Osery, I.A.

    1981-01-01

    This report displays the theory of the spatial burnup computer code ''UAFCC'' which has been constructed as a part of an integrated reactor calculation scheme proposed at the Reactors Department of the ARE Atomic Energy Authority. The ''UAFCC'' is a single energy-one-dimensional diffusion burnup FORTRAN computer code for well moderated, multiregion, cylindrical thermal reactors. The effect of reactivity variation with burnup is introduced in the steady state diffusion equation by a fictitious neutron source. The infinite multiplication factor, the total migration area, and the power density per unit thermal flux are calculated from the point model burnup code ''UABUC'' fitted to polynomials of suitable degree in the flux-time, and then used as an input data to the ''UAFCC'' code. The proposed burnup spatial model has been used to study the different stratogemes of the incore fuel management schemes. The conclusions of this study will be presented in a future publication. (author)

  6. Optimal control theory applications to management science and economics

    CERN Document Server

    Sethi, Suresh P

    2006-01-01

    Optimal control methods are used to determine the best ways to control a dynamic system. This book applies theoretical work to business management problems developed from the authors' research and classroom instruction. The thoroughly revised new edition has been refined with careful attention to the text and graphic material presentation. Chapters cover a range of topics including finance, production and inventory problems, marketing problems, machine maintenance and replacement, problems of optimal consumption of natural resources, and applications of control theory to economics. The book in

  7. Driving external chemistry optimization via operations management principles.

    Science.gov (United States)

    Bi, F Christopher; Frost, Heather N; Ling, Xiaolan; Perry, David A; Sakata, Sylvie K; Bailey, Simon; Fobian, Yvette M; Sloan, Leslie; Wood, Anthony

    2014-03-01

    Confronted with the need to significantly raise the productivity of remotely located chemistry CROs Pfizer embraced a commitment to continuous improvement which leveraged the tools from both Lean Six Sigma and queue management theory to deliver positive measurable outcomes. During 2012 cycle times were reduced by 48% by optimization of the work in progress and conducting a detailed workflow analysis to identify and address pinch points. Compound flow was increased by 29% by optimizing the request process and de-risking the chemistry. Underpinning both achievements was the development of close working relationships and productive communications between Pfizer and CRO chemists. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Electricity Portfolio Management: Optimal Peak / Off-Peak Allocations

    OpenAIRE

    Huisman, Ronald; Mahieu, Ronald; Schlichter, Felix

    2007-01-01

    textabstractElectricity purchasers manage a portfolio of contracts in order to purchase the expected future electricity consumption profile of a company or a pool of clients. This paper proposes a mean-variance framework to address the concept of structuring the portfolio and focuses on how to allocate optimal positions in peak and off-peak forward contracts. It is shown that the optimal allocations are based on the difference in risk premiums per unit of day-ahead risk as a measure of relati...

  9. Hydroeconomic optimization of reservoir management under downstream water quality constraints

    DEFF Research Database (Denmark)

    Davidsen, Claus; Liu, Suxia; Mo, Xingguo

    2015-01-01

    water quantity and water quality management and minimizes the total costs over a planning period assuming stochastic future runoff. The outcome includes cost-optimal reservoir releases, groundwater pumping, water allocation, wastewater treatments and water curtailments. The optimization model uses......), and the resulting minimum dissolved oxygen (DO) concentration is computed with the Streeter-Phelps equation and constrained to match Chinese water quality targets. The baseline water scarcity and operational costs are estimated to 15.6. billion. CNY/year. Compliance to water quality grade III causes a relatively...

  10. Adaptive surrogate model based multiobjective optimization for coastal aquifer management

    Science.gov (United States)

    Song, Jian; Yang, Yun; Wu, Jianfeng; Wu, Jichun; Sun, Xiaomin; Lin, Jin

    2018-06-01

    In this study, a novel surrogate model assisted multiobjective memetic algorithm (SMOMA) is developed for optimal pumping strategies of large-scale coastal groundwater problems. The proposed SMOMA integrates an efficient data-driven surrogate model with an improved non-dominated sorted genetic algorithm-II (NSGAII) that employs a local search operator to accelerate its convergence in optimization. The surrogate model based on Kernel Extreme Learning Machine (KELM) is developed and evaluated as an approximate simulator to generate the patterns of regional groundwater flow and salinity levels in coastal aquifers for reducing huge computational burden. The KELM model is adaptively trained during evolutionary search to satisfy desired fidelity level of surrogate so that it inhibits error accumulation of forecasting and results in correctly converging to true Pareto-optimal front. The proposed methodology is then applied to a large-scale coastal aquifer management in Baldwin County, Alabama. Objectives of minimizing the saltwater mass increase and maximizing the total pumping rate in the coastal aquifers are considered. The optimal solutions achieved by the proposed adaptive surrogate model are compared against those solutions obtained from one-shot surrogate model and original simulation model. The adaptive surrogate model does not only improve the prediction accuracy of Pareto-optimal solutions compared with those by the one-shot surrogate model, but also maintains the equivalent quality of Pareto-optimal solutions compared with those by NSGAII coupled with original simulation model, while retaining the advantage of surrogate models in reducing computational burden up to 94% of time-saving. This study shows that the proposed methodology is a computationally efficient and promising tool for multiobjective optimizations of coastal aquifer managements.

  11. The Application of Social Characteristic and L1 Optimization in the Error Correction for Network Coding in Wireless Sensor Networks.

    Science.gov (United States)

    Zhang, Guangzhi; Cai, Shaobin; Xiong, Naixue

    2018-02-03

    One of the remarkable challenges about Wireless Sensor Networks (WSN) is how to transfer the collected data efficiently due to energy limitation of sensor nodes. Network coding will increase network throughput of WSN dramatically due to the broadcast nature of WSN. However, the network coding usually propagates a single original error over the whole network. Due to the special property of error propagation in network coding, most of error correction methods cannot correct more than C /2 corrupted errors where C is the max flow min cut of the network. To maximize the effectiveness of network coding applied in WSN, a new error-correcting mechanism to confront the propagated error is urgently needed. Based on the social network characteristic inherent in WSN and L1 optimization, we propose a novel scheme which successfully corrects more than C /2 corrupted errors. What is more, even if the error occurs on all the links of the network, our scheme also can correct errors successfully. With introducing a secret channel and a specially designed matrix which can trap some errors, we improve John and Yi's model so that it can correct the propagated errors in network coding which usually pollute exactly 100% of the received messages. Taking advantage of the social characteristic inherent in WSN, we propose a new distributed approach that establishes reputation-based trust among sensor nodes in order to identify the informative upstream sensor nodes. With referred theory of social networks, the informative relay nodes are selected and marked with high trust value. The two methods of L1 optimization and utilizing social characteristic coordinate with each other, and can correct the propagated error whose fraction is even exactly 100% in WSN where network coding is performed. The effectiveness of the error correction scheme is validated through simulation experiments.

  12. Generating optimized stochastic power management strategies for electric car components

    Energy Technology Data Exchange (ETDEWEB)

    Fruth, Matthias [TraceTronic GmbH, Dresden (Germany); Bastian, Steve [Technische Univ. Dresden (Germany)

    2012-11-01

    With the increasing prevalence of electric vehicles, reducing the power consumption of car components becomes a necessity. For the example of a novel traffic-light assistance system, which makes speed recommendations based on the expected length of red-light phases, power-management strategies are used to control under which conditions radio communication, positioning systems and other components are switched to low-power (e.g. sleep) or high-power (e.g. idle/busy) states. We apply dynamic power management, an optimization technique well-known from other domains, in order to compute energy-optimal power-management strategies, sometimes resulting in these strategies being stochastic. On the example of the traffic-light assistant, we present a MATLAB/Simulink-implemented framework for the generation, simulation and formal analysis of optimized power-management strategies, which is based on this technique. We study capabilities and limitations of this approach and sketch further applications in the automotive domain. (orig.)

  13. Using the Electronic Industry Code of Conduct to Evaluate Green Supply Chain Management: An Empirical Study of Taiwan’s Computer Industry

    Directory of Open Access Journals (Sweden)

    Ching-Ching Liu

    2015-03-01

    Full Text Available Electronics companies throughout Asia recognize the benefits of Green Supply Chain Management (GSCM for gaining competitive advantage. A large majority of electronics companies in Taiwan have recently adopted the Electronic Industry Citizenship Coalition (EICC Code of Conduct for defining and managing their social and environmental responsibilities throughout their supply chains. We surveyed 106 Tier 1 suppliers to the Taiwanese computer industry to determine their environmental performance using the EICC Code of Conduct (EICC Code and performed Analysis of Variance (ANOVA on the 63/106 questionnaire responses collected. We test the results to determine whether differences in product type, geographic area, and supplier size correlate with different levels of environmental performance. To our knowledge, this is the first study to analyze questionnaire data on supplier adoption to optimize the implementation of GSCM. The results suggest that characteristic classification of suppliers could be employed to enhance the efficiency of GSCM.

  14. Monte Carlo simulation for treatment planning optimization of the COMS and USC eye plaques using the MCNP4C code

    International Nuclear Information System (INIS)

    Jannati Isfahani, A.; Shokrani, P.; Raisali, Gh.

    2010-01-01

    Ophthalmic plaque radiotherapy using I-125 radioactive seeds in removable episcleral plaques is often used in management of ophthalmic tumors. Radioactive seeds are fixed in a gold bowl-shaped plaque and the plaque is sutured to the scleral surface corresponding to the base of the intraocular tumor. This treatment allows for a localized radiation dose delivery to the tumor with a minimum target dose of 85 Gy. The goal of this study was to develop a Monte Carlo simulation method for treatment planning optimization of the COMS and USC eye plaques. Material and Methods: The MCNP4C code was used to simulate three plaques: COMS-12mm, COMS-20mm, and USC ≠9 with I-125 seeds. Calculation of dose was performed in a spherical water phantom (radius 12 mm) using a 3D matrix with a size of 12 voxels in each dimension. Each voxel contained a sphere of radius 1 mm. Results: Dose profiles were calculated for each plaque. Isodose lines were created in 2 planes normal to the axes of the plaque, at the base of the tumor and at the level of the 85 Gy isodose in a 7 day treatment. Discussion and Conclusion: This study shows that it is necessary to consider the following tumor properties in design or selection of an eye plaque: the diameter of tumor base, its thickness and geometric shape, and the tumor location with respect to normal critical structures. The plaque diameter is selected by considering the tumor diameter. Tumor thickness is considered when selecting the seed parameters such as their number, activity and distribution. Finally, tumor shape and its location control the design of following parameters: the shape and material of the plaque and the need for collimation.

  15. Analysis and optimization of hybrid electric vehicle thermal management systems

    Science.gov (United States)

    Hamut, H. S.; Dincer, I.; Naterer, G. F.

    2014-02-01

    In this study, the thermal management system of a hybrid electric vehicle is optimized using single and multi-objective evolutionary algorithms in order to maximize the exergy efficiency and minimize the cost and environmental impact of the system. The objective functions are defined and decision variables, along with their respective system constraints, are selected for the analysis. In the multi-objective optimization, a Pareto frontier is obtained and a single desirable optimal solution is selected based on LINMAP decision-making process. The corresponding solutions are compared against the exergetic, exergoeconomic and exergoenvironmental single objective optimization results. The results show that the exergy efficiency, total cost rate and environmental impact rate for the baseline system are determined to be 0.29, ¢28 h-1 and 77.3 mPts h-1 respectively. Moreover, based on the exergoeconomic optimization, 14% higher exergy efficiency and 5% lower cost can be achieved, compared to baseline parameters at an expense of a 14% increase in the environmental impact. Based on the exergoenvironmental optimization, a 13% higher exergy efficiency and 5% lower environmental impact can be achieved at the expense of a 27% increase in the total cost.

  16. Accuracy improvement of SPACE code using the optimization for CHF subroutine

    International Nuclear Information System (INIS)

    Yang, Chang Keun; Kim, Yo Han; Park, Jong Eun; Ha, Sang Jun

    2010-01-01

    Typically, a subroutine to calculate the CHF (Critical Heat Flux) is loaded in code for safety analysis of nuclear power plant. CHF subroutine calculates CHF phenomenon using arbitrary condition (Temperature, pressure, flow rate, power, etc). When safety analysis for nuclear power plant is performed using major factor, CHF parameter is one of the most important factor. But the subroutines used in most codes, such as Biasi method, etc., estimate some different values from experimental data. Most CHF subroutines in the codes could predict only in their specification area, such as pressure, mass flow, void fraction, etc. Even though the most accurate CHF subroutine is used in the high quality nuclear safety analysis code, it is not assured that the valued predicted values by the subroutine are acceptable out of their application area. To overcome this hardship, various approaches to estimate the CHF have been examined during the code developing stage of SPACE. And the six sigma technique was adopted for the examination as mentioned this study. The objective of this study is to improvement of CHF prediction accuracy for nuclear power plant safety analysis code using the CHF database and Six Sigma technique. Through the study, it was concluded that the six sigma technique was useful to quantify the deviation of prediction values to experimental data and the implemented CHF prediction method in SPACE code had well-predict capabilities compared with those from other methods

  17. Multi-Objective Optimization of a Hybrid ESS Based on Optimal Energy Management Strategy for LHDs

    Directory of Open Access Journals (Sweden)

    Jiajun Liu

    2017-10-01

    Full Text Available Energy storage systems (ESS play an important role in the performance of mining vehicles. A hybrid ESS combining both batteries (BTs and supercapacitors (SCs is one of the most promising solutions. As a case study, this paper discusses the optimal hybrid ESS sizing and energy management strategy (EMS of 14-ton underground load-haul-dump vehicles (LHDs. Three novel contributions are added to the relevant literature. First, a multi-objective optimization is formulated regarding energy consumption and the total cost of a hybrid ESS, which are the key factors of LHDs, and a battery capacity degradation model is used. During the process, dynamic programming (DP-based EMS is employed to obtain the optimal energy consumption and hybrid ESS power profiles. Second, a 10-year life cycle cost model of a hybrid ESS for LHDs is established to calculate the total cost, including capital cost, operating cost, and replacement cost. According to the optimization results, three solutions chosen from the Pareto front are compared comprehensively, and the optimal one is selected. Finally, the optimal and battery-only options are compared quantitatively using the same objectives, and the hybrid ESS is found to be a more economical and efficient option.

  18. The Impact of Diagnostic Code Misclassification on Optimizing the Experimental Design of Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Steven J. Schrodi

    2017-01-01

    Full Text Available Diagnostic codes within electronic health record systems can vary widely in accuracy. It has been noted that the number of instances of a particular diagnostic code monotonically increases with the accuracy of disease phenotype classification. As a growing number of health system databases become linked with genomic data, it is critically important to understand the effect of this misclassification on the power of genetic association studies. Here, I investigate the impact of this diagnostic code misclassification on the power of genetic association studies with the aim to better inform experimental designs using health informatics data. The trade-off between (i reduced misclassification rates from utilizing additional instances of a diagnostic code per individual and (ii the resulting smaller sample size is explored, and general rules are presented to improve experimental designs.

  19. Optimal Energy Management for Microgrid with Stationary and Mobile Storage

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yubo; Wang, Bin; Zhang, Tianyang; Nazaripouya, Hamidreza; Chu, Chi-Cheng; Gadh, Rajit

    2016-05-02

    This paper studies energy management in a Microgrid (MG) with solar generation, Battery Energy Management System (BESS) and gridable (V2G) EVs. A two-stage stochastic optimization method is proposed to capture the intermittent solar generation and random EV user behaviors. It is subsequently formulated as a Mixed Integer Linear Programming (MILP) problem. To evalutate the proposed method, real solar generation, loads, BESS and EV data is used in Sample Average Approximation (SAA). Computational results show the correctness of the proposed method as well as steady and tightly bounded optimality gap. Comparisons demonstrate that the proposed stochastic method outperforms its deterministic counterpart at the expense of higher computational cost. It is also observed that moderate number of EVs helps to reduce the overall operational cost of the MG, which sheds light on future EV integration to the smart grid.

  20. Iterative Phase Optimization of Elementary Quantum Error Correcting Codes (Open Access, Publisher’s Version)

    Science.gov (United States)

    2016-08-24

    to the seven-qubit Steane code [29] and also represents the smallest instance of a 2D topological color code [30]. Since the realized quantum error...Quantum Computations on a Topologically Encoded Qubit, Science 345, 302 (2014). [17] M. Cramer, M. B. Plenio, S. T. Flammia, R. Somma, D. Gross, S. D...Memory, J. Math . Phys. (N.Y.) 43, 4452 (2002). [20] B. M. Terhal, Quantum Error Correction for Quantum Memories, Rev. Mod. Phys. 87, 307 (2015). [21] D

  1. Performance Evaluation of a Novel Optimization Sequential Algorithm (SeQ Code for FTTH Network

    Directory of Open Access Journals (Sweden)

    Fazlina C.A.S.

    2017-01-01

    Full Text Available The SeQ codes has advantages, such as variable cross-correlation property at any given number of users and weights, as well as effectively suppressed the impacts of phase induced intensity noise (PIIN and multiple access interference (MAI cancellation property. The result revealed, at system performance analysis of BER = 10-09, the SeQ code capable to achieved 1 Gbps up to 60 km.

  2. Optimal energy management strategy for battery powered electric vehicles

    International Nuclear Information System (INIS)

    Xi, Jiaqi; Li, Mian; Xu, Min

    2014-01-01

    Highlights: • The power usage for battery-powered electrical vehicles with in-wheel motors is maximized. • The battery and motor dynamics are examined emphasized on the power conversion and utilization. • The optimal control strategy is derived and verified by simulations. • An analytic expression of the optimal operating point is obtained. - Abstract: Due to limited energy density of batteries, energy management has always played a critical role in improving the overall energy efficiency of electric vehicles. In this paper, a key issue within the energy management problem will be carefully tackled, i.e., maximizing the power usage of batteries for battery-powered electrical vehicles with in-wheel motors. To this end, the battery and motor dynamics will be thoroughly examined with particular emphasis on the power conversion and power utilization. The optimal control strategy will then be derived based on the analysis. One significant contribution of this work is that an analytic expression for the optimal operating point in terms of the component and environment parameters can be obtained. Owing to this finding, the derived control strategy is also rendered a simple structure for real-time implementation. Simulation results demonstrate that the proposed strategy works both adaptively and robustly under different driving scenarios

  3. The Importance of Supply Chain Management on Financial Optimization

    Directory of Open Access Journals (Sweden)

    Arawati Agus

    2013-01-01

    Full Text Available Many manufacturing companies are facing uncertainties and stiff competition both locally and globally, intensified by increasing needs for sophisticated and high value products from demanding customers. These companies are forced to improve the quality of their supply chain management decisions, products and reduce their manufacturing costs. With today’s volatile and very challenging global market, many manufacturing companies have started to realize the importance of the proper managing of their supply chains. Supply chain management (SCM involves practices such as strategic supplier partnership, customer focus, lean production, postpone concept and technology & innovation. This study investigates the importance of SCM on financial optimization. The study measures production or SCM managers’ perceptions regarding SCM and level of performances in their companies. The paper also specifically investigates whether supply chain performance acts as a mediating variable in the relationship between SCM and financial optimization. These associations were analyzed through statistical methods such as Pearson’s correlation and a regression-based mediated analysis. The findings suggest that SCM has significant correlations with supply chain performance and financial optimization. In addition, the result of the regression-based mediated analysis demonstrates that supply chain performance mediates the linkage between SCM and financial optimization. The findings of the study provide a striking demonstration of the importance of SCM in enhancing the performances of Malaysian manufacturing companies. The result indicates that manufac-turing companies should emphasize greater management support for SCM implementation and a greater degree of attention for production integration and information flow integration in the manufacturing system in order to maximize profit and tzerimize cost.

  4. Identifying factors affecting optimal management of agricultural water

    Directory of Open Access Journals (Sweden)

    Masoud Samian

    2015-01-01

    In addition to quantitative methodology such as descriptive statistics and factor analysis a qualitative methodology was employed for dynamic simulation among variables through Vensim software. In this study, the factor analysis technique was used through the Kaiser-Meyer-Olkin (KMO and Bartlett tests. From the results, four key elements were identified as factors affecting the optimal management of agricultural water in Hamedan area. These factors were institutional and legal factors, technical and knowledge factors, economic factors and social factors.

  5. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  6. Preliminary study for unified management of CANDU safety codes and construction of database system

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  7. TECHNIQUE OF OPTIMAL AUDIT PLANNING FOR INFORMATION SECURITY MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    F. N. Shago

    2014-03-01

    Full Text Available Complication of information security management systems leads to the necessity of improving the scientific and methodological apparatus for these systems auditing. Planning is an important and determining part of information security management systems auditing. Efficiency of audit will be defined by the relation of the reached quality indicators to the spent resources. Thus, there is an important and urgent task of developing methods and techniques for optimization of the audit planning, making it possible to increase its effectiveness. The proposed technique gives the possibility to implement optimal distribution for planning time and material resources on audit stages on the basis of dynamics model for the ISMS quality. Special feature of the proposed approach is the usage of a priori data as well as a posteriori data for the initial audit planning, and also the plan adjustment after each audit event. This gives the possibility to optimize the usage of audit resources in accordance with the selected criteria. Application examples of the technique are given while planning audit information security management system of the organization. The result of computational experiment based on the proposed technique showed that the time (cost audit costs can be reduced by 10-15% and, consequently, quality assessments obtained through audit resources allocation can be improved with respect to well-known methods of audit planning.

  8. Probabilistic framework for product design optimization and risk management

    Science.gov (United States)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  9. Robust optimization-based DC optimal power flow for managing wind generation uncertainty

    Science.gov (United States)

    Boonchuay, Chanwit; Tomsovic, Kevin; Li, Fangxing; Ongsakul, Weerakorn

    2012-11-01

    Integrating wind generation into the wider grid causes a number of challenges to traditional power system operation. Given the relatively large wind forecast errors, congestion management tools based on optimal power flow (OPF) need to be improved. In this paper, a robust optimization (RO)-based DCOPF is proposed to determine the optimal generation dispatch and locational marginal prices (LMPs) for a day-ahead competitive electricity market considering the risk of dispatch cost variation. The basic concept is to use the dispatch to hedge against the possibility of reduced or increased wind generation. The proposed RO-based DCOPF is compared with a stochastic non-linear programming (SNP) approach on a modified PJM 5-bus system. Primary test results show that the proposed DCOPF model can provide lower dispatch cost than the SNP approach.

  10. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    International Nuclear Information System (INIS)

    Binh, Do Quang; Huy, Ngo Quang; Hai, Nguyen Hoang

    2014-01-01

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  11. A binary mixed integer coded genetic algorithm for multi-objective optimization of nuclear research reactor fuel reloading

    Energy Technology Data Exchange (ETDEWEB)

    Binh, Do Quang [University of Technical Education Ho Chi Minh City (Viet Nam); Huy, Ngo Quang [University of Industry Ho Chi Minh City (Viet Nam); Hai, Nguyen Hoang [Centre for Research and Development of Radiation Technology, Ho Chi Minh City (Viet Nam)

    2014-12-15

    This paper presents a new approach based on a binary mixed integer coded genetic algorithm in conjunction with the weighted sum method for multi-objective optimization of fuel loading patterns for nuclear research reactors. The proposed genetic algorithm works with two types of chromosomes: binary and integer chromosomes, and consists of two types of genetic operators: one working on binary chromosomes and the other working on integer chromosomes. The algorithm automatically searches for the most suitable weighting factors of the weighting function and the optimal fuel loading patterns in the search process. Illustrative calculations are implemented for a research reactor type TRIGA MARK II loaded with the Russian VVR-M2 fuels. Results show that the proposed genetic algorithm can successfully search for both the best weighting factors and a set of approximate optimal loading patterns that maximize the effective multiplication factor and minimize the power peaking factor while satisfying operational and safety constraints for the research reactor.

  12. Multi-Objective Optimization of Managed Aquifer Recharge.

    Science.gov (United States)

    Fatkhutdinov, Aybulat; Stefan, Catalin

    2018-04-27

    This study demonstrates the utilization of a multi-objective hybrid global/local optimization algorithm for solving managed aquifer recharge (MAR) design problems, in which the decision variables included spatial arrangement of water injection and abstraction wells and time-variant rates of pumping and injection. The objective of the optimization was to maximize the efficiency of the MAR scheme, which includes both quantitative and qualitative aspects. The case study used to demonstrate the capabilities of the proposed approach is based on a published report on designing a real MAR site with defined aquifer properties, chemical groundwater characteristics as well as quality and volumes of injected water. The demonstration problems include steady-state and transient scenarios. The steady-state scenario demonstrates optimization of spatial arrangement of multiple injection and recovery wells, whereas the transient scenario was developed with the purpose of finding optimal regimes of water injection and recovery at a single location. Both problems were defined as multi-objective problems. The scenarios were simulated by applying coupled numerical groundwater flow and solute transport models: MODFLOW-2005 and MT3D-USGS. The applied optimization method was a combination of global - the Non-Dominated Sorting Genetic Algorithm (NSGA-2), and local - the Nelder-Mead Downhill Simplex search algorithms. The analysis of the resulting Pareto optimal solutions led to the discovery of valuable patterns and dependencies between the decision variables, model properties and problem objectives. Additionally, the performance of the traditional global and the hybrid optimization schemes were compared. This article is protected by copyright. All rights reserved.

  13. Management of nuclear PRs activity with optimal conditions

    International Nuclear Information System (INIS)

    Ohnishi, Teruaki

    1997-01-01

    A methodology is proposed to derive optimal conditions for the activity of nuclear public relations (PRs). With the use of data-bases available at present, expressions were derived which connect the budget allocated for the PRs activity with the intensity of stimulus for four types of activity of the advertisement in the press, the exclusive publicity, the pamphlet and the advertisement on television. Optimal conditions for the activity were determined by introducing a model describing a relation between the intensity of stimulus and the extent of the change of public's attitude to nuclear energy, namely the effect of PRs activity, and also by giving the optimal ratio of allocation of the budget among the four types of activity as a function of cost versus effectiveness of each type. Those optimal conditions, being for the ratio of allocation of the budget, the execution time and the intensity of each type of activity at that time, vary depending on the number of household in a target region, the target class of demography, the duration time of activity, and the amount of budget for the activity. It becomes clear from numerical calculation that the optimal conditions and the effect of activity show quite strong non-linearity with respect to the variation of those variables, and that the effect of PRs activity averaged over all public in the target region becomes to be maximum, in Japan, when the activity is executed with the optimal conditions determined for the target class of middle- and advanced-aged women. The management of nuclear PRs activity becomes possible by introducing such a method of fixation of optimal conditions for the activity as described here. (author)

  14. Optimization of control poison management by dynamic programming

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.

    1974-01-01

    A dynamic programming approach was used to optimize the poison distribution in the core of a nuclear power plant between reloading. This method was applied to a 500 M We PWR subject to two different fuel management policies. The beginning of a stage is marked by a fuel management decision. The state vector of the system is defined by the burnups in the three fuel zones of the core. The change of the state vector is computed in several time steps. A criticality conserving poison management pattern is chosen at the beginning of each step. The burnups at the end of a step are obtained by means of depletion calculations, assuming constant neutron distribution during the step. The violation of burnup and power peaking constraints during the step eliminates the corresponding end states. In the case of identical end states, all except that which produced the largest amount of energy, are eliminated. Among the several end states one is selected for the subsequent stage, when it is subjected to a fuel management decision. This selection is based on an optimally criterion previously chosen, such as: discharged fuel burnup maximization, energy generation cost minimization, etc. (author)

  15. Development and application of methods and computer codes of fuel management and nuclear design of reload cycles in PWR

    International Nuclear Information System (INIS)

    Ahnert, C.; Aragones, J.M.; Corella, M.R.; Esteban, A.; Martinez-Val, J.M.; Minguez, E.; Perlado, J.M.; Pena, J.; Matias, E. de; Llorente, A.; Navascues, J.; Serrano, J.

    1976-01-01

    Description of methods and computer codes for Fuel Management and Nuclear Design of Reload Cycles in PWR, developed at JEN by adaptation of previous codes (LEOPARD, NUTRIX, CITATION, FUELCOST) and implementation of original codes (TEMP, SOTHIS, CICLON, NUDO, MELON, ROLLO, LIBRA, PENELOPE) and their application to the project of Management and Design of Reload Cycles of a 510 Mwt PWR, including comparison with results of experimental operation and other calculations for validation of methods. (author) [es

  16. General productivity code: productivity optimization of gaseous diffusion cascades. The programmer's guide

    International Nuclear Information System (INIS)

    Tunstall, J.N.

    1975-05-01

    The General Productivity Code is a FORTRAN IV computer program for the IBM System 360. With its model of the productivity of gaseous diffusion cascades, the program is used to determine optimum cascade performance based on specified operating conditions and to aid in the calculation of optimum operating conditions for a complex of diffusion cascades. This documentation of the program is directed primarily to programmers who will be responsible for updating the code as requested by the users. It is also intended to be an aid in training new Productivity Code users and to serve as a general reference manual. Elements of the mathematical model, the input data requirements, the definitions of the various tasks (Instructions) that can be performed, and a detailed description of most FORTRAN variables and program subroutines are presented. A sample problem is also included. (auth)

  17. Lifting scheme-based method for joint coding 3D stereo digital cinema with luminace correction and optimized prediction

    Science.gov (United States)

    Darazi, R.; Gouze, A.; Macq, B.

    2009-01-01

    Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.

  18. Recommendations for Optimizing Internal Management Mechanism of Farmers’ Specialized Cooperatives

    Institute of Scientific and Technical Information of China (English)

    Jingxiao; CHEN

    2016-01-01

    Based on the survey of 38 farmers’ specialized cooperatives in Hubei Province,this paper analyzed existing problems in internal management mechanism of cooperatives,including widespread problem of centralized control,imperfect supervision mechanism,lack of effective incentive mechanism,insufficient specialized personnel,and limited participation of cooperative members in management. It elaborated causes for these problems from the perspective of practice. Finally,it came up with recommendations for optimizing farmers’ specialized cooperatives: building democratic decision making mechanism with coordination of cooperative members and able personnel,establishing supervision mechanism suitable for self demands,improving internal incentive mechanism,establishing talent introduction and cultivation mechanism in proper time,and strengthening internal member management of cooperatives.

  19. Knowledge Management Society to Optimize Teaching Academic Performance

    Directory of Open Access Journals (Sweden)

    María Mercedes Carrillo

    2016-08-01

    Full Text Available Society is undergoing rapid changes as a result of the waves of change with the passing of the years. Each day brings new challenges to managers of organizations. Hence, this paper aims to "identify the importance of Management and Knowledge Society to Optimize Teaching Academic Performance". Methodologically article is based on an investigation of documentary-descriptive, based on recognized authors reading; bibliographic texts to support the theoretical literature review. In conclusion, there are: The new management faces a change of learning; It reflects information society, knowledge; We are facing a landscape of challenges, such as the creation of knowledge; Education is a crucial factor in this social transformation. Finally, analyzing the results was evident in the treated subject that the texts consulted and contributions of investigated theoretical gave support and scientific relevance article presented.

  20. Present status of reactor physics in the United States and Japan-III. 2. Nuclear Fuel Management Optimization Capabilities

    International Nuclear Information System (INIS)

    Karve, Atul A.; Keller, Paul M.; Turinsky, Paul J.; Maldonado, G. Ivan

    2001-01-01

    Nuclear fuel management is a very difficult design optimization problem in that decisions ranging from the microscopic level, e.g., pin enrichment, to the macroscopic level, e.g., core flow rate, and spanning time horizons of several reload cycles are strongly coupled. Added to these attributes are the highly constrained design, disjointed decision space, multimodal objective function, mixed integer type decision variables, highly nonlinear objective and constraint functions, and computationally demanding evaluation of the objective and constraint functions. Not surprisingly, after years of research on nuclear fuel management optimization, only limited progress has been made. The traditional approach to partially overcome these difficulties involves constraining the search space via heuristic rules, decomposing the problem into sub-optimization problems, and utilizing simplified core physics models. These approaches have sometimes proven effective, but to claim that the design decisions are global optimum decisions would not be appropriate. Given the increasingly tight constraints and design complexities of nuclear cores, and stronger desire to reduce generating costs, the nuclear fuel management design optimization problem has grown more challenging and important with the passage of time. In this paper, we summarize our research on this design optimization problem. A suite of computer codes that aid in making nuclear fuel management decisions has been developed. From Table I, it is obvious that decomposition of the global optimization problem into suboptimum problems has been employed. All of these computer codes utilize stochastic optimization techniques to search the decision space for determining the family of near-optimum decisions in the sub-optimization problem being solved. A stochastic optimization approach has been selected since it is well suited to address the problems' attributes noted earlier. The drawback of employing a stochastic optimization

  1. Decision Support Model for Optimal Management of Coastal Gate

    Science.gov (United States)

    Ditthakit, Pakorn; Chittaladakorn, Suwatana

    2010-05-01

    The coastal areas are intensely settled by human beings owing to their fertility of natural resources. However, at present those areas are facing with water scarcity problems: inadequate water and poor water quality as a result of saltwater intrusion and inappropriate land-use management. To solve these problems, several measures have been exploited. The coastal gate construction is a structural measure widely performed in several countries. This manner requires the plan for suitably operating coastal gates. Coastal gate operation is a complicated task and usually concerns with the management of multiple purposes, which are generally conflicted one another. This paper delineates the methodology and used theories for developing decision support modeling for coastal gate operation scheduling. The developed model was based on coupling simulation and optimization model. The weighting optimization technique based on Differential Evolution (DE) was selected herein for solving multiple objective problems. The hydrodynamic and water quality models were repeatedly invoked during searching the optimal gate operations. In addition, two forecasting models:- Auto Regressive model (AR model) and Harmonic Analysis model (HA model) were applied for forecasting water levels and tide levels, respectively. To demonstrate the applicability of the developed model, it was applied to plan the operations for hypothetical system of Pak Phanang coastal gate system, located in Nakhon Si Thammarat province, southern part of Thailand. It was found that the proposed model could satisfyingly assist decision-makers for operating coastal gates under various environmental, ecological and hydraulic conditions.

  2. Smart Microgrid Energy Management Using a Novel Artificial Shark Optimization

    Directory of Open Access Journals (Sweden)

    Pawan Singh

    2017-01-01

    Full Text Available At present, renewable energy sources (RESs integration using microgrid (MG technology is of great importance for demand side management. Optimization of MG provides enhanced generation from RES at minimum operation cost. The microgrid optimization problem involves a large number of variables and constraints; therefore, it is complex in nature and various existing algorithms are unable to handle them efficiently. This paper proposed an artificial shark optimization (ASO method to remove the limitation of existing algorithms for solving the economical operation problem of MG. The ASO algorithm is motivated by the sound sensing capability of sharks, which they use for hunting. Further, the intermittent nature of renewable energy sources is managed by utilizing battery energy storage (BES. BES has several benefits. However, all these benefits are limited to a certain fixed area due to the stationary nature of the BES system. The latest technologies, such as electric vehicle technologies (EVTs, provide all benefits of BES along with mobility to support the variable system demands. Therefore, in this work, EVTs incorporated grid connected smart microgrid (SMG system is introduced. Additionally, a comparative study is provided, which shows that the ASO performs relatively better than the existing techniques.

  3. Optimized Reactive Power Flow of DFIG Power Converters for Better Reliability Performance Considering Grid Codes

    DEFF Research Database (Denmark)

    Zhou, Dao; Blaabjerg, Frede; Lau, Mogens

    2015-01-01

    . In order to fulfill the modern grid codes, over-excited reactive power injection will further reduce the lifetime of the rotor-side converter. In this paper, the additional stress of the power semiconductor due to the reactive power injection is firstly evaluated in terms of modulation index...

  4. Power Allocation Optimization: Linear Precoding Adapted to NB-LDPC Coded MIMO Transmission

    Directory of Open Access Journals (Sweden)

    Tarek Chehade

    2015-01-01

    Full Text Available In multiple-input multiple-output (MIMO transmission systems, the channel state information (CSI at the transmitter can be used to add linear precoding to the transmitted signals in order to improve the performance and the reliability of the transmission system. This paper investigates how to properly join precoded closed-loop MIMO systems and nonbinary low density parity check (NB-LDPC. The q elements in the Galois field, GF(q, are directly mapped to q transmit symbol vectors. This allows NB-LDPC codes to perfectly fit with a MIMO precoding scheme, unlike binary LDPC codes. The new transmission model is detailed and studied for several linear precoders and various designed LDPC codes. We show that NB-LDPC codes are particularly well suited to be jointly used with precoding schemes based on the maximization of the minimum Euclidean distance (max-dmin criterion. These results are theoretically supported by extrinsic information transfer (EXIT analysis and are confirmed by numerical simulations.

  5. Optimal power flow management for distributed energy resources with batteries

    International Nuclear Information System (INIS)

    Tazvinga, Henerica; Zhu, Bing; Xia, Xiaohua

    2015-01-01

    Highlights: • A PV-diesel-battery hybrid system is proposed. • Model minimizes fuel and battery wear costs. • Power flows are analysed in a 24-h period. • Results provide a practical platform for decision making. - Abstract: This paper presents an optimal energy management model of a solar photovoltaic-diesel-battery hybrid power supply system for off-grid applications. The aim is to meet the load demand completely while satisfying the system constraints. The proposed model minimizes fuel and battery wear costs and finds the optimal power flow, taking into account photovoltaic power availability, battery bank state of charge and load power demand. The optimal solutions are compared for cases when the objectives are weighted equally and when a larger weight is assigned to battery wear. A considerable increase in system operational cost is observed in the latter case owing to the increased usage of the diesel generator. The results are important for decision makers, as they depict the optimal decisions considered in the presence of trade-offs between conflicting objectives

  6. Application of Flow and Transport Optimization Codes to Groundwater Pump and Treat Systems- VOLUME 2

    National Research Council Canada - National Science Library

    Minsker, Barbara

    2004-01-01

    .... Recent studies completed by the EPA and the Navy indicate that the majority of pump and treat systems are not operating as designed, have unachievable or undefined goals, and have not been optimized since installation...

  7. Fluid management in the optimization of space construction

    Science.gov (United States)

    Snyder, Howard

    1990-01-01

    Fluid management impacts strongly on the optimization of space construction. Large quantities of liquids are needed for propellants and life support. The mass of propellant liquids is comparable to that required for the structures. There may be a strong dynamic interaction between the stored liquids and the space structure unless the design minimizes the interaction. The constraints of cost and time required optimization of the supply/resupply strategy. The proper selection and design of the fluid management methods for: slosh control; stratification control; acquisition; transfer; gauging; venting; dumping; contamination control; selection of tank configuration and size; the storage state and the control system can improve the entire system performance substantially. Our effort consists of building mathematical/computer models of the various fluid management methods and testing them against the available experimental data. The results of the models are used as inputs to the system operations studies. During the past year, the emphasis has been on modeling: the transfer of cryogens; sloshing and the storage configuration. The work has been intermeshed with ongoing NASA design and development studies to leverage the funds provided by the Center.

  8. MACRO1: a code to test a methodology for analyzing nuclear-waste management systems

    International Nuclear Information System (INIS)

    Edwards, L.L.

    1979-01-01

    The code is primarily a manager of probabilistic data and deterministic mathematical models. The user determines the desired aggregation of the available models into a composite model of a physical system. MACRO1 then propagates the finite probability distributions of the inputs to the model to finite probability distributions over the outputs. MACRO1 has been applied to a sample analysis of a nuclear-waste repository, and its results compared satisfactorily with previously obtained Monte Carlo statistics

  9. Ambulatory anesthesia: optimal perioperative management of the diabetic patient

    Directory of Open Access Journals (Sweden)

    Polderman JAW

    2016-05-01

    Full Text Available Jorinde AW Polderman, Robert van Wilpe, Jan H Eshuis, Benedikt Preckel, Jeroen Hermanides Department of Anaesthesiology, Academic Medical Centre, University of Amsterdam, Amsterdam, the Netherlands Abstract: Given the growing number of patients with diabetes mellitus (DM and the growing number of surgical procedures performed in an ambulatory setting, DM is one of the most encountered comorbidities in patients undergoing ambulatory surgery. Perioperative management of ambulatory patients with DM requires a different approach than patients undergoing major surgery, as procedures are shorter and the stress response caused by surgery is minimal. However, DM is a risk factor for postoperative complications in ambulatory surgery, so should be managed carefully. Given the limited time ambulatory patients spend in the hospital, improvement in management has to be gained from the preanesthetic assessment. The purpose of this review is to summarize current literature regarding the anesthesiologic management of patients with DM in the ambulatory setting. We will discuss the risks of perioperative hyperglycemia together with the pre-, intra-, and postoperative considerations for these patients when encountered in an ambulatory setting. Furthermore, we provide recommendations for the optimal perioperative management of the diabetic patient undergoing ambulatory surgery. Keywords: diabetes mellitus, perioperative period, ambulatory surgery, insulin, complications, GLP-1 agonist, DPP-4 inhibitor

  10. In-Core Fuel Management with Biased Multiobjective Function Optimization

    International Nuclear Information System (INIS)

    Shatilla, Youssef A.; Little, David C.; Penkrot, Jack A.; Holland, Richard Andrew

    2000-01-01

    The capability of biased multiobjective function optimization has been added to the Westinghouse Electric Company's (Westinghouse's) Advanced Loading Pattern Search code (ALPS). The search process, given a user-defined set of design constraints, proceeds to minimize a global parameter called the total value associated with constraints compliance (VACC), an importance-weighted measure of the deviation from limit and/or margin target. The search process takes into consideration two equally important user-defined factors while minimizing the VACC, namely, the relative importance of each constraint with respect to the others and the optimization of each constraint according to its own objective function. Hence, trading off margin-to-design limits from where it is abundantly available to where it is badly needed can now be accomplished. Two practical methods are provided to the user for input of constraints and associated objective functions. One consists of establishing design limits based on traditional core design parameters such as assembly/pin burnup, power, or reactivity. The second method allows the user to write a program, or script, to define a logic not possible through ordinary means. This method of script writing was made possible through the application resident compiler feature of the technical user language integration processor (tulip), developed at Westinghouse. For the optimization problems studied, ALPS not only produced candidate loading patterns (LPs) that met all of the conflicting design constraints, but in cases where the design appeared to be over constrained gave a wide range of LPs that came very close to meeting all the constraints based on the associated objective functions

  11. Analysis and Optimization of Sparse Random Linear Network Coding for Reliable Multicast Services

    DEFF Research Database (Denmark)

    Tassi, Andrea; Chatzigeorgiou, Ioannis; Roetter, Daniel Enrique Lucani

    2016-01-01

    Point-to-multipoint communications are expected to play a pivotal role in next-generation networks. This paper refers to a cellular system transmitting layered multicast services to a multicast group of users. Reliability of communications is ensured via different random linear network coding (RLNC......) techniques. We deal with a fundamental problem: the computational complexity of the RLNC decoder. The higher the number of decoding operations is, the more the user's computational overhead grows and, consequently, the faster the battery of mobile devices drains. By referring to several sparse RLNC...... techniques, and without any assumption on the implementation of the RLNC decoder in use, we provide an efficient way to characterize the performance of users targeted by ultra-reliable layered multicast services. The proposed modeling allows to efficiently derive the average number of coded packet...

  12. Using combinatorial problem decomposition for optimizing plutonium inventory management

    International Nuclear Information System (INIS)

    Niquil, Y.; Gondran, M.; Voskanian, A.; Paris-11 Univ., 91 - Orsay

    1997-03-01

    Plutonium Inventory Management Optimization can be modeled as a very large 0-1 linear program. To solve it, problem decomposition is necessary, since other classic techniques are not efficient for such a size. The first decomposition consists in favoring constraints that are the most difficult to reach and variables that have the highest influence on the cost: fortunately, both correspond to stock output decisions. The second decomposition consists in mixing continuous linear program solving and integer linear program solving. Besides, the first decisions to be taken are systematically favored, for they are based on data considered to be sure, when data supporting later decisions in known with less accuracy and confidence. (author)

  13. A discrete optimization method for nuclear fuel management

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1993-04-01

    Nuclear loading pattern elaboration can be seen as a combinational optimization problem of tremendous size and with non-linear cost-functions, and search are always numerically expensive. After a brief introduction of the main aspects of nuclear fuel management, this paper presents a new idea to treat the combinational problem by using informations included in the gradient of a cost function. The method is to choose, by direct observation of the gradient, the more interesting changes in fuel loading patterns. An example is then developed to illustrate an operating mode of the method, and finally, connections with simulated annealing and genetic algorithms are described as an attempt to improve search processes

  14. Optimization of Concrete Composition in Radioactive Waste Management

    International Nuclear Information System (INIS)

    IIija, P.

    1999-01-01

    Low and Intermediate level radioactive waste re presents 95% of the total wastes that is conditioned into special concrete containers. Since these containers are to protect radioactive waste safely for about 300 years, the selection and precise control of physical and mechanical characteristics of materials is very important. After volume reduction and valuable components recovery, waste materials have to be conditioned for transport, storage and disposal. Conditioning is the waste management step in which radioactive wastes are immobilized and packed . In this paper methods and optimization of concrete container composition, used for storing radioactive waste, is presented

  15. An Optimal Method for Developing Global Supply Chain Management System

    Directory of Open Access Journals (Sweden)

    Hao-Chun Lu

    2013-01-01

    Full Text Available Owing to the transparency in supply chains, enhancing competitiveness of industries becomes a vital factor. Therefore, many developing countries look for a possible method to save costs. In this point of view, this study deals with the complicated liberalization policies in the global supply chain management system and proposes a mathematical model via the flow-control constraints, which are utilized to cope with the bonded warehouses for obtaining maximal profits. Numerical experiments illustrate that the proposed model can be effectively solved to obtain the optimal profits in the global supply chain environment.

  16. Optimal energy management in pulp and paper mills

    International Nuclear Information System (INIS)

    Sarimveis, H.K.; Angelou, A.S.; Retsina, T.R.; Rutherford, S.R.; Bafas, G.V.

    2003-01-01

    In this paper, we examine the utilization of mathematical programming tools for optimum energy management of the power plant in pulp and paper mills. The objective is the fulfillment of the total plant requirements in energy and steam with the minimum possible cost. The proposed methodology is based on the development of a detailed model of the power plant using mass and energy balances and a mathematical formulation of the electrical purchase contract, which can be translated into a rigorous mixed integer linear programming optimization problem. The results show that the method can be a very useful tool for the reduction of production cost due to minimization of the fuel and electricity costs

  17. Stochastic optimization of GeantV code by use of genetic algorithms

    Science.gov (United States)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) and handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. The goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.

  18. Cancer related fatigue: implementing guidelines for optimal management.

    Science.gov (United States)

    Pearson, Elizabeth J M; Morris, Meg E; McKinstry, Carol E

    2017-07-18

    Cancer-related fatigue (CRF) is a key concern for people living with cancer and can impair physical functioning and activities of daily living. Evidence-based guidelines for CRF are available, yet inconsistently implemented globally. This study aimed to identify barriers and enablers to applying a cancer fatigue guideline and to derive implementation strategies. A mixed-method study explored the feasibility of implementing the CRF guideline developed by the Canadian Association for Psychosocial Oncology (CAPO). Health professionals, managers and consumers from different practice settings participated in a modified Delphi study with two survey rounds. A reference group informed the design of the study including the surveys. The first round focused on guideline characteristics, compatibility with current practice and experience, and behaviour change. The second survey built upon and triangulated the first round. Forty-five health practitioners and managers, and 68 cancer survivors completed the surveys. More than 75% of participants endorsed the CAPO cancer related fatigue guidelines. Some respondents perceived a lack of resources for accessible and expert fatigue management services. Further barriers to guideline implementation included complexity, limited practical details for some elements, and lack of clinical tools such as assessment tools or patient education materials. Recommendations to enhance guideline applicability centred around four main themes: (1) balancing the level of detail in the CAPO guideline with ease of use, (2) defining roles of different professional disciplines in CRF management, (3) how best to integrate CRF management into policy and practice, (4) how best to ensure a consumer-focused approach to CRF management. Translating current knowledge on optimal management of CRF into clinical practice can be enhanced by the adoption of valid guidelines. This study indicates that it is feasible to adopt the CAPO guidelines. Clinical application may

  19. Application of Artificial Intelligence for Optimization in Pavement Management

    Directory of Open Access Journals (Sweden)

    Reus Salini

    2015-07-01

    Full Text Available Artificial intelligence (AI is a group of techniques that have quite a potential to be applied to pavement engineering and management. In this study, we developed a practical, flexible and out of the box approach to apply genetic algorithms to optimizing the budget allocation and the road maintenance strategy selection for a road network. The aim is to provide an alternative to existing software and better fit the requirements of an important number of pavement managers. To meet the objectives, a new indicator, named Road Global Value Index (RGVI, was created to contemplate the pavement condition, the traffic and the economic and political importance for each and every road section. This paper describes the approach and its components by an example confirming that genetic algorithms are very effective for the intended purpose.

  20. Equilibrium optimization code OPEQ and results of applying it to HT-7U

    International Nuclear Information System (INIS)

    Zha Xuejun; Zhu Sizheng; Yu Qingquan

    2003-01-01

    The plasma equilibrium configuration has a strong impact on the confinement and MHD stability in tokamaks. For designing a tokamak device, it is an important issue to determine the sites and currents of poloidal coils which have some constraint conditions from physics and engineering with a prescribed equilibrium shape of the plasma. In this paper, an effective method based on multi-variables equilibrium optimization is given. The method can optimize poloidal coils when the previously prescribed plasma parameters are treated as an object function. We apply it to HT-7U equilibrium calculation, and obtain good results

  1. A proposal for accident management optimization based on the study of accident sequence analysis for a BWR

    International Nuclear Information System (INIS)

    Sobajima, M.

    1998-01-01

    The paper describes a proposal for accident management optimization based on the study of accident sequence and source term analyses for a BWR. In Japan, accident management measures are to be implemented in all LWRs by the year 2000 in accordance with the recommendation of the regulatory organization and based on the PSAs carried out by the utilities. Source terms were evaluated by the Japan Atomic Energy Research Institute (JAERI) with the THALES code for all BWR sequences in which loss of decay heat removal resulted in the largest release. Identification of the priority and importance of accident management measures was carried out for the sequences with larger risk contributions. Considerations for optimizing emergency operation guides are believed to be essential for risk reduction. (author)

  2. Innovation of genetic algorithm code GenA for WWER fuel loading optimization

    International Nuclear Information System (INIS)

    Sustek, J.

    2005-01-01

    One of the stochastic search techniques - genetic algorithms - was recently used for optimization of arrangement of fuel assemblies (FA) in core of reactors WWER-440 and WWER-1000. Basic algorithm was modified by incorporation of SPEA scheme. Both were enhanced and some results are presented (Authors)

  3. Portfolio Implementation Risk Management Using Evolutionary Multiobjective Optimization

    Directory of Open Access Journals (Sweden)

    David Quintana

    2017-10-01

    Full Text Available Portfolio management based on mean-variance portfolio optimization is subject to different sources of uncertainty. In addition to those related to the quality of parameter estimates used in the optimization process, investors face a portfolio implementation risk. The potential temporary discrepancy between target and present portfolios, caused by trading strategies, may expose investors to undesired risks. This study proposes an evolutionary multiobjective optimization algorithm aiming at regions with solutions more tolerant to these deviations and, therefore, more reliable. The proposed approach incorporates a user’s preference and seeks a fine-grained approximation of the most relevant efficient region. The computational experiments performed in this study are based on a cardinality-constrained problem with investment limits for eight broad-category indexes and 15 years of data. The obtained results show the ability of the proposed approach to address the robustness issue and to support decision making by providing a preferred part of the efficient set. The results reveal that the obtained solutions also exhibit a higher tolerance to prediction errors in asset returns and variance–covariance matrix.

  4. Optimization of concrete composition in radioactive waste management

    International Nuclear Information System (INIS)

    Plecas, I.; Peric, A.

    1995-01-01

    Low and intermediate level waste represents 95% of the total wastes that is conditioned into special concrete containers. Since these containers are to protect radioactive waste safely for about 300 years, the selection and precise control of physical and mechanical characteristics of materials is very important. After volume reduction and valuable components recovery, waste materials have to be conditioned for transport, storage and disposal. Conditioning is the waste management step in which radioactive wastes are immobilized and packed. The immobilization processes involve conversation of the wastes to solid forms that reduce the potential for migration or dispersion of radionuclides from the wastes by natural processes during storage, transport and disposal. The immobilization processes involve the use of various matrices of nonradioactive materials, such as concrete, to fix the wastes as monoliths, usually directly in the waste containers used for subsequent handling. In this paper an optimization of concrete container composition, used for storing radioactive waste from nuclear power plants, is presented. Optimization was performed on the composition of the concrete that is used in the container production. In experiments, the authors tried to obtain the best mechanical characteristics of the concrete, varying the weight percentage of the granulate due to its diameter, water-to-cement ratios and type of the cements that were used in preparing the concrete container formulation. Concrete containers, that were optimized in the manner described in this paper, will be in used for the radioactive waste materials final disposal, using the concept of the engineer trench system facilities

  5. Optimization of waste management by actions taken at source

    International Nuclear Information System (INIS)

    Strachan, N.R.; Wakerley, M.W.

    1988-01-01

    The purpose of this document is to collate and review information on those measures and practices adopted within nuclear facilities to optimize the management of radioactive waste at the point at which it arises. The information on which it has been based has been obtained by a review of the literature and by visits to a number of different types of facilities within the European Community. The search revealed mainly references to waste-management optimization at US LWRs, whereas the visits have tried to cover as wide a range of European facilities as practicable. There are a number of different respects in which radioactive waste can be minimized: - minimizing the amount of activity appearing in the wastes. This is best achieved by the design of process equipment; - minimizing the volume of waste with which radioactivity is associated. This is best achieved by a combination of administrative controls and equipment design; - minimizing the amount of material which for administrative or measurement reasons is considered to be radioactive. Many examples of minimization at source by means of developments in equipment and administrative controls that were encountered during our visits, or identified in the literature search, are described

  6. Optimization of waste management by actions taken at source

    International Nuclear Information System (INIS)

    Strachan, N.R.; Wakerley, M.W.

    1988-01-01

    The purpose of this document is to collate and review information on those measures and practices adopted within nuclear facilities to optimize the management of radioactive waste at the point of arising. The information on which it has been based has been obtained by literature review and by visits to a number of different types of facilities within the European Community. The search revealed mainly references to waste management optimization at US LWRs, whereas the visits have tried to cover as wide a range of European facilities as practicable. There are a number of different respects in which radioactive waste can be minimized: minimizing the amount of activity appearing in the wastes. This is best achieved by the design of process equipment; minimizing the volume of waste with which radioactivity is associated. This is best achieved by a combination of administrative controls and equipment design; minimizing the amount of material which for administrative or measurement reasons is considered to be radioactive. Many examples of minimization at source by means of equipment developments and administrative controls that were encountered during our visits or identified in the literature search are described. (author)

  7. An intelligent agent for optimal river-reservoir system management

    Science.gov (United States)

    Rieker, Jeffrey D.; Labadie, John W.

    2012-09-01

    A generalized software package is presented for developing an intelligent agent for stochastic optimization of complex river-reservoir system management and operations. Reinforcement learning is an approach to artificial intelligence for developing a decision-making agent that learns the best operational policies without the need for explicit probabilistic models of hydrologic system behavior. The agent learns these strategies experientially in a Markov decision process through observational interaction with the environment and simulation of the river-reservoir system using well-calibrated models. The graphical user interface for the reinforcement learning process controller includes numerous learning method options and dynamic displays for visualizing the adaptive behavior of the agent. As a case study, the generalized reinforcement learning software is applied to developing an intelligent agent for optimal management of water stored in the Truckee river-reservoir system of California and Nevada for the purpose of streamflow augmentation for water quality enhancement. The intelligent agent successfully learns long-term reservoir operational policies that specifically focus on mitigating water temperature extremes during persistent drought periods that jeopardize the survival of threatened and endangered fish species.

  8. On the Optimality of Repetition Coding among Rate-1 DC-offset STBCs for MIMO Optical Wireless Communications

    KAUST Repository

    Sapenov, Yerzhan

    2017-07-06

    In this paper, an optical wireless multiple-input multiple-output communication system employing intensity-modulation direct-detection is considered. The performance of direct current offset space-time block codes (DC-STBC) is studied in terms of pairwise error probability (PEP). It is shown that among the class of DC-STBCs, the worst case PEP corresponding to the minimum distance between two codewords is minimized by repetition coding (RC), under both electrical and optical individual power constraints. It follows that among all DC-STBCs, RC is optimal in terms of worst-case PEP for static channels and also for varying channels under any turbulence statistics. This result agrees with previously published numerical results showing the superiority of RC in such systems. It also agrees with previously published analytic results on this topic under log-normal turbulence and further extends it to arbitrary turbulence statistics. This shows the redundancy of the time-dimension of the DC-STBC in this system. This result is further extended to sum power constraints with static and turbulent channels, where it is also shown that the time dimension is redundant, and the optimal DC-STBC has a spatial beamforming structure. Numerical results are provided to demonstrate the difference in performance for systems with different numbers of receiving apertures and different throughput.

  9. Depletion mapping and constrained optimization to support managing groundwater extraction

    Science.gov (United States)

    Fienen, Michael N.; Bradbury, Kenneth R.; Kniffin, Maribeth; Barlow, Paul M.

    2018-01-01

    Groundwater models often serve as management tools to evaluate competing water uses including ecosystems, irrigated agriculture, industry, municipal supply, and others. Depletion potential mapping—showing the model-calculated potential impacts that wells have on stream baseflow—can form the basis for multiple potential management approaches in an oversubscribed basin. Specific management approaches can include scenarios proposed by stakeholders, systematic changes in well pumping based on depletion potential, and formal constrained optimization, which can be used to quantify the tradeoff between water use and stream baseflow. Variables such as the maximum amount of reduction allowed in each well and various groupings of wells using, for example, K-means clustering considering spatial proximity and depletion potential are considered. These approaches provide a potential starting point and guidance for resource managers and stakeholders to make decisions about groundwater management in a basin, spreading responsibility in different ways. We illustrate these approaches in the Little Plover River basin in central Wisconsin, United States—home to a rich agricultural tradition, with farmland and urban areas both in close proximity to a groundwater-dependent trout stream. Groundwater withdrawals have reduced baseflow supplying the Little Plover River below a legally established minimum. The techniques in this work were developed in response to engaged stakeholders with various interests and goals for the basin. They sought to develop a collaborative management plan at a watershed scale that restores the flow rate in the river in a manner that incorporates principles of shared governance and results in effective and minimally disruptive changes in groundwater extraction practices.

  10. Integer-linear-programing optimization in scalable video multicast with adaptive modulation and coding in wireless networks.

    Science.gov (United States)

    Lee, Dongyul; Lee, Chaewoo

    2014-01-01

    The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  11. Integer-Linear-Programing Optimization in Scalable Video Multicast with Adaptive Modulation and Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Dongyul Lee

    2014-01-01

    Full Text Available The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC with adaptive modulation and coding (AMC provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  12. Low-complexity BCH codes with optimized interleavers for DQPSK systems with laser phase noise

    DEFF Research Database (Denmark)

    Leong, Miu Yoong; Larsen, Knud J.; Jacobsen, Gunnar

    2017-01-01

    The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose...... simulations. For a target post-FEC BER of 10−6, codes selected using our method result in BERs around 3× target and achieve the target with around 0.2 dB extra signal-to-noise ratio....

  13. Code of practice on the management of radioactive wastes from the mining and milling of radioactive ores 1982

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    This Code, issued by the Department of Home Affairs and Environment, was formulated under the provisions of the Environment Protection (Nuclear Codes) Act 1978. The Code provides for prior development, approval and subsequent updating of a waste management programme for each mining or milling operation to which it applies, for the purpose of ensuring an approach to waste management best suited to the particular circumstances of each operation. It also prescribes the duties of the owners, operators and managers of mines and mills. (NEA) [fr

  14. Analytical methodology for optimization of waste management scenarios in nuclear installation decommissioning process - 16148

    International Nuclear Information System (INIS)

    Zachar, Matej; Necas, Vladimir; Daniska, Vladimir; Rehak, Ivan; Vasko, Marek

    2009-01-01

    The nuclear installation decommissioning process is characterized by production of large amount of various radioactive and non-radioactive waste that has to be managed, taking into account its physical, chemical, toxic and radiological properties. Waste management is considered to be one of the key issues within the frame of the decommissioning process. During the decommissioning planning period, the scenarios covering possible routes of materials release into the environment and radioactive waste disposal, should be discussed and evaluated. Unconditional and conditional release to the environment, long-term storage at the nuclear site, near surface or deep geological disposal and relevant material management techniques for achieving the final status should be taken into account in the analysed scenarios. At the level of the final decommissioning plan, it is desirable to have the waste management scenario optimized for local specific facility conditions taking into account a national decommissioning background. The analytical methodology for the evaluation of decommissioning waste management scenarios, presented in the paper, is based on the materials and radioactivity flow modelling, which starts from waste generation activities like pre-dismantling decontamination, selected methods of dismantling, waste treatment and conditioning, up to materials release or conditioned radioactive waste disposal. The necessary input data for scenarios, e.g. nuclear installation inventory database (physical and radiological data), waste processing technologies parameters or material release and waste disposal limits, have to be considered. The analytical methodology principles are implemented into the standardised decommissioning parameters calculation code OMEGA, developed in the DECOM company. In the paper the examples of the methodology implementation for the scenarios optimization are presented and discussed. (authors)

  15. Optimal Control for Bufferbloat Queue Management Using Indirect Method with Parametric Optimization

    Directory of Open Access Journals (Sweden)

    Amr Radwan

    2016-01-01

    Full Text Available Because memory buffers become larger and cheaper, they have been put into network devices to reduce the number of loss packets and improve network performance. However, the consequences of large buffers are long queues at network bottlenecks and throughput saturation, which has been recently noticed in research community as bufferbloat phenomenon. To address such issues, in this article, we design a forward-backward optimal control queue algorithm based on an indirect approach with parametric optimization. The cost function which we want to minimize represents a trade-off between queue length and packet loss rate performance. Through the integration of an indirect approach with parametric optimization, our proposal has advantages of scalability and accuracy compared to direct approaches, while still maintaining good throughput and shorter queue length than several existing queue management algorithms. All numerical analysis, simulation in ns-2, and experiment results are provided to solidify the efficiency of our proposal. In detailed comparisons to other conventional algorithms, the proposed procedure can run much faster than direct collocation methods while maintaining a desired short queue (≈40 packets in simulation and 80 (ms in experiment test.

  16. Compendium of technical computer codes used in support of the DOE Office of Civilian Radioactive Waste Management

    International Nuclear Information System (INIS)

    McBride, A.F.; Austin, P.N.; Ward, W.M.; McCarn, L.B.; Roddy, J.W.; Ludwig, S.B.; Reich, W.J.; Roussin, R.W.

    1989-04-01

    A compilation of technical computer codes related to ongoing work under the cognizance of the US Department of Energy's Office of Civilian Radioactive Waste Management (DOE/OCRWM) is presented. Much of the information was obtained from responses to a questionnaire distributed by DOE/OCRWM to all DOE offices associated with the radioactive waste management program. The codes are arranged alphabetically by name. In addition to the code description, each sheet includes other data such as computer hardware and software requirements, document references, name of respondent, and code variants. The codes are categorized into seventeen subject areas plus a miscellaneous category. Some of the subject areas covered are atmospheric dispersion, biosphere transport, geochemistry, nuclear radiation transport, nuclide inventory, and risk assessment. Three appendixes are included which list the names of the contributors, a list of the literature reviewed, and a glossary of computer code terminology and definitions. 50 refs., 3 tabs

  17. SEJITS: embedded specializers to turn patterns-based designs into optimized parallel code

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    All software should be parallel software. This is natural result of the transition to a many core world. For a small fraction of the world's programmers (efficiency programmers), this is not a problem. They enjoy mapping algorithms onto the details of a particular system and are well served by low level languages and OpenMP, MPI, or OpenCL. Most programmers, however, are "domain specialists" who write code. They are too busy working in their domain of choice (such as physics) to master the intricacies of each computer they use. How do we make these programmers productive without giving up performance? We have been working with a team at UC Berkeley's ParLab to address this problem. The key is a clear software architecture expressed in terms of design patterns that exposes the concurrency in a problem. The resulting code is written using a patterns-based framework within a high level, productivity language (such as Python). Then a separate system is used by a small group o...

  18. Heap leach cyanide irrigation and risk to wildlife: Ramifications for the international cyanide management code.

    Science.gov (United States)

    Donato, D B; Madden-Hallett, D M; Smith, G B; Gursansky, W

    2017-06-01

    Exposed cyanide-bearing solutions associated with gold and silver recovery processes in the mining industry pose a risk to wildlife that interact with these solutions. This has been documented with cyanide-bearing tailings storage facilities, however risks associated with heap leach facilities are poorly documented, monitored and audited. Gold and silver leaching heap leach facilities use cyanide, pH-stabilised, at concentrations deemed toxic to wildlife. Their design and management are known to result in exposed cyanide-bearing solutions that are accessible to and present a risk to wildlife. Monitoring of the presence of exposed solutions, wildlife interaction, interpretation of risks and associated wildlife deaths are poorly documented. This paper provides a list of critical monitoring criteria and attempts to predict wildlife guilds most at risk. Understanding the significance of risks to wildlife from exposed cyanide solutions is complex, involving seasonality, relative position of ponding, temporal nature of ponding, solution palatability, environmental conditions, in situ wildlife species inventory and provision of alternative drinking sources for wildlife. Although a number of heap leach operations are certified as complaint with the International Cyanide Management Code (Cyanide Code), these criteria are not considered by auditors nor has systematic monitoring regime data been published. Without systematic monitoring and further knowledge, wildlife deaths on heap leach facilities are likely to remain largely unrecorded. This has ramifications for those operations certified as compliance with the Cyanide Code. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Development of code system for management of reactor decommissioning (COSMARD), 1

    International Nuclear Information System (INIS)

    Yanagihara, Satoshi; Ogihara, Hirohito

    1994-02-01

    The Code System for Management of Reactor Decommissioning (COSMARD) was developed for use in the effective planning and management of reactor decommissioning. The decommissioning management data evaluation facility (DMAF) which is the main part of COSMARD has functions to evaluate various project management data such as manpower needs, radiation exposure of workers, amount of waste arisings necessary for each activity in a project using input data and calculation models consisting of simple arithmetic formulas and unit factors in the database. Using a set of command descriptors developed in COSMARD, work conditions and procedures for decommissioning a nuclear facility are describes as input data. The management data are evaluated by adopting the calculation models, which are placed in the activities at the lowest level of the work breakdown structure (WBS). The management data evaluated by the models are summed up in the ascending direction of WBS to obtain necessary data for the activities at any levels of WBS. In addition, scheduling calculations are conducted to obtain scheduling bar chart and histograms of the management data, on the basis of the work precedence conditions attached at certain activities. This report describes the outline of DMAF and user's manual of the sets of command descriptors. (author)

  20. GRA prospectus: optimizing design and management of protected areas

    Science.gov (United States)

    Bernknopf, Richard; Halsing, David

    2001-01-01

    Protected areas comprise one major type of global conservation effort that has been in the form of parks, easements, or conservation concessions. Though protected areas are increasing in number and size throughout tropical ecosystems, there is no systematic method for optimally targeting specific local areas for protection, designing the protected area, and monitoring it, or for guiding follow-up actions to manage it or its surroundings over the long run. Without such a system, conservation projects often cost more than necessary and/or risk protecting ecosystems and biodiversity less efficiently than desired. Correcting these failures requires tools and strategies for improving the placement, design, and long-term management of protected areas. The objective of this project is to develop a set of spatially based analytical tools to improve the selection, design, and management of protected areas. In this project, several conservation concessions will be compared using an economic optimization technique. The forest land use portfolio model is an integrated assessment that measures investment in different land uses in a forest. The case studies of individual tropical ecosystems are developed as forest (land) use and preservation portfolios in a geographic information system (GIS). Conservation concessions involve a private organization purchasing development and resource access rights in a certain area and retiring them. Forests are put into conservation, and those people who would otherwise have benefited from extracting resources or selling the right to do so are compensated. Concessions are legal agreements wherein the exact amount and nature of the compensation result from a negotiated agreement between an agent of the conservation community and the local community. Funds are placed in a trust fund, and annual payments are made to local communities and regional/national governments. The payments are made pending third-party verification that the forest expanse

  1. The verification of PWR-fuel code for PWR in-core fuel management

    International Nuclear Information System (INIS)

    Surian Pinem; Tagor M Sembiring; Tukiran

    2015-01-01

    In-core fuel management for PWR is not easy because of the number of fuel assemblies in the core as much as 192 assemblies so many possibilities for placement of the fuel in the core. Configuration of fuel assemblies in the core must be precise and accurate so that the reactor operates safely and economically. It is necessary for verification of PWR-FUEL code that will be used in-core fuel management for PWR. PWR-FUEL code based on neutron transport theory and solved with the approach of multi-dimensional nodal diffusion method many groups and diffusion finite difference method (FDM). The goal is to check whether the program works fine, especially for the design and in-core fuel management for PWR. Verification is done with equilibrium core search model at three conditions that boron free, 1000 ppm boron concentration and critical boron concentration. The result of the average burn up fuel assemblies distribution and power distribution at BOC and EOC showed a consistent trend where the fuel with high power at BOC will produce a high burn up in the EOC. On the core without boron is obtained a high multiplication factor because absence of boron in the core and the effect of fission products on the core around 3.8 %. Reactivity effect at 1000 ppm boron solution of BOC and EOC is 6.44 % and 1.703 % respectively. Distribution neutron flux and power density using NODAL and FDM methods have the same result. The results show that the verification PWR-FUEL code work properly, especially for core design and in-core fuel management for PWR. (author)

  2. Performance of an improved logarithmic phase mask with optimized parameters in a wavefront-coding system.

    Science.gov (United States)

    Zhao, Hui; Li, Yingcai

    2010-01-10

    In two papers [Proc. SPIE 4471, 272-280 (2001) and Appl. Opt. 43, 2709-2721 (2004)], a logarithmic phase mask was proposed and proved to be effective in extending the depth of field; however, according to our research, this mask is not that perfect because the corresponding defocused modulation transfer function has large oscillations in the low-frequency region, even when the mask is optimized. So, in a previously published paper [Opt. Lett. 33, 1171-1173 (2008)], we proposed an improved logarithmic phase mask by making a small modification. The new mask can not only eliminate the drawbacks to a certain extent but can also be even less sensitive to focus errors according to Fisher information criteria. However, the performance comparison was carried out with the modified mask not being optimized, which was not reasonable. In this manuscript, we optimize the modified logarithmic phase mask first before analyzing its performance and more convincing results have been obtained based on the analysis of several frequently used metrics.

  3. Swiss Foundation Code 2009 principles and recommendations for the establishment and management of grant-making foundations

    CERN Document Server

    Sprecher, Thomas; Janssen, Martin

    2011-01-01

    The «Swiss Foundation Code 2009» takes up and completes the first European Good Governance Code for grant-making foundations, published in 2005. It contains practical governance guidelines regarding the establishment, organization, management and monitoring of grant-making foundations as well as making due reference to support activities, financial and investment policies. The abridged English version of the „Swiss Foundation Code 2009“ contains 3 principles and 26 recommendations – but not the extensive commentary parts.

  4. Architecture proposal for the use of QR code in supply chain management

    Directory of Open Access Journals (Sweden)

    Dalton Matsuo Tavares

    2012-01-01

    Full Text Available Supply chain traceability and visibility are key concerns for many companies. Radio-Frequency Identification (RFID is an enabling technology that allows identification of objects in a fully automated manner via radio waves. Nevertheless, this technology has limited acceptance and high costs. This paper presents a research effort undertaken to design a track and trace solution in supply chains, using quick response code (or QR Code for short as a less complex and cost-effective alternative for RFID in supply chain management (SCM. A first architecture proposal using open source software will be presented as a proof of concept. The system architecture is presented in order to achieve tag generation, the image acquisition and pre-processing, product inventory and tracking. A prototype system for the tag identification is developed and discussed at the end of the paper to demonstrate its feasibility.

  5. Optimal pricing and lot sizing vendor managed inventory

    Directory of Open Access Journals (Sweden)

    Mohsen Ziaee

    2010-07-01

    Full Text Available Vendor Managed Inventory (VMI is one of the effective techniques for managing theinventory in supply chain. VMI models have been proven to reduce the cost of inventorycompared with traditional economic order quantity method under some conditions such asconstant demand and production expenditure. However, the modeling of the VMI problem hasnever been studied under some realistic assumptions such as price dependent demand. In thispaper, three problem formulations are proposed. In the first problem formulation, we study aVMI problem with one buyer and one supplier when demand is considered to be a function ofprice and price elasticity to demand, and production cost is also a function of demand. Theproposed model is formulated and solved in a form of geometric programming. For the secondand the third models, we consider VMI problem with two buyers and two suppliers assumingthat each buyer centre is relatively close to the other buyer centre. Each supplier has only oneproduct which is different from the product of the other supplier. Two suppliers cooperate incustomer relationship management and two buyers cooperate in supplier relationshipmanagement as well, so the suppliers send the orders of two buyers by one vehicle,simultaneously. For the third model, an additional assumption which is practically applicableand reasonable is considered. For all the proposed models, the optimal solution is comparedwith the traditional one. We demonstrate the implementation of our proposed models usingsome numerical examples.

  6. Management and optimization of the CPCU network working

    Energy Technology Data Exchange (ETDEWEB)

    Silvain, D. (Compagnie Parisienne de Chauffage Urbain, 75 - Paris (FR))

    1991-10-01

    The CPCU steam distribution network is supplemented by a return network for the condensation water. The data system installed in 1988 provides, for the real time, management of the function of the two networks and a reduction in production costs. For the steam, data required in the network, the boiler houses and from external sources are processed by local network of five microprocessors and permit: - with time delay: technical and economic production optimizing calculations, or forecasts, for the following day, of the total required output and the procedure necessary for supplying this at the lowest cost; - in real time: on the basis of the forecasts for the previous day, creating the production instructions for the boiler houses and the instructions for the network remote control elements; - in case of an unexpected occurrence: immediate creation of new operating forecasts for the boiler houses for the establishing management data in real time. For the water, the system forecasts the volume to be returned to the boiler depending on the quantity of steam to be produced. Subsequently, an analysis is carried out in real time of pressures and outputs measured in the network for deriving valve movements and the pump stop/start procedure for guaranteeing the return of the water. The architecture, basic principles and software developed for this application can be used in other steam or water networks and, in a general manner, are adaptable for the management of any complex multi-supplier or multicustomer systems.

  7. [Optimal use of peritoneal dialysis with multi-disciplinary management].

    Science.gov (United States)

    Elios Russo, Gaspare; Martinez, A; Mazzaferro, S; Nunzi, A; Testorio, M; Rocca, A R; Lai, S; Morgia, A; Borzacca, B; Gnerre Musto, T

    2013-01-01

    Considering the increasing incidence of chronic kidney disease and the increased use of peritoneal dialysis, we wanted to assess whether the multidisciplinary management of patients in peritoneal dialysis might produce improvement in the quality of patients' lives when compared to management by a routine team of operators. Our study observed 40 patients on peritoneal dialysis in our Department between 2010 and 2012. They were randomly assigned to either group A, the routine team which consisted of a nephrologist and a nurse, or group B, a multidisciplinary team comprising several medical specialists, a nurse, a psychologist and a social worker. Two tests, KDQOL-SF and MMPI-2, were administered to both groups. In group B, the number of days of hospitalization and day hospital were more than 88% lower when compared to group A. The multidisciplinary team achieved better results with the KDQOL-SF test with regards to both emotional and objective dimensions. The Pearson coefficient between the results of the two questionnaires shows how multidisciplinary management can positively influence the perceived well-being of the patient and his or her adherence to treatment. In a multidisciplinary team, each operator, in addition to his or her specific role, also contributes to the achievement of the overall objective, namely of ensuring an optimal quality of life to the patient on peritoneal dialysis thereby allowing these patients to continue their professional and social lives.

  8. TRU Waste Management Program. Cost/schedule optimization analysis

    International Nuclear Information System (INIS)

    Detamore, J.A.; Raudenbush, M.H.; Wolaver, R.W.; Hastings, G.A.

    1985-10-01

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office Rockwell International (JIO/RI) during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance development, task monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short-term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. Systems models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions

  9. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Qin, N; Jiang, S [UT Southwestern Medical Center, Dallas, TX (United States); Peeler, C [UT MD Anderson Cancer Center, Houston, TX (United States); Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relative Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.

  10. MagRad: A code to optimize the operation of superconducting magnets in a radiation environment

    International Nuclear Information System (INIS)

    Yeaw, C.T.

    1995-01-01

    A powerful computational tool, called MagRad, has been developed which optimizes magnet design for operation in radiation fields. Specifically, MagRad has been used for the analysis and design modification of the cable-in-conduit conductors of the TF magnet systems in fusion reactor designs. Since the TF magnets must operate in a radiation environment which damages the material components of the conductor and degrades their performance, the optimization of conductor design must account not only for start-up magnet performance, but also shut-down performance. The degradation in performance consists primarily of three effects: reduced stability margin of the conductor; a transition out of the well-cooled operating regime; and an increased maximum quench temperature attained in the conductor. Full analysis of the magnet performance over the lifetime of the reactor includes: radiation damage to the conductor, stability, protection, steady state heat removal, shielding effectiveness, optimal annealing schedules, and finally costing of the magnet and reactor. Free variables include primary and secondary conductor geometric and compositional parameters, as well as fusion reactor parameters. A means of dealing with the radiation damage to the conductor, namely high temperature superconductor anneals, is proposed, examined, and demonstrated to be both technically feasible and cost effective. Additionally, two relevant reactor designs (ITER CDA and ARIES-II/IV) have been analyzed. Upon addition of pure copper strands to the cable, the ITER CDA TF magnet design was found to be marginally acceptable, although much room for both performance improvement and cost reduction exists. A cost reduction of 10-15% of the capital cost of the reactor can be achieved by adopting a suitable superconductor annealing schedule. In both of these reactor analyses, the performance predictive capability of MagRad and its associated costing techniques have been demonstrated

  11. Optimization and parallelization of the thermal–hydraulic subchannel code CTF for high-fidelity multi-physics applications

    International Nuclear Information System (INIS)

    Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.

    2015-01-01

    Highlights: • COBRA-TF was adopted by the Consortium for Advanced Simulation of LWRs. • We have improved code performance to support running large-scale LWR simulations. • Code optimization has led to reductions in execution time and memory usage. • An MPI parallelization has reduced full-core simulation time from days to minutes. - Abstract: This paper describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis. A set of serial code optimizations—including fixing computational inefficiencies, optimizing the numerical approach, and making smarter data storage choices—are first described and shown to reduce both execution time and memory usage by about a factor of ten. Next, a “single program multiple data” parallelization strategy targeting distributed memory “multiple instruction multiple data” platforms utilizing domain decomposition is presented. In this approach, data communication between processors is accomplished by inserting standard Message-Passing Interface (MPI) calls at strategic points in the code. The domain decomposition approach implemented assigns one MPI process to each fuel assembly, with each domain being represented by its own CTF input file. The creation of CTF input files, both for serial and parallel runs, is also fully automated through use of a pressurized water reactor (PWR) pre-processor utility that uses a greatly simplified set of user input compared with the traditional CTF input. To run CTF in

  12. Development of Evaluation Technology for Hydrogen Combustion in containment and Accident Management Code for CANDU

    International Nuclear Information System (INIS)

    Kim, S. B.; Kim, D. H.; Song, Y. M.

    2011-08-01

    For a licensing of nuclear power plant(NPP) construction and operation, the hydrogen combustion and hydrogen mitigation system in the containment is one of the important safety issues. Hydrogen safety and its control for the new NPPs(Shin-Wolsong 1 and 2, Shin-Ulchin 1 and 2) have been evaluated in detail by using the 3-dimensional analysis code GASFLOW. The experimental and computational studies on the hydrogen combustion, and participations of the OEDE/NEA programs such as THAI and ISP-49 secures the resolving capabilities of the hydrogen safety and its control for the domestic nuclear power plants. ISAAC4.0, which has been developed for the assessment of severe accident management at CANDU plants, was already delivered to the regulatory body (KINS) for the assessment of the severe accident management guidelines (SAMG) for Wolsong units 1 to 4, which are scheduled to be submitted to KINS. The models for severe accident management strategy were newly added and the graphic simulator, CAVIAR, was coupled to addition, the ISAAC computer code is anticipated as a platform for the development and maintenance of Wolsong plant risk monitor and Wolsong-specific SAMG

  13. Enhancing Chemical Inventory Management in Laboratory through a Mobile-Based QR Code Tag

    Science.gov (United States)

    Shukran, M. A. M.; Ishak, M. S.; Abdullah, M. N.

    2017-08-01

    The demand for a greater inventory management system which can provide a lot of useful information from a single scan has made laboratory inventory management using barcode technology more difficult. Since the barcode technology lacks the ability to overcome the problem and is not capable of providing information needed to manage the chemicals in the laboratory, thus employing a QR code technology is the best solution. In this research, the main idea is to develop a standalone application running with its own database that is periodically synchronized with the inventory software hosted by the computer and connected to a specialized network as well. The first process required to establish this centralized system is to determine all inventory available in the chemical laboratory by referring to the documented data in order to develop the database. Several customization and enhancement were made to the open source QR code technology to ensure the developed application is dedicated for its main purposes. As the end of the research, it was proven that the system is able to track the position of all inventory and showing real time information about the scanned chemical labels. This paper intends to give an overview about the QR tag inventory system that was developed and its implementation at the National Defence University of Malaysia’s (NDUM) chemical laboratory.

  14. Optimal management of primary retroperitoneal sarcoma: an update.

    Science.gov (United States)

    Miah, Aisha B; Hannay, Jonathan; Benson, Charlotte; Thway, Khin; Messiou, Christina; Hayes, Andrew J; Strauss, Dirk C

    2014-05-01

    Soft tissue sarcomas are a group of heterogeneous neoplasms with more than 50 histological subtypes exhibiting major differences in terms of pathogenesis, genetic alterations and clinical behavior. Sarcomas represent approximately 1% of malignancies with retroperitoneal sarcomas representing 10-15% of all soft tissue sarcomas. Surgery is currently the only modality which offers the chance of cure. Surgery for retroperitoneal sarcomas presents specific challenges due their location in a complex space surrounded by vital structures and visceral organs often prohibiting resection with wide margins. Furthermore, even after complete resection local recurrence is common and the leading cause of death. In this article the authors describe the initial investigations, prognostic factors and optimal surgical management. The evidence and current research as regards the role of multimodality treatment is reviewed and discussed.

  15. Using combinatorial problem decomposition for optimizing plutonium inventory management

    Energy Technology Data Exchange (ETDEWEB)

    Niquil, Y.; Gondran, M. [Electricite de France (EDF), 92 - Clamart (France). Direction des Etudes et Recherches; Voskanian, A. [Electricite de France (EDF), 92 - Clamart (France). Direction des Etudes et Recherches]|[Paris-11 Univ., 91 - Orsay (France). Lab. de Recherche en Informatique

    1997-03-01

    Plutonium Inventory Management Optimization can be modeled as a very large 0-1 linear program. To solve it, problem decomposition is necessary, since other classic techniques are not efficient for such a size. The first decomposition consists in favoring constraints that are the most difficult to reach and variables that have the highest influence on the cost: fortunately, both correspond to stock output decisions. The second decomposition consists in mixing continuous linear program solving and integer linear program solving. Besides, the first decisions to be taken are systematically favored, for they are based on data considered to be sure, when data supporting later decisions in known with less accuracy and confidence. (author) 7 refs.

  16. Optimization Under Uncertainty for Management of Renewables in Electricity Markets

    DEFF Research Database (Denmark)

    Zugno, Marco

    -by-price. In a similar setup, the optimal trading (and pricing) problem for a retailer connected to flexible consumers is considered. Finally, market and system operators are challenged by the increasing penetration of renewables, which put stress on markets that were designed to accommodate a generation mix largely......This thesis deals with the development and application of models for decision-making under uncertainty to support the participation of renewables in electricity markets. The output of most renewable sources, e.g., wind, is intermittent and, furthermore, it can only be predicted with a limited...... accuracy. As a result of their non-dispatchable and stochastic nature, the management of renewables poses new challenges as compared to conventional sources of electricity. Focusing in particular on short-term electricity markets, both the trading activities of market participants (producers, retailers...

  17. Opportunities of Optimization in Administrative Structures for Efficient Management

    Directory of Open Access Journals (Sweden)

    Venelin Terziev

    2017-12-01

    Full Text Available Current paper presents studies on the administrative structures in order to optimize the activities and the overall management through the example of the Bulgarian Commission for Protection against Discrimination. It aims at establishing duplicate functions in the organization under study. The main tasks in the analysis are related to the display of the basic findings and conclusions for the strongest sides and the fields for improvement regarding the relevance, the effectiveness and the efficiency of the administration of the Commission for Protection against Discrimination in Bulgaria. The following areas are thoroughly and critically analyzed: relevance of the functions and efficiency of the activity. As a result of the study a Strategy for Organizational Development and a Training Plan have been drafted.

  18. A discrete optimization method for nuclear fuel management

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1993-04-01

    Nuclear loading pattern elaboration can be seen as a combinational optimization problem, of tremendous size and with non-linear cost-functions, and search are always numerically expensive. After a brief introduction of the main aspects of nuclear fuel management, this note presents a new idea to treat the combinational problem by using informations included in the gradient of a cost function. The method is to choose, by direct observation of the gradient, the more interesting changes in fuel loading patterns. An example is then developed to illustrate an operating mode of the method, and finally, connections with simulated annealing and genetic algorithms are described as an attempt to improve search processes. (author). 1 fig., 16 refs

  19. Beyond bureaucracy and entrepreneurialism:examining the multiple discursive codes informing the work, careers and subjectivities of management graduates

    OpenAIRE

    Loacker, Bernadette Isabel; Sliwa, Martyna

    2016-01-01

    This paper examines how discursive codes and demands associated with ‘bureaucratic and entrepreneurial regimes’ of work and career organization shape the work, careers and subjectivities of management graduates. The study is based on an analysis of 30 narratives of management professionals who graduated from an Austrian business school in the early 1970s or 2000s. Its insights suggest that variegated discursive codes manifest in the graduates’ articulated professional practices and subjectivi...

  20. Optimal Bidding Strategy for Renewable Microgrid with Active Network Management

    Directory of Open Access Journals (Sweden)

    Seung Wan Kim

    2016-01-01

    Full Text Available Active Network Management (ANM enables a microgrid to optimally dispatch the active/reactive power of its Renewable Distributed Generation (RDG and Battery Energy Storage System (BESS units in real time. Thus, a microgrid with high penetration of RDGs can handle their uncertainties and variabilities to achieve the stable operation using ANM. However, the actual power flow in the line connecting the main grid and microgrid may deviate significantly from the day-ahead bids if the bids are determined without consideration of the real-time adjustment through ANM, which will lead to a substantial imbalance cost. Therefore, this study proposes a formulation for obtaining an optimal bidding which reflects the change of power flow in the connecting line by real-time adjustment using ANM. The proposed formulation maximizes the expected profit of the microgrid considering various network and physical constraints. The effectiveness of the proposed bidding strategy is verified through the simulations with a 33-bus test microgrid. The simulation results show that the proposed bidding strategy improves the expected operating profit by reducing the imbalance cost to a greater degree compared to the basic bidding strategy without consideration of ANM.

  1. Technical and economic optimization study for HLW waste management

    International Nuclear Information System (INIS)

    Deffes, A.

    1989-01-01

    This study was conducted to assess the technical and economic aspects of high level waste (HLW) management with the objective of optimizing the interim storage duration and the dimensions of the underground repository site. The procedure consisted in optimizing the economic criterion under specified constraints. The results are intended to identify trends and guide the choice from among available options; simple and highly flexible models were therefore used in this study, and only nearfield thermal constraints were taken into consideration. Because of the present uncertainty on the physicochemical properties of the repository environment and on the unit cost figures, this study focused on developing a suitable method rather than on obtaining definitive results. With the physical and economic data bases used for the two media investigated (granite and salt) the optimum values found show that it is advisable to minimize the interim storage time, and that the geological repository should feature a high degree of spatial dilution. These results depend to a considerable extent on the assumption of high interim storage costs

  2. Multipurpose Water Reservoir Management: An Evolutionary Multiobjective Optimization Approach

    Directory of Open Access Journals (Sweden)

    Luís A. Scola

    2014-01-01

    Full Text Available The reservoirs that feed large hydropower plants should be managed in order to provide other uses for the water resources. Those uses include, for instance, flood control and avoidance, irrigation, navigability in the rivers, and other ones. This work presents an evolutionary multiobjective optimization approach for the study of multiple water usages in multiple interlinked reservoirs, including both power generation objectives and other objectives not related to energy generation. The classical evolutionary algorithm NSGA-II is employed as the basic multiobjective optimization machinery, being modified in order to cope with specific problem features. The case studies, which include the analysis of a problem which involves an objective of navigability on the river, are tailored in order to illustrate the usefulness of the data generated by the proposed methodology for decision-making on the problem of operation planning of multiple reservoirs with multiple usages. It is shown that it is even possible to use the generated data in order to determine the cost of any new usage of the water, in terms of the opportunity cost that can be measured on the revenues related to electric energy sales.

  3. Optimizing Patient Management and Adherence for Children Receiving Growth Hormone.

    Science.gov (United States)

    Acerini, Carlo L; Wac, Katarzyna; Bang, Peter; Lehwalder, Dagmar

    2017-01-01

    Poor adherence with growth hormone (GH) therapy has been associated with worse clinical outcomes, which in children relates specifically to their linear growth and loss of quality of life. The "360° GH in Europe" meeting, held in Lisbon, Portugal, in June 2016 and funded by Merck KGaA (Germany), examined many aspects of GH diseases. The three sessions, entitled " Short Stature Diagnosis and Referral ," " Optimizing Patient Management ," and " Managing Transition ," each benefited from three guest speaker presentations, followed by an open discussion and are reported as a manuscript, authored by the speakers. Reported here is a summary of the proceedings of the second session, which reviewed the determinants of GH therapy response, factors affecting GH therapy adherence and the development of innovative technologies to improve GH treatment in children. Response to GH therapy varies widely, particularly in regard to the underlying diagnosis, although there is little consensus on the definition of a poor response. If the growth response is seen to be less than expected, the possible reasons should be discussed with patients and their parents, including compliance with the therapy regimen. Understanding and addressing the multiple factors that influence adherence, in order to optimize GH therapy, requires a multi-disciplinary approach. Because therapy continues over many years, various healthcare professionals will be involved at different periods of the patient's journey. The role of the injection device for GH therapy, frequent monitoring of response, and patient support are all important for maintaining adherence. New injection devices are incorporating electronic technologies for automated monitoring and recording of clinically relevant information on injections. Study results are indicating that such devices can at least maintain GH adherence; however, acceptance of novel devices needs to be assessed and there remains an on-going need for innovations.

  4. Optimizing Patient Management and Adherence for Children Receiving Growth Hormone

    Directory of Open Access Journals (Sweden)

    Carlo L. Acerini

    2017-11-01

    Full Text Available Poor adherence with growth hormone (GH therapy has been associated with worse clinical outcomes, which in children relates specifically to their linear growth and loss of quality of life. The “360° GH in Europe” meeting, held in Lisbon, Portugal, in June 2016 and funded by Merck KGaA (Germany, examined many aspects of GH diseases. The three sessions, entitled “Short Stature Diagnosis and Referral,” “Optimizing Patient Management,” and “Managing Transition,” each benefited from three guest speaker presentations, followed by an open discussion and are reported as a manuscript, authored by the speakers. Reported here is a summary of the proceedings of the second session, which reviewed the determinants of GH therapy response, factors affecting GH therapy adherence and the development of innovative technologies to improve GH treatment in children. Response to GH therapy varies widely, particularly in regard to the underlying diagnosis, although there is little consensus on the definition of a poor response. If the growth response is seen to be less than expected, the possible reasons should be discussed with patients and their parents, including compliance with the therapy regimen. Understanding and addressing the multiple factors that influence adherence, in order to optimize GH therapy, requires a multi-disciplinary approach. Because therapy continues over many years, various healthcare professionals will be involved at different periods of the patient’s journey. The role of the injection device for GH therapy, frequent monitoring of response, and patient support are all important for maintaining adherence. New injection devices are incorporating electronic technologies for automated monitoring and recording of clinically relevant information on injections. Study results are indicating that such devices can at least maintain GH adherence; however, acceptance of novel devices needs to be assessed and there remains an on

  5. Burn-up function of fuel management code for aqueous homogeneous reactors and its validation

    International Nuclear Information System (INIS)

    Wang Liangzi; Yao Dong; Wang Kan

    2011-01-01

    Fuel Management Code for Aqueous Homogeneous Reactors (FMCAHR) is developed based on the Monte Carlo transport method, to analyze the physics characteristics of aqueous homogeneous reactors. FMCAHR has the ability of doing resonance treatment, searching for critical rod heights, thermal hydraulic parameters calculation, radiolytic-gas bubbles' calculation and bum-up calculation. This paper introduces the theory model and scheme of its burn-up function, and then compares its calculation results with benchmarks and with DRAGON's burn-up results, which confirms its bum-up computing precision and its applicability in the bum-up calculation and analysis for aqueous solution reactors. (authors)

  6. POLA PENGELOLAAN SANITASI DI PERKAMPUNGAN BANTARAN SUNGAI CODE, YOGYAKARTA (Pattern of Sanitation Management in Code Riverside Settlements, Yogyakarta

    Directory of Open Access Journals (Sweden)

    Atyanto Dharoko

    2005-11-01

    Full Text Available ABSTRAK Bantaran Sungai Code merupakan wilayah pusat kota Yogyakarta yang dipenuhi oleh perkampungan padat penduduknya. Sistem kehidupan masyarakat kampung bantaran Sungai Code sudah terintegrasi dengan kehidupan sosial ekonomi masyarakat kota Yogyakarta. Permasalahan yang muncul adalah rendahnya kualitas intrastruktur terutama fasilitas sanitasi karena kendala terbatasnya kemampuan ekonomi masyarakat dan bentuk topograti yang terjal. Akhirnya sungai merupakan tujuan pembuangan akhir limbah sanitasi lingkungan tanpa proses terlebih dahulu. Penelitian ini menyimpulkan bahwa pola sanitasi komunal lebih dapat diterima oleh masyarakat dari pertimbangan sosial, ekonomi dan kondisi lingkungan yang terjal. Di masa mendatang sistem ini perlu dijadikan dasar pengembangan teknis sistem sanitasi bantaran sungai untuk memperoleh sustainability yang tinggi.   ABSTRACT Code riverside is part of central business district in Yogyakarta composed by densely populated kampungs. Community way of life in the kampungs have been successfully integrated with social-economic of the urban community. The crusial problem faced by the community is lack of infrastructure facilities especially sanitation. This situation is very much related to social-economic constraints of the community and topographical situation as fisical constraints. Finally, sanitation disposals have to be discharged into Code River without pre processing. The study concludes that communal sanitation system becomes the most acceptable system based on socio-economic and topographical constraints. In the future communal sanitation system may become a basic technical considerations to develop sanitation system in the riverside settlements and to achieve sustainability.

  7. Flow analysis and port optimization of geRotor pump using commercial CFD code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byung Jo; Seong, Seung Hak; Yoon, Soon Hyun [Pusan National Univ., Pusan (Korea, Republic of)

    2005-07-01

    GeRotor pump is widely used in the automotive industry for fuel lift, injection, engine oil lubrication, and also in transmission systems. The CFD study of the pump, which is characterized by transient flow with moving rotor boundaries, has been performed to obtain the most optimum shape of the inlet/outlet port of the pump. Various shapes of the port have been tested to investigate how they affect flow rates and fluctuations. Based on the parametric study, an optimum shape has been determined for the maximum flow rate and minimum fluctuations. The result has been confirmed by experiments. For the optimization, Taguchi method has been adapted. The groove shape has been found to be the most important factor among the selected several parameters related to flow rate and fluctuations.

  8. Optimal coding-decoding for systems controlled via a communication channel

    Science.gov (United States)

    Yi-wei, Feng; Guo, Ge

    2013-12-01

    In this article, we study the problem of controlling plants over a signal-to-noise ratio (SNR) constrained communication channel. Different from previous research, this article emphasises the importance of the actual channel model and coder/decoder in the study of network performance. Our major objectives include coder/decoder design for an additive white Gaussian noise (AWGN) channel with both standard network configuration and Youla parameter network architecture. We find that the optimal coder and decoder can be realised for different network configuration. The results are useful in determining the minimum channel capacity needed in order to stabilise plants over communication channels. The coder/decoder obtained can be used to analyse the effect of uncertainty on the channel capacity. An illustrative example is provided to show the effectiveness of the results.

  9. Optimized logarithmic phase masks used to generate defocus invariant modulation transfer function for wavefront coding system.

    Science.gov (United States)

    Zhao, Hui; Li, Yingcai

    2010-08-01

    In a previous Letter [Opt. Lett. 33, 1171 (2008)], we proposed an improved logarithmic phase mask by making modifications to the original one designed by Sherif. However, further studies in another paper [Appl. Opt. 49, 229 (2010)] show that even when the Sherif mask and the improved one are optimized, their corresponding defocused modulation transfer functions (MTFs) are still not stable with respect to focus errors. So, by further modifying their phase profiles, we design another two logarithmic phase masks that exhibit more stable defocused MTF. However, with the defocus-induced phase effect considered, we find that the performance of the two masks proposed in this Letter is better than the Sherif mask, but worse than our previously proposed phase mask, according to the Hilbert space angle.

  10. ActiWiz – optimizing your nuclide inventory at proton accelerators with a computer code

    CERN Document Server

    Vincke, Helmut

    2014-01-01

    When operating an accelerator one always faces unwanted, but inevitable beam losses. These result in activation of adjacent material, which in turn has an obvious impact on safety and handling constraints. One of the key parameters responsible for activation is the chemical composition of the material which often can be optimized in that respect. In order to facilitate this task also for non-expert users the ActiWiz software has been developed at CERN. Based on a large amount of generic FLUKA Monte Carlo simulations the software applies a specifically developed risk assessment model to provide support to decision makers especially during the design phase as well as common operational work in the domain of radiation protection.

  11. The design of optimal electric power demand management contracts

    Science.gov (United States)

    Fahrioglu, Murat

    1999-11-01

    Our society derives a quantifiable benefit from electric power. In particular, forced outages or blackouts have enormous consequences on society, one of which is loss of economic surplus. Electric utilities try to provide reliable supply of electric power to their customers. Maximum customer benefit derives from minimum cost and sufficient supply availability. Customers willing to share in "availability risk" can derive further benefit by participating in controlled outage programs. Specifically, whenever utilities foresee dangerous loading patterns, there is a need for a rapid reduction in demand either system-wide or at specific locations. The utility needs to get relief in order to solve its problems quickly and efficiently. This relief can come from customers who agree to curtail their loads upon request in exchange for an incentive fee. This thesis shows how utilities can get efficient load relief while maximizing their economic benefit. This work also shows how estimated customer cost functions can be calibrated, using existing utility data, to help in designing efficient demand management contracts. In order to design such contracts, optimal mechanism design is adopted from "Game Theory" and applied to the interaction between a utility and its customers. The idea behind mechanism design is to design an incentive structure that encourages customers to sign up for the right contract and reveal their true value of power. If a utility has demand management contracts with customers at critical locations, most operational problems can be solved efficiently. This thesis illustrates how locational attributes of customers incorporated into demand management contract design can have a significant impact in solving system problems. This kind of demand management contracts can also be used by an Independent System Operator (ISO). During times of congestion a loss of economic surplus occurs. When the market is too slow or cannot help relieve congestion, demand management

  12. Adaptive multi-objective Optimization scheme for cognitive radio resource management

    KAUST Repository

    Alqerm, Ismail; Shihada, Basem

    2014-01-01

    configuration by exploiting optimization and machine learning techniques. In this paper, we propose an Adaptive Multi-objective Optimization Scheme (AMOS) for cognitive radio resource management to improve spectrum operation and network performance

  13. An enhancement of selection and crossover operations in real-coded genetic algorithm for large-dimensionality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Noh Sung; Lee, Jongsoo [Yonsei University, Seoul (Korea, Republic of)

    2016-01-15

    The present study aims to implement a new selection method and a novel crossover operation in a real-coded genetic algorithm. The proposed selection method facilitates the establishment of a successively evolved population by combining several subpopulations: an elitist subpopulation, an off-spring subpopulation and a mutated subpopulation. A probabilistic crossover is performed based on the measure of probabilistic distance between the individuals. The concept of ‘allowance’ is suggested to describe the level of variance in the crossover operation. A number of nonlinear/non-convex functions and engineering optimization problems are explored to verify the capacities of the proposed strategies. The results are compared with those obtained from other genetic and nature-inspired algorithms.

  14. Optimal diagnosis, prevention, and management of periprosthetic joint infection

    Directory of Open Access Journals (Sweden)

    Tafer N

    2015-01-01

    Full Text Available Nathalie Tafer,1 Wilson Belaieff,1 Céline Cuérel,1 Matthieu Zingg,1 Pierre Hoffmeyer,1 Ilker Uçkay1,2 1Orthopedic Surgery Department, 2Division of Infectious Diseases, University of Geneva Hospitals and Medical School, Geneva, Switzerland Abstract: The pace of the aging population is steadily rising worldwide with a parallel increase in the demand for joint replacement procedures. With the increasing number of patients undergoing arthroplasty, there is also an increased risk for arthroplasty infection that may lead to severe complications, poorer outcome, and substantial extra costs for health care systems. Current rates of prosthetic joint infection are not dramatically different from the 1960s or 1970s, but some general principles are now better defined, and their management has been studied extensively during the past decades, thus resulting in a change in clinical practice. The purpose of this review is to summarize important principles of prosthetic joint infection to guide the clinician and to contribute to the optimal diagnosis, prevention, and management of periprosthetic joint infections. Keywords: arthroplasty infection, antibiotic therapy, biofilm, surgery, prevention

  15. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  16. Chemical cleaning - essential for optimal steam generator asset management

    International Nuclear Information System (INIS)

    Ammann, Franz

    2009-01-01

    Accumulation of deposits in Steam Generator is intrinsic during the operation of Pressurized Water Reactors. Such depositions lead to reduction of thermal performance, loss of component integrity and, in some cases, to power restrictions. Accordingly, removal of such deposits is an essential part of the asset management program of Steam Generators. Every plant has specific conditions, history and constraints which must be considered when planning and performing a chemical cleaning. Typical points are: -Constitution of the deposits or sludge - Sludge load - Sludge distribution in the steam generator - Existing or expected corrosion problems - Amount and tendency of fouling for waste treatment The strategy for chemical cleaning is developed from these points. The range of chemical cleaning treatments starts with very soft cleanings which can remove approximately 100kg per steam generator and ends with full scale, i.e., hard, cleanings which can remove several thousand kilograms of deposits from a steam generator. Dependent upon the desired goal for the operating plant and the steam generator material condition, the correct cleaning method can be selected. This requires flexible cleaning methods that can be adapted to the individual needs of a plant. Such customizing of chemical cleaning methods is a crucial factor for an optimized asset management program of steam generators in a nuclear power plant

  17. Insights for caribou/reindeer management using optimal foraging theory

    Directory of Open Access Journals (Sweden)

    Gary E. Belovsky

    1991-10-01

    Full Text Available Optimal foraging theory is useful to wildlife managers, because it helps explain the nutritional value of different habitats for wildlife species. Based upon nutritional value, the use of different habitats can be predicted, including how factors such as insect harassment, predation and migration might modify habitat selection. If habitat value and use can be understood, then changes in habitat availability which are of concern to wildlife managers can be assessed. The theory is used to address diet choice and habitat use of caribou/reindeer. Diet choice is examined in terms of lichen composition of the diet and is demonstrated to be a function of daily feeding time, food abundance and digestive capacity. The diet choice model is then used to assess the nutritional profitability of different habitats and which habitat should be preferred based upon nutritional profitability. Caribou/reindeer use of habitats is demonstrated to be easily modified by insect harassment and predation which change the nutritional profitability of habitats differentially. The same type of approach could be used to explain migratory behaviour; however, the needed parameter values are unavailable. The results of this analysis lead one to question some common conceptions about caribou/reindeer ecology.

  18. Advanced chemistry management system to optimize BWR chemistry control

    International Nuclear Information System (INIS)

    Maeda, K.; Nagasawa, K.

    2002-01-01

    BWR plant chemistry control has close relationships among nuclear safety, component reliability, radiation field management and fuel integrity. Advanced technology is required to improve chemistry control [1,3,6,7,10,11]. Toshiba has developed TACMAN (Toshiba Advanced Chemistry Management system) to support BWR chemistry control. The TACMAN has been developed as response to utilities' years of requirements to keep plant operation safety, reliability and cost benefit. The advanced technology built into the TACMAN allows utilities to make efficient chemistry control and to keep cost benefit. TACMAN is currently being used in response to the needs for tools those plant chemists and engineers could use to optimize and identify plant chemistry conditions continuously. If an incipient condition or anomaly is detected at early stage, root causes evaluation and immediate countermeasures can be provided. Especially, the expert system brings numerous and competitive advantages not only to improve plant chemistry reliability but also to standardize and systematize know-how, empirical knowledge and technologies in BWR chemistry This paper shows detail functions of TACMAN and practical results to evaluate actual plant. (authors)

  19. Optimal management of night eating syndrome: challenges and solutions

    Directory of Open Access Journals (Sweden)

    Kucukgoncu S

    2015-03-01

    Full Text Available Suat Kucukgoncu, Margaretta Midura, Cenk Tek Department of Psychiatry, Yale University, New Haven, CT, USA Abstract: Night Eating Syndrome (NES is a unique disorder characterized by a delayed pattern of food intake in which recurrent episodes of nocturnal eating and/or excessive food consumption occur after the evening meal. NES is a clinically important disorder due to its relationship to obesity, its association with other psychiatric disorders, and problems concerning sleep. However, NES often goes unrecognized by both health professionals and patients. The lack of knowledge regarding NES in clinical settings may lead to inadequate diagnoses and inappropriate treatment approaches. Therefore, the proper diagnosis of NES is the most important issue when identifying NES and providing treatment for this disorder. Clinical assessment tools such as the Night Eating Questionnaire may help health professionals working with populations vulnerable to NES. Although NES treatment studies are still in their infancy, antidepressant treatments and psychological therapies can be used for optimal management of patients with NES. Other treatment options such as melatonergic medications, light therapy, and the anticonvulsant topiramate also hold promise as future treatment options. The purpose of this review is to provide a summary of NES, including its diagnosis, comorbidities, and treatment approaches. Possible challenges addressing patients with NES and management options are also discussed. Keywords: night eating, obesity, psychiatric disorders, weight, depression

  20. A k-distribution-based radiation code and its computational optimization for an atmospheric general circulation model

    International Nuclear Information System (INIS)

    Sekiguchi, Miho; Nakajima, Teruyuki

    2008-01-01

    The gas absorption process scheme in the broadband radiative transfer code 'mstrn8', which is used to calculate atmospheric radiative transfer efficiently in a general circulation model, is improved. Three major improvements are made. The first is an update of the database of line absorption parameters and the continuum absorption model. The second is a change to the definition of the selection rule for gas absorption used to choose which absorption bands to include. The last is an upgrade of the optimization method used to decrease the number of quadrature points used for numerical integration in the correlated k-distribution approach, thereby realizing higher computational efficiency without losing accuracy. The new radiation package termed 'mstrnX' computes radiation fluxes and heating rates with errors less than 0.6 W/m 2 and 0.3 K/day, respectively, through the troposphere and the lower stratosphere for any standard AFGL atmospheres. A serious cold bias problem of an atmospheric general circulation model using the ancestor code 'mstrn8' is almost solved by the upgrade to 'mstrnX'

  1. WASA-BOSS. Development and application of Severe Accident Codes. Evaluation and optimization of accident management measures. Subproject E. Improvement of the lower head model in MELCOR and calculations in connection with the FUKUSHIMA accident. Final report; WASA-BOSS. Weiterentwicklung und Anwendung von Severe Accident Codes. Bewertung und Optimierung von Stoerfallmassnahmen. Teilprojekt E. Verbesserung des Lower Head-Modelles fuer MELCOR und MELCOR-Rechnungen zu Fukushima. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Kretzschmar, Frank; Dietrich, Philipp; Gabriel, Stephan; Miassoedov, Alexei

    2016-12-15

    The knowledge of the key phenomena, which govern the chronological sequence of a core melt accident, is a crucial precondition for the development of SAMGs (Severe Management Guides) to avoid and mitigate the radiological consequences for the population and the environment. In the frame of a dissertation a new model has been coupled with MELCOR, which describes the thermal interaction of a core melt with the RPV (reactor pressure vessel) wall in the lower plenum. This model allows a better description of this phenomenon. The method to couple extern programs with MELCOR had been already developed and used in a former dissertation at KIT-IKET. The model has been validated recalculating according experiments in the LIVE facility. Afterwards a defined accident scenario has been calculated for a German generic KONVOI power plant. 12 months after the start of the project a MELCOR input has been developed using data delivered by the Ruhr university of Bochum (subproject ''Simulation des Unfalls in Fukushima-Daichi zur Bewertung des Stoerfall-Analysecodes ATHLET-CD''). The results of this simulation have made a contribution to review the current understanding of the FUKUSHIMA sequence. HZDR and KIT-IKET have agreed in the course of the project, that KIT-IKET will develop a MELCOR input of a german generic KONVOI power plant following an ATHLET-CDinput of HZDR. Using this MELCOR input, a comparative analysis has been performed.

  2. Plant life management optimized utilization of existing nuclear power plants

    International Nuclear Information System (INIS)

    Watzinger, H.; Erve, M.

    1999-01-01

    For safe, reliable and economical nuclear power generation it is of central importance to understand, analyze and manage aging-related phenomena and to apply this information in the systematic utilization and as-necessary extension of the service life of components and systems. An operator's overall approach to aging and plant life management which also improves performance characteristics can help to optimize plant operating economy. In view of the deregulation of the power generation industry with its increased competition, nuclear power plants must today also increasingly provide for or maintain a high level of plant availability and low power generating costs. This is a difficult challenge even for the newest, most modern plants, and as plants age they can only remain competitive if a plant operator adopts a strategic approach which takes into account the various aging-related effects on a plant-wide basis. The significance of aging and plant life management for nuclear power plants becomes apparent when looking at their age: By the year 2000 roughly fifty of the world's 434 commercial nuclear power plants will have been in operation for thirty years or more. According to the International Atomic Energy Agency, as many as 110 plants will have reached the thirty-year service mark by the year 2005. In many countries human society does not push the construction of new nuclear power plants and presumably will not change mind within the next ten years. New construction licenses cannot be expected so that for economical and ecological reasons existing plants have to be operated unchallengeably. On the other hand the deregulation of the power production market is asking just now for analysis of plant life time to operate the plants at a high technical and economical level until new nuclear power plants can be licensed and constructed. (author)

  3. Stochastic algorithm for channel optimized vector quantization: application to robust narrow-band speech coding

    International Nuclear Information System (INIS)

    Bouzid, M.; Benkherouf, H.; Benzadi, K.

    2011-01-01

    In this paper, we propose a stochastic joint source-channel scheme developed for efficient and robust encoding of spectral speech LSF parameters. The encoding system, named LSF-SSCOVQ-RC, is an LSF encoding scheme based on a reduced complexity stochastic split vector quantizer optimized for noisy channel. For transmissions over noisy channel, we will show first that our LSF-SSCOVQ-RC encoder outperforms the conventional LSF encoder designed by the split vector quantizer. After that, we applied the LSF-SSCOVQ-RC encoder (with weighted distance) for the robust encoding of LSF parameters of the 2.4 Kbits/s MELP speech coder operating over a noisy/noiseless channel. The simulation results will show that the proposed LSF encoder, incorporated in the MELP, ensure better performances than the original MELP MSVQ of 25 bits/frame; especially when the transmission channel is highly disturbed. Indeed, we will show that the LSF-SSCOVQ-RC yields significant improvement to the LSFs encoding performances by ensuring reliable transmissions over noisy channel.

  4. Virtual machine provisioning, code management, and data movement design for the Fermilab HEPCloud Facility

    Science.gov (United States)

    Timm, S.; Cooper, G.; Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Grassano, D.; Tiradani, A.; Krishnamurthy, R.; Vinayagam, S.; Raicu, I.; Wu, H.; Ren, S.; Noh, S.-Y.

    2017-10-01

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  5. Virtual Machine Provisioning, Code Management, and Data Movement Design for the Fermilab HEPCloud Facility

    Energy Technology Data Exchange (ETDEWEB)

    Timm, S. [Fermilab; Cooper, G. [Fermilab; Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Grassano, D. [Fermilab; Tiradani, A. [Fermilab; Krishnamurthy, R. [IIT, Chicago; Vinayagam, S. [IIT, Chicago; Raicu, I. [IIT, Chicago; Wu, H. [IIT, Chicago; Ren, S. [IIT, Chicago; Noh, S. Y. [KISTI, Daejeon

    2017-11-22

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  6. Developing Optimal Procedure of Emergency Outside Cooling Water Injection for APR1400 Extended SBO Scenario Using MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Jong Rok; Oh, Seung Jong [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-10-15

    In this study, we examined optimum operator actions to mitigate extended SBO using MARS code. Particularly, this paper focuses on analyzing outside core cooling water injection scenario, and aimed to develop optimal extended SBO procedure. Supplying outside emergency cooling water is the key feature of flexible strategy in extended SBO situation. An optimum strategy to maintain core cooling is developed for typical extended SBO. MARS APR1400 best estimate model was used to find optimal procedure. Also RCP seal leakage effect was considered importantly. Recent Fukushima accident shows the importance of mitigation capability against extended SBO scenarios. In Korea, all nuclear power plants incorporated various measures against Fukushima-like events. For APR1400 NPP, outside connectors are installed to inject cooling water using fire trucks or portable pumps. Using these connectors, outside cooling water can be provided to reactor, steam generators (SG), containment spray system, and spent fuel pool. In U. S., similar approach is chosen to provide a diverse and flexible means to prevent fuel damage (core and SFP) in external event conditions resulting in extended loss of AC power and loss of ultimate heat sink. Hence, hardware necessary to cope with extended SBO is already available for APR1400. However, considering the complex and stressful condition encountered by operators during extended SBO, it is important to develop guidelines/procedures to best cope with the event.

  7. Redefining Secondary Forests in the Mexican Forest Code: Implications for Management, Restoration, and Conservation

    Directory of Open Access Journals (Sweden)

    Francisco J. Román-Dañobeytia

    2014-05-01

    Full Text Available The Mexican Forest Code establishes structural reference values to differentiate between secondary and old-growth forests and requires a management plan when secondary forests become old-growth and potentially harvestable forests. The implications of this regulation for forest management, restoration, and conservation were assessed in the context of the Calakmul Biosphere Reserve, which is located in the Yucatan Peninsula. The basal area and stem density thresholds currently used by the legislation to differentiate old-growth from secondary forests are 4 m2/ha and 15 trees/ha (trees with a diameter at breast height of >25 cm; however, our research indicates that these values should be increased to 20 m2/ha and 100 trees/ha, respectively. Given that a management plan is required when secondary forests become old-growth forests, many landowners avoid forest-stand development by engaging slash-and-burn agriculture or cattle grazing. We present evidence that deforestation and land degradation may prevent the natural regeneration of late-successional tree species of high ecological and economic importance. Moreover, we discuss the results of this study in the light of an ongoing debate in the Yucatan Peninsula between policy makers, non-governmental organizations (NGOs, landowners and researchers, regarding the modification of this regulation to redefine the concept of acahual (secondary forest and to facilitate forest management and restoration with valuable timber tree species.

  8. The optimally sampled galaxy-wide stellar initial mass function. Observational tests and the publicly available GalIMF code

    Science.gov (United States)

    Yan, Zhiqiang; Jerabkova, Tereza; Kroupa, Pavel

    2017-11-01

    Here we present a full description of the integrated galaxy-wide initial mass function (IGIMF) theory in terms of the optimal sampling and compare it with available observations. Optimal sampling is the method we use to discretize the IMF deterministically into stellar masses. Evidence indicates that nature may be closer to deterministic sampling as observations suggest a smaller scatter of various relevant observables than random sampling would give, which may result from a high level of self-regulation during the star formation process. We document the variation of IGIMFs under various assumptions. The results of the IGIMF theory are consistent with the empirical relation between the total mass of a star cluster and the mass of its most massive star, and the empirical relation between the star formation rate (SFR) of a galaxy and the mass of its most massive cluster. Particularly, we note a natural agreement with the empirical relation between the IMF power-law index and the SFR of a galaxy. The IGIMF also results in a relation between the SFR of a galaxy and the mass of its most massive star such that, if there were no binaries, galaxies with SFR first time, we show optimally sampled galaxy-wide IMFs (OSGIMF) that mimic the IGIMF with an additional serrated feature. Finally, a Python module, GalIMF, is provided allowing the calculation of the IGIMF and OSGIMF dependent on the galaxy-wide SFR and metallicity. A copy of the python code model is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/607/A126

  9. Optimized Data Transfers Based on the OpenCL Event Management Mechanism

    Directory of Open Access Journals (Sweden)

    Hiroyuki Takizawa

    2015-01-01

    Full Text Available In standard OpenCL programming, hosts are supposed to control their compute devices. Since compute devices are dedicated to kernel computation, only hosts can execute several kinds of data transfers such as internode communication and file access. These data transfers require one host to simultaneously play two or more roles due to the need for collaboration between the host and devices. The codes for such data transfers are likely to be system-specific, resulting in low portability. This paper proposes an OpenCL extension that incorporates such data transfers into the OpenCL event management mechanism. Unlike the current OpenCL standard, the main thread running on the host is not blocked to serialize dependent operations. Hence, an application can easily use the opportunities to overlap parallel activities of hosts and compute devices. In addition, the implementation details of data transfers are hidden behind the extension, and application programmers can use the optimized data transfers without any tricky programming techniques. The evaluation results show that the proposed extension can use the optimized data transfer implementation and thereby increase the sustained data transfer performance by about 18% for a real application accessing a big data file.

  10. Optimization of core reload design for low leakage fuel management in pressurized water reactors

    International Nuclear Information System (INIS)

    Kim, Y.J.

    1986-01-01

    A new method was developed to optimize pressurized water reactor core reload design for low leakage fuel management, a strategy recently adopted by most utilities to extend cycle length and mitigate pressurized thermal shock concerns. The method consists of a two-stage optimization process which provides the maximum cycle length for a given fresh fuel loading subject to power peaking constraints. In the first stage, a best fuel arrangement is determined at the end of cycle in the absence of burnable poisons. A direct search method is employed in conjunction with a constant power, Haling depletion. In the second stage, the core control poison requirements are determined using a linear programming technique. The solution provides the fresh fuel burnable poison loading required to meet core power peaking constraints. An accurate method of explicitly modeling burnable absorbers was developed for this purpose. The design method developed here was implemented in a currently recognized fuel licensing code, SIMULATE, that was adapted to the CYBER-205 computer. This methodology was applied to core reload design of cycles 9 and 10 for the Commonwealth Edison Zion, Unit-1 Reactor. The results showed that the optimum loading pattern for cycle 9 yielded almost a 9% increase in the cycle length while reducing core vessel fluence by 30% compared with the reference design used by Commonwealth Edison

  11. Implementation of data management and effect on chronic disease coding in a primary care organisation: A parallel cohort observational study.

    Science.gov (United States)

    Greiver, Michelle; Wintemute, Kimberly; Aliarzadeh, Babak; Martin, Ken; Khan, Shahriar; Jackson, Dave; Leggett, Jannet; Lambert-Lanning, Anita; Siu, Maggie

    2016-10-12

    Consistent and standardized coding for chronic conditions is associated with better care; however, coding may currently be limited in electronic medical records (EMRs) used in Canadian primary care.Objectives To implement data management activities in a community-based primary care organisation and to evaluate the effects on coding for chronic conditions. Fifty-nine family physicians in Toronto, Ontario, belonging to a single primary care organisation, participated in the study. The organisation implemented a central analytical data repository containing their EMR data extracted, cleaned, standardized and returned by the Canadian Primary Care Sentinel Surveillance Network (CPCSSN), a large validated primary care EMR-based database. They used reporting software provided by CPCSSN to identify selected chronic conditions and standardized codes were then added back to the EMR. We studied four chronic conditions (diabetes, hypertension, chronic obstructive pulmonary disease and dementia). We compared changes in coding over six months for physicians in the organisation with changes for 315 primary care physicians participating in CPCSSN across Canada. Chronic disease coding within the organisation increased significantly more than in other primary care sites. The adjusted difference in the increase of coding was 7.7% (95% confidence interval 7.1%-8.2%, p Data management activities were associated with an increase in standardized coding for chronic conditions. Exploring requirements to scale and spread this approach in Canadian primary care organisations may be worthwhile.

  12. Swiss Foundation Code 2015 principles and recommendations for the establishment and management of grant-making foundations

    CERN Document Server

    Sprecher, Thomas; Schnurbein, Georg von

    2015-01-01

    The publication 'Swiss Foundation Code' contains practical governance guidelines on the topics of the establishment of foundations, their organisation, management and supervision, their charitable work and also on finance and investment policy for the contemporary and professional management of charitable foundations.

  13. Investigating the Optimal Management Strategy for a Healthcare Facility Maintenance Program

    National Research Council Canada - National Science Library

    Gaillard, Daria

    2004-01-01

    ...: strategic partnering with an equipment management firm. The objective of this study is to create a decision-model for selecting the optimal management strategy for a healthcare organization's facility maintenance program...

  14. Quantum behaved Particle Swarm Optimization with Differential Mutation operator applied to WWER-1000 in-core fuel management optimization

    International Nuclear Information System (INIS)

    Jamalipour, Mostafa; Sayareh, Reza; Gharib, Morteza; Khoshahval, Farrokh; Karimi, Mahmood Reza

    2013-01-01

    Highlights: ► A new method called QPSO-DM is applied to BNPP in-core fuel management optimization. ► It is found that QPSO-DM performs better than PSO and QPSO. ► This method provides a permissible arrangement for optimum loading pattern. - Abstract: This paper presents a new method using Quantum Particle Swarm Optimization with Differential Mutation operator (QPSO-DM) for optimizing WWER-1000 core fuel management. Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) have shown good performance on in-core fuel management optimization (ICFMO). The objective of this paper is to show that QPSO-DM performs very well and is comparable to PSO and Quantum Particle Swarm Optimization (QPSO). Most of the strategies for ICFMO are based on maximizing multiplication factor (k eff ) to increase cycle length and minimizing power peaking factor (P q ) in order to improve fuel integrity. PSO, QPSO and QPSO-DM have been implemented to fulfill these requirements for the first operating cycle of WWER-1000 Bushehr Nuclear Power Plant (BNPP). The results show that QPSO-DM performs better than the others. A program has been written in MATLAB to map PSO, QPSO and QPSO-DM for loading pattern optimization. WIMS and CITATION have been used to simulate reactor core for neutronic calculations

  15. [Optimized logistics in the prehospital management of acute stroke].

    Science.gov (United States)

    Luiz, T; Moosmann, A; Koch, C; Behrens, S; Daffertshofer, M; Ellinger, K

    2001-12-01

    Current management of acute stroke is characterised by an aggressive approach including specific therapy i. e. reperfusion therapy. However currently stroke patients often arrive too late in hospitals offering adequate treatment. Therefore optimized logistics play a predominant role in modern stroke management. 1. Does teaching of EMS staff and the public result in reduced prehospital latencies 2. Will EMS personnel be able to effectively screen patients potentially suitable for thrombolysis? During a six week-period all EMS patients presenting with possible signs of an acute stroke were prospectively registered (period 1). Data of interest were age, mode of primary contact, prehospital latencies, mode of transportation, destination and final diagnosis. Next an algorithm was established allowing EMS personnel to transfer patients with an assumed stroke to the best suitable hospital. Teaching comprised clinical signs, indication of CT scanning, pathophysiology, specific therapeutic options (thrombolysis), and criteria to identify patients suitable for thrombolysis. In a second step the public was continuously taught about stroke symptoms and the necessity to instantly seek EMS assistance. After 12 months data were compared to baseline (period 2). (period 2 vs. Period 1): Rate of patients transferred to a stroke center: 60 % vs. 54 %; rate of those transported to hospitals not offering CT scans: 17 % vs. 26 % (p < 0.05). Percentage of patients primarily contacting the EMS system: 33 % vs. 24 %. Median interval between onset of symptoms and emergency call: 54 vs. 263 minutes Median interval between the emergency call and arrival at the emergency department: 44 vs. 58 minutes (p < 0.01). Rate of patients admitted with a diagnosis other than stroke: 18 % vs. 25 % (n. s.). Median interval between onset of symptoms and hospital admission: 140 vs. 368 minutes (p < 0.001). Median age: 69 vs. 75 years (p < 0.01). This study demonstrates the efficacy of educational efforts in

  16. Optimal screening and donor management in a public stool bank.

    Science.gov (United States)

    Kazerouni, Abbas; Burgess, James; Burns, Laura J; Wein, Lawrence M

    2015-12-17

    Fecal microbiota transplantation is an effective treatment for recurrent Clostridium difficile infection and is being investigated as a treatment for other microbiota-associated diseases. To facilitate these activities, an international public stool bank has been created, which screens donors and processes stools in a standardized manner. The goal of this research is to use mathematical modeling and analysis to optimize screening and donor management at the stool bank. Compared to the current policy of screening active donors every 60 days before releasing their quarantined stools for sale, costs can be reduced by 10.3 % by increasing the screening frequency to every 36 days. In addition, the stool production rate varies widely across donors, and using donor-specific screening, where higher producers are screened more frequently, also reduces costs, as does introducing an interim (i.e., between consecutive regular tests) stool test for just rotavirus and C. difficile. We also derive a donor release (i.e., into the system) policy that allows the supply to approximately match an exponentially increasing deterministic demand. More frequent screening, interim screening for rotavirus and C. difficile, and donor-specific screening, where higher stool producers are screened more frequently, are all cost-reducing measures. If screening costs decrease in the future (e.g., as a result of bringing screening in house), a bottleneck for implementing some of these recommendations may be the reluctance of donors to undergo serum screening more frequently than monthly.

  17. Optimal management of dactylitis in patients with psoriatic arthritis

    Directory of Open Access Journals (Sweden)

    Yamamoto T

    2015-09-01

    Full Text Available Toshiyuki YamamotoDepartment of Dermatology, Fukushima Medical University, Fukushima, JapanAbstract: Psoriatic arthritis (PsA is an inflammatory arthropathy associated with cutaneous psoriasis, which is currently classified as a seronegative spondyloarthropathy. The presence of cutaneous psoriasis is important for correct and early diagnosis of PsA, because the onset of cutaneous lesions usually precedes the appearance of joint manifestation. Thus, dermatologists are able to detect the condition at its inception. PsA has several unique characteristics such as enthesopathy, dactylitis, and abnormal bone remodeling. In particular, dactylitis occurs on the easily observed sites such as digits, and is thus a significant indicator of PsA. It is important to observe not only the fingers but also the toes, because dactylitis involves both digits of the hands and feet. Recently, new ideas regarding the involvement of the interleukin (IL-23/Th17 axis have emerged, and the dramatic effects of targeting therapies have highlighted the physiological roles of key cytokines such as tumor necrosis factor-α, IL-17A, and IL-23 in psoriasis. As recent insights are shedding light on the pathogenesis of PsA, understanding of the pathogenesis of dactylitis and enthesitis are also progressing. In this article, current views on the optimal management of dactylitis are discussed.Keywords: pathogenesis, therapy, enthesitis, tenosynovitis

  18. OSSA - An optimized approach to severe accident management: EPR application

    International Nuclear Information System (INIS)

    Sauvage, E. C.; Prior, R.; Coffey, K.; Mazurkiewicz, S. M.

    2006-01-01

    There is a recognized need to provide nuclear power plant technical staff with structured guidance for response to a potential severe accident condition involving core damage and potential release of fission products to the environment. Over the past ten years, many plants worldwide have implemented such guidance for their emergency technical support center teams either by following one of the generic approaches, or by developing fully independent approaches. There are many lessons to be learned from the experience of the past decade, in developing, implementing, and validating severe accident management guidance. Also, though numerous basic approaches exist which share common principles, there are differences in the methodology and application of the guidelines. AREVA/Framatome-ANP is developing an optimized approach to severe accident management guidance in a project called OSSA ('Operating Strategies for Severe Accidents'). There are still numerous operating power plants which have yet to implement severe accident management programs. For these, the option to use an updated approach which makes full use of lessons learned and experience, is seen as a major advantage. Very few of the current approaches covers all operating plant states, including shutdown states with the primary system closed and open. Although it is not necessary to develop an entirely new approach in order to add this capability, the opportunity has been taken to develop revised full scope guidance covering all plant states in addition to the fuel in the fuel building. The EPR includes at the design phase systems and measures to minimize the risk of severe accident and to mitigate such potential scenarios. This presents a difference in comparison with existing plant, for which severe accidents where not considered in the design. Thought developed for all type of plants, OSSA will also be applied on the EPR, with adaptations designed to take into account its favourable situation in that field

  19. Root Exploit Detection and Features Optimization: Mobile Device and Blockchain Based Medical Data Management.

    Science.gov (United States)

    Firdaus, Ahmad; Anuar, Nor Badrul; Razak, Mohd Faizal Ab; Hashem, Ibrahim Abaker Targio; Bachok, Syafiq; Sangaiah, Arun Kumar

    2018-05-04

    The increasing demand for Android mobile devices and blockchain has motivated malware creators to develop mobile malware to compromise the blockchain. Although the blockchain is secure, attackers have managed to gain access into the blockchain as legal users, thereby comprising important and crucial information. Examples of mobile malware include root exploit, botnets, and Trojans and root exploit is one of the most dangerous malware. It compromises the operating system kernel in order to gain root privileges which are then used by attackers to bypass the security mechanisms, to gain complete control of the operating system, to install other possible types of malware to the devices, and finally, to steal victims' private keys linked to the blockchain. For the purpose of maximizing the security of the blockchain-based medical data management (BMDM), it is crucial to investigate the novel features and approaches contained in root exploit malware. This study proposes to use the bio-inspired method of practical swarm optimization (PSO) which automatically select the exclusive features that contain the novel android debug bridge (ADB). This study also adopts boosting (adaboost, realadaboost, logitboost, and multiboost) to enhance the machine learning prediction that detects unknown root exploit, and scrutinized three categories of features including (1) system command, (2) directory path and (3) code-based. The evaluation gathered from this study suggests a marked accuracy value of 93% with Logitboost in the simulation. Logitboost also helped to predicted all the root exploit samples in our developed system, the root exploit detection system (RODS).

  20. Optimizing chronic disease management mega-analysis: economic evaluation.

    Science.gov (United States)

    2013-01-01

    As Ontario's population ages, chronic diseases are becoming increasingly common. There is growing interest in services and care models designed to optimize the management of chronic disease. To evaluate the cost-effectiveness and expected budget impact of interventions in chronic disease cohorts evaluated as part of the Optimizing Chronic Disease Management mega-analysis. Sector-specific costs, disease incidence, and mortality were calculated for each condition using administrative databases from the Institute for Clinical Evaluative Sciences. Intervention outcomes were based on literature identified in the evidence-based analyses. Quality-of-life and disease prevalence data were obtained from the literature. Analyses were restricted to interventions that showed significant benefit for resource use or mortality from the evidence-based analyses. An Ontario cohort of patients with each chronic disease was constructed and followed over 5 years (2006-2011). A phase-based approach was used to estimate costs across all sectors of the health care system. Utility values identified in the literature and effect estimates for resource use and mortality obtained from the evidence-based analyses were applied to calculate incremental costs and quality-adjusted life-years (QALYs). Given uncertainty about how many patients would benefit from each intervention, a system-wide budget impact was not determined. Instead, the difference in lifetime cost between an individual-administered intervention and no intervention was presented. Of 70 potential cost-effectiveness analyses, 8 met our inclusion criteria. All were found to result in QALY gains and cost savings compared with usual care. The models were robust to the majority of sensitivity analyses undertaken, but due to structural limitations and time constraints, few sensitivity analyses were conducted. Incremental cost savings per patient who received intervention ranged between $15 per diabetic patient with specialized nursing to

  1. Fuel Management Study for a CANDU reactor Using New Physics Codes Suite

    International Nuclear Information System (INIS)

    Kim, Won Young; Kim, Bong Ghi; Park, Joo Hwan

    2008-01-01

    A CANDU reactor is a heavy-water-moderated, natural uranium fuelled reactor with a pressure tube. The reactor contains a horizontal cylindrical vessel (calandria) and each pressure tube is isolated from the heavy-water moderator in a calandria. This allows the moderator system to be operated of a high-pressure and of a high-temperature coolant in pressure tube. The primary reactivity control in a CANDU reactor is the on-power refueling on a daily basis and an additional reactivity control is provided through an individual reactivity device movement, which includes 21 adjusters, 6 liquid zone controllers, 4 mechanical control absorbers and 2 shutdown systems. The refueling in CANDU is carried out on power and this makes the in-core fuel management different from that in a reactor refueled during shutdowns. The objective of a fuel management is to determine a fuel loading and fuel replacement procedure which will result in a minimum total unit energy cost in a safe and reliable operation. In this article, the in-core fuel management for the CANDU reactor was studied by using the new physics code suite of WIMS-IST/DRAGON-IST/RFSP-IST with the model of Wolsong-1 NPP

  2. Ethical Management in Companies in the Czech Republic and Ukraine - Comparison of the Presence of a Code of Ethics

    Directory of Open Access Journals (Sweden)

    Caha Zdeněk

    2017-01-01

    Full Text Available The aim of the study is to compare the commonness of a code of ethics as the most important ethical management tool in the business sector in two post-communist countries, namely the Czech Republic and Ukraine. The hypothesis that a code of ethics is much more widespread in the economically more developed country, which is the Czech Republic, and also the assumption that the occurrence of a code of ethics is in relation to the company size, were examined on the base of a questionnaire survey. The results definitely confirmed that a code of ethics is much more widespread in the Czech Republic than in Ukraine. The survey results have also confirmed that the commonness of a code of ethics grows with the company size. This was not confirmed in micro and small companies in Ukraine.

  3. Implementation of an optimal control energy management strategy in a hybrid truck

    NARCIS (Netherlands)

    Mullem, D. van; Keulen, T. van; Kessels, J.T.B.A.; Jager, B. de; Steinbuch, M.

    2010-01-01

    Energy Management Strategies for hybrid powertrains control the power split, between the engine and electric motor, of a hybrid vehicle, with fuel consumption or emission minimization as objective. Optimal control theory can be applied to rewrite the optimization problem to an optimization

  4. Application of static fuel management codes for determination of the neutron noise using the adiabatic approximation

    International Nuclear Information System (INIS)

    Garis, N.S.; Karlsson, J.K.H.; Pazsit, I.

    2000-01-01

    The neutron noise, induced by a rod manoeuvring experiment in a pressurized water reactor, has been calculated by the incore fuel management code SIMULATE. The space- and frequency-dependent noise in the thermal group was calculated through the adiabatic approximation in three dimensions and two-group theory, with the spatial resolution of the nodal model underlying the SIMULATE algorithm. The calculated spatial noise profiles were interpreted on physical terms. They were also compared with model calculations in a 2-D one-group model, where various approximations as well as the full space-dependent response could be calculated. The adiabatic results obtained with SIMULATE can be regarded as reliable for sub-plateau frequencies (below 0.1 Hz). (orig.) [de

  5. Application of genetic algorithms to in-core nuclear fuel management optimization

    International Nuclear Information System (INIS)

    Poon, P.W.; Parks, G.T.

    1993-01-01

    The search for an optimal arrangement of fresh and burnt fuel and control material within the core of a PWR represents a formidable optimization problem. The approach of combining the robust optimization capabilities of the Simulated Annealing (SA) algorithm with the computational speed of a Generalized Perturbation Theory (GPT) based evaluation methodology in the code FORMOSA has proved to be very effective. In this paper, we show that the incorporation of another stochastic search technique, a Genetic Algorithm, results in comparable optimization performance on serial computers and offers substantially superior performance on parallel machines. (orig.)

  6. Optimized Management of Groundwater Resources in Kish Island: A Sensitivity Analysis of Optimal Strategies in Response to Environmental Changes

    Directory of Open Access Journals (Sweden)

    Davood Mahmoodzadeh

    2016-05-01

    Full Text Available Groundwater in coastal areas is an essential source of freshwater that warrants protection from seawater intrusion as a priority based on an optimal management plan. Proper optimal management strategies can be developed using a variety of decision-making models. The present study aims to investigate the impacts of environmental changes on groundwater resources. For this purpose, a combined simulation-optimization model is employed that incorporates the SUTRA numerical model and the evolutionaty method of ant colony optimization. The fresh groundwater lens in Kish Island is used as a case study and different scenarios are considered for the likely enviromental changes. Results indicate that while variations in recharge rate form an important factor in the fresh groundwater lens, land-surface inundation due to rises in seawater level, especially in low-lying lands, is the major factor affecting the lens. Furthermore, impacts of environmental changes when effected into the Kish Island aquifer optimization management plan have led to a reduction of more than 20% in the allowable water extraction, indicating the high sensitivity of groundwater resources management plans in small islands to such variations.

  7. Progress in the Legitimacy of Business and Management Education Research: Rejoinder to "Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory"

    Science.gov (United States)

    Bacon, Donald R.

    2016-01-01

    In this rejoinder to "Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory," published in the "Journal of Management Education," Dec 2016 (see EJ1118407), Donald R. Bacon discusses the similarities between Arbaugh et al.'s (2016) findings and the scholarship…

  8. Optimal management of nail disease in patients with psoriasis

    Directory of Open Access Journals (Sweden)

    Piraccini BM

    2015-01-01

    psoriasis and the optimal management of nail disease in patients with psoriasis. Keywords: biologics, nail psoriasis, topical therapy, systemic therapy

  9. Optimal source coding, removable noise elimination, and natural coordinate system construction for general vector sources using replicator neural networks

    Science.gov (United States)

    Hecht-Nielsen, Robert

    1997-04-01

    A new universal one-chart smooth manifold model for vector information sources is introduced. Natural coordinates (a particular type of chart) for such data manifolds are then defined. Uniformly quantized natural coordinates form an optimal vector quantization code for a general vector source. Replicator neural networks (a specialized type of multilayer perceptron with three hidden layers) are the introduced. As properly configured examples of replicator networks approach minimum mean squared error (e.g., via training and architecture adjustment using randomly chosen vectors from the source), these networks automatically develop a mapping which, in the limit, produces natural coordinates for arbitrary source vectors. The new concept of removable noise (a noise model applicable to a wide variety of real-world noise processes) is then discussed. Replicator neural networks, when configured to approach minimum mean squared reconstruction error (e.g., via training and architecture adjustment on randomly chosen examples from a vector source, each with randomly chosen additive removable noise contamination), in the limit eliminate removable noise and produce natural coordinates for the data vector portions of the noise-corrupted source vectors. Consideration regarding selection of the dimension of a data manifold source model and the training/configuration of replicator neural networks are discussed.

  10. PWR in-core nuclear fuel management optimization utilizing nodal (non-linear NEM) generalized perturbation theory

    International Nuclear Information System (INIS)

    Maldonado, G.I.; Turinsky, P.J.; Kropaczek, D.J.

    1993-01-01

    The computational capability of efficiently and accurately evaluate reactor core attributes (i.e., k eff and power distributions as a function of cycle burnup) utilizing a second-order accurate advanced nodal Generalized Perturbation Theory (GPT) model has been developed. The GPT model is derived from the forward non-linear iterative Nodal Expansion Method (NEM) strategy, thereby extending its inherent savings in memory storage and high computational efficiency to also encompass GPT via the preservation of the finite-difference matrix structure. The above development was easily implemented into the existing coarse-mesh finite-difference GPT-based in-core fuel management optimization code FORMOSA-P, thus combining the proven robustness of its adaptive Simulated Annealing (SA) multiple-objective optimization algorithm with a high-fidelity NEM GPT neutronics model to produce a powerful computational tool used to generate families of near-optimum loading patterns for PWRs. (orig.)

  11. 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences

    CERN Document Server

    Dinh, Tao; Nguyen, Ngoc

    2015-01-01

    This proceedings set contains 85 selected full papers presented at the 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences - MCO 2015, held on May 11–13, 2015 at Lorraine University, France. The present part I of the 2 volume set includes articles devoted to Combinatorial optimization and applications, DC programming and DCA: thirty years of Developments, Dynamic Optimization, Modelling and Optimization in financial engineering, Multiobjective programming, Numerical Optimization, Spline Approximation and Optimization, as well as Variational Principles and Applications

  12. DART code optimization works

    International Nuclear Information System (INIS)

    Taboada, Horacio; Solis, Diego

    1999-01-01

    DART (Dispersion Analysis Research Tool) calculation and assessment program is a thermomechanical computer model developed by Dr. J. Rest of Argonne National Laboratory, USA. This program is the only mechanistic model available to assure the performance of low-enriched oxided-based dispersion fuels, dispersion of siliciures and uranium intermetallics in aluminum matrix for research reactors. The program predicts fission-products induced swelling (especially gases), fuel behavior during fabrication porosity closing, macroscopical changes in diameter of rods or width of plates and tubes produced by fuel deformation, degradation of thermal conductivity of fuel dispersion owing to irradiation and fuel restructuring because of Al-fuel reaction, amorphization and recrystallization. (author)

  13. Extending DIRAC File Management with Erasure-Coding for efficient storage

    CERN Document Server

    Skipsey, Samuel Cadellin; Britton, David; Crooks, David; Roy, Gareth

    2015-01-01

    The state of the art in Grid style data management is to achieve increased resilience of data via multiple complete replicas of data files across multiple storage endpoints. While this is effective, it is not the most space-efficient approach to resilience, especially when the reliability of individual storage endpoints is sufficiently high that only a few will be inactive at any point in time. We report on work performed as part of GridPP\\cite{GridPP}, extending the Dirac File Catalogue and file management interface to allow the placement of erasure-coded files: each file distributed as N identically-sized chunks of data striped across a vector of storage endpoints, encoded such that any M chunks can be lost and the original file can be reconstructed. The tools developed are transparent to the user, and, as well as allowing up and downloading of data to Grid storage, also provide the possibility of parallelising access across all of the distributed chunks at once, improving data transfer and IO performance. ...

  14. Extending DIRAC File Management with Erasure-Coding for efficient storage.

    Science.gov (United States)

    Cadellin Skipsey, Samuel; Todev, Paulin; Britton, David; Crooks, David; Roy, Gareth

    2015-12-01

    The state of the art in Grid style data management is to achieve increased resilience of data via multiple complete replicas of data files across multiple storage endpoints. While this is effective, it is not the most space-efficient approach to resilience, especially when the reliability of individual storage endpoints is sufficiently high that only a few will be inactive at any point in time. We report on work performed as part of GridPP[1], extending the Dirac File Catalogue and file management interface to allow the placement of erasure-coded files: each file distributed as N identically-sized chunks of data striped across a vector of storage endpoints, encoded such that any M chunks can be lost and the original file can be reconstructed. The tools developed are transparent to the user, and, as well as allowing up and downloading of data to Grid storage, also provide the possibility of parallelising access across all of the distributed chunks at once, improving data transfer and IO performance. We expect this approach to be of most interest to smaller VOs, who have tighter bounds on the storage available to them, but larger (WLCG) VOs may be interested as their total data increases during Run 2. We provide an analysis of the costs and benefits of the approach, along with future development and implementation plans in this area. In general, overheads for multiple file transfers provide the largest issue for competitiveness of this approach at present.

  15. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    Energy Technology Data Exchange (ETDEWEB)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek; Barnes, Taylor; Wichmann, Nathan; Raman, Karthik; Sasanka, Ruchira; Louie, Steven G.

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW method is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.

  16. A method to optimize the shield compact and lightweight combining the structure with components together by genetic algorithm and MCNP code.

    Science.gov (United States)

    Cai, Yao; Hu, Huasi; Pan, Ziheng; Hu, Guang; Zhang, Tao

    2018-05-17

    To optimize the shield for neutrons and gamma rays compact and lightweight, a method combining the structure and components together was established employing genetic algorithms and MCNP code. As a typical case, the fission energy spectrum of 235 U which mixed neutrons and gamma rays was adopted in this study. Six types of materials were presented and optimized by the method. Spherical geometry was adopted in the optimization after checking the geometry effect. Simulations have made to verify the reliability of the optimization method and the efficiency of the optimized materials. To compare the materials visually and conveniently, the volume and weight needed to build a shield are employed. The results showed that, the composite multilayer material has the best performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Final technical position on documentation of computer codes for high-level waste management

    International Nuclear Information System (INIS)

    Silling, S.A.

    1983-06-01

    Guidance is given for the content of documentation of computer codes which are used in support of a license application for high-level waste disposal. The guidelines cover theoretical basis, programming, and instructions for use of the code

  18. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  19. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-01-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation

  20. Best practice in the management of clinical coding services: Insights from a project in the Republic of Ireland, Part 2.

    Science.gov (United States)

    Reid, Beth A; Ridoutt, Lee; O'Connor, Paul; Murphy, Deirdre

    2017-09-01

    This is the second of two articles about best practice in the management of coding services. The best practice project was part of a year-long project conducted in the Republic of Ireland to review the quality of the Hospital Inpatient Enquiry data for its use in activity-based funding. The four methods used to address the best practice aspect of the project were described in detail in Part 1. The results included in this article are those relating to the coding manager's background, preparation and style, clinical coder (CC) workforce adequacy, the CC workforce structure and career pathway, and the physical and psychological work environment for the clinical coding service. Examples of best practice were found in the study hospitals but there were also areas for improvement. Coding managers would benefit from greater support in the form of increased opportunities for management training and a better method for calculating CC workforce numbers. A career pathway is needed for CCs to progress from entry to expert CC, mentor, manager and quality controller. Most hospitals could benefit from investment in infrastructure that places CCs in a physical environment that tells them they are an important part of the hospital and their work is valued.

  1. Preliminary analysis on in-core fuel management optimization of molten salt pebble-bed reactor

    International Nuclear Information System (INIS)

    Xia Bing; Jing Xingqing; Xu Xiaolin; Lv Yingzhong

    2013-01-01

    The Nuclear Hot Spring (NHS) is a molten salt pebble-bed reactor featured by full power natural circulation. The unique horizontal coolant flow of the NHS demands the fuel recycling schemes based on radial zoning refueling and the corresponding method of fuel management optimization. The local searching algorithm (LSA) and the simulated annealing algorithm (SAA), the stochastic optimization methods widely used in the refueling optimization problems in LWRs, were applied to the analysis of refueling optimization of the NHS. The analysis results indicate that, compared with the LSA, the SAA can survive the traps of local optimized solutions and reach the global optimized solution, and the quality of optimization of the SAA is independent of the choice of the initial solution. The optimization result gives excellent effects on the in-core power flattening and the suppression of fuel center temperature. For the one-dimensional zoning refueling schemes of the NHS, the SAA is an appropriate optimization method. (authors)

  2. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  3. Optimal Energy Management of Multi-Microgrids with Sequentially Coordinated Operations

    Directory of Open Access Journals (Sweden)

    Nah-Oak Song

    2015-08-01

    Full Text Available We propose an optimal electric energy management of a cooperative multi-microgrid community with sequentially coordinated operations. The sequentially coordinated operations are suggested to distribute computational burden and yet to make the optimal 24 energy management of multi-microgrids possible. The sequential operations are mathematically modeled to find the optimal operation conditions and illustrated with physical interpretation of how to achieve optimal energy management in the cooperative multi-microgrid community. This global electric energy optimization of the cooperative community is realized by the ancillary internal trading between the microgrids in the cooperative community which reduces the extra cost from unnecessary external trading by adjusting the electric energy production amounts of combined heat and power (CHP generators and amounts of both internal and external electric energy trading of the cooperative community. A simulation study is also conducted to validate the proposed mathematical energy management models.

  4. Optimal trading quantity integration as a basis for optimal portfolio management

    Directory of Open Access Journals (Sweden)

    Saša Žiković

    2005-06-01

    Full Text Available The author in this paper points out the reason behind calculating and using optimal trading quantity in conjunction with Markowitz’s Modern portfolio theory. In the opening part the author presents an example of calculating optimal weights using Markowitz’s Mean-Variance approach, followed by an explanation of basic logic behind optimal trading quantity. The use of optimal trading quantity is not limited to systems with Bernoulli outcome, but can also be used when trading shares, futures, options etc. Optimal trading quantity points out two often-overlooked axioms: (1 a system with negative mathematical expectancy can never be transformed in a system with positive mathematical expectancy, (2 by missing the optimal trading quantity an investor can turn a system with positive expectancy into a negative one. Optimal trading quantity is that quantity which maximizes geometric mean (growth function of a particular system. To determine the optimal trading quantity for simpler systems, with a very limited number of outcomes, a set of Kelly’s formulas is appropriate. In the conclusion the summary of the paper is presented.

  5. Design of investment management optimization system for power grid companies under new electricity reform

    Science.gov (United States)

    Yang, Chunhui; Su, Zhixiong; Wang, Xin; Liu, Yang; Qi, Yongwei

    2017-03-01

    The new normalization of the economic situation and the implementation of a new round of electric power system reform put forward higher requirements to the daily operation of power grid companies. As an important day-to-day operation of power grid companies, investment management is directly related to the promotion of the company's operating efficiency and management level. In this context, the establishment of power grid company investment management optimization system will help to improve the level of investment management and control the company, which is of great significance for power gird companies to adapt to market environment changing as soon as possible and meet the policy environment requirements. Therefore, the purpose of this paper is to construct the investment management optimization system of power grid companies, which includes investment management system, investment process control system, investment structure optimization system, and investment project evaluation system and investment management information platform support system.

  6. A Study of Clinical Coding Accuracy in Surgery: Implications for the Use of Administrative Big Data for Outcomes Management.

    Science.gov (United States)

    Nouraei, S A R; Hudovsky, A; Frampton, A E; Mufti, U; White, N B; Wathen, C G; Sandhu, G S; Darzi, A

    2015-06-01

    Clinical coding is the translation of clinical activity into a coded language. Coded data drive hospital reimbursement and are used for audit and research, and benchmarking and outcomes management purposes. We undertook a 2-center audit of coding accuracy across surgery. Clinician-auditor multidisciplinary teams reviewed the coding of 30,127 patients and assessed accuracy at primary and secondary diagnosis and procedure levels, morbidity level, complications assignment, and financial variance. Postaudit data of a randomly selected sample of 400 cases were reaudited by an independent team. At least 1 coding change occurred in 15,402 patients (51%). There were 3911 (13%) and 3620 (12%) changes to primary diagnoses and procedures, respectively. In 5183 (17%) patients, the Health Resource Grouping changed, resulting in income variance of £3,974,544 (+6.2%). The morbidity level changed in 2116 (7%) patients (P data are a key engine for knowledge-driven health care provision. They are used, increasingly at individual surgeon level, to benchmark performance. Surgical clinical coding is prone to subjectivity, variability, and error (SVE). Having a specialty-by-specialty understanding of the nature and clinical significance of informatics variability and adopting strategies to reduce it, are necessary to allow accurate assumptions and informed decisions to be made concerning the scope and clinical applicability of administrative data in surgical outcomes improvement.

  7. An optimal control model for load shifting - With application in the energy management of a colliery

    International Nuclear Information System (INIS)

    Middelberg, Arno; Zhang Jiangfeng; Xia Xiaohua

    2009-01-01

    This paper presents an optimal control model for the load shifting problem in energy management and its application in a South African colliery. It is illustrated in the colliery scenario that how the optimal control model can be applied to optimize load shifting and improve energy efficiency through the control of conveyor belts. The time-of-use electricity tariff is used as an input to the objective function in order to obtain a solution that minimizes electricity costs and thus maximizes load shifting. The case study yields promising results that show the potential of applying this optimal control model to other industrial Demand Side Management initiatives

  8. Handling Uncertain Gross Margin and Water Demand in Agricultural Water Resources Management using Robust Optimization

    Science.gov (United States)

    Chaerani, D.; Lesmana, E.; Tressiana, N.

    2018-03-01

    In this paper, an application of Robust Optimization in agricultural water resource management problem under gross margin and water demand uncertainty is presented. Water resource management is a series of activities that includes planning, developing, distributing and managing the use of water resource optimally. Water resource management for agriculture can be one of the efforts to optimize the benefits of agricultural output. The objective function of agricultural water resource management problem is to maximizing total benefits by water allocation to agricultural areas covered by the irrigation network in planning horizon. Due to gross margin and water demand uncertainty, we assume that the uncertain data lies within ellipsoidal uncertainty set. We employ robust counterpart methodology to get the robust optimal solution.

  9. Inventory Management and the Impact of Anticipation in Evolutionary Stochastic Online Dynamic Optimization

    NARCIS (Netherlands)

    P.A.N. Bosman (Peter); J.A. La Poutré (Han)

    2007-01-01

    htmlabstractInventory management (IM) is an important area in logistics. The goal is to manage the inventory of a vendor as efficiently as possible. Its practical relevance also makes it an important real-world application for research in optimization. Because inventory must be managed over time, IM

  10. Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  11. A loading pattern optimization method for nuclear fuel management

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1997-01-01

    Nuclear fuel reload of PWR core leads to the search of an optimal nuclear fuel assemblies distribution, namely of loading pattern. This large discrete optimization problem is here expressed as a cost function minimization. To deal with this problem, an approach based on gradient information is used to direct the search in the patterns discrete space. A method using an adjoint state formulation is then developed, and final results of complete patterns search tests by this method are presented. (author)

  12. Code Switching in the Classroom: A Case Study of Economics and Management Students at the University of Sfax, Tunisia

    Science.gov (United States)

    Bach Baoueb, Sallouha Lamia; Toumi, Naouel

    2012-01-01

    This case study explores the motivations for code switching (CS) in the interactions of Tunisian students at the faculty of Economics and Management in Sfax, Tunisia. The study focuses on students' (EMSs) classroom conversations and out-of-classroom peer interactions. The analysis of the social motivations of EMSs' CS behaviour shows that…

  13. Development of a computer code system for selecting off-site protective action in radiological accidents based on the multiobjective optimization method

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu; Oyama, Kazuo

    1989-09-01

    This report presents a new method to support selection of off-site protective action in nuclear reactor accidents, and provides a user's manual of a computer code system, PRASMA, developed using the method. The PRASMA code system gives several candidates of protective action zones of evacuation, sheltering and no action based on the multiobjective optimization method, which requires objective functions and decision variables. We have assigned population risks of fatality, injury and cost as the objective functions, and distance from a nuclear power plant characterizing the above three protective action zones as the decision variables. (author)

  14. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    Science.gov (United States)

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  15. Optimized and secure technique for multiplexing QR code images of single characters: application to noiseless messages retrieval

    International Nuclear Information System (INIS)

    Trejos, Sorayda; Barrera, John Fredy; Torroba, Roberto

    2015-01-01

    We present for the first time an optical encrypting–decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome. (paper)

  16. Optimized and secure technique for multiplexing QR code images of single characters: application to noiseless messages retrieval

    Science.gov (United States)

    Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto

    2015-08-01

    We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.

  17. BASHAN: A few-group three-dimensional diffusion code with burnup and fuel management features

    International Nuclear Information System (INIS)

    Pearce, D.F.

    1970-12-01

    The diffusion equation for a two or three-dimensional, two-group or multi-group downscatter problem is solved by conventional finite difference techniques. An x-y-z geometry is assumed with an 'in-channel' mesh point representation. Options are available which allow representation of a soluble poison dispersed throughout the reactor, and also absorber rods in specified channels. The power distribution and multiplication factor k eff are calculated and a point rating map is used to advance the irradiation at each mesh point by a specified time-step so that burnup is followed. Fuel changes may be made so that radial shuffling and axial shuffling fuel management schemes can be studies. The code has been written in FORTRAN S2 for an IBM 7030 (STRETCH) computer which, with a fast store of 80,000 locations, allows problems of up to 15,000 mesh points to be dealt with. Conversion to FORTRAN IV for IBM 360 has now been completed. (author)

  18. ASTEC code development, validation and applications for severe accident management within the CESAM European project - 15392

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Chatelard, P.; Chevalier-Jabet, K.; Nowack, H.; Herranz, L.E.; Pascal, G.; Sanchez-Espinoza, V.H.

    2015-01-01

    ASTEC, jointly developed by IRSN and GRS, is considered as the European reference code since it capitalizes knowledge from the European research on the domain. The CESAM project aims at its enhancement and extension for use in severe accident management (SAM) analysis of the nuclear power plants (NPP) of Generation II-III presently under operation or foreseen in near future in Europe, spent fuel pools included. Within the CESAM project 3 main types of research activities are performed: -) further validation of ASTEC models important for SAM, in particular for the phenomena being of importance in the Fukushima-Daichi accidents, such as reflooding of degraded cores, pool scrubbing, hydrogen combustion, or spent fuel pools behaviour; -) modelling improvements, especially for BWR or based on the feedback of validation tasks; and -) ASTEC applications to severe accident scenarios in European NPPs in order to assess prevention and mitigation measures. An important step will be reached with the next major ASTEC V2.1 version planned to be delivered in the first part of 2015. Its main improvements will concern the possibility to simulate in details the core degradation of BWR and PHWR and a model of reflooding of severely degraded cores. A new user-friendly Graphic User Interface will be available for plant analyses

  19. A model for the optimal risk management of farm firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    2012-01-01

    Risk management is an integrated part of business or firm management and deals with the problem of how to avoid the risk of economic losses when the objective is to maximize expected profit. This paper will focus on the identification, assessment, and prioritization of risks in agriculture followed...... by a description of procedures for coordinated and economical application of resources to control the probability and/or impact of unfortunate events. Besides identifying the major risk factors and tools for risk management in agricultural production, the paper will look critically into the current methods...... for risk management Risk management is typically based on numerical analysis and the concept of efficiency. None of the methods developed so far actually solve the basic question of how the individual manager should behave so as to optimise the balance between expected profit/income and risk. In the paper...

  20. ECOLOGICAL AND ECONOMICALLY OPTIMAL MANAGEMENT OF WASTE FROM HEALTHCARE FACILITIES

    OpenAIRE

    Halina Marczak

    2013-01-01

    Modern healthcare facilities generate more and more waste, and their management is a significant constitutes a significant cost of their functioning. The undertakings aimed at lowering the costs of expenses in waste management may have a positive influence on budgetary accounts in the institutions rendering health care services. On the example of a hospital in Lublin the costs of waste management and the possibilities to lower these costs by intensifying segregation procedures were presented....