WorldWideScience

Sample records for bioprocess development optimisation

  1. Hybrid elementary flux analysis/nonparametric modeling: application for bioprocess control

    Directory of Open Access Journals (Sweden)

    Alves Paula M

    2007-01-01

    Full Text Available Abstract Background The progress in the "-omic" sciences has allowed a deeper knowledge on many biological systems with industrial interest. This knowledge is still rarely used for advanced bioprocess monitoring and control at the bioreactor level. In this work, a bioprocess control method is presented, which is designed on the basis of the metabolic network of the organism under consideration. The bioprocess dynamics are formulated using hybrid rigorous/data driven systems and its inherent structure is defined by the metabolism elementary modes. Results The metabolic network of the system under study is decomposed into elementary modes (EMs, which are the simplest paths able to operate coherently in steady-state. A reduced reaction mechanism in the form of simplified reactions connecting substrates with end-products is obtained. A dynamical hybrid system integrating material balance equations, EMs reactions stoichiometry and kinetics was formulated. EMs kinetics were defined as the product of two terms: a mechanistic/empirical known term and an unknown term that must be identified from data, in a process optimisation perspective. This approach allows the quantification of fluxes carried by individual elementary modes which is of great help to identify dominant pathways as a function of environmental conditions. The methodology was employed to analyse experimental data of recombinant Baby Hamster Kidney (BHK-21A cultures producing a recombinant fusion glycoprotein. The identified EMs kinetics demonstrated typical glucose and glutamine metabolic responses during cell growth and IgG1-IL2 synthesis. Finally, an online optimisation study was conducted in which the optimal feeding strategies of glucose and glutamine were calculated after re-estimation of model parameters at each sampling time. An improvement in the final product concentration was obtained as a result of this online optimisation. Conclusion The main contribution of this work is a

  2. Developing a Continuous Bioprocessing Approach to Stromal Cell Manufacture.

    Science.gov (United States)

    Miotto, Martina; Gouveia, Ricardo; Abidin, Fadhilah Zainal; Figueiredo, Francisco; Connon, Che J

    2017-11-29

    To this day, the concept of continuous bioprocessing has been applied mostly to the manufacture of molecular biologics such as proteins, growth factors, and secondary metabolites with biopharmaceutical uses. The present work now sets to explore the potential application of continuous bioprocess methods to source large numbers of human adherent cells with potential therapeutic value. To this purpose, we developed a smart multifunctional surface coating capable of controlling the attachment, proliferation, and subsequent self-detachment of human corneal stromal cells. This system allowed the maintenance of cell cultures under steady-state growth conditions, where self-detaching cells were continuously replenished by the proliferation of those remaining attached. This facilitated a closed, continuous bioprocessing platform with recovery of approximately 1% of the total adherent cells per hour, a yield rate that was maintained for 1 month. Moreover, both attached and self-detached cells were shown to retain their original phenotype. Together, these results represent the proof-of-concept for a new high-throughput, high-standard, and low-cost biomanufacturing strategy with multiple potentials and important downstream applications.

  3. [Progress in industrial bioprocess engineering in China].

    Science.gov (United States)

    Zhuang, Yingping; Chen, Hongzhang; Xia, Jianye; Tang, Wenjun; Zhao, Zhimin

    2015-06-01

    The advances of industrial biotechnology highly depend on the development of industrial bioprocess researches. In China, we are facing several challenges because of a huge national industrial fermentation capacity. The industrial bioprocess development experienced several main stages. This work mainly reviews the development of the industrial bioprocess in China during the past 30 or 40 years: including the early stage kinetics model study derived from classical chemical engineering, researching method based on control theory, multiple-parameter analysis techniques of on-line measuring instruments and techniques, and multi-scale analysis theory, and also solid state fermentation techniques and fermenters. In addition, the cutting edge of bioprocess engineering was also addressed.

  4. Bioprocessing research for energy applications

    Energy Technology Data Exchange (ETDEWEB)

    Scott, C.D.; Gaden, E.L. Jr.; Humphrey, A.E.; Carta, G.; Kirwan, D.J.

    1989-04-01

    The new biotechnology that is emerging could have a major impact on many of the industries important to our country, especially those associated with energy production and conservation. Advances in bioprocessing systems will provide important alternatives for the future utilization of various energy resources and for the control of environmental hazards that can result from energy generation. Although research in the fundamental biological sciences has helped set the scene for a ''new biotechnology,'' the major impediment to rapid commercialization for energy applications is the lack of a firm understanding of the necessary engineering concepts. Engineering research is now the essential ''bridge'' that will allow the development of a wide range of energy-related bioprocessing systems. A workshop entitled ''Bioprocessing Research for Energy Applications'' was held to address this technological area, to define the engineering research needs, and to identify those opportunities which would encourage rapid implementation of advanced bioprocessing concepts.

  5. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... development drivers, three case studies are presented. In the first example it is demonstrated how experimental investigations of the bi-enzymatic production of lactobionic acid can be modeled with help of a new mechanistic mathematical model. The reaction was performed at lab scale and the prediction quality...

  6. On-line bioprocess monitoring - an academic discipline or an industrial tool?

    DEFF Research Database (Denmark)

    Olsson, Lisbeth; Schulze, Ulrik; Nielsen, Jens Bredal

    1998-01-01

    Bioprocess monitoring capabilities are gaining increasing Importance bath in physiological studies and in bioprocess development, The present article focuses on on-line analytical systems since these represent the backbone of most bioprocess monitoring systems, both in academia and in industry. W...

  7. Nanobiocatalyst advancements and bioprocessing applications.

    Science.gov (United States)

    Misson, Mailin; Zhang, Hu; Jin, Bo

    2015-01-06

    The nanobiocatalyst (NBC) is an emerging innovation that synergistically integrates advanced nanotechnology with biotechnology and promises exciting advantages for improving enzyme activity, stability, capability and engineering performances in bioprocessing applications. NBCs are fabricated by immobilizing enzymes with functional nanomaterials as enzyme carriers or containers. In this paper, we review the recent developments of novel nanocarriers/nanocontainers with advanced hierarchical porous structures for retaining enzymes, such as nanofibres (NFs), mesoporous nanocarriers and nanocages. Strategies for immobilizing enzymes onto nanocarriers made from polymers, silicas, carbons and metals by physical adsorption, covalent binding, cross-linking or specific ligand spacers are discussed. The resulting NBCs are critically evaluated in terms of their bioprocessing performances. Excellent performances are demonstrated through enhanced NBC catalytic activity and stability due to conformational changes upon immobilization and localized nanoenvironments, and NBC reutilization by assembling magnetic nanoparticles into NBCs to defray the high operational costs associated with enzyme production and nanocarrier synthesis. We also highlight several challenges associated with the NBC-driven bioprocess applications, including the maturation of large-scale nanocarrier synthesis, design and development of bioreactors to accommodate NBCs, and long-term operations of NBCs. We suggest these challenges are to be addressed through joint collaboration of chemists, engineers and material scientists. Finally, we have demonstrated the great potential of NBCs in manufacturing bioprocesses in the near future through successful laboratory trials of NBCs in carbohydrate hydrolysis, biofuel production and biotransformation. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. Modeling and simulation of the bioprocess with recirculation

    Directory of Open Access Journals (Sweden)

    Žerajić Stanko

    2007-01-01

    Full Text Available The bioprocess models with recirculation present an integration of the model of continuous bioreaction system and the model of separation system. The reaction bioprocess is integrated with separation the biomass, formed product, no consumed substrate or inhibitory substance. In this paper the simulation model of recirculation bioprocess was developed, which may be applied for increasing the biomass productivity and product biosynthesis increasing the conversion of a substrate-to-product, mixing efficiency and secondary C02 separation. The goal of the work is optimal bioprocess configuration, which is determined by simulation optimization. The optimal hemostat state was used as referent. Step-by-step simulation method is necessary because the initial bioprocess state is changing with recirculation in each step. The simulation experiment confirms that at the recirculation ratio a. = 0.275 and the concentration factor C = 4 the maximum glucose conversion to ethanol and at a dilution rate ten times larger.

  9. Design-for-Six-Sigma To Develop a Bioprocess Knowledge Management Framework.

    Science.gov (United States)

    Junker, Beth; Maheshwari, Gargi; Ranheim, Todd; Altaras, Nedim; Stankevicz, Michael; Harmon, Lori; Rios, Sandra; D'anjou, Marc

    2011-01-01

    Owing to the high costs associated with biopharmaceutical development, considerable pressure has developed for the biopharmaceutical industry to increase productivity by becoming more lean and flexible. The ability to reuse knowledge was identified as one key advantage to streamline productivity, efficiently use resources, and ultimately perform better than the competition. A knowledge management (KM) strategy was assembled for bioprocess-related information using the technique of Design-for-Six-Sigma (DFSS). This strategy supported quality-by-design and process validation efforts for pipeline as well as licensed products. The DFSS technique was selected because it was both streamlined and efficient. These characteristics permitted development of a KM strategy with minimized team leader and team member resources. DFSS also placed a high emphasis on the voice of the customer, information considered crucial to the selection of solutions most appropriate for the current knowledge-based challenges of the organization. The KM strategy developed was comprised of nine workstreams, constructed from related solution buckets which in turn were assembled from the individual solution tasks that were identified. Each workstream's detailed design was evaluated against published and established best practices, as well as the KM strategy project charter and design inputs. Gaps and risks were identified and mitigated as necessary to improve the robustness of the proposed strategy. Aggregated resources (specifically expense/capital funds and staff) and timing were estimated to obtain vital management sponsorship for implementation. Where possible, existing governance and divisional/corporate information technology efforts were leveraged to minimize the additional bioprocess resources required for implementation. Finally, leading and lagging indicator metrics were selected to track the success of pilots and eventual implementation. A knowledge management framework was assembled for

  10. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology.

    Science.gov (United States)

    Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios

    2012-01-01

    The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  11. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology

    Directory of Open Access Journals (Sweden)

    Michalis Koutinas

    2012-10-01

    Full Text Available The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control & optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  12. BIOPROCESS SYSTEMS ENGINEERING: TRANSFERRING TRADITIONAL PROCESS ENGINEERING PRINCIPLES TO INDUSTRIAL BIOTECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Michalis Koutinas

    2012-10-01

    Full Text Available The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  13. Guiding bioprocess design by microbial ecology.

    Science.gov (United States)

    Volmer, Jan; Schmid, Andreas; Bühler, Bruno

    2015-06-01

    Industrial bioprocess development is driven by profitability and eco-efficiency. It profits from an early stage definition of process and biocatalyst design objectives. Microbial bioprocess environments can be considered as synthetic technical microbial ecosystems. Natural systems follow Darwinian evolution principles aiming at survival and reproduction. Technical systems objectives are eco-efficiency, productivity, and profitable production. Deciphering technical microbial ecology reveals differences and similarities of natural and technical systems objectives, which are discussed in this review in view of biocatalyst and process design and engineering strategies. Strategies for handling opposing objectives of natural and technical systems and for exploiting and engineering natural properties of microorganisms for technical systems are reviewed based on examples. This illustrates the relevance of considering microbial ecology for bioprocess design and the potential for exploitation by synthetic biology strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Nano-tubular cellulose for bioprocess technology development.

    Science.gov (United States)

    Koutinas, Athanasios A; Sypsas, Vasilios; Kandylis, Panagiotis; Michelis, Andreas; Bekatorou, Argyro; Kourkoutas, Yiannis; Kordulis, Christos; Lycourghiotis, Alexis; Banat, Ibrahim M; Nigam, Poonam; Marchant, Roger; Giannouli, Myrsini; Yianoulis, Panagiotis

    2012-01-01

    Delignified cellulosic material has shown a significant promotional effect on the alcoholic fermentation as yeast immobilization support. However, its potential for further biotechnological development is unexploited. This study reports the characterization of this tubular/porous cellulosic material, which was done by SEM, porosimetry and X-ray powder diffractometry. The results showed that the structure of nano-tubular cellulose (NC) justifies its suitability for use in "cold pasteurization" processes and its promoting activity in bioprocessing (fermentation). The last was explained by a glucose pump theory. Also, it was demonstrated that crystallization of viscous invert sugar solutions during freeze drying could not be otherwise achieved unless NC was present. This effect as well as the feasibility of extremely low temperature fermentation are due to reduction of the activation energy, and have facilitated the development of technologies such as wine fermentations at home scale (in a domestic refrigerator). Moreover, NC may lead to new perspectives in research such as the development of new composites, templates for cylindrical nano-particles, etc.

  15. BIOPROCESS DEVELOPMENTS FOR CELLULASE PRODUCTION BY Aspergillus oryzae CULTIVATED UNDER SOLID-STATE FERMENTATION

    Directory of Open Access Journals (Sweden)

    R. D. P. B. Pirota

    Full Text Available Abstract Bioprocess development studies concerning the production of cellulases are of crucial importance due to the significant impact of these enzymes on the economics of biomass conversion into fuels and chemicals. This work evaluates the effects of solid-state fermentation (SSF operational conditions on cellulase production by a novel strain of Aspergillus oryzae using an instrumented lab-scale bioreactor equipped with an on-line automated monitoring and control system. The use of SSF cultivation under controlled conditions substantially improved cellulase production. Highest production of FPase (0.40 IU g-1, endoglucanase (123.64 IU g-1, and β-glucosidase (18.32 IU g-1 was achieved at 28 °C, using an initial substrate moisture content of 70%, with an inlet air humidity of 80% and an airflow rate of 20 mL min-1. Further studies of kinetic profiles and respirometric analyses were performed. The results showed that these data could be very useful for bioprocess development of cellulase production and scale-up.

  16. Establishing new microbial cell factories for sustainable bioprocesses

    DEFF Research Database (Denmark)

    Workman, Mhairi; Holt, Philippe; Liu, Xiaoying

    2012-01-01

    . The application of biological catalysts which can convert a variety of substrates to an array of desirable products has been demonstrated in both ancient bioprocesses and modern industrial biotechnology. In recent times, focus has been on a limited number of “model” organisms which have been extensively exploited...... of products, it may be interesting to look to less domesticated strains and towards more non-conventional hosts in the development of new bioprocesses. This approach demands thorough physiological characterization as well as establishment of tools for genetic engineering if new cell factories......The demands of modern society are increasing pressure on natural resources while creating the need for a wider range of products. There is an interest in developing bioprocesses to meet these demands, with conversion of a variety of waste materials providing the basis for a sustainable society...

  17. Incorporating unnatural amino acids to engineer biocatalysts for industrial bioprocess applications.

    Science.gov (United States)

    Ravikumar, Yuvaraj; Nadarajan, Saravanan Prabhu; Hyeon Yoo, Tae; Lee, Chong-Soon; Yun, Hyungdon

    2015-12-01

    The bioprocess engineering with biocatalysts broadly spans its development and actual application of enzymes in an industrial context. Recently, both the use of bioprocess engineering and the development and employment of enzyme engineering techniques have been increasing rapidly. Importantly, engineering techniques that incorporate unnatural amino acids (UAAs) in vivo has begun to produce enzymes with greater stability and altered catalytic properties. Despite the growth of this technique, its potential value in bioprocess applications remains to be fully exploited. In this review, we explore the methodologies involved in UAA incorporation as well as ways to synthesize these UAAs. In addition, we summarize recent efforts to increase the yield of UAA engineered proteins in Escherichia coli and also the application of this tool in enzyme engineering. Furthermore, this protein engineering tool based on the incorporation of UAA can be used to develop immobilized enzymes that are ideal for bioprocess applications. Considering the potential of this tool and by exploiting these engineered enzymes, we expect the field of bioprocess engineering to open up new opportunities for biocatalysis in the near future. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Nano-tubular cellulose for bioprocess technology development.

    Directory of Open Access Journals (Sweden)

    Athanasios A Koutinas

    Full Text Available Delignified cellulosic material has shown a significant promotional effect on the alcoholic fermentation as yeast immobilization support. However, its potential for further biotechnological development is unexploited. This study reports the characterization of this tubular/porous cellulosic material, which was done by SEM, porosimetry and X-ray powder diffractometry. The results showed that the structure of nano-tubular cellulose (NC justifies its suitability for use in "cold pasteurization" processes and its promoting activity in bioprocessing (fermentation. The last was explained by a glucose pump theory. Also, it was demonstrated that crystallization of viscous invert sugar solutions during freeze drying could not be otherwise achieved unless NC was present. This effect as well as the feasibility of extremely low temperature fermentation are due to reduction of the activation energy, and have facilitated the development of technologies such as wine fermentations at home scale (in a domestic refrigerator. Moreover, NC may lead to new perspectives in research such as the development of new composites, templates for cylindrical nano-particles, etc.

  19. On-line soft sensing in upstream bioprocessing.

    Science.gov (United States)

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  20. Disposable bioprocessing: the future has arrived.

    Science.gov (United States)

    Rao, Govind; Moreira, Antonio; Brorson, Kurt

    2009-02-01

    Increasing cost pressures are driving the rapid adoption of disposables in bioprocessing. While well ensconced in lab-scale operations, the lower operating/ validation costs at larger scale and relative ease of use are leading to these systems entering all stages and operations of a typical biopharmaceutical manufacturing process. Here, we focus on progress made in the incorporation of disposable equipment with sensor technology in bioprocessing throughout the development cycle. We note that sensor patch technology is mostly being adapted to disposable cell culture devices, but future adaptation to downstream steps is conceivable. Lastly, regulatory requirements are also briefly assessed in the context of disposables and the Process Analytical Technologies (PAT) and Quality by Design (QbD) initiatives.

  1. Optimisation: how to develop stake holder involvement

    International Nuclear Information System (INIS)

    Weiss, W.

    2003-01-01

    The Precautionary Principle is an internationally recognised approach for dealing with risk situations characterised by uncertainties and potential irreversible damages. Since the late fifties, ICRP has adopted this prudent attitude because of the lack of scientific evidence concerning the existence of a threshold at low doses for stochastic effects. The 'linear, no-threshold' model and the 'optimisation of protection' principle have been developed as a pragmatic response for the management of the risk. The progress in epidemiology and radiobiology over the last decades have affirmed the initial assumption and the optimisation remains the appropriate response for the application of the precautionary principle in the context of radiological protection. The basic objective of optimisation is, for any source within the system of radiological protection, to maintain the level of exposure as low as reasonably achievable, taking into account social and economical factors. Methods tools and procedures have been developed over the last two decades to put into practice the optimisation principle with a central role given to the cost-benefit analysis as a means to determine the optimised level of protection. However, with the advancement in the implementation of the principle more emphasis was progressively given to good practice, as well as on the importance of controlling individual levels of exposure through the optimisation process. In the context of the revision of its present recommendations, the Commission is reenforcing the emphasis on protection of the individual with the adoption of an equity-based system that recognizes individual rights and a basic level of health protection. Another advancement is the role that is now recognised to 'stakeholders involvement' in the optimisation process as a mean to improve the quality of the decision aiding process for identifying and selecting protection actions considered as being accepted by all those involved. The paper

  2. Stem cell bioprocessing: fundamentals and principles.

    Science.gov (United States)

    Placzek, Mark R; Chung, I-Ming; Macedo, Hugo M; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Cha, Jae Min; Fauzi, Iliana; Kang, Yunyi; Yeo, David C L; Ma, Chi Yip Joan; Polak, Julia M; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2009-03-06

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the 'omics' technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical-failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications.

  3. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  4. Miniature Bioprocess Array: A Platform for Quantitative Physiology and Bioprocess Optimization

    National Research Council Canada - National Science Library

    Keasling, Jay

    2002-01-01

    .... The miniature bioprocess array is based on an array of 150-microliters wells, each one of which incorporates MEMS for the closed-loop control of cell culture parameters such as temperature, pH, and dissolved oxygen...

  5. Control of Bioprocesses

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted

    2015-01-01

    The purpose of bioprocess control is to ensure that the plant operates as designed. This chapter presents the fundamental principles for control of biochemical processes. Through examples, the selection of manipulated and controlled variables in the classical reactor configurations is discussed, so...... are control objectives and the challenges in obtaining good control of the bioreactor. The objective of this chapter is to discuss the bioreactor control problems and to highlight some general traits that distinguish operation of bioprocesses from operation of processes in the conventional chemical process...... industries. It also provides a number of typical control loops for different objectives. A brief introduction to the general principles of process control, the PID control algorithm is discussed, and the design and effect of tuning are shown in an example. Finally, a discussion of novel, model-free control...

  6. Bioprocessing of a stored mixed liquid waste

    Energy Technology Data Exchange (ETDEWEB)

    Wolfram, J.H.; Rogers, R.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Finney, R. [Mound Applied Technologies, Miamisburg, OH (United States)] [and others

    1995-12-31

    This paper describes the development and results of a demonstration for a continuous bioprocess for mixed waste treatment. A key element of the process is an unique microbial strain which tolerates high levels of aromatic solvents and surfactants. This microorganism is the biocatalysis of the continuous flow system designed for the processing of stored liquid scintillation wastes. During the past year a process demonstration has been conducted on commercial formulation of liquid scintillation cocktails (LSC). Based on data obtained from this demonstration, the Ohio EPA granted the Mound Applied Technologies Lab a treatability permit allowing the limited processing of actual mixed waste. Since August 1994, the system has been successfully processing stored, {open_quotes}hot{close_quotes} LSC waste. The initial LSC waste fed into the system contained 11% pseudocumene and detectable quantities of plutonium. Another treated waste stream contained pseudocumene and tritium. Data from this initial work shows that the hazardous organic solvent, and pseudocumene have been removed due to processing, leaving the aqueous low level radioactive waste. Results to date have shown that living cells are not affected by the dissolved plutonium and that 95% of the plutonium was sorbed to the biomass. This paper discusses the bioprocess, rates of processing, effluent, and the implications of bioprocessing for mixed waste management.

  7. Virtual parameter-estimation experiments in Bioprocess-Engineering education

    NARCIS (Netherlands)

    Sessink, O.D.T.; Beeftink, H.H.; Hartog, R.J.M.; Tramper, J.

    2006-01-01

    Cell growth kinetics and reactor concepts constitute essential knowledge for Bioprocess-Engineering students. Traditional learning of these concepts is supported by lectures, tutorials, and practicals: ICT offers opportunities for improvement. A virtual-experiment environment was developed that

  8. Soft sensors in bioprocessing: A status report and recommendations

    DEFF Research Database (Denmark)

    Luttmann, Reiner; Bracewell, Daniel G.; Cornelissen, Gesine

    2012-01-01

    The following report with recommendations is the result of an expert panel meeting on soft sensor applications in bioprocess engineering that was organized by the Measurement, Monitoring, Modelling and Control (M3C) Working Group of the European Federation of Biotechnology - Section of Biochemical...... Engineering Science (ESBES). The aim of the panel was to provide an update on the present status of the subject and to identify critical needs and issues for the furthering of the successful development of soft sensor methods in bioprocess engineering research and for industrial applications, in particular...... with focus on biopharmaceutical applications. It concludes with a set of recommendations, which highlight current prospects for the extended use of soft sensors and those areas requiring development....

  9. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  10. Raman spectroscopy as a process analytical technology for pharmaceutical manufacturing and bioprocessing.

    Science.gov (United States)

    Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R

    2017-01-01

    Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.

  11. On optimal development and becoming an optimiser

    NARCIS (Netherlands)

    de Ruyter, D.J.

    2012-01-01

    The article aims to provide a justification for the claim that optimal development and becoming an optimiser are educational ideals that parents should pursue in raising their children. Optimal development is conceptualised as enabling children to grow into flourishing persons, that is persons who

  12. Sense and sensitivity in bioprocessing-detecting cellular metabolites with biosensors.

    Science.gov (United States)

    Dekker, Linda; Polizzi, Karen M

    2017-10-01

    Biosensors use biological elements to detect or quantify an analyte of interest. In bioprocessing, biosensors are employed to monitor key metabolites. There are two main types: fully biological systems or biological recognition coupled with physical/chemical detection. New developments in chemical biosensors include multiplexed detection using microfluidics. Synthetic biology can be used to engineer new biological biosensors with improved characteristics. Although there have been few biosensors developed for bioprocessing thus far, emerging trends can be applied in the future. A range of new platform technologies will enable rapid engineering of new biosensors based on transcriptional activation, riboswitches, and Förster Resonance Energy Transfer. However, translation to industry remains a challenge and more research into the robustness biosensors at scale is needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Development of bioprocess for high density cultivation yield of the probiotic Bacillus coagulans and its spores

    Directory of Open Access Journals (Sweden)

    Kavita R. Pandey

    2016-09-01

    Full Text Available Bacillus coagulans is a spore forming lactic acid bacterium. Spore forming bacteria, have been extensively studied and commercialized as probiotics. Probiotics are produced by fermentation technology. There is a limitation to biomass produced by conventional modes of fermentation. With the great demand generated by range of probiotic products, biomass is becoming very valuable for several pharmaceutical, dairy and probiotic companies. Thus, there is a need to develop high cell density cultivation processes for enhanced biomass accumulation. The bioprocess development was carried out in 6.6 L bench top lab scale fermentor. Four different cultivation strategies were employed to develop a bioprocess for higher growth and sporulation efficiencies of probiotic B. coagulans. Batch fermentation of B. coagulans yielded 18 g L-1 biomass (as against 8.0 g L-1 productivity in shake flask with 60% spore efficiency. Fed-batch cultivation was carried out for glucose, which yielded 25 g L-1 of biomass. C/N ratio was very crucial in achieving higher spore titres. Maximum biomass yield recorded was 30 g L-1, corresponding to 3.8 × 1011 cells mL-1 with 81% of cells in sporulated stage. The yield represents increment of 85 times the productivity and 158 times the spore titres relative to the highest reported values for high density cultivation of B. coagulans.

  14. Design of digital learning material for bioprocess-engineering-education

    NARCIS (Netherlands)

    Schaaf, van der H.

    2007-01-01

    With the advance of computers and the internet, new types of learning material can be developed: web-based digital learning material. Because many complex learning objectives in the food- and bioprocess technology domain are difficult to achieve in a traditional learning environment, a project was

  15. Therapeutic antibodies: market considerations, disease targets and bioprocessing.

    Science.gov (United States)

    Elvin, John G; Couston, Ruairidh G; van der Walle, Christopher F

    2013-01-02

    Antibodies are well established in mainstream clinical practice and present an exciting area for collaborative research and development in industry and academia alike. In this review, we will provide an overview of the current market and an outlook to 2015, focussing on whole antibody molecules while acknowledging the next generation scaffolds containing variable fragments. The market will be discussed in the context of disease targets, particularly in the areas of oncology and immune disorders which generate the greatest revenue by a wide margin. Emerging targets include central nervous system disorders which will also stimulate new delivery strategies. It is becoming increasingly apparent that a better understanding of bioprocessing is required in order to optimize the steps involved in the preparation of a protein prior to formulation. The latter is outside the scope of this review and nor is it our intention to discuss protein delivery and pharmacokinetics. The challenges that lie ahead include the discovery of new disease targets and the development of robust bioprocessing operations. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  16. Bioprocess intensification for the effective production of chemical products

    DEFF Research Database (Denmark)

    Woodley, John

    2017-01-01

    The further implementation of new bioprocesses, using biocatalysts in various formats, for the synthesis of chemicals is highly dependent upon effective process intensification. The need for process intensification reflects the fact that the conditions under which a biocatalyst carries out...... a reaction in nature are far from those which are optimal for industrial processes. In this paper the rationale for intensification will be discussed, as well as the four complementary approaches used today to achieve bioprocess intensification. Two of these four approaches are based on alteration...... of the biocatalyst (either by protein engineering or metabolic engineering), resulting in an extra degree of freedom in the process design. To date, biocatalyst engineering has been developed independently from the conventional process engineering methodology to intensification. Although the integration of these two...

  17. White paper on continuous bioprocessing. May 20-21, 2014 Continuous Manufacturing Symposium.

    Science.gov (United States)

    Konstantinov, Konstantin B; Cooney, Charles L

    2015-03-01

    There is a growing interest in realizing the benefits of continuous processing in biologics manufacturing, which is reflected by the significant number of industrial and academic researchers who are actively involved in the development of continuous bioprocessing systems. These efforts are further encouraged by guidance expressed in recent US FDA conference presentations. The advantages of continuous manufacturing include sustained operation with consistent product quality, reduced equipment size, high-volumetric productivity, streamlined process flow, low-process cycle times, and reduced capital and operating cost. This technology, however, poses challenges, which need to be addressed before routine implementation is considered. This paper, which is based on the available literature and input from a large number of reviewers, is intended to provide a consensus of the opportunities, technical needs, and strategic directions for continuous bioprocessing. The discussion is supported by several examples illustrating various architectures of continuous bioprocessing systems. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  18. Application of agent-based system for bioprocess description and process improvement.

    Science.gov (United States)

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which

  19. Cell bioprocessing in space - Applications of analytical cytology

    Science.gov (United States)

    Todd, P.; Hymer, W. C.; Goolsby, C. L.; Hatfield, J. M.; Morrison, D. R.

    1988-01-01

    Cell bioprocessing experiments in space are reviewed and the development of on-board cell analytical cytology techniques that can serve such experiments is discussed. Methods and results of experiments involving the cultivation and separation of eukaryotic cells in space are presented. It is suggested that an advanced cytometer should be developed for the quantitative analysis of large numbers of specimens of suspended eukaryotic cells and bioparticles in experiments on the Space Station.

  20. Bioprocessing of sewage sludge for safe recycling on agricultural land - BIOWASTE

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, Jens Ejbye; Angelidaki, Irini; Christensen, Nina; Batstone, Damien John; Lyberatos, Gerasimos; Stamatelatou, Katerina; Lichtfouse, Eric; Elbisser, Brigitte; Rogers, Kayne; Sappin-Didier, Valerie; Dernaix, Laurence; Caria; Giovanni; Metzger, Laure; Borghi, Veronica; Montcada, Eloi

    2003-07-01

    Disposal and handling of sewage sludge are increasing problems in Europe due to the increasing quantities of the sewage sludge produced. A large amount of the sewage sludge contains small fractions of toxic chemicals, which results in problems with safe use of the sewage sludge on agricultural land. From an ecological and economical point of view, it would be essential to establish methodologies, which could allow sewage sludge to be reused as fertilizers on agricultural land. Energy efficient biotreatment processes of organic waste are, therefore, of crucial importance. BIOWASTE will offer an integrated study of this area. The typical composition of sewage sludge will be characterized with regard to key contaminating compounds. The following compounds will be in focus: Emulsifying agents such as nonylphenols and nonylphenol ethoxylates (NPE), polycyclic aromatic hydrocarbons (PAHs) derived from incomplete combustion processes and phthalates, which are used as additives in plastics and surfactants such as linear alkyl benzene sulfonate (LAS). Analytical techniques suitable for qualitative and quantitative evaluation of the chemical species involved in the processes under investigation will be determined. Bacteria that are able to degrade selected contaminating compounds under anaerobic and aerobic conditions will be isolated, characterized and bioaugmented for decontamination of sewage sludge through bioprocessing. Aerobic, anaerobic and combination of aerobic/anaerobic bioprocessing of sewage sludge will be applied. A mathematical model will be developed to describe the biodegradation processes of the contaminating compounds after establishing the kinetic parameters for degradation of contaminating compounds. The bioprocessed sewage sludge will be used in eco- and plant- toxicology tests to evaluate the impact of the xenobiotics on the environment. Methodologies will be developed and applied to assess the cleanliness of the bioprocessing as a safe method for waste

  1. Scale-up bioprocess development for production of the antibiotic valinomycin in Escherichia coli based on consistent fed-batch cultivations.

    Science.gov (United States)

    Li, Jian; Jaitzig, Jennifer; Lu, Ping; Süssmuth, Roderich D; Neubauer, Peter

    2015-06-12

    Heterologous production of natural products in Escherichia coli has emerged as an attractive strategy to obtain molecules of interest. Although technically feasible most of them are still constrained to laboratory scale production. Therefore, it is necessary to develop reasonable scale-up strategies for bioprocesses aiming at the overproduction of targeted natural products under industrial scale conditions. To this end, we used the production of the antibiotic valinomycin in E. coli as a model system for scalable bioprocess development based on consistent fed-batch cultivations. In this work, the glucose limited fed-batch strategy based on pure mineral salt medium was used throughout all scales for valinomycin production. The optimal glucose feed rate was initially detected by the use of a biocatalytically controlled glucose release (EnBase® technology) in parallel cultivations in 24-well plates with continuous monitoring of pH and dissolved oxygen. These results were confirmed in shake flasks, where the accumulation of valinomycin was highest when the specific growth rate decreased below 0.1 h(-1). This correlation was also observed for high cell density fed-batch cultivations in a lab-scale bioreactor. The bioreactor fermentation produced valinomycin with titers of more than 2 mg L(-1) based on the feeding of a concentrated glucose solution. Valinomycin production was not affected by oscillating conditions (i.e. glucose and oxygen) in a scale-down two-compartment reactor, which could mimic similar situations in industrial bioreactors, suggesting that the process is very robust and a scaling of the process to a larger industrial scale appears a realistic scenario. Valinomycin production was scaled up from mL volumes to 10 L with consistent use of the fed-batch technology. This work presents a robust and reliable approach for scalable bioprocess development and represents an example for the consistent development of a process for a heterologously expressed natural

  2. Advanced optimisation - coal fired power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Turney, D.M.; Mayes, I. [E.ON UK, Nottingham (United Kingdom)

    2005-03-01

    The purpose of this unit optimization project is to develop an integrated approach to unit optimisation and develop an overall optimiser that is able to resolve any conflicts between the individual optimisers. The individual optimisers have been considered during this project are: on-line thermal efficiency package, GNOCIS boiler optimiser, GNOCIS steam side optimiser, ESP optimisation, and intelligent sootblowing system. 6 refs., 7 figs., 3 tabs.

  3. Quantitative feature extraction from the Chinese hamster ovary bioprocess bibliome using a novel meta-analysis workflow

    DEFF Research Database (Denmark)

    Golabgir, Aydin; Gutierrez, Jahir M.; Hefzi, Hooman

    2016-01-01

    compilation covers all published CHO cell studies from 1995 to 2015, and each study is classified by the types of phenotypic and bioprocess data contained therein. Using data from selected studies, we also present a quantitative meta-analysis of bioprocess characteristics across diverse culture conditions...... practices can limit research re-use in this field, we show that the statistical analysis of diverse legacy bioprocess data can provide insight into bioprocessing capabilities of CHO cell lines used in industry. The CHO bibliome can be accessed at http://lewislab.ucsd.edu/cho-bibliome/....

  4. Beam position optimisation for IMRT

    International Nuclear Information System (INIS)

    Holloway, L.; Hoban, P.

    2001-01-01

    Full text: The introduction of IMRT has not generally resulted in the use of optimised beam positions because to find the global solution of the problem a time consuming stochastic optimisation method must be used. Although a deterministic method may not achieve the global minimum it should achieve a superior dose distribution compared to no optimisation. This study aimed to develop and test such a method. The beam optimisation method developed relies on an iterative process to achieve the desired number of beams from a large initial number of beams. The number of beams is reduced in a 'weeding-out' process based on the total fluence which each beam delivers. The process is gradual, with only three beams removed each time (following a small number of iterations), ensuring that the reduction in beams does not dramatically affect the fluence maps of those remaining. A comparison was made between the dose distributions achieved when the beams positions were optimised in this fashion and when the beams positions were evenly distributed. The method has been shown to work quite effectively and efficiently. The Figure shows a comparison in dose distribution with optimised and non optimised beam positions for 5 beams. It can be clearly seen that there is an improvement in the dose distribution delivered to the tumour and a reduction in the dose to the critical structure with beam position optimisation. A method for beam position optimisation for use in IMRT optimisations has been developed. This method although not necessarily achieving the global minimum in beam position still achieves quite a dramatic improvement compared with no beam position optimisation and is very efficiently achieved. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  5. Potentials and limitations of miniaturized calorimeters for bioprocess monitoring.

    Science.gov (United States)

    Maskow, Thomas; Schubert, Torsten; Wolf, Antje; Buchholz, Friederike; Regestein, Lars; Buechs, Jochen; Mertens, Florian; Harms, Hauke; Lerchner, Johannes

    2011-10-01

    In theory, heat production rates are very well suited for analysing and controlling bioprocesses on different scales from a few nanolitres up to many cubic metres. Any bioconversion is accompanied by a production (exothermic) or consumption (endothermic) of heat. The heat is tightly connected with the stoichiometry of the bioprocess via the law of Hess, and its rate is connected to the kinetics of the process. Heat signals provide real-time information of bioprocesses. The combination of heat measurements with respirometry is theoretically suited for the quantification of the coupling between catabolic and anabolic reactions. Heat measurements have also practical advantages. Unlike most other biochemical sensors, thermal transducers can be mounted in a protected way that prevents fouling, thereby minimizing response drifts. Finally, calorimetry works in optically opaque solutions and does not require labelling or reactants. It is surprising to see that despite all these advantages, calorimetry has rarely been applied to monitor and control bioprocesses with intact cells in the laboratory, industrial bioreactors or ecosystems. This review article analyses the reasons for this omission, discusses the additional information calorimetry can provide in comparison with respirometry and presents miniaturization as a potential way to overcome some inherent weaknesses of conventional calorimetry. It will be discussed for which sample types and scientific question miniaturized calorimeter can be advantageously applied. A few examples from different fields of microbiological and biotechnological research will illustrate the potentials and limitations of chip calorimetry. Finally, the future of chip calorimetry is addressed in an outlook.

  6. Development of a new bioprocess scheme using frozen seed train intermediates to initiate CHO cell culture manufacturing campaigns.

    Science.gov (United States)

    Seth, Gargi; Hamilton, Robert W; Stapp, Thomas R; Zheng, Lisa; Meier, Angela; Petty, Krista; Leung, Stephenie; Chary, Srikanth

    2013-05-01

    Agility to schedule and execute cell culture manufacturing campaigns quickly in a multi-product facility will play a key role in meeting the growing demand for therapeutic proteins. In an effort to shorten campaign timelines, maximize plant flexibility and resource utilization, we investigated the initiation of cell culture manufacturing campaigns using CHO cells cryopreserved in large volume bags in place of the seed train process flows that are conventionally used in cell culture manufacturing. This approach, termed FASTEC (Frozen Accelerated Seed Train for Execution of a Campaign), involves cultivating cells to high density in a perfusion bioreactor, and cryopreserving cells in multiple disposable bags. Each run for a manufacturing campaign would then come from a thaw of one or more of these cryopreserved bags. This article reviews the development and optimization of individual steps of the FASTEC bioprocess scheme: scaling up cells to greater than 70 × 10(6) cells/mL and freezing in bags with an optimized controlled rate freezing protocol and using a customized rack configuration. Flow cytometry analysis was also employed to understand the recovery of CHO cells following cryopreservation. Extensive development data were gathered to ensure that the quantity and quality of the drug manufactured using the FASTEC bioprocess scheme was acceptable compared to the conventional seed train process flow. The result of offering comparable manufacturing options offers flexibility to the cell culture manufacturing network. Copyright © 2012 Wiley Periodicals, Inc.

  7. Simplex-based optimization of numerical and categorical inputs in early bioprocess development: Case studies in HT chromatography.

    Science.gov (United States)

    Konstantinidis, Spyridon; Titchener-Hooker, Nigel; Velayudhan, Ajoy

    2017-08-01

    Bioprocess development studies often involve the investigation of numerical and categorical inputs via the adoption of Design of Experiments (DoE) techniques. An attractive alternative is the deployment of a grid compatible Simplex variant which has been shown to yield optima rapidly and consistently. In this work, the method is combined with dummy variables and it is deployed in three case studies wherein spaces are comprised of both categorical and numerical inputs, a situation intractable by traditional Simplex methods. The first study employs in silico data and lays out the dummy variable methodology. The latter two employ experimental data from chromatography based studies performed with the filter-plate and miniature column High Throughput (HT) techniques. The solute of interest in the former case study was a monoclonal antibody whereas the latter dealt with the separation of a binary system of model proteins. The implemented approach prevented the stranding of the Simplex method at local optima, due to the arbitrary handling of the categorical inputs, and allowed for the concurrent optimization of numerical and categorical, multilevel and/or dichotomous, inputs. The deployment of the Simplex method, combined with dummy variables, was therefore entirely successful in identifying and characterizing global optima in all three case studies. The Simplex-based method was further shown to be of equivalent efficiency to a DoE-based approach, represented here by D-Optimal designs. Such an approach failed, however, to both capture trends and identify optima, and led to poor operating conditions. It is suggested that the Simplex-variant is suited to development activities involving numerical and categorical inputs in early bioprocess development. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. ADAPTIVE HIGH GAIN OBSERVER EXTENSION AND ITS APPLICATION TO BIOPROCESS MONITORING

    Czech Academy of Sciences Publication Activity Database

    Čelikovský, Sergej; Torres-Munoz, J. A.; Dominguez-Bocanegra, A. R.

    2018-01-01

    Roč. 54, č. 1 (2018), s. 155-174 ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Adaptive observers * nonlinear systems * bioprocess Subject RIV: BC - Control Systems Theory OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.379, year: 2016 http://doi.org/10.14736/kyb-2018-1-0155

  9. Design and analysis of heat recovery system in bioprocess plant

    International Nuclear Information System (INIS)

    Anastasovski, Aleksandar; Rašković, Predrag; Guzović, Zvonimir

    2015-01-01

    Highlights: • Heat integration of a bioprocess plant is studied. • Bioprocess plant produces yeast and ethyl-alcohol. • The design of a heat recovery system is performed by batch pinch analysis. • Direct and indirect heat integration approaches are used in process design. • The heat recovery system without a heat storage opportunity is more profitable. - Abstract: The paper deals with the heat integration of a bioprocess plant which produces yeast and ethyl-alcohol. The referent plant is considered to be a multiproduct batch plant which operates in a semi-continuous mode. The design of a heat recovery system is performed by batch pinch analysis and by the use of the Time slice model. The results obtained by direct and indirect heat integration approaches are presented in the form of cost-optimal heat exchanger networks and evaluated by different thermodynamic and economic indicators. They signify that the heat recovery system without a heat storage opportunity can be considered to be a more profitable solution for the energy efficiency increase in a plant

  10. Bioprocessing of wheat bran improves in vitro bioaccessibility and colonic metabolism of phenolic compounds

    NARCIS (Netherlands)

    Mateo Anson, N.; Selinheimo, E.; Havenaar, R.; Aura, A.-M.; Mattila, I.; Lehtinen, P.; Bast, A.; Poutanen, K.; Haenen, G.R.M.M.

    2009-01-01

    Ferulic acid (FA) is the most abundant phenolic compound in wheat grain, mainly located in the bran. However, its bioaccessibility from the bran matrix is extremely low. Different bioprocessing techniques involving fermentation or enzymatic and fermentation treatments of wheat bran were developed

  11. Capacity Planning for Batch and Perfusion Bioprocesses Across Multiple Biopharmaceutical Facilities

    OpenAIRE

    Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S

    2014-01-01

    Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fe...

  12. Multi-Optimisation Consensus Clustering

    Science.gov (United States)

    Li, Jian; Swift, Stephen; Liu, Xiaohui

    Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.

  13. Development of microorganisms for cellulose-biofuel consolidated bioprocessings: metabolic engineers’ tricks

    Directory of Open Access Journals (Sweden)

    Roberto Mazzoli

    2012-10-01

    Full Text Available Cellulose waste biomass is the most abundant and attractive substrate for "biorefinery strategies" that are aimed to produce high-value products (e.g. solvents, fuels, building blocks by economically and environmentally sustainable fermentation processes. However, cellulose is highly recalcitrant to biodegradation and its conversion by biotechnological strategies currently requires economically inefficient multistep industrial processes. The need for dedicated cellulase production continues to be a major constraint to cost-effective processing of cellulosic biomass.Research efforts have been aimed at developing recombinant microorganisms with suitable characteristics for single step biomass fermentation (consolidated bioprocessing, CBP. Two paradigms have been applied for such, so far unsuccessful, attempts: a “native cellulolytic strategies”, aimed at conferring high-value product properties to natural cellulolytic microorganisms; b “recombinant cellulolytic strategies”, aimed to confer cellulolytic ability to microorganisms exhibiting high product yields and titers.By starting from the description of natural enzyme systems for plant biomass degradation and natural metabolic pathways for some of the most valuable product (i.e. butanol, ethanol, and hydrogen biosynthesis, this review describes state-of-the-art bottlenecks and solutions for the development of recombinant microbial strains for cellulosic biofuel CBP by metabolic engineering. Complexed cellulases (i.e. cellulosomes benefit from stronger proximity effects and show enhanced synergy on insoluble substrates (i.e. crystalline cellulose with respect to free enzymes. For this reason, special attention was held on strategies involving cellulosome/designer cellulosome-bearing recombinant microorganisms.

  14. Application of Hydrodynamic Cavitation for Food and Bioprocessing

    Science.gov (United States)

    Gogate, Parag R.

    Hydrodynamic cavitation can be simply generated by the alterations in the flow field in high speed/high pressure devices and also by passage of the liquid through a constriction such as orifice plate, venturi, or throttling valve. Hydrodynamic cavitation results in the formation of local hot spots, release of highly reactive free radicals, and enhanced mass transfer rates due to turbulence generated as a result of liquid circulation currents. These conditions can be suitably applied for intensification of different bioprocessing applications in an energy-efficient manner as compared to conventionally used ultrasound-based reactors. The current chapter aims at highlighting different aspects related to hydrodynamic cavitation, including the theoretical aspects for optimization of operating parameters, reactor designs, and overview of applications relevant to food and bioprocessing. Some case studies highlighting the comparison of hydrodynamic cavitation and acoustic cavitation reactors will also be discussed.

  15. Development and application of an excitation ratiometric optical pH sensor for bioprocess monitoring.

    Science.gov (United States)

    Badugu, Ramachandram; Kostov, Yordan; Rao, Govind; Tolosa, Leah

    2008-01-01

    The development of a fluorescent excitation ratiometric pH sensor (AHQ-PEG) using a novel allylhydroxyquinolinium (AHQ) derivative copolymerized with polyethylene glycol dimethacrylate (PEG) is described. The AHQ-PEG sensor film is shown to be suitable for real-time, noninvasive, continuous, online pH monitoring of bioprocesses. Optical ratiometric measurements are generally more reliable, robust, inexpensive, and insensitive to experimental errors such as fluctuations in the source intensity and fluorophore photobleaching. The sensor AHQ-PEG in deionized water was shown to exhibit two excitation maxima at 375 and 425 nm with a single emission peak at 520 nm. Excitation spectra of AHQ-PEG show a decrease in emission at the 360 nm excitation and an increase at the 420 nm excitation with increasing pH. Accordingly, the ratio of emission at 420:360 nm excitation showed a maximum change between pH 5 and 8 with an apparent pK(a) of 6.40. The low pK(a) value is suitable for monitoring the fermentation of most industrially important microorganisms. Additionally, the AHQ-PEG sensor was shown to have minimal sensitivity to ionic strength and temperature. Because AHQ is covalently attached to PEG, the film shows no probe leaching and is sterilizable by steam and alcohol. It shows rapid (approximately 2 min) and reversible response to pH over many cycles without any photobleaching. Subsequently, the AHQ-PEG sensor film was tested for its suitability in monitoring the pH of S. cereviseae (yeast) fermentation. The observed pH using AHQ-PEG film is in agreement with a conventional glass pH electrode. However, unlike the glass electrode, the present sensor is easily adaptable to noninvasive monitoring of sterilized, closed bioprocess environments without the awkward wire connections that electrodes require. In addition, the AHQ-PEG sensor is easily miniaturized to fit in microwell plates and microbioreactors for high-throughput cell culture applications.

  16. A fast and systematic procedure to develop dynamic models of bioprocesses: application to microalgae cultures

    Directory of Open Access Journals (Sweden)

    J. Mailier

    2010-09-01

    Full Text Available The purpose of this paper is to report on the development of a procedure for inferring black-box, yet biologically interpretable, dynamic models of bioprocesses based on sets of measurements of a few external components (biomass, substrates, and products of interest. The procedure has three main steps: (a the determination of the number of macroscopic biological reactions linking the measured components; (b the estimation of a first reaction scheme, which has interesting mathematical properties, but might lack a biological interpretation; and (c the "projection" (or transformation of this reaction scheme onto a biologically-consistent scheme. The advantage of the method is that it allows the fast prototyping of models for the culture of microorganisms that are not well documented. The good performance of the third step of the method is demonstrated by application to an example of microalgal culture.

  17. Upgrading protein products using bioprocessing on agricultural crops

    DEFF Research Database (Denmark)

    Sulewska, Anna Maria; Sørensen, Jens Christian; Markedal, Keld Ejdrup

    to sustainability leads to a demand for plant protein products made from locally grown crops. Novel bioprocessing methods have been developed to generate protein products which are nutritious, readily available and do not generate hazardous waste. The processing focus has therefore been on developing protein......Due to increasing world population, higher average income, and changes in food preferences, there is a growing demand for proteins, especially novel plant-based protein sources, that can substitute animal proteins and supplement currently used soya proteins. Increased customer awareness......-enriched products with minimized content of antinutritional compounds. For every crop it is a challenge to obtain protein fractions with sufficient added value to make processing economically feasible. In this work we present the characterization of protein products developed in pilot scale using the novel...

  18. Efficient and reproducible mammalian cell bioprocesses without probes and controllers?

    Science.gov (United States)

    Tissot, Stéphanie; Oberbek, Agata; Reclari, Martino; Dreyer, Matthieu; Hacker, David L; Baldi, Lucia; Farhat, Mohamed; Wurm, Florian M

    2011-07-01

    Bioprocesses for recombinant protein production with mammalian cells are typically controlled for several physicochemical parameters including the pH and dissolved oxygen concentration (DO) of the culture medium. Here we studied whether these controls are necessary for efficient and reproducible bioprocesses in an orbitally shaken bioreactor (OSR). Mixing, gas transfer, and volumetric power consumption (P(V)) were determined in both a 5-L OSR and a 3-L stirred-tank bioreactor (STR). The two cultivation systems had a similar mixing intensity, but the STR had a lower volumetric mass transfer coefficient of oxygen (k(L)a) and a higher P(V) than the OSR. Recombinant CHO cell lines expressing either tumor necrosis factor receptor as an Fc fusion protein (TNFR:Fc) or an anti-RhesusD monoclonal antibody were cultivated in the two systems. The 5-L OSR was operated in an incubator shaker with 5% CO(2) in the gas environment but without pH and DO control whereas the STR was operated with or without pH and DO control. Higher cell densities and recombinant protein titers were obtained in the OSR as compared to both the controlled and the non-controlled STRs. To test the reproducibility of a bioprocess in a non-controlled OSR, the two CHO cell lines were each cultivated in parallel in six 5-L OSRs. Similar cell densities, cell viabilities, and recombinant protein titers along with similar pH and DO profiles were achieved in each group of replicates. Our study demonstrated that bioprocesses can be performed in OSRs without pH or DO control in a highly reproducible manner, at least at the scale of operation studied here. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  20. Developing a mesophilic co-culture for direct conversion of cellulose to butanol in consolidated bioprocess.

    Science.gov (United States)

    Wang, Zhenyu; Cao, Guangli; Zheng, Ju; Fu, Defeng; Song, Jinzhu; Zhang, Junzheng; Zhao, Lei; Yang, Qian

    2015-01-01

    Consolidated bioprocessing (CBP) of butanol production from cellulosic biomass is a promising strategy for cost saving compared to other processes featuring dedicated cellulase production. CBP requires microbial strains capable of hydrolyzing biomass with enzymes produced on its own with high rate and high conversion and simultaneously produce a desired product at high yield. However, current reported butanol-producing candidates are unable to utilize cellulose as a sole carbon source and energy source. Consequently, developing a co-culture system using different microorganisms by taking advantage of their specific metabolic capacities to produce butanol directly from cellulose in consolidated bioprocess is of great interest. This study was mainly undertaken to find complementary organisms to the butanol producer that allow simultaneous saccharification and fermentation of cellulose to butanol in their co-culture under mesophilic condition. Accordingly, a highly efficient and stable consortium N3 on cellulose degradation was first developed by multiple subcultures. Subsequently, the functional microorganisms with 16S rRNA sequences identical to the denaturing gradient gel electrophoresis (DGGE) profile were isolated from consortium N3. The isolate Clostridium celevecrescens N3-2 exhibited higher cellulose-degrading capability was thus chosen as the partner strain for butanol production with Clostridium acetobutylicum ATCC824. Meanwhile, the established stable consortium N3 was also investigated to produce butanol by co-culturing with C. acetobutylicum ATCC824. Butanol was produced from cellulose when C. acetobutylicum ATCC824 was co-cultured with either consortium N3 or C. celevecrescens N3-2. Co-culturing C. acetobutylicum ATCC824 with the stable consortium N3 resulted in a relatively higher butanol concentration, 3.73 g/L, and higher production yield, 0.145 g/g of glucose equivalent. The newly isolated microbial consortium N3 and strain C. celevecrescens N3

  1. Simulation optimisation

    International Nuclear Information System (INIS)

    Anon

    2010-01-01

    Over the past decade there has been a significant advance in flotation circuit optimisation through performance benchmarking using metallurgical modelling and steady-state computer simulation. This benchmarking includes traditional measures, such as grade and recovery, as well as new flotation measures, such as ore floatability, bubble surface area flux and froth recovery. To further this optimisation, Outotec has released its HSC Chemistry software with simulation modules. The flotation model developed by the AMIRA P9 Project, of which Outotec is a sponsor, is regarded by industry as the most suitable flotation model to use for circuit optimisation. This model incorporates ore floatability with flotation cell pulp and froth parameters, residence time, entrainment and water recovery. Outotec's HSC Sim enables you to simulate mineral processes in different levels, from comminution circuits with sizes and no composition, through to flotation processes with minerals by size by floatability components, to full processes with true particles with MLA data.

  2. Membrane Bioprocesses for Pharmaceutical Micropollutant Removal from Waters

    Directory of Open Access Journals (Sweden)

    Matthias de Cazes

    2014-10-01

    Full Text Available The purpose of this review work is to give an overview of the research reported on bioprocesses for the treatment of domestic or industrial wastewaters (WW containing pharmaceuticals. Conventional WW treatment technologies are not efficient enough to completely remove all pharmaceuticals from water. Indeed, these compounds are becoming an actual public health problem, because they are more and more present in underground and even in potable waters. Different types of bioprocesses are described in this work: from classical activated sludge systems, which allow the depletion of pharmaceuticals by bio-degradation and adsorption, to enzymatic reactions, which are more focused on the treatment of WW containing a relatively high content of pharmaceuticals and less organic carbon pollution than classical WW. Different aspects concerning the advantages of membrane bioreactors for pharmaceuticals removal are discussed, as well as the more recent studies on enzymatic membrane reactors to the depletion of these recalcitrant compounds.

  3. Cleaner bioprocesses for promoting zero-emission biofuels production in Vojvodina

    Energy Technology Data Exchange (ETDEWEB)

    Dodic, Sinisa N.; Vucurovic, Damjan G.; Popov, Stevan D.; Dodic, Jelena M.; Rankovic, Jovana A. [Department of Biotechnology and Pharmaceutical Engineering, Faculty of Technology, University of Novi Sad, Bul. cara Lazara 1, Novi Sad 21000, Vojvodina (RS)

    2010-12-15

    In this study, the policy, market conditions and food security of biomass energy sources are assessed for supplying the future needs of Vojvodina. The Autonomous Province of Vojvodina is an autonomous province in Serbia, containing about 27% of its total population according to the 2002 Census. It is located in the northern part of the country, in the Pannonia plain, in southeastern Europe. Vojvodina is an energy-deficient province. The incentives to invest human and financial resources in the research and development of cleaner bioprocesses are high, considering the benefits which might be achieved in terms of environment protection and manufacturing costs. In the near and medium tenu, the development of bioprocesses for waste recycling and resource recovery might be one of the most viable options, considering much research work has already been done. In Vojvodina, there are technological solutions that biofuels produced in a closed cycle, so that the quantity of waste reduced to a minimum. These solutions include the stillage (remainder after distillation) used for fattening cattle, and cattle excrement to produce biogas and manure as fertilizer. The energy required for the production of bioethanol is obtained combustion lignocelullose residual waste from the production of basic raw materials starch, or biogas. Ash from the burned biomass returned to soil as a source of minerals for plants and replacement of mineral fertilizer. Such a closed cycle is economical for small farms in Vojvodina. (author)

  4. Cleaner bioprocesses for promoting zero-emission biofuels production in Vojvodina

    International Nuclear Information System (INIS)

    Dodic, Sinisa N.; Vucurovic, Damjan G.; Popov, Stevan D.; Dodic, Jelena M.; Rankovic, Jovana A.

    2010-01-01

    In this study, the policy, market conditions and food security of biomass energy sources are assessed for supplying the future needs of Vojvodina. The Autonomous Province of Vojvodina is an autonomous province in Serbia, containing about 27% of its total population according to the 2002 Census. It is located in the northern part of the country, in the Pannonia plain, in southeastern Europe. Vojvodina is an energy-deficient province. The incentives to invest human and financial resources in the research and development of cleaner bioprocesses are high, considering the benefits which might be achieved in terms of environment protection and manufacturing costs. In the near and medium tenu, the development of bioprocesses for waste recycling and resource recovery might be one of the most viable options, considering much research work has already been done. In Vojvodina, there are technological solutions that biofuels produced in a closed cycle, so that the quantity of waste reduced to a minimum. These solutions include the stillage (remainder after distillation) used for fattening cattle, and cattle excrement to produce biogas and manure as fertilizer. The energy required for the production of bioethanol is obtained combustion lignocelullose residual waste from the production of basic raw materials starch, or biogas. Ash from the burned biomass returned to soil as a source of minerals for plants and replacement of mineral fertilizer. Such a closed cycle is economical for small farms in Vojvodina. (author)

  5. Optimising Impact in Astronomy for Development Projects

    Science.gov (United States)

    Grant, Eli

    2015-08-01

    Positive outcomes in the fields of science education and international development are notoriously difficult to achieve. Among the challenges facing projects that use astronomy to improve education and socio-economic development is how to optimise project design in order to achieve the greatest possible benefits. Over the past century, medical scientists along with statisticians and economists have progressed an increasingly sophisticated and scientific approach to designing, testing and improving social intervention and public health education strategies. This talk offers a brief review of the history and current state of `intervention science'. A similar framework is then proposed for astronomy outreach and education projects, with applied examples given of how existing evidence can be used to inform project design, predict and estimate cost-effectiveness, minimise the risk of unintended negative consequences and increase the likelihood of target outcomes being achieved.

  6. Optimising Magnetostatic Assemblies

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Smith, Anders

    theorem. This theorem formulates an energy equivalence principle with several implications concerning the optimisation of objective functionals that are linear with respect to the magnetic field. Linear functionals represent different optimisation goals, e.g. maximising a certain component of the field...... approached employing a heuristic algorithm, which led to new design concepts. Some of the procedures developed for linear objective functionals have been extended to non-linear objectives, by employing iterative techniques. Even though most the optimality results discussed in this work have been derived...

  7. To Stretch the Boundary of Secondary Metabolite Production in Plant Cell-Based Bioprocessing: Anthocyanin as a Case Study

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2004-01-01

    Full Text Available Plant cells and tissue cultures hold great promise for controlled production of a myriad of useful secondary metabolites on demand. The current yield and productivity cannot fulfill the commercial goal of a plant cell-based bioprocess for the production of most secondary metabolites. In order to stretch the boundary, recent advances, new directions and opportunities in plant cell-based bioprocessing, have been critically examined for the 10 years from 1992 to 2002. A review of the literature indicated that most of the R&D work was devoted predominantly to studies at an empirical level. A rational approach to molecular plant cell bioprocessing based on the fundamental understanding of metabolic pathways and their regulations is urgently required to stimulate further advances; however, the strategies and technical framework are still being developed. It is the aim of this review to take a step forward in framing workable strategies and technologies for molecular plant cell-based bioprocessing. Using anthocyanin biosynthesis as a case study, an integrated postgenomic approach has been proposed. This combines the functional analysis of metabolic pathways for biosynthesis of a particular metabolite from profiling of gene expression and protein expression to metabolic profiling. A global correlation not only can thus be established at the three molecular levels, but also places emphasis on the interactions between primary metabolism and secondary metabolism; between competing and/or complimentary pathways; and between biosynthetic and post-biosynthetic events.

  8. Production of polyol oils from soybean oil by bioprocess and Philippines edible medicinal wild mushrooms

    Science.gov (United States)

    We have been trying to develop a bioprocess for the production of polyol oils directly from soybean oil. We reported earlier the polyol products produced from soybean oil by Acinetobacter haemolyticus A01-35 (NRRL B-59985) (Hou and Lin, 2013). The objective of this study is to identify the chemical ...

  9. Synthesis and characterization of robust magnetic carriers for bioprocess applications

    Energy Technology Data Exchange (ETDEWEB)

    Kopp, Willian, E-mail: willkopp@gmail.com [Federal University of São Carlos-UFSCar, Graduate Program in Chemical Engineering, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Silva, Felipe A., E-mail: eq.felipe.silva@gmail.com [Federal University of São Carlos-UFSCar, Graduate Program in Chemical Engineering, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Lima, Lionete N., E-mail: lionetenunes@yahoo.com.br [Federal University of São Carlos-UFSCar, Graduate Program in Chemical Engineering, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Masunaga, Sueli H., E-mail: sueli.masunaga@gmail.com [Department of Physics, Montana State University-MSU, 173840, Bozeman, MT 59717-3840 (United States); Tardioli, Paulo W., E-mail: pwtardioli@ufscar.br [Department of Chemical Engineering, Federal University of São Carlos-UFSCar, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Giordano, Roberto C., E-mail: roberto@ufscar.br [Department of Chemical Engineering, Federal University of São Carlos-UFSCar, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Araújo-Moreira, Fernando M., E-mail: faraujo@df.ufscar.br [Department of Physics, Federal University of São Carlos-UFSCar, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); and others

    2015-03-15

    Highlights: • Silica magnetic microparticles were synthesized for applications in bioprocesses. • The process to produce magnetic microparticles is inexpensive and easily scalable. • Microparticles with very high saturation magnetization were obtained. • The structure of the silica magnetic microparticles could be controlled. - Abstract: Magnetic carriers are an effective option to withdraw selected target molecules from complex mixtures or to immobilize enzymes. This paper describes the synthesis of robust silica magnetic microparticles (SMMps), particularly designed for applications in bioprocesses. SMMps were synthesized in a micro-emulsion, using sodium silicate as the silica source and superparamagnetic iron oxide nanoparticles as the magnetic core. Thermally resistant particles, with high and accessible surface area, narrow particle size distribution, high saturation magnetization, and with superparamagnetic properties were obtained. Several reaction conditions were tested, yielding materials with saturation magnetization between 45 and 63 emu g{sup −1}, particle size between 2 and 200 μm and average diameter between 11.2 and 15.9 μm, surface area between 49 and 103 m{sup 2} g{sup −1} and pore diameter between 2 and 60 nm. The performance of SMMps in a bioprocess was evaluated by the immobilization of Pseudomonas fluorescens lipase on to octyl modified SMMp, the biocatalyst obtained was used in the production of butyl butyrate with good results.

  10. Teaching bioprocess engineering to undergraduates: Multidisciplinary hands-on training in a one-week practical course.

    Science.gov (United States)

    Henkel, Marius; Zwick, Michaela; Beuker, Janina; Willenbacher, Judit; Baumann, Sandra; Oswald, Florian; Neumann, Anke; Siemann-Herzberg, Martin; Syldatk, Christoph; Hausmann, Rudolf

    2015-01-01

    Bioprocess engineering is a highly interdisciplinary field of study which is strongly benefited by practical courses where students can actively experience the interconnection between biology, engineering, and physical sciences. This work describes a lab course developed for 2nd year undergraduate students of bioprocess engineering and related disciplines, where students are challenged with a real-life bioprocess-engineering application, the production of recombinant protein in a fed-batch process. The lab course was designed to introduce students to the subject of operating and supervising an experiment in a bioreactor, along with the analysis of collected data and a final critical evaluation of the experiment. To provide visual feedback of the experimental outcome, the organism used during class was Escherichia coli which carried a plasmid to recombinantly produce enhanced green fluorescent protein (eGFP) upon induction. This can easily be visualized in both the bioreactor and samples by using ultraviolet light. The lab course is performed with bioreactors of the simplest design, and is therefore highly flexible, robust and easy to reproduce. As part of this work the implementation and framework, the results, the evaluation and assessment of student learning combined with opinion surveys are presented, which provides a basis for instructors intending to implement a similar lab course at their respective institution. © 2015 by the International Union of Biochemistry and Molecular Biology.

  11. Bioprocessing of concentrated mixed hazardous industrial waste

    International Nuclear Information System (INIS)

    Wolfram, J.H.; Rogers, R.D.; Silver, G.; Attalla, A.; Prisc, M.

    1994-01-01

    The use of selected microorganisms for the degradation and/or the detoxification of hazardous organic compounds is gaining wide acceptance as an alternative waste treatment technology. This work describes the unique capabilities of an isolated strain of Pseudomonas for metabolizing methylated aromatic compounds. This strain of Pseudomonas putida Idaho is unique in that it can tolerate and grow under a layer of neat p-xylene. A bioprocess has been developed to degrade LLW and mixed wastes containing methylated aromatic compounds, i.e., pseudocumene, toluene and p-xylene. The process is now in the demonstration phase at a DOE facility and has been running for one year. Feed concentrations of 21200 ppm of the toxic organic substrate have been fed to the bioreactor. This report describes the results obtained thus far

  12. Optimisation of technical specifications using probabilistic methods

    International Nuclear Information System (INIS)

    Ericsson, G.; Knochenhauer, M.; Hultqvist, G.

    1986-01-01

    During the last few years the development of methods for modifying and optimising nuclear power plant Technical Specifications (TS) for plant operations has received increased attention. Probalistic methods in general, and the plant and system models of probabilistic safety assessment (PSA) in particular, seem to provide the most forceful tools for optimisation. This paper first gives some general comments on optimisation, identifying important parameters and then gives a description of recent Swedish experiences from the use of nuclear power plant PSA models and results for TS optimisation

  13. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  14. Evolutionary programming for neutron instrument optimisation

    International Nuclear Information System (INIS)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelievre-Berna, Eddy

    2006-01-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations

  15. Optimisation in radiotherapy II: Programmed and inversion optimisation algorithms

    International Nuclear Information System (INIS)

    Ebert, M.

    1997-01-01

    This is the second article in a three part examination of optimisation in radiotherapy. The previous article established the bases of optimisation in radiotherapy, and the formulation of the optimisation problem. This paper outlines several algorithms that have been used in radiotherapy, for searching for the best irradiation strategy within the full set of possible strategies. Two principle classes of algorithm are considered - those associated with mathematical programming which employ specific search techniques, linear programming type searches or artificial intelligence - and those which seek to perform a numerical inversion of the optimisation problem, finishing with deterministic iterative inversion. (author)

  16. Intelligent control of mixed-culture bioprocesses

    International Nuclear Information System (INIS)

    Stoner, D.L.; Larsen, E.D.; Miller, K.S.

    1995-01-01

    A hierarchical control system is being developed and applied to a mixed culture bioprocess in a continuous stirred tank reactor. A bioreactor, with its inherent complexity and non-linear behavior was an interesting, yet, difficult application for control theory. The bottom level of the hierarchy was implemented as a number of integrated set point controls and data acquisition modules. Within the second level was a diagnostic system that used expert knowledge to determine the operational status of the sensors, actuators, and control modules. A diagnostic program was successfully implemented for the detection of stirrer malfunctions, and to monitor liquid delivery rates and recalibrate the pumps when deviations from desired flow rates occurred. The highest control level was a supervisory shell that was developed using expert knowledge and the history of the reactor operation to determine the set points required to meet a set of production criteria. At this stage the supervisory shell analyzed the data to determine the state of the system. In future implementations, this shell will determine the set points required to optimize a cost function using expert knowledge and adaptive learning techniques

  17. Intelligent control of mixed-culture bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Stoner, D.L.; Larsen, E.D.; Miller, K.S. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-12-31

    A hierarchical control system is being developed and applied to a mixed culture bioprocess in a continuous stirred tank reactor. A bioreactor, with its inherent complexity and non-linear behavior was an interesting, yet, difficult application for control theory. The bottom level of the hierarchy was implemented as a number of integrated set point controls and data acquisition modules. Within the second level was a diagnostic system that used expert knowledge to determine the operational status of the sensors, actuators, and control modules. A diagnostic program was successfully implemented for the detection of stirrer malfunctions, and to monitor liquid delivery rates and recalibrate the pumps when deviations from desired flow rates occurred. The highest control level was a supervisory shell that was developed using expert knowledge and the history of the reactor operation to determine the set points required to meet a set of production criteria. At this stage the supervisory shell analyzed the data to determine the state of the system. In future implementations, this shell will determine the set points required to optimize a cost function using expert knowledge and adaptive learning techniques.

  18. Bioprocessing strategies for the large-scale production of human mesenchymal stem cells: a review.

    Science.gov (United States)

    Panchalingam, Krishna M; Jung, Sunghoon; Rosenberg, Lawrence; Behie, Leo A

    2015-11-23

    Human mesenchymal stem cells (hMSCs), also called mesenchymal stromal cells, have been of great interest in regenerative medicine applications because of not only their differentiation potential but also their ability to secrete bioactive factors that can modulate the immune system and promote tissue repair. This potential has initiated many early-phase clinical studies for the treatment of various diseases, disorders, and injuries by using either hMSCs themselves or their secreted products. Currently, hMSCs for clinical use are generated through conventional static adherent cultures in the presence of fetal bovine serum or human-sourced supplements. However, these methods suffer from variable culture conditions (i.e., ill-defined medium components and heterogeneous culture environment) and thus are not ideal procedures to meet the expected future demand of quality-assured hMSCs for human therapeutic use. Optimizing a bioprocess to generate hMSCs or their secreted products (or both) promises to improve the efficacy as well as safety of this stem cell therapy. In this review, current media and methods for hMSC culture are outlined and bioprocess development strategies discussed.

  19. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  20. Optimisation of Investment Resources at Small Enterprises

    Directory of Open Access Journals (Sweden)

    Shvets Iryna B.

    2014-03-01

    Full Text Available The goal of the article lies in the study of the process of optimisation of the structure of investment resources, development of criteria and stages of optimisation of volumes of investment resources for small enterprises by types of economic activity. The article characterises the process of transformation of investment resources into assets and liabilities of the balances of small enterprises and conducts calculation of the structure of sources of formation of investment resources in Ukraine at small enterprises by types of economic activity in 2011. On the basis of the conducted analysis of the structure of investment resources of small enterprises the article forms main groups of criteria of optimisation in the context of individual small enterprises by types of economic activity. The article offers an algorithm and step-by-step scheme of optimisation of investment resources at small enterprises in the form of a multi-stage process of management of investment resources in the context of increase of their mobility and rate of transformation of existing resources into investments. The prospect of further studies in this direction is development of a structural and logic scheme of optimisation of volumes of investment resources at small enterprises.

  1. Consolidated bioprocessing for butyric acid production from rice straw with undefined mixed culture

    Directory of Open Access Journals (Sweden)

    Binling Ai

    2016-10-01

    Full Text Available Lignocellulosic biomass is a renewable source with great potential for biofuels and bioproducts. However, the cost of cellulolytic enzymes limits the utilization of the low-cost bioresource. This study aimed to develop a consolidated bioprocessing without the need of supplementary cellulase for butyric acid production from lignocellulosic biomass. A stirred-tank reactor with a working volume of 21 L was constructed and operated in batch and semi-continuous fermentation modes with a cellulolytic butyrate-producing microbial community. The semi-continuous fermentation with intermittent discharging of the culture broth and replenishment with fresh medium achieved the highest butyric acid productivity of 2.69 g/(L•d. In semi-continuous operation mode, the butyric acid and total carboxylic acid concentrations of 16.2 and 28.9 g/L, respectively, were achieved. Over the 21-day fermentation period, their cumulative yields reached 1189 and 2048 g, respectively, corresponding to 41% and 74% of the maximum theoretical yields based on the amount of NaOH pretreated rice straw fed in. This study demonstrated that an undefined mixed culture-based consolidated bioprocessing for butyric acid production can completely eliminate the cost of supplementary cellulolytic enzymes.

  2. Consolidated Bioprocessing for Butyric Acid Production from Rice Straw with Undefined Mixed Culture.

    Science.gov (United States)

    Ai, Binling; Chi, Xue; Meng, Jia; Sheng, Zhanwu; Zheng, Lili; Zheng, Xiaoyan; Li, Jianzheng

    2016-01-01

    Lignocellulosic biomass is a renewable source with great potential for biofuels and bioproducts. However, the cost of cellulolytic enzymes limits the utilization of the low-cost bioresource. This study aimed to develop a consolidated bioprocessing without the need of supplementary cellulase for butyric acid production from lignocellulosic biomass. A stirred-tank reactor with a working volume of 21 L was constructed and operated in batch and semi-continuous fermentation modes with a cellulolytic butyrate-producing microbial community. The semi-continuous fermentation with intermittent discharging of the culture broth and replenishment with fresh medium achieved the highest butyric acid productivity of 2.69 g/(L· d). In semi-continuous operation mode, the butyric acid and total carboxylic acid concentrations of 16.2 and 28.9 g/L, respectively, were achieved. Over the 21-day fermentation period, their cumulative yields reached 1189 and 2048 g, respectively, corresponding to 41 and 74% of the maximum theoretical yields based on the amount of NaOH pretreated rice straw fed in. This study demonstrated that an undefined mixed culture-based consolidated bioprocessing for butyric acid production can completely eliminate the cost of supplementary cellulolytic enzymes.

  3. Monoliths in Bioprocess Technology

    Directory of Open Access Journals (Sweden)

    Vignesh Rajamanickam

    2015-04-01

    Full Text Available Monolithic columns are a special type of chromatography column, which can be used for the purification of different biomolecules. They have become popular due to their high mass transfer properties and short purification times. Several articles have already discussed monolith manufacturing, as well as monolith characteristics. In contrast, this review focuses on the applied aspect of monoliths and discusses the most relevant biomolecules that can be successfully purified by them. We describe success stories for viruses, nucleic acids and proteins and compare them to conventional purification methods. Furthermore, the advantages of monolithic columns over particle-based resins, as well as the limitations of monoliths are discussed. With a compilation of commercially available monolithic columns, this review aims at serving as a ‘yellow pages’ for bioprocess engineers who face the challenge of purifying a certain biomolecule using monoliths.

  4. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  5. Bioprocessing applications in the management of nuclear and chemical wastes

    International Nuclear Information System (INIS)

    Genung, R.K.

    1988-01-01

    The projected requirements for waste management and environmental restoration activities within the United States will probably cost tens of billions of dollars annually during the next two decades. Expenditures of this magnitude clearly have the potential to affect the international competitiveness of many US industries and the continued operation of many federal facilities. It is argued that the costs of implementing current technology will be too high unless the standards and schedules for compliance are relaxed. Since this is socially unacceptable, efforts to improve the efficiency of existing technologies and to develop new technologies should be pursued. A sizable research, development, and demonstration effort can be easily justified if the potential for reducing costs can be shown. Bioprocessing systems for the treatment of nuclear and chemically hazardous wastes offer such promise. 11 refs

  6. Production-process optimization algorithm: Application to fed-batch bioprocess

    Czech Academy of Sciences Publication Activity Database

    Pčolka, M.; Čelikovský, Sergej

    2017-01-01

    Roč. 354, č. 18 (2017), s. 8529-8551 ISSN 0016-0032 R&D Projects: GA ČR(CZ) GA17-04682S Institutional support: RVO:67985556 Keywords : Optimal control * Bioprocess * Optimization Subject RIV: BC - Control Systems Theory OBOR OECD: Automation and control systems Impact factor: 3.139, year: 2016 https://doi.org/10.1016/j.jfranklin.2017.10.012

  7. Self-optimisation and model-based design of experiments for developing a C–H activation flow process

    Directory of Open Access Journals (Sweden)

    Alexander Echtermeyer

    2017-01-01

    Full Text Available A recently described C(sp3–H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  8. Development of an Optimised Losartan Potassium Press-Coated ...

    African Journals Online (AJOL)

    The optimised formulation was further characterized with Fourier-transform infrared spectroscopy (FTIR) and powder X-ray diffractometry (PXRD) to investigate any drug/excipient modifications/interactions. Results: The tensile strength values of all the PCT were between 1.12 and 1.23MNm-2 and friability was < 0.36 %.

  9. Optimising agile development practices for the maintenance operation: nine heuristics

    DEFF Research Database (Denmark)

    Heeager, Lise Tordrup; Rose, Jeremy

    2014-01-01

    Agile methods are widely used and successful in many development situations and beginning to attract attention amongst the software maintenance community – both researchers and practitioners. However, it should not be assumed that implementing a well-known agile method for a maintenance department...... is therefore a trivial endeavour - the maintenance operation differs in some important respects from development work. Classical accounts of software maintenance emphasise more traditional software engineering processes, whereas recent research accounts of agile maintenance efforts uncritically focus...... on benefits. In an action research project at Aveva in Denmark we assisted with the optimisation of SCRUM, tailoring the standard process to the immediate needs of the developers. We draw on both theoretical and empirical learning to formulate nine heuristics for maintenance practitioners wishing to go agile....

  10. Share-of-Surplus Product Line Optimisation with Price Levels

    Directory of Open Access Journals (Sweden)

    X. G. Luo

    2014-01-01

    Full Text Available Kraus and Yano (2003 established the share-of-surplus product line optimisation model and developed a heuristic procedure for this nonlinear mixed-integer optimisation model. In their model, price of a product is defined as a continuous decision variable. However, because product line optimisation is a planning process in the early stage of product development, pricing decisions usually are not very precise. In this research, a nonlinear integer programming share-of-surplus product line optimization model that allows the selection of candidate price levels for products is established. The model is further transformed into an equivalent linear mixed-integer optimisation model by applying linearisation techniques. Experimental results in different market scenarios show that the computation time of the transformed model is much less than that of the original model.

  11. Characterization of simultaneous heat and mass transfer phenomena for water vapour condensation on a solid surface in an abiotic environment--application to bioprocesses.

    Science.gov (United States)

    Tiwari, Akhilesh; Kondjoyan, Alain; Fontaine, Jean-Pierre

    2012-07-01

    The phenomenon of heat and mass transfer by condensation of water vapour from humid air involves several key concepts in aerobic bioreactors. The high performance of bioreactors results from optimised interactions between biological processes and multiphase heat and mass transfer. Indeed in various processes such as submerged fermenters and solid-state fermenters, gas/liquid transfer need to be well controlled, as it is involved at the microorganism interface and for the control of the global process. For the theoretical prediction of such phenomena, mathematical models require heat and mass transfer coefficients. To date, very few data have been validated concerning mass transfer coefficients from humid air inflows relevant to those bioprocesses. Our study focussed on the condensation process of water vapour and developed an experimental set-up and protocol to study the velocity profiles and the mass flux on a small size horizontal flat plate in controlled environmental conditions. A closed circuit wind tunnel facility was used to control the temperature, hygrometry and hydrodynamics of the flow. The temperature of the active surface was controlled and kept isothermal below the dew point to induce condensation, by the use of thermoelectricity. The experiments were performed at ambient temperature for a relative humidity between 35-65% and for a velocity of 1.0 ms⁻¹. The obtained data are analysed and compared to available theoretical calculations on condensation mass flux.

  12. Agent-Based Decision Control—How to Appreciate Multivariate Optimisation in Architecture

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas Holmer; Kolarik, Jakub

    2015-01-01

    , the method is applied to a multivariate optimisation problem. The aim is specifically to demonstrate optimisation for entire building energy consumption, daylight distribution and capital cost. Based on the demonstrations Moth’s ability to find local minima is discussed. It is concluded that agent-based...... in the early design stage. The main focus is to demonstrate the optimisation method, which is done in two ways. Firstly, the newly developed agent-based optimisation algorithm named Moth is tested on three different single objective search spaces. Here Moth is compared to two evolutionary algorithms. Secondly...... optimisation algorithms like Moth open up for new uses of optimisation in the early design stage. With Moth the final outcome is less dependent on pre- and post-processing, and Moth allows user intervention during optimisation. Therefore, agent-based models for optimisation such as Moth can be a powerful...

  13. Application of Raman Spectroscopy and Univariate Modelling As a Process Analytical Technology for Cell Therapy Bioprocessing

    Science.gov (United States)

    Baradez, Marc-Olivier; Biziato, Daniela; Hassan, Enas; Marshall, Damian

    2018-01-01

    Cell therapies offer unquestionable promises for the treatment, and in some cases even the cure, of complex diseases. As we start to see more of these therapies gaining market authorization, attention is turning to the bioprocesses used for their manufacture, in particular the challenge of gaining higher levels of process control to help regulate cell behavior, manage process variability, and deliver product of a consistent quality. Many processes already incorporate the measurement of key markers such as nutrient consumption, metabolite production, and cell concentration, but these are often performed off-line and only at set time points in the process. Having the ability to monitor these markers in real-time using in-line sensors would offer significant advantages, allowing faster decision-making and a finer level of process control. In this study, we use Raman spectroscopy as an in-line optical sensor for bioprocess monitoring of an autologous T-cell immunotherapy model produced in a stirred tank bioreactor system. Using reference datasets generated on a standard bioanalyzer, we develop chemometric models from the Raman spectra for glucose, glutamine, lactate, and ammonia. These chemometric models can accurately monitor donor-specific increases in nutrient consumption and metabolite production as the primary T-cell transition from a recovery phase and begin proliferating. Using a univariate modeling approach, we then show how changes in peak intensity within the Raman spectra can be correlated with cell concentration and viability. These models, which act as surrogate markers, can be used to monitor cell behavior including cell proliferation rates, proliferative capacity, and transition of the cells to a quiescent phenotype. Finally, using the univariate models, we also demonstrate how Raman spectroscopy can be applied for real-time monitoring. The ability to measure these key parameters using an in-line Raman optical sensor makes it possible to have immediate

  14. Application of Raman Spectroscopy and Univariate Modelling As a Process Analytical Technology for Cell Therapy Bioprocessing.

    Science.gov (United States)

    Baradez, Marc-Olivier; Biziato, Daniela; Hassan, Enas; Marshall, Damian

    2018-01-01

    Cell therapies offer unquestionable promises for the treatment, and in some cases even the cure, of complex diseases. As we start to see more of these therapies gaining market authorization, attention is turning to the bioprocesses used for their manufacture, in particular the challenge of gaining higher levels of process control to help regulate cell behavior, manage process variability, and deliver product of a consistent quality. Many processes already incorporate the measurement of key markers such as nutrient consumption, metabolite production, and cell concentration, but these are often performed off-line and only at set time points in the process. Having the ability to monitor these markers in real-time using in-line sensors would offer significant advantages, allowing faster decision-making and a finer level of process control. In this study, we use Raman spectroscopy as an in-line optical sensor for bioprocess monitoring of an autologous T-cell immunotherapy model produced in a stirred tank bioreactor system. Using reference datasets generated on a standard bioanalyzer, we develop chemometric models from the Raman spectra for glucose, glutamine, lactate, and ammonia. These chemometric models can accurately monitor donor-specific increases in nutrient consumption and metabolite production as the primary T-cell transition from a recovery phase and begin proliferating. Using a univariate modeling approach, we then show how changes in peak intensity within the Raman spectra can be correlated with cell concentration and viability. These models, which act as surrogate markers, can be used to monitor cell behavior including cell proliferation rates, proliferative capacity, and transition of the cells to a quiescent phenotype. Finally, using the univariate models, we also demonstrate how Raman spectroscopy can be applied for real-time monitoring. The ability to measure these key parameters using an in-line Raman optical sensor makes it possible to have immediate

  15. Application of Raman Spectroscopy and Univariate Modelling As a Process Analytical Technology for Cell Therapy Bioprocessing

    Directory of Open Access Journals (Sweden)

    Marc-Olivier Baradez

    2018-03-01

    Full Text Available Cell therapies offer unquestionable promises for the treatment, and in some cases even the cure, of complex diseases. As we start to see more of these therapies gaining market authorization, attention is turning to the bioprocesses used for their manufacture, in particular the challenge of gaining higher levels of process control to help regulate cell behavior, manage process variability, and deliver product of a consistent quality. Many processes already incorporate the measurement of key markers such as nutrient consumption, metabolite production, and cell concentration, but these are often performed off-line and only at set time points in the process. Having the ability to monitor these markers in real-time using in-line sensors would offer significant advantages, allowing faster decision-making and a finer level of process control. In this study, we use Raman spectroscopy as an in-line optical sensor for bioprocess monitoring of an autologous T-cell immunotherapy model produced in a stirred tank bioreactor system. Using reference datasets generated on a standard bioanalyzer, we develop chemometric models from the Raman spectra for glucose, glutamine, lactate, and ammonia. These chemometric models can accurately monitor donor-specific increases in nutrient consumption and metabolite production as the primary T-cell transition from a recovery phase and begin proliferating. Using a univariate modeling approach, we then show how changes in peak intensity within the Raman spectra can be correlated with cell concentration and viability. These models, which act as surrogate markers, can be used to monitor cell behavior including cell proliferation rates, proliferative capacity, and transition of the cells to a quiescent phenotype. Finally, using the univariate models, we also demonstrate how Raman spectroscopy can be applied for real-time monitoring. The ability to measure these key parameters using an in-line Raman optical sensor makes it possible

  16. Bioprocessing of ores: Application to space resources

    Science.gov (United States)

    Johansson, Karl R.

    1992-01-01

    The role of microorganisms in the oxidation and leaching of various ores (especially those of copper, iron, and uranium) is well known. This role is increasingly being applied by the mining, metallurgy, and sewage industries in the bioconcentration of metal ions from natural receiving waters and from waste waters. It is concluded that bioprocessing using bacteria in closed reactors may be a variable option for the recovery of metals from the lunar regolith. Obviously, considerable research must be done to define the process, specify the appropriate bacteria, determine the necessary conditions and limitations, and evaluate the overall feasibility.

  17. Bioprocessing of wheat bran in whole wheat bread increases the bioavailability of phenolic acids in men and exerts antiinflammatory effects ex vivo.

    Science.gov (United States)

    Mateo Anson, Nuria; Aura, Anna-Marja; Selinheimo, Emilia; Mattila, Ismo; Poutanen, Kaisa; van den Berg, Robin; Havenaar, Robert; Bast, Aalt; Haenen, Guido R M M

    2011-01-01

    Whole grain consumption has been linked to a lower risk of metabolic syndrome, which is normally associated with a low-grade chronic inflammation. The benefits of whole grain are in part related to the inclusion of the bran, rich in phenolic acids and fiber. However, the phenols are poorly bioaccessible from the cereal matrix. The aim of the present study was to investigate the effect of bioprocessing of the bran in whole wheat bread on the bioavailability of phenolic acids, the postprandial plasma antioxidant capacity, and ex vivo antiinflammatory properties. After consumption of a low phenolic acid diet for 3 d and overnight fasting, 8 healthy men consumed 300 g of whole wheat bread containing native bran (control bread) or bioprocessed bran (bioprocessed bread) in a cross-over design. Urine and blood samples were collected for 24 h to analyze the phenolic acids and metabolites. Trolox equivalent antioxidant capacity was measured in plasma. Cytokines were measured in blood after ex vivo stimulation with LPS. The bioavailabilities of ferulic acid, vanillic acid, sinapic acid, and 3,4-dimethoxybenzoic acid from the bioprocessed bread were 2- to 3-fold those from the control bread. Phenylpropionic acid and 3-hydroxyphenylpropionic acid were the main colonic metabolites of the nonbioaccessible phenols. The ratios of pro-:antiinflammatory cytokines were significantly lower in LPS-stimulated blood after the consumption of the bioprocessed bread. In conclusion, bioprocessing can remarkably increase the bioavailability of phenolic acids and their circulating metabolites, compounds which have immunomodulatory effects ex vivo.

  18. Human pluripotent stem cell-derived products: advances towards robust, scalable and cost-effective manufacturing strategies.

    Science.gov (United States)

    Jenkins, Michael J; Farid, Suzanne S

    2015-01-01

    The ability to develop cost-effective, scalable and robust bioprocesses for human pluripotent stem cells (hPSCs) will be key to their commercial success as cell therapies and tools for use in drug screening and disease modelling studies. This review outlines key process economic drivers for hPSCs and progress made on improving the economic and operational feasibility of hPSC bioprocesses. Factors influencing key cost metrics, namely capital investment and cost of goods, for hPSCs are discussed. Step efficiencies particularly for differentiation, media requirements and technology choice are amongst the key process economic drivers identified for hPSCs. Progress made to address these cost drivers in hPSC bioprocessing strategies is discussed. These include improving expansion and differentiation yields in planar and bioreactor technologies, the development of xeno-free media and microcarrier coatings, identification of optimal bioprocess operating conditions to control cell fate and the development of directed differentiation protocols that reduce reliance on expensive morphogens such as growth factors and small molecules. These approaches offer methods to further optimise hPSC bioprocessing in terms of its commercial feasibility. © 2014 The Authors. Biotechnology Journal published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

  19. A one-step bioprocess for production of high-content fructo-oligosaccharides from inulin by yeast.

    Science.gov (United States)

    Wang, Da; Li, Fu-Li; Wang, Shi-An

    2016-10-20

    Commercial fructo-oligosaccharides (FOS) are predominantly produced from sucrose by transfructosylation process that presents a maximum theoretical yield below 0.60gFOSgSucrose(-1). To obtain high-content FOS, costly purification is generally employed. Additionally, high-content FOS can be produced from inulin by using endo-inulinases. However, commercial endo-inulinases have not been extensively used in scale-up production of FOS. In the present study, a one-step bioprocess that integrated endo-inulinase production, FOS fermentation, and non-FOS sugars removal into one reactor was proposed to produce high-content FOS from inulin. The bioprocess was implemented by a recombinant yeast strain JZHΔS-TSC, in which a heterologous endo-inulinase gene was expressed and the inherent invertase gene SUC2 was disrupted. FOS fermentation at 40°C from 200g/L chicory inulin presented the maximun titer, yield, and productivity of 180.2±0.8g/L, 0.9gFOSgInulin(-1), and 7.51±0.03g/L/h, respectively. This study demonstrated that the one-step bioprocess was simple and highly efficient. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Bioprocessing applications in the management of nuclear and chemical wastes

    International Nuclear Information System (INIS)

    Genung, R.K.

    1989-01-01

    The US Department of Energy (DOE), the US Department of Defense (DOD), and other federal agencies already face profound challenges in finding strategies that manage budgets and priorities while bringing their sites and facilities into compliance with current statues and regulations and with agency policies and orders. While it is often agreed that current technology can be used to address most waste management and environmental restoration needs, it is also argued by many that the costs of implementing current technology will be too high unless the standards and schedules for compliance are relaxed. Since this is socially unacceptable, efforts to improve the efficiency of existing technologies and to develop new technologies should be pursued. A sizable research, development, and demonstration effort can be easily justified if the potential for reducing costs can be shown. Bioprocessing systems for the treatment of nuclear and chemically hazardous wastes offer such promise

  1. Aspects of modelling and control of bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiachang

    1995-12-31

    The modelling and control of bioprocesses are the main subjects in this thesis. Different modelling approaches are proposed for different purposes in various bioprocesses. A conventional global model was constructed for a very complex mammalian cell culture process. A new concept of functional state and a multiple model (local models) approach were used for modelling the fed-batch baker`s yeast process for monitoring and control purposes. Finally, a combination of conventional electrical and biological models was used to simulate and to control a microbial fuel cell process. In the thesis, a yeast growth process was taken as an example to demonstrate the usefulness of the functional state concept and local models. The functional states were first defined according to the yeast metabolism. The process was then described by a set of simple local models. In different functional states, different local models were used. On the other hand, the on-line estimation of functional state and biomass of the process was discussed for process control purpose. As a consequence, both the functional state concept and the multiple model approach were applied for fuzzy logic control of yeast growth process. A fuzzy factor was calculated on the basis of a knowledge-based expert system and fuzzy logic rules. The factor was used to correct an ideal substrate feed rate. In the last part of the thesis, microbial fuel cell processes were studied. A microbial fuel cell is a device for direct conversion of chemical energy to electrical energy by using micro-organisms as catalysts. A combined model including conventional electrical and biological models was constructed for the process based on the biological and electrochemical phenomena

  2. Aspects of modelling and control of bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiachang

    1996-12-31

    The modelling and control of bioprocesses are the main subjects in this thesis. Different modelling approaches are proposed for different purposes in various bioprocesses. A conventional global model was constructed for a very complex mammalian cell culture process. A new concept of functional state and a multiple model (local models) approach were used for modelling the fed-batch baker`s yeast process for monitoring and control purposes. Finally, a combination of conventional electrical and biological models was used to simulate and to control a microbial fuel cell process. In the thesis, a yeast growth process was taken as an example to demonstrate the usefulness of the functional state concept and local models. The functional states were first defined according to the yeast metabolism. The process was then described by a set of simple local models. In different functional states, different local models were used. On the other hand, the on-line estimation of functional state and biomass of the process was discussed for process control purpose. As a consequence, both the functional state concept and the multiple model approach were applied for fuzzy logic control of yeast growth process. A fuzzy factor was calculated on the basis of a knowledge-based expert system and fuzzy logic rules. The factor was used to correct an ideal substrate feed rate. In the last part of the thesis, microbial fuel cell processes were studied. A microbial fuel cell is a device for direct conversion of chemical energy to electrical energy by using micro-organisms as catalysts. A combined model including conventional electrical and biological models was constructed for the process based on the biological and electrochemical phenomena

  3. Combining simulation and multi-objective optimisation for equipment quantity optimisation in container terminals

    OpenAIRE

    Lin, Zhougeng

    2013-01-01

    This thesis proposes a combination framework to integrate simulation and multi-objective optimisation (MOO) for container terminal equipment optimisation. It addresses how the strengths of simulation and multi-objective optimisation can be integrated to find high quality solutions for multiple objectives with low computational cost. Three structures for the combination framework are proposed respectively: pre-MOO structure, integrated MOO structure and post-MOO structure. The applications of ...

  4. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  5. Bioprocessing of lignite coals using reductive microorganisms

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D.L.

    1992-03-29

    In order to convert lignite coals into liquid fuels, gases or chemical feedstock, the macromolecular structure of the coal must be broken down into low molecular weight fractions prior to further modification. Our research focused on this aspect of coal bioprocessing. We isolated, characterized and studied the lignite coal-depolymerizing organisms Streptomyces viridosporus T7A, Pseudomonas sp. DLC-62, unidentified bacterial strain DLC-BB2 and Gram-positive Bacillus megaterium strain DLC-21. In this research we showed that these bacteria are able to solubilize and depolymerize lignite coals using a combination of biological mechanisms including the excretion of coal solublizing basic chemical metabolites and extracellular coal depolymerizing enzymes.

  6. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters....... Accordingly, two novel optimisation configurations are developed, where the optimisation either acts on the actuators or acts on the regulatory control layer. These two optimisation designs are evaluated on a sub-catchment of the sewer system in Copenhagen, and found to perform better than the existing...

  7. Modified cuckoo search: A new gradient free optimisation algorithm

    International Nuclear Information System (INIS)

    Walton, S.; Hassan, O.; Morgan, K.; Brown, M.R.

    2011-01-01

    Highlights: → Modified cuckoo search (MCS) is a new gradient free optimisation algorithm. → MCS shows a high convergence rate, able to outperform other optimisers. → MCS is particularly strong at high dimension objective functions. → MCS performs well when applied to engineering problems. - Abstract: A new robust optimisation algorithm, which can be regarded as a modification of the recently developed cuckoo search, is presented. The modification involves the addition of information exchange between the top eggs, or the best solutions. Standard optimisation benchmarking functions are used to test the effects of these modifications and it is demonstrated that, in most cases, the modified cuckoo search performs as well as, or better than, the standard cuckoo search, a particle swarm optimiser, and a differential evolution strategy. In particular the modified cuckoo search shows a high convergence rate to the true global minimum even at high numbers of dimensions.

  8. Multi-objective evolutionary optimisation for product design and manufacturing

    CERN Document Server

    2011-01-01

    Presents state-of-the-art research in the area of multi-objective evolutionary optimisation for integrated product design and manufacturing Provides a comprehensive review of the literature Gives in-depth descriptions of recently developed innovative and novel methodologies, algorithms and systems in the area of modelling, simulation and optimisation

  9. Risk based test interval and maintenance optimisation - Application and uses

    International Nuclear Information System (INIS)

    Sparre, E.

    1999-10-01

    The project is part of an IAEA co-ordinated Research Project (CRP) on 'Development of Methodologies for Optimisation of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs'. The purpose of the project is to investigate the sensitivity of the results obtained when performing risk based optimisation of the technical specifications. Previous projects have shown that complete LPSA models can be created and that these models allow optimisation of technical specifications. However, these optimisations did not include any in depth check of the result sensitivity with regards to methods, model completeness etc. Four different test intervals have been investigated in this study. Aside from an original, nominal, optimisation a set of sensitivity analyses has been performed and the results from these analyses have been compared to the original optimisation. The analyses indicate that the result of an optimisation is rather stable. However, it is not possible to draw any certain conclusions without performing a number of sensitivity analyses. Significant differences in the optimisation result were discovered when analysing an alternative configuration. Also deterministic uncertainties seem to affect the result of an optimisation largely. The sensitivity of failure data uncertainties is important to investigate in detail since the methodology is based on the assumption that the unavailability of a component is dependent on the length of the test interval

  10. Mannheimia haemolytica growth and leukotoxin production for vaccine manufacturing — A bioprocess review

    Directory of Open Access Journals (Sweden)

    Tobias Oppermann

    2017-07-01

    Full Text Available Mannheimia haemolytica leukotoxin (LKT is a known cause of bovine respiratory disease (BRD which results in severe economic losses in the cattle industry (up to USD 1 billion per year in the USA. Vaccines based on LKT offer the most promising measure to contain BRD outbreaks and are already commercially available. However, insufficient LKT yields, predominantly reflecting a lack of knowledge about the LKT expression process, remain a significant engineering problem and further bioprocess optimization is required to increase process efficiency. Most previous investigations have focused on LKT activity and cell growth, but neither of these parameters defines reliable criteria for the improvement of LKT yields. In this article, we review the most important process conditions and operational parameters (temperature, pH, substrate concentration, dissolved oxygen level, medium composition and the presence of metabolites from a bioprocess engineering perspective, in order to maximize LKT yields.

  11. Effect of Bioprocessing on the In Vitro Colonic Microbial Metabolism of Phenolic Acids from Rye Bran Fortified Breads

    DEFF Research Database (Denmark)

    Koistinen, Ville M; Nordlund, Emilia; Katina, Kati

    2017-01-01

    in an in vitro colon model, the metabolites were analyzed using two different methods applying mass spectrometry. While phenolic acids were released more extensively from the bioprocessed bran bread and ferulic acid had consistently higher concentrations in the bread type during fermentation, there were only......Cereal bran is an important source of dietary fiber and bioactive compounds, such as phenolic acids. We aimed to study the phenolic acid metabolism of native and bioprocessed rye bran fortified refined wheat bread and to elucidate the microbial metabolic route of phenolic acids. After incubation...

  12. Advances in consolidated bioprocessing systems for bioethanol and butanol production from biomass: a comprehensive review

    Directory of Open Access Journals (Sweden)

    Gholamreza Salehi Jouzani

    2015-03-01

    Full Text Available Recently, lignocellulosic biomass as the most abundant renewable resource has been widely considered for bioalcohols production. However, the complex structure of lignocelluloses requires a multi-step process which is costly and time consuming. Although, several bioprocessing approaches have been developed for pretreatment, saccharification and fermentation, bioalcohols production from lignocelluloses is still limited because of the economic infeasibility of these technologies. This cost constraint could be overcome by designing and constructing robust cellulolytic and bioalcohols producing microbes and by using them in a consolidated bioprocessing (CBP system. This paper comprehensively reviews potentials, recent advances and challenges faced in CBP systems for efficient bioalcohols (ethanol and butanol production from lignocellulosic and starchy biomass. The CBP strategies include using native single strains with cellulytic and alcohol production activities, microbial co-cultures containing both cellulytic and ethanologenic microorganisms, and genetic engineering of cellulytic microorganisms to be alcohol-producing or alcohol producing microorganisms to be cellulytic. Moreover, high-throughput techniques, such as metagenomics, metatranscriptomics, next generation sequencing and synthetic biology developed to explore novel microorganisms and powerful enzymes with high activity, thermostability and pH stability are also discussed. Currently, the CBP technology is in its infant stage, and ideal microorganisms and/or conditions at industrial scale are yet to be introduced. So, it is essential to bring into attention all barriers faced and take advantage of all the experiences gained to achieve a high-yield and low-cost CBP process.

  13. Operational Radiological Protection and Aspects of Optimisation

    International Nuclear Information System (INIS)

    Lazo, E.; Lindvall, C.G.

    2005-01-01

    Since 1992, the Nuclear Energy Agency (NEA), along with the International Atomic Energy Agency (IAEA), has sponsored the Information System on Occupational Exposure (ISOE). ISOE collects and analyses occupational exposure data and experience from over 400 nuclear power plants around the world and is a forum for radiological protection experts from both nuclear power plants and regulatory authorities to share lessons learned and best practices in the management of worker radiation exposures. In connection to the ongoing work of the International Commission on Radiological Protection (ICRP) to develop new recommendations, the ISOE programme has been interested in how the new recommendations would affect operational radiological protection application at nuclear power plants. Bearing in mind that the ICRP is developing, in addition to new general recommendations, a new recommendation specifically on optimisation, the ISOE programme created a working group to study the operational aspects of optimisation, and to identify the key factors in optimisation that could usefully be reflected in ICRP recommendations. In addition, the Group identified areas where further ICRP clarification and guidance would be of assistance to practitioners, both at the plant and the regulatory authority. The specific objective of this ISOE work was to provide operational radiological protection input, based on practical experience, to the development of new ICRP recommendations, particularly in the area of optimisation. This will help assure that new recommendations will best serve the needs of those implementing radiation protection standards, for the public and for workers, at both national and international levels. (author)

  14. Fluorescence Spectroscopy and Chemometric Modeling for Bioprocess Monitoring

    Directory of Open Access Journals (Sweden)

    Saskia M. Faassen

    2015-04-01

    Full Text Available On-line sensors for the detection of crucial process parameters are desirable for the monitoring, control and automation of processes in the biotechnology, food and pharma industry. Fluorescence spectroscopy as a highly developed and non-invasive technique that enables the on-line measurements of substrate and product concentrations or the identification of characteristic process states. During a cultivation process significant changes occur in the fluorescence spectra. By means of chemometric modeling, prediction models can be calculated and applied for process supervision and control to provide increased quality and the productivity of bioprocesses. A range of applications for different microorganisms and analytes has been proposed during the last years. This contribution provides an overview of different analysis methods for the measured fluorescence spectra and the model-building chemometric methods used for various microbial cultivations. Most of these processes are observed using the BioView® Sensor, thanks to its robustness and insensitivity to adverse process conditions. Beyond that, the PLS-method is the most frequently used chemometric method for the calculation of process models and prediction of process variables.

  15. Fluorescence Spectroscopy and Chemometric Modeling for Bioprocess Monitoring

    Science.gov (United States)

    Faassen, Saskia M.; Hitzmann, Bernd

    2015-01-01

    On-line sensors for the detection of crucial process parameters are desirable for the monitoring, control and automation of processes in the biotechnology, food and pharma industry. Fluorescence spectroscopy as a highly developed and non-invasive technique that enables the on-line measurements of substrate and product concentrations or the identification of characteristic process states. During a cultivation process significant changes occur in the fluorescence spectra. By means of chemometric modeling, prediction models can be calculated and applied for process supervision and control to provide increased quality and the productivity of bioprocesses. A range of applications for different microorganisms and analytes has been proposed during the last years. This contribution provides an overview of different analysis methods for the measured fluorescence spectra and the model-building chemometric methods used for various microbial cultivations. Most of these processes are observed using the BioView® Sensor, thanks to its robustness and insensitivity to adverse process conditions. Beyond that, the PLS-method is the most frequently used chemometric method for the calculation of process models and prediction of process variables. PMID:25942644

  16. Energy Savings from Optimised In-Field Route Planning for Agricultural Machinery

    Directory of Open Access Journals (Sweden)

    Efthymios Rodias

    2017-10-01

    Full Text Available Various types of sensors technologies, such as machine vision and global positioning system (GPS have been implemented in navigation of agricultural vehicles. Automated navigation systems have proved the potential for the execution of optimised route plans for field area coverage. This paper presents an assessment of the reduction of the energy requirements derived from the implementation of optimised field area coverage planning. The assessment regards the analysis of the energy requirements and the comparison between the non-optimised and optimised plans for field area coverage in the whole sequence of operations required in two different cropping systems: Miscanthus and Switchgrass production. An algorithmic approach for the simulation of the executed field operations by following both non-optimised and optimised field-work patterns was developed. As a result, the corresponding time requirements were estimated as the basis of the subsequent energy cost analysis. Based on the results, the optimised routes reduce the fuel energy consumption up to 8%, the embodied energy consumption up to 7%, and the total energy consumption from 3% up to 8%.

  17. Optimisation of efficiency of axial fans

    NARCIS (Netherlands)

    Kruyt, Nicolaas P.; Pennings, P.C.; Faasen, R.

    2014-01-01

    A three-stage research project has been executed to develop ducted axial-fans with increased efficiency. In the first stage a design method has been developed in which various conflicting design criteria can be incorporated. Based on this design method, an optimised design has been determined

  18. Interpretation of optimisation in the context of a disposal facility for long-lived radioactive waste

    International Nuclear Information System (INIS)

    1999-01-01

    Full text: Guidance on the Requirements for Authorisation (the GRA) issued by the Environment Agency for England and Wales requires that all disposals of radioactive waste are undertaken in a manner consistent with four principles for the protection of the public. Among these is a principle of Optimisation, that: 'The radiological detriment to members of the public that may result from the disposal of radioactive waste shall be as low as reasonably achievable, economic and social factors being taken into account'. The principle of optimisation is widely accepted and has been discussed in both UK national policy and guidance and in documents from international organisations. The practical interpretation of optimisation in the context of post-closure safety of radioactive waste repositories is, however, still open to question. In particular, the strategies and procedures that a developer might employ to implement optimisation in the siting and development of a repository, and demonstrate optimisation in a safety case, are not defined. In preparation for its role of regulatory review, the Agency has undertaken a pilot study to explore the possible interpretations of optimisation stemming from the GRA, and to identify possible strategies and procedures that a developer might follow. A review has been undertaken of UK regulatory guidance and related documents, and also international guidance, referring to optimisation in relation to radioactive waste disposal facilities. In addition, diverse examples of the application of optimisation have been identified in the international and UK performance assessment literature. A one-day meeting was organised bringing together Agency staff and technical experts with different experiences and perspectives on the subject of optimisation in the context of disposal facilities for radioactive waste. This meeting identified and discussed key issues and possible approaches to optimisation, and specifically: (1) The meaning of

  19. Optimised Renormalisation Group Flows

    CERN Document Server

    Litim, Daniel F

    2001-01-01

    Exact renormalisation group (ERG) flows interpolate between a microscopic or classical theory and the corresponding macroscopic or quantum effective theory. For most problems of physical interest, the efficiency of the ERG is constrained due to unavoidable approximations. Approximate solutions of ERG flows depend spuriously on the regularisation scheme which is determined by a regulator function. This is similar to the spurious dependence on the ultraviolet regularisation known from perturbative QCD. Providing a good control over approximated ERG flows is at the root for reliable physical predictions. We explain why the convergence of approximate solutions towards the physical theory is optimised by appropriate choices of the regulator. We study specific optimised regulators for bosonic and fermionic fields and compare the optimised ERG flows with generic ones. This is done up to second order in the derivative expansion at both vanishing and non-vanishing temperature. An optimised flow for a ``proper-time ren...

  20. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  1. Optimal Optimisation in Chemometrics

    NARCIS (Netherlands)

    Hageman, J.A.

    2004-01-01

    The use of global optimisation methods is not straightforward, especially for the more difficult optimisation problems. Solutions have to be found for items such as the evaluation function, representation, step function and meta-parameters, before any useful results can be obtained. This thesis aims

  2. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  3. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  4. Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy.

    Science.gov (United States)

    Sonnleitner, Bernhard

    2013-01-01

    The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.

  5. CLIC crab cavity design optimisation for maximum luminosity

    Energy Technology Data Exchange (ETDEWEB)

    Dexter, A.C., E-mail: a.dexter@lancaster.ac.uk [Lancaster University, Lancaster, LA1 4YR (United Kingdom); Cockcroft Institute, Daresbury, Warrington, WA4 4AD (United Kingdom); Burt, G.; Ambattu, P.K. [Lancaster University, Lancaster, LA1 4YR (United Kingdom); Cockcroft Institute, Daresbury, Warrington, WA4 4AD (United Kingdom); Dolgashev, V. [SLAC, Menlo Park, CA 94025 (United States); Jones, R. [University of Manchester, Manchester, M13 9PL (United Kingdom)

    2011-11-21

    The bunch size and crossing angle planned for CERN's compact linear collider CLIC dictate that crab cavities on opposing linacs will be needed to rotate bunches of particles into alignment at the interaction point if the desired luminosity is to be achieved. Wakefield effects, RF phase errors between crab cavities on opposing linacs and unpredictable beam loading can each act to reduce luminosity below that anticipated for bunches colliding in perfect alignment. Unlike acceleration cavities, which are normally optimised for gradient, crab cavities must be optimised primarily for luminosity. Accepting the crab cavity technology choice of a 12 GHz, normal conducting, travelling wave structure as explained in the text, this paper develops an analytical approach to optimise cell number and iris diameter.

  6. Optimisation and symmetry in experimental radiation physics

    International Nuclear Information System (INIS)

    Ghose, A.

    1988-01-01

    The present monograph is concerned with the optimisation of geometric factors in radiation physics experiments. The discussions are essentially confined to those systems in which optimisation is equivalent to symmetrical configurations of the measurement systems. They include, measurements of interaction cross section of diverse types, determination of polarisations, development of detectors with almost ideal characteristics, production of radiations with continuously variable energies and development of high efficiency spectrometers etc. The monograph is intended for use by experimental physicists investigating primary interactions of radiations with matter and associated technologies. We have illustrated the various optimisation procedures by considering the cases of the so-called ''14 MeV'' on d-t neutrons and gamma rays with energies less than 3 MeV. Developments in fusion technology are critically dependent on the availability accurate cross sections of nuclei for fast neutrons of energies at least as high as d-t neutrons. In this monograph we have discussed various techniques which can be used to improve the accuracy of such measurements and have also presented a method for generating almost monoenergetic neutrons in the 8 MeV to 13 MeV energy range which can be used to measure cross sections in this sparingly investigated region

  7. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture

    Science.gov (United States)

    Pollock, James; Coffman, Jon; Ho, Sa V.

    2017-01-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete‐event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision‐making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E‐factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium‐sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed‐batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision‐making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854–866, 2017

  8. Risk-informed optimisation of railway tracks inspection and maintenance procedures

    International Nuclear Information System (INIS)

    Podofillini, Luca; Zio, Enrico; Vatn, Jorn

    2006-01-01

    Nowadays, efforts are being made by the railway industry for the application of reliability-based and risk-informed approaches to maintenance optimisation of railway infrastructures, with the aim of reducing the operation and maintenance expenditures while still assuring high safety standards. In particular, in this paper, we address the use of ultrasonic inspection cars and develop a methodology for the determination of an optimal strategy for their use. A model is developed to calculate the risks and costs associated with an inspection strategy, giving credit to the realistic issues of the rail failure process and including the actual inspection and maintenance procedures followed by the railway company. A multi-objective optimisation viewpoint is adopted in an effort to optimise inspection and maintenance procedures with respect to both economical and safety-related aspects. More precisely, the objective functions here considered are such to drive the search towards solutions characterized by low expenditures and low derailment probability. The optimisation is performed by means of a genetic algorithm. The work has been carried out within a study of the Norwegian National Rail Administration (Jernbaneverket)

  9. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez; Khan, Ayaz ul Hassan

    2014-01-01

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  10. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez

    2014-09-16

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  11. Results of the 2010 IGSC Topical Session on Optimisation

    International Nuclear Information System (INIS)

    Bailey, Lucy

    2014-01-01

    topical session reflected the diversity of optimisation goals that may be pursued in the framework of a geological disposal programme. While optimisation of protection, as defined by ICRP, is regarded as a process to keep the magnitude of individual doses, the number of people exposed, and the likelihood of potential exposure as low as reasonably achievable with economic and social factors being taken into account, optimisation can also be seen as a way of increasing the technical quality and robustness of the whole waste management process. An optimal solution means addressing safety requirements whilst balancing other factors such as the need to use resources efficiently, political and acceptance issues and any other boundary conditions imposed by society. It was noted that optimisation variables are not well defined and could be quite programme-specific. However, the discussion showed a lot of agreement and consensus of views. In particular, the summary noted general agreement on the following points: - Optimisation is a process that can be checked and reviewed and needs to be transparent. Optimisation is therefore a learning process, and as such can contribute to building confidence in the safety case by the demonstration of ongoing learning across the organisation. - Optimisation occurs at each stage of the disposal facility development programme, and is therefore forward looking rather than focussed on re-examining past decisions. Optimisation should be about the right way forward at each stage, making the best decisions to move forward from the present situation based on current knowledge and understanding. - Regulators need to be clear about their requirements and these requirements become constraints on the optimisation process, together with any societal constraints that may be applied in certain programmes. Optimisation therefore requires a permanent dialogue between regulator and implementer. - Once the safety objectives (dose/risk targets and other

  12. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  13. Particle swarm optimisation classical and quantum perspectives

    CERN Document Server

    Sun, Jun; Wu, Xiao-Jun

    2016-01-01

    IntroductionOptimisation Problems and Optimisation MethodsRandom Search TechniquesMetaheuristic MethodsSwarm IntelligenceParticle Swarm OptimisationOverviewMotivationsPSO Algorithm: Basic Concepts and the ProcedureParadigm: How to Use PSO to Solve Optimisation ProblemsSome Harder Examples Some Variants of Particle Swarm Optimisation Why Does the PSO Algorithm Need to Be Improved? Inertia and Constriction-Acceleration Techniques for PSOLocal Best ModelProbabilistic AlgorithmsOther Variants of PSO Quantum-Behaved Particle Swarm Optimisation OverviewMotivation: From Classical Dynamics to Quantum MechanicsQuantum Model: Fundamentals of QPSOQPSO AlgorithmSome Essential ApplicationsSome Variants of QPSOSummary Advanced Topics Behaviour Analysis of Individual ParticlesConvergence Analysis of the AlgorithmTime Complexity and Rate of ConvergenceParameter Selection and PerformanceSummaryIndustrial Applications Inverse Problems for Partial Differential EquationsInverse Problems for Non-Linear Dynamical SystemsOptimal De...

  14. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  15. Selection of bioprocess simulation software for industrial applications.

    Science.gov (United States)

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues. Copyright 2001 John Wiley & Sons, Inc.

  16. Optimisation of BPMN Business Models via Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for the optimisation of business processes modelled in the business process modelling language BPMN, which builds upon earlier work, where we developed a model checking based method for the analysis of BPMN models. We define a structure for expressing optimisation goals...... for synthesized BPMN components, based on probabilistic computation tree logic and real-valued reward structures of the BPMN model, allowing for the specification of complex quantitative goals. We here present a simple algorithm, inspired by concepts from evolutionary algorithms, which iteratively generates...

  17. Optimisation models for decision support in the development of biomass-based industrial district-heating networks in Italy

    International Nuclear Information System (INIS)

    Chinese, Damiana; Meneghetti, Antonella

    2005-01-01

    A system optimisation approach is proposed to design biomass-based district-heating networks in the context of industrial districts, which are one of the main successful productive aspects of Italian industry. Two different perspectives are taken into account, that of utilities and of policy makers, leading to two optimisation models to be further integrated. A mixed integer linear-programming model is developed for a utility company's profit maximisation, while a linear-programming model aims at minimising the balance of greenhouse-gas emissions related to the proposed energy system and the avoided emissions due to the substitution of current fossil-fuel boilers with district-heating connections. To systematically compare their results, a sensitivity analysis is performed with respect to network size in order to identify how the optimal system configuration, in terms of selected boilers to be connected to a multiple energy-source network, may vary in the two cases and to detect possible optimal sizes. Then a factorial analysis is adopted to rank desirable client types under the two perspectives and identify proper marketing strategies. The proposed optimisation approach was applied to the design of a new district-heating network in the chair-manufacturing district of North-Eastern Italy. (Author)

  18. Multiobjective optimisation of bogie suspension to boost speed on curves

    Science.gov (United States)

    Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor

    2016-01-01

    To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.

  19. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture.

    Science.gov (United States)

    Pollock, James; Coffman, Jon; Ho, Sa V; Farid, Suzanne S

    2017-07-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete-event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision-making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E-factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium-sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed-batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision-making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854-866, 2017. © 2017 The

  20. Optimising neutron polarizers--measuring the flipping ratio and related quantities

    CERN Document Server

    Goossens, D J

    2002-01-01

    The continuing development of gaseous spin polarized sup 3 He transmission filters for use as neutron polarizers makes the choice of optimum thickness for these filters an important consideration. The 'quality factors' derived for the optimisation of transmission filters for particular measurements are general to all neutron polarizers. In this work optimisation conditions for neutron polarizers are derived and discussed for the family of studies related to measuring the flipping ratio from samples. The application of the optimisation conditions to sup 3 He transmission filters and other types of neutron polarizers is discussed. Absolute comparisons are made between the effectiveness of different types of polarizers for this sort of work.

  1. Origins of Cell-to-Cell Bioprocessing Diversity and Implications of the Extracellular Environment Revealed at the Single-Cell Level.

    Science.gov (United States)

    Vasdekis, A E; Silverman, A M; Stephanopoulos, G

    2015-12-14

    Bioprocess limitations imposed by microbial cell-to-cell phenotypic diversity remain poorly understood. To address this, we investigated the origins of such culture diversity during lipid production and assessed the impact of the fermentation microenvironment. We measured the single-cell lipid production dynamics in a time-invariant microfluidic environment and discovered that production is not monotonic, but rather sporadic with time. To characterize this, we introduce bioprocessing noise and identify its epigenetic origins. We linked such intracellular production fluctuations with cell-to-cell productivity diversity in culture. This unmasked the phenotypic diversity amplification by the culture microenvironment, a critical parameter in strain engineering as well as metabolic disease treatment.

  2. Turmeric Bioprocessed with Mycelia from the Shiitake Culinary-Medicinal Mushroom Lentinus edodes (Agaricomycetes) Protects Mice Against Salmonellosis.

    Science.gov (United States)

    Kim, Sung Phil; Lee, Sang Jong; Nam, Seok Hyun; Friedman, Mendel

    2017-01-01

    This study investigated the suppressive mechanisms of an extract from bioprocessed Lentinus edodes mycelial liquid culture supplemented with turmeric (bioprocessed Curcuma longa extract [BPCLE]) against murine salmonellosis. The BPLCE extract from the bioprocessed mycelia of the Salmonella Typhimurium into murine RAW 264.7 macrophage cells, elimination of intracellular bacteria, and elevation of inducible nitric oxide synthase expression. Dietary administration of BPCLE activated leukocytes from the mice infected with Salmonella through the intraperitoneal route. The enzyme-linked immunosorbent assay of the cytokines produced by splenocytes from infected mice showed significant increases in the levels of Th1 cytokines, including interleukin (IL)-1β, IL-2, IL-6, and IL-12. Histology showed that dietary administration of BPCLE protected against necrosis of the liver resulting from a sublethal dose of Salmonella. In addition, the treatment (1) extended the lifespan of lethally infected mice, (2) suppressed the invasion of Salmonella into human Caco-2 colorectal adenocarcinoma cells, (3) increased excretion of the bacterium in the feces, (4) suppressed the translocation of the Salmonella to internal organs, and (5) increased total immunoglobulin A in both serum and intestinal fluids. BPCLE protected the mice against salmonellosis via cooperative effects that include the upregulation of the Th1 immune reaction, prevention of translocation of bacteria across the intestinal epithelial cells, and increased immunoglobulin A production in serum and intestinal fluids.

  3. Implementation and use of cloud-based electronic lab notebook in a bioprocess engineering teaching laboratory.

    Science.gov (United States)

    Riley, Erin M; Hattaway, Holly Z; Felse, P Arthur

    2017-01-01

    Electronic lab notebooks (ELNs) are better equipped than paper lab notebooks (PLNs) to handle present-day life science and engineering experiments that generate large data sets and require high levels of data integrity. But limited training and a lack of workforce with ELN knowledge have restricted the use of ELN in academic and industry research laboratories which still rely on cumbersome PLNs for recordkeeping. We used LabArchives, a cloud-based ELN in our bioprocess engineering lab course to train students in electronic record keeping, good documentation practices (GDPs), and data integrity. Implementation of ELN in the bioprocess engineering lab course, an analysis of user experiences, and our development actions to improve ELN training are presented here. ELN improved pedagogy and learning outcomes of the lab course through stream lined workflow, quick data recording and archiving, and enhanced data sharing and collaboration. It also enabled superior data integrity, simplified information exchange, and allowed real-time and remote monitoring of experiments. Several attributes related to positive user experiences of ELN improved between the two subsequent years in which ELN was offered. Student responses also indicate that ELN is better than PLN for compliance. We demonstrated that ELN can be successfully implemented in a lab course with significant benefits to pedagogy, GDP training, and data integrity. The methods and processes presented here for ELN implementation can be adapted to many types of laboratory experiments.

  4. Advances in Consolidated Bioprocessing Using Clostridium thermocellum and Thermoanaerobacter saccharolyticum

    Energy Technology Data Exchange (ETDEWEB)

    Lynd, Lee R. [Dartmouth College, Thayer School of Engineering; Guss, Adam M. [ORNL; Himmel, Mike [National Renewable Energy Laboratory (NREL); Beri, Dhananjay [Dartmouth College, Thayer School of Engineering; Herring, Christopher [Mascoma Corporation; Holwerda, Evert [Dartmouth College, Thayer School of Engineering; Murphy, Sean J. [Dartmouth College, Thayer School of Engineering; Olson, Daniel G. [Dartmouth College, Thayer School of Engineering; Paye, Julie [Dartmouth College, Thayer School of Engineering; Rydzak, Thomas [ORNL; Shao, Xiongjun [Dartmouth College, Thayer School of Engineering; Tian, Liang [Dartmouth College, Thayer School of Engineering; Worthen, Robert [Dartmouth College, Thayer School of Engineering

    2016-11-01

    Recent advances are addressed pertaining to consolidated bioprocessing (CBP) of plant cell walls to ethanol using two thermophilic, saccharolytic bacteria: the cellulose-fermenting Clostridium thermocellum and the hemicellulose- fermenting ermoanaerobacterium saccharolyticum. On the basis of the largest comparative dataset assembled to date, it appears that C. thermocellum is substantially more effective at solubilizing unpretreated plant cell walls than industry-standard fungal cellulase, and that this is particularly the case for more recalcitrant feedstocks. e distinctive central metabolism of C. thermocellum appears to involve more extensive energy coupling (e.g., on the order of 5 ATP per glucosyl moiety) than most fermentative anaerobes. Ethanol yields and titers realized by engineered strains of T. saccharolyticum meet standards for industrial feasibility and provide an important proof of concept as well as a model that may be emulated in other organisms. Progress has also been made with C. thermocellum, although not yet to this extent. e current state of strain development is summarized and outstanding challenges for commercial application are discussed. We speculate that CBP organism development is more promising starting with naturally occurring cellulolytic microbes as compared to starting with noncellulolytic hosts.

  5. Optimising Shovel-Truck Fuel Consumption using Stochastic ...

    African Journals Online (AJOL)

    Optimising the fuel consumption and truck waiting time can result in significant fuel savings. The paper demonstrates that stochastic simulation is an effective tool for optimising the utilisation of fossil-based fuels in mining and related industries. Keywords: Stochastic, Simulation Modelling, Mining, Optimisation, Shovel-Truck ...

  6. Development of the hard and soft constraints based optimisation model for unit sizing of the hybrid renewable energy system designed for microgrid applications

    Science.gov (United States)

    Sundaramoorthy, Kumaravel

    2017-02-01

    The hybrid energy systems (HESs) based electricity generation system has become a more attractive solution for rural electrification nowadays. Economically feasible and technically reliable HESs are solidly based on an optimisation stage. This article discusses about the optimal unit sizing model with the objective function to minimise the total cost of the HES. Three typical rural sites from southern part of India have been selected for the application of the developed optimisation methodology. Feasibility studies and sensitivity analysis on the optimal HES are discussed elaborately in this article. A comparison has been carried out with the Hybrid Optimization Model for Electric Renewable optimisation model for three sites. The optimal HES is found with less total net present rate and rate of energy compared with the existing method

  7. Optimisation in X-ray and Molecular Imaging 2015

    International Nuclear Information System (INIS)

    Baath, Magnus; Hoeschen, Christoph; Mattsson, Soeren; Mansson, Lars Gunnar

    2016-01-01

    This issue of Radiation Protection Dosimetry is based on contributions to Optimisation in X-ray and Molecular Imaging 2015 - the 4. Malmoe Conference on Medical Imaging (OXMI 2015). The conference was jointly organised by members of former and current research projects supported by the European Commission EURATOM Radiation Protection Research Programme, in cooperation with the Swedish Society for Radiation Physics. The conference brought together over 150 researchers and other professionals from hospitals, universities and industries with interests in different aspects of the optimisation of medical imaging. More than 100 presentations were given at this international gathering of medical physicists, radiologists, engineers, technicians, nurses and educational researchers. Additionally, invited talks were offered by world-renowned experts on radiation protection, spectral imaging and medical image perception, thus covering several important aspects of the generation and interpretation of medical images. The conference consisted of 13 oral sessions and a poster session, as reflected by the conference title connected by their focus on the optimisation of the use ionising radiation in medical imaging. The conference included technology-specific topics such as computed tomography and tomosynthesis, but also generic issues of interest for the optimisation of all medical imaging, such as image perception and quality assurance. Radiation protection was covered by e.g. sessions on patient dose benchmarking and occupational exposure. Technically-advanced topics such as modelling, Monte Carlo simulation, reconstruction, classification, and segmentation were seen taking advantage of recent developments of hardware and software, showing that the optimisation community is at the forefront of technology and adapts well to new requirements. These peer-reviewed proceedings, representing a continuation of a series of selected reports from meetings in the field of medical imaging

  8. Turbulence optimisation in stellarator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Proll, Josefine H.E. [Max-Planck/Princeton Center for Plasma Physics (Germany); Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstr. 1, 17491 Greifswald (Germany); Faber, Benjamin J. [HSX Plasma Laboratory, University of Wisconsin-Madison, Madison, WI 53706 (United States); Helander, Per; Xanthopoulos, Pavlos [Max-Planck/Princeton Center for Plasma Physics (Germany); Lazerson, Samuel A.; Mynick, Harry E. [Plasma Physics Laboratory, Princeton University, P.O. Box 451 Princeton, New Jersey 08543-0451 (United States)

    2015-05-01

    Stellarators, the twisted siblings of the axisymmetric fusion experiments called tokamaks, have historically suffered from confining the heat of the plasma insufficiently compared with tokamaks and were therefore considered to be less promising candidates for a fusion reactor. This has changed, however, with the advent of stellarators in which the laminar transport is reduced to levels below that of tokamaks by shaping the magnetic field accordingly. As in tokamaks, the turbulent transport remains as the now dominant transport channel. Recent analytical theory suggests that the large configuration space of stellarators allows for an additional optimisation of the magnetic field to also reduce the turbulent transport. In this talk, the idea behind the turbulence optimisation is explained. We also present how an optimised equilibrium is obtained and how it might differ from the equilibrium field of an already existing device, and we compare experimental turbulence measurements in different configurations of the HSX stellarator in order to test the optimisation procedure.

  9. OPTIMISATION OF COMPRESSIVE STRENGTH OF PERIWINKLE ...

    African Journals Online (AJOL)

    In this paper, a regression model is developed to predict and optimise the compressive strength of periwinkle shell aggregate concrete using Scheffe's regression theory. The results obtained from the derived regression model agreed favourably with the experimental data. The model was tested for adequacy using a student ...

  10. Layout Optimisation of Wave Energy Converter Arrays

    DEFF Research Database (Denmark)

    Ruiz, Pau Mercadé; Nava, Vincenzo; Topper, Mathew B. R.

    2017-01-01

    This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC) arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation......, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA), a genetic algorithm (GA) and the glowworm swarm optimisation (GSO) algorithm...

  11. Topology optimisation of natural convection problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations...... coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences...... in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach...

  12. Optimisation of decision making under uncertainty throughout field lifetime: A fractured reservoir example

    Science.gov (United States)

    Arnold, Dan; Demyanov, Vasily; Christie, Mike; Bakay, Alexander; Gopa, Konstantin

    2016-10-01

    Assessing the change in uncertainty in reservoir production forecasts over field lifetime is rarely undertaken because of the complexity of joining together the individual workflows. This becomes particularly important in complex fields such as naturally fractured reservoirs. The impact of this problem has been identified in previous and many solutions have been proposed but never implemented on complex reservoir problems due to the computational cost of quantifying uncertainty and optimising the reservoir development, specifically knowing how many and what kind of simulations to run. This paper demonstrates a workflow that propagates uncertainty throughout field lifetime, and into the decision making process by a combination of a metric-based approach, multi-objective optimisation and Bayesian estimation of uncertainty. The workflow propagates uncertainty estimates from appraisal into initial development optimisation, then updates uncertainty through history matching and finally propagates it into late-life optimisation. The combination of techniques applied, namely the metric approach and multi-objective optimisation, help evaluate development options under uncertainty. This was achieved with a significantly reduced number of flow simulations, such that the combined workflow is computationally feasible to run for a real-field problem. This workflow is applied to two synthetic naturally fractured reservoir (NFR) case studies in appraisal, field development, history matching and mid-life EOR stages. The first is a simple sector model, while the second is a more complex full field example based on a real life analogue. This study infers geological uncertainty from an ensemble of models that are based on the carbonate Brazilian outcrop which are propagated through the field lifetime, before and after the start of production, with the inclusion of production data significantly collapsing the spread of P10-P90 in reservoir forecasts. The workflow links uncertainty

  13. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  14. Dose optimisation in single plane interstitial brachytherapy

    DEFF Research Database (Denmark)

    Tanderup, Kari; Hellebust, Taran Paulsen; Honoré, Henriette Benedicte

    2006-01-01

    patients,       treated for recurrent rectal and cervical cancer, flexible catheters were       sutured intra-operatively to the tumour bed in areas with compromised       surgical margin. Both non-optimised, geometrically and graphically       optimised CT -based dose plans were made. The overdose index...... on the       regularity of the implant, such that the benefit of optimisation was       larger for irregular implants. OI and HI correlated strongly with target       volume limiting the usability of these parameters for comparison of dose       plans between patients. CONCLUSIONS: Dwell time optimisation significantly......BACKGROUND AND PURPOSE: Brachytherapy dose distributions can be optimised       by modulation of source dwell times. In this study dose optimisation in       single planar interstitial implants was evaluated in order to quantify the       potential benefit in patients. MATERIAL AND METHODS: In 14...

  15. An efficient optimisation method in groundwater resource ...

    African Journals Online (AJOL)

    DRINIE

    2003-10-04

    Oct 4, 2003 ... theories developed in the field of stochastic subsurface hydrology. In reality, many ... Recently, some researchers have applied the multi-stage ... Then a robust solution of the optimisation problem given by Eqs. (1) to (3) is as ...

  16. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    This thesis deals with topology optimisation for coupled convection problems. The aim is to extend and apply topology optimisation to steady-state conjugate heat transfer problems, where the heat conduction equation governs the heat transfer in a solid and is coupled to thermal transport...... in a surrounding uid, governed by a convection-diffusion equation, where the convective velocity field is found from solving the isothermal incompressible steady-state Navier-Stokes equations. Topology optimisation is also applied to steady-state natural convection problems. The modelling is done using stabilised...... finite elements, the formulation and implementation of which was done partly during a special course as prepatory work for this thesis. The formulation is extended with a Brinkman friction term in order to facilitate the topology optimisation of fluid flow and convective cooling problems. The derived...

  17. Trends in Process Analytical Technology: Present State in Bioprocessing.

    Science.gov (United States)

    Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian

    2017-08-04

    Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.

  18. The Solubility of Cr-Organic Produced by Hydrolysis, Bioprocess and Bioremediation and its Effect on Fermented Rate, Digestibility and Rumen Microbe Population (in vitro

    Directory of Open Access Journals (Sweden)

    UH Tanuwiria

    2010-09-01

    Full Text Available The research was conducted to study the production of organic chromium from the leather tanning waste and its effect on in vitro rumen fermentation activities. The research was divided into two phases. The first phase was production of organic chromium by alkali hydrolysis, S cereviceae bioprocess, and duckweed bioremediation that perceived solubility in neutral and acid solution. The second phase was the supplementation of organic-Cr in ration seen from in-vitro fermented rate, digestibility and microbe rumen population. Research was conducted experimentally using 4x4 factorial patterns, on the basis of Completely Randomized Design (CRD with three replications in each experimental unit. The first factor was the type of organic-Cr and the second factor was the supplement in ration at four levels, 1, 2, 3 and 4 ppm. The results of this research indicated that organic chromium can be synthesized by alkali hydrolysis, S cereviseae bioprocess and the activity of duckweed bioremediation. Among the three of processes referred, the highest level of Cr was obtained from S cereviseae bioprocess that was originated from leather-tanning waste. The levels of organic-Cr that was resulted from alkali hydrolysis, bioprocess from Cl3Cr.6H2O, bioprocess from Cr leather-tanning waste, and from duckweed bioremediation were 354, 1011, 3833 and 310 mg/kg, respectively. Organic-Cr characteristic of each product has relatively similar in ferment ability, dry matter and organic matter digestibility and rumen ecosystem. There is an indication that dry matter and organic matter digestibility and rumen microbe population in ration that was added with organic Cr from alkali hydrolysis was higher than other supplements. (Animal Production 12(3: 175-183 (2010Key Words: organic-Cr, rumen fermentation activities, rumen microbe population

  19. Real-time optimisation of the Hoa Binh reservoir, Vietnam

    DEFF Research Database (Denmark)

    Richaud, Bertrand; Madsen, Henrik; Rosbjerg, Dan

    2011-01-01

    -time optimisation. First, the simulation-optimisation framework is applied for optimising reservoir operating rules. Secondly, real-time and forecast information is used for on-line optimisation that focuses on short-term goals, such as flood control or hydropower generation, without compromising the deviation...... in the downstream part of the Red River, and at the same time to increase hydropower generation and to save water for the dry season. The real-time optimisation procedure further improves the efficiency of the reservoir operation and enhances the flexibility for the decision-making. Finally, the quality......Multi-purpose reservoirs often have to be managed according to conflicting objectives, which requires efficient tools for trading-off the objectives. This paper proposes a multi-objective simulation-optimisation approach that couples off-line rule curve optimisation with on-line real...

  20. Turmeric bioprocessed with mycelia from the shiitake culinary-medicinal mushroom lentinus edodes (agaricomycetes) protects mice against salmonellosis

    Science.gov (United States)

    Extracts of the shiitake mushroom Lentinus edodes and the spice tumeric (Curcuma longa) have both been reported to have health-promoting properties. The present study investigated the suppressive mechanisms of a bioprocessed Lentinus edodes liquid mushroom mycelia culture supplemented with turmeric ...

  1. Efficient topology optimisation of multiscale and multiphysics problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    The aim of this Thesis is to present efficient methods for optimising high-resolution problems of a multiscale and multiphysics nature. The Thesis consists of two parts: one treating topology optimisation of microstructural details and the other treating topology optimisation of conjugate heat...

  2. MANAGEMENT OPTIMISATION OF MASS CUSTOMISATION MANUFACTURING USING COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    Louwrens Butler

    2018-05-01

    Full Text Available Computational intelligence paradigms can be used for advanced manufacturing system optimisation. A static simulation model of an advanced manufacturing system was developed in order to simulate a manufacturing system. The purpose of this advanced manufacturing system was to mass-produce a customisable product range at a competitive cost. The aim of this study was to determine whether this new algorithm could produce a better performance than traditional optimisation methods. The algorithm produced a lower cost plan than that for a simulated annealing algorithm, and had a lower impact on the workforce.

  3. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  4. Design of optimised backstepping controller for the synchronisation ...

    Indian Academy of Sciences (India)

    Ehsan Fouladi

    2017-12-18

    Dec 18, 2017 ... for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller. Keywords. Colpitts oscillator; backstepping controller; chaos synchronisation; shark smell algorithm; particle .... The velocity model is based on the gradient of the objective function, tilting ...

  5. Development and optimisation of generic decommissioning strategies for civil Magnox reactors

    International Nuclear Information System (INIS)

    Carpenter, G.; Hebditch, D.; Meek, N.; Patel, A.; Reeve, P.

    2004-01-01

    BNFL Environmental Services has formulated updated proposals for the use of decision analysis in the development of decommissioning strategy. The proposals are based on the Department of Transport, Local Government and the Regions manual for practitioners on multi-criteria analysis, specifically multi-criteria decision analysis, as suited to complex problems with a mixture of monetary and non-monetary objectives. They take account of up-to-date academic methodology, the newly issued BNFL decision analysis framework for environmental decisions and a wide variety of other engineering, optioneering and optimisation processes. The paper also summarises legislative and company policy areas of importance to decommissioning strategy development. Higher-level generic reactor and site remediation strategies already exist. At the lower level, various generic decommissioning reference processes and project options need development. For the past year, Environmental Services has held responsibility to respond to the Nuclear Installations Inspectorates' quinquennial review, develop and maintain up-to-date strategies, institute the review of a selected number of key strategies, and respond to changing circumstances including stakeholder views. Environmental Services is performing a range of generic studies for selection of strategies and end-points as used for a variety of waste management and site care and maintenance preparations. (author)

  6. Studies on generalized kinetic model and Pareto optimization of a product-driven self-cycling bioprocess.

    Science.gov (United States)

    Sun, Kaibiao; Kasperski, Andrzej; Tian, Yuan

    2014-10-01

    The aim of this study is the optimization of a product-driven self-cycling bioprocess and presentation of a way to determine the best possible decision variables out of a set of alternatives based on the designed model. Initially, a product-driven generalized kinetic model, which allows a flexible choice of the most appropriate kinetics is designed and analysed. The optimization problem is given as the bi-objective one, where maximization of biomass productivity and minimization of unproductive loss of substrate are the objective functions. Then, the Pareto fronts are calculated for exemplary kinetics. It is found that in the designed bioprocess, a decrease of emptying/refilling fraction and an increase of substrate feeding concentration cause an increase of the biomass productivity. An increase of emptying/refilling fraction and a decrease of substrate feeding concentration cause a decrease of unproductive loss of substrate. The preferred solutions are calculated using the minimum distance from an ideal solution method, while giving proposals of their modifications derived from a decision maker's reactions to the generated solutions.

  7. Extending Particle Swarm Optimisers with Self-Organized Criticality

    DEFF Research Database (Denmark)

    Løvbjerg, Morten; Krink, Thiemo

    2002-01-01

    Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.......Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions....

  8. Technical Aspects of Use of Ultrasound for Intensification of Enzymatic Bio-Processing: New Path to "Green Chemistry"

    Science.gov (United States)

    Use of enzymatic processing in the food, textile, and bio-fuel applications is becoming increasingly popular, primarily because of rapid introduction of a new variety of highly efficient enzymes. In general, an enzymatic bio-processing generates less toxic and readily biodegradable wastewater efflue...

  9. Comparative analysis of solid-state bioprocessing and enzymatic treatment of finger millet for mobilization of bound phenolics.

    Science.gov (United States)

    Yadav, Geetanjali; Singh, Anshu; Bhattacharya, Patrali; Yuvraj, Jude; Banerjee, Rintu

    2013-11-01

    The present work investigates the probable bioprocessing technique to mobilize the bound phenolics naturally found in finger millet cell wall for enriching it with dietary antioxidants. Comparative study was performed between the exogenous enzymatic treatment and solid-state fermentation of grain (SSF) with a food grade organism Rhizopus oryzae. SSF results indicated that at the 6th day of incubation, total phenolic content (18.64 mg gallic acid equivalent/gds) and antioxidant property (DPPH radical scavenging activity of 39.03 %, metal chelating ability of 54 % and better reducing power) of finger millet were drastically enhanced when fermented with GRAS filamentous fungi. During the enzymatic bioprocessing, most of the phenolics released during the hydrolysis, leached out into the liquid portion rather than retaining them within the millet grain, resulting in overall loss of dietary antioxidant. The present study establishes the most effective strategy to enrich the finger millet with phenolic antioxidants.

  10. Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.

    Science.gov (United States)

    Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima

    2013-01-01

    The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  11. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Layout Optimisation of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Pau Mercadé Ruiz

    2017-08-01

    Full Text Available This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA, a genetic algorithm (GA and the glowworm swarm optimisation (GSO algorithm. The results show slightly higher performances for the latter two algorithms; however, the first turns out to be significantly less computationally demanding.

  13. Optimising the design and operation of semi-continuous affinity chromatography for clinical and commercial manufacture.

    Science.gov (United States)

    Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S

    2013-04-05

    This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost

  14. Mesoderm Lineage 3D Tissue Constructs Are Produced at Large-Scale in a 3D Stem Cell Bioprocess.

    Science.gov (United States)

    Cha, Jae Min; Mantalaris, Athanasios; Jung, Sunyoung; Ji, Yurim; Bang, Oh Young; Bae, Hojae

    2017-09-01

    Various studies have presented different approaches to direct pluripotent stem cell differentiation such as applying defined sets of exogenous biochemical signals and genetic/epigenetic modifications. Although differentiation to target lineages can be successfully regulated, such conventional methods are often complicated, laborious, and not cost-effective to be employed to the large-scale production of 3D stem cell-based tissue constructs. A 3D-culture platform that could realize the large-scale production of mesoderm lineage tissue constructs from embryonic stem cells (ESCs) is developed. ESCs are cultured using our previously established 3D-bioprocess platform which is amenable to mass-production of 3D ESC-based tissue constructs. Hepatocarcinoma cell line conditioned medium is introduced to the large-scale 3D culture to provide a specific biomolecular microenvironment to mimic in vivo mesoderm formation process. After 5 days of spontaneous differentiation period, the resulting 3D tissue constructs are composed of multipotent mesodermal progenitor cells verified by gene and molecular expression profiles. Subsequently the optimal time points to trigger terminal differentiation towards cardiomyogenesis or osteogenesis from the mesodermal tissue constructs is found. A simple and affordable 3D ESC-bioprocess that can reach the scalable production of mesoderm origin tissues with significantly improved correspondent tissue properties is demonstrated. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Optimisation-based worst-case analysis and anti-windup synthesis for uncertain nonlinear systems

    Science.gov (United States)

    Menon, Prathyush Purushothama

    This thesis describes the development and application of optimisation-based methods for worst-case analysis and anti-windup synthesis for uncertain nonlinear systems. The worst-case analysis methods developed in the thesis are applied to the problem of nonlinear flight control law clearance for highly augmented aircraft. Local, global and hybrid optimisation algorithms are employed to evaluate worst-case violations of a nonlinear response clearance criterion, for a highly realistic aircraft simulation model and flight control law. The reliability and computational overheads associated with different opti misation algorithms are compared, and the capability of optimisation-based approaches to clear flight control laws over continuous regions of the flight envelope is demonstrated. An optimisation-based method for computing worst-case pilot inputs is also developed, and compared with current industrial approaches for this problem. The importance of explicitly considering uncertainty in aircraft parameters when computing worst-case pilot demands is clearly demonstrated. Preliminary results on extending the proposed framework to the problems of limit-cycle analysis and robustness analysis in the pres ence of time-varying uncertainties are also included. A new method for the design of anti-windup compensators for nonlinear constrained systems controlled using nonlinear dynamics inversion control schemes is presented and successfully applied to some simple examples. An algorithm based on the use of global optimisation is proposed to design the anti-windup compensator. Some conclusions are drawn from the results of the research presented in the thesis, and directions for future work are identified.

  16. A comparison of forward planning and optimised inverse planning

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony; Webb, Steve

    1995-01-01

    A radiotherapy treatment plan optimisation algorithm has been applied to 48 prostate plans and the results compared with those of an experienced human planner. Twelve patients were used in the study, and a 3, 4, 6 and 8 field plan (with standard coplanar beam angles for each plan type) were optimised by both the human planner and the optimisation algorithm. The human planner 'optimised' the plan by conventional forward planning techniques. The optimisation algorithm was based on fast-simulated-annealing. 'Importance factors' assigned to different regions of the patient provide a method for controlling the algorithm, and it was found that the same values gave good results for almost all plans. The plans were compared on the basis of dose statistics and normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results show that the optimisation algorithm yielded results that were at least as good as the human planner for all plan types, and on the whole slightly better. A study of the beam-weights chosen by the optimisation algorithm and the planner will be presented. The optimisation algorithm showed greater variation, in response to individual patient geometry. For simple (e.g. 3 field) plans it was found to consistently achieve slightly higher TCP and lower NTCP values. For more complicated (e.g. 8 fields) plans the optimisation also achieved slightly better results with generally less numbers of beams. The optimisation time was always ≤5 minutes; a factor of up to 20 times faster than the human planner

  17. The current regulatory requirements on optimisation and BAT in Sweden in the context of geological disposal

    International Nuclear Information System (INIS)

    Dverstorp, B.

    2010-01-01

    Bjorn Dverstorp, Swedish Radiation Safety authority (SSM) presented 'The current regulatory requirements on optimisation and BAT in Sweden in the context of geological disposal'. In Sweden, a nuclear waste repository will be evaluated according to both to general environmental legislation (the Environmental Code, SFS, 1998:808) and according to more specific requirements in the Act on Nuclear Activities (SFS, 1984:3) and the Radiation Protection Act (SFS, 1988:220). The evaluations according to these laws will be carried out according to two separate, but coordinated, legal-review and decision-making processes. This will be a basis for the siting process. Although the requirements on BAT and siting in the Environmental Code apply to radiological protection, they aim at a broader system optimisation. The more specific requirements on optimisation and BAT of radiological protection of geological disposal systems are given in the regulations associated with the Radiation Protection Act. The Swedish radiation protection regulations (SSM, 2009) comprise three corner stones: a risk target, environmental protection goals and the use of optimisation and BAT. In SSM' s guidance optimisation is defined as a means to reduce risk, guided by the results of risk calculations. In case of a conflict between BAT and optimisation, measures satisfying BAT should have priority. Application of optimisation and BAT on different timescales are described as well as for human intrusion scenarios. B. Dverstorp explained that because of uncertainties in the long term there is a need for additional arguments in the safety case in support of decision making. It is in this context that the requirements on optimisation and BAT should be seen as supplementary to the risk target, in providing evidence that the developer has taken into consideration, as far as reasonably possible, measures and options for reducing future doses and risks. Both principles focus on the proponent's work on developing

  18. Improving Vector Evaluated Particle Swarm Optimisation by Incorporating Nondominated Solutions

    Directory of Open Access Journals (Sweden)

    Kian Sheng Lim

    2013-01-01

    Full Text Available The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  19. Optimisation of electrical system for offshore wind farms via genetic algorithm

    DEFF Research Database (Denmark)

    Chen, Zhe; Zhao, Menghua; Blaabjerg, Frede

    2009-01-01

    An optimisation platform based on genetic algorithm (GA) is presented, where the main components of a wind farm and key technical specifications are used as input parameters and the electrical system design of the wind farm is optimised in terms of both production cost and system reliability....... The power losses, wind power production, initial investment and maintenance costs are considered in the production cost. The availability of components and network redundancy are included in the reliability evaluation. The method of coding an electrical system to a binary string, which is processed by GA......, is developed. Different GA techniques are investigated based on a real example offshore wind farm. This optimisation platform has been demonstrated as a powerful tool for offshore wind farm design and evaluation....

  20. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...... using the boundary element method where absorption is incorporated. An example is given where the geometry of a room is defined by four design modes. The room geometry is optimised to get a uniform sound pressure....

  1. Optimisation on processing parameters for minimising warpage on side arm using response surface methodology (RSM) and particle swarm optimisation (PSO)

    Science.gov (United States)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.

    2017-09-01

    This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.

  2. Biomass supply chain optimisation for Organosolv-based biorefineries.

    Science.gov (United States)

    Giarola, Sara; Patel, Mayank; Shah, Nilay

    2014-05-01

    This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Optimisation of patient and staff exposure in interventional cardiology

    International Nuclear Information System (INIS)

    Padovani, R.; Malisan, M.R.; Bernardi, G.; Vano, E.; Neofotistou, V.

    2001-01-01

    The Council Directive of the European Community 97/43/Euratom (MED) deals with the health protection of individuals against dangers of ionising radiation in relation to medical exposure, and also focuses attention on some special practices (Art. 9), including interventional radiology, a technique involving high doses to the patient. The paper presents the European approach to optimisation of exposure in interventional cardiology. The DIMOND research consortium (DIMOND: Digital Imaging: Measures for Optimising Radiological Information Content and Dose) is working to develop quality criteria for cineangiographic images, to develop procedures for the classification of complexity of therapeutic and diagnostic procedures and to derive reference levels, related also to procedure complexity. DIMOND project also includes aspects of equipment characteristics and performance and content of training in radiation protection of personnel working in interventional radiology field. (author)

  4. Energy thermal management in commercial bread-baking using a multi-objective optimisation framework

    International Nuclear Information System (INIS)

    Khatir, Zinedine; Taherkhani, A.R.; Paton, Joe; Thompson, Harvey; Kapur, Nik; Toropov, Vassili

    2015-01-01

    In response to increasing energy costs and legislative requirements energy efficient high-speed air impingement jet baking systems are now being developed. In this paper, a multi-objective optimisation framework for oven designs is presented which uses experimentally verified heat transfer correlations and high fidelity Computational Fluid Dynamics (CFD) analyses to identify optimal combinations of design features which maximise desirable characteristics such as temperature uniformity in the oven and overall energy efficiency of baking. A surrogate-assisted multi-objective optimisation framework is proposed and used to explore a range of practical oven designs, providing information on overall temperature uniformity within the oven together with ensuing energy usage and potential savings. - Highlights: • A multi-objective optimisation framework to design commercial ovens is presented. • High fidelity CFD embeds experimentally calibrated heat transfer inputs. • The optimum oven design minimises specific energy and bake time. • The Pareto front outlining the surrogate-assisted optimisation framework is built. • Optimisation of industrial bread-baking ovens reveals an energy saving of 637.6 GWh

  5. (MBO) algorithm in multi-reservoir system optimisation

    African Journals Online (AJOL)

    A comparative study of marriage in honey bees optimisation (MBO) algorithm in ... A practical application of the marriage in honey bees optimisation (MBO) ... to those of other evolutionary algorithms, such as the genetic algorithm (GA), ant ...

  6. HEK293 cell culture media study towards bioprocess optimization: Animal derived component free and animal derived component containing platforms.

    Science.gov (United States)

    Liste-Calleja, Leticia; Lecina, Martí; Cairó, Jordi Joan

    2014-04-01

    The increasing demand for biopharmaceuticals produced in mammalian cells has lead industries to enhance bioprocess volumetric productivity through different strategies. Among those strategies, cell culture media development is of major interest. In the present work, several commercially available culture media for Human Embryonic Kidney cells (HEK293) were evaluated in terms of maximal specific growth rate and maximal viable cell concentration supported. The main objective was to provide different cell culture platforms which are suitable for a wide range of applications depending on the type and the final use of the product obtained. Performing simple media supplementations with and without animal derived components, an enhancement of cell concentration from 2 × 10(6) cell/mL to 17 × 10(6) cell/mL was achieved in batch mode operation. Additionally, the media were evaluated for adenovirus production as a specific application case of HEK293 cells. None of the supplements interfered significantly with the adenovirus infection although some differences were encountered in viral productivity. To the best of our knowledge, the high cell density achieved in the work presented has never been reported before in HEK293 batch cell cultures and thus, our results are greatly promising to further study cell culture strategies in bioreactor towards bioprocess optimization. Copyright © 2013 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  7. Mechatronic System Design Based On An Optimisation Approach

    DEFF Research Database (Denmark)

    Andersen, Torben Ole; Pedersen, Henrik Clemmensen; Hansen, Michael Rygaard

    The envisaged objective of this paper project is to extend the current state of the art regarding the design of complex mechatronic systems utilizing an optimisation approach. We propose to investigate a novel framework for mechatronic system design. The novelty and originality being the use...... of optimisation techniques. The methods used to optimise/design within the classical disciplines will be identified and extended to mechatronic system design....

  8. The principle of optimisation: reasons for success and legal criticism

    International Nuclear Information System (INIS)

    Fernandez Regalado, Luis

    2008-01-01

    The International Commission on Radiological Protection (ICRP) has adopted new recommendations in 2007. In broad outlines they fundamentally continue the recommendations already approved in 1990 and later on. The principle of optimisation of protection, together with the principles of justification and dose limits, remains playing a key role of the ICRP recommendations, and it has so been for the last few years. This principle, somehow reinforced in the 2007 ICRP recommendations, has been incorporated into norms and legislation which have peacefully been in force in many countries all over the world. There are three main reasons to explain the success in the application of the principle of optimisation in radiological protection: First, the subjectivity of the sentence that embraces the principle of optimisation, 'As low as reasonably achievable' (ALARA), that allows different valid interpretations under different circumstances. Second, the pragmatism and adaptability of ALARA to all exposure situations. And third, the scientific humbleness which is behind the principle of optimisation, which makes a clear contrast with the old fashioned scientific positivism that enshrined scientist opinions. Nevertheless, from a legal point of view, there is some criticism cast over the principle of optimisation in radiological protection, where it has been transformed in compulsory norm. This criticism is based on two arguments: The lack of democratic participation in the process of elaboration of the norm, and the legal uncertainty associated to its application. Both arguments are somehow known by the ICRP which, on the one hand, has broadened the participation of experts, associations and the professional radiological protection community, increasing the transparency on how decisions on recommendations have been taken, and on the other hand, the ICRP has warned about the need for authorities to specify general criteria to develop the principle of optimisation in national

  9. Warpage optimisation on the moulded part with straight-drilled and conformal cooling channels using response surface methodology (RSM) and glowworm swarm optimisation (GSO)

    Science.gov (United States)

    Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.

    2017-09-01

    In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.

  10. Energy efficiency optimisation for distillation column using artificial neural network models

    International Nuclear Information System (INIS)

    Osuolale, Funmilayo N.; Zhang, Jie

    2016-01-01

    This paper presents a neural network based strategy for the modelling and optimisation of energy efficiency in distillation columns incorporating the second law of thermodynamics. Real-time optimisation of distillation columns based on mechanistic models is often infeasible due to the effort in model development and the large computation effort associated with mechanistic model computation. This issue can be addressed by using neural network models which can be quickly developed from process operation data. The computation time in neural network model evaluation is very short making them ideal for real-time optimisation. Bootstrap aggregated neural networks are used in this study for enhanced model accuracy and reliability. Aspen HYSYS is used for the simulation of the distillation systems. Neural network models for exergy efficiency and product compositions are developed from simulated process operation data and are used to maximise exergy efficiency while satisfying products qualities constraints. Applications to binary systems of methanol-water and benzene-toluene separations culminate in a reduction of utility consumption of 8.2% and 28.2% respectively. Application to multi-component separation columns also demonstrate the effectiveness of the proposed method with a 32.4% improvement in the exergy efficiency. - Highlights: • Neural networks can accurately model exergy efficiency in distillation columns. • Bootstrap aggregated neural network offers improved model prediction accuracy. • Improved exergy efficiency is obtained through model based optimisation. • Reductions of utility consumption by 8.2% and 28.2% were achieved for binary systems. • The exergy efficiency for multi-component distillation is increased by 32.4%.

  11. Contribution à l'optimisation de la conduite des procédés alimentaires

    OpenAIRE

    Olmos-Perez , Alejandra

    2003-01-01

    The main objective of this work is the development of a methodology to calculate the optimal operating conditions applicable in food processes control. In the first part of this work, we developed an optimization strategy in two stages. Firstly, the optimisation problem is constructed. Afterwards, a feasible optimisation method is chosen to solve the problem. This choice is made through a decisional diagram, which proposes a deterministic (sequential quadratic programming, SQP), a stochastic ...

  12. Optimisation of logistics processes of energy grass collection

    Science.gov (United States)

    Bányai, Tamás.

    2010-05-01

    The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The

  13. ICT for whole life optimisation of residential buildings

    Energy Technology Data Exchange (ETDEWEB)

    Haekkinen, T.; Vares, S.; Huovila, P.; Vesikari, E.; Porkka, J. (VTT Technical Research Centre of Finland, Espoo (FI)); Nilsson, L.-O.; Togeroe, AA. (Lund University (SE)); Jonsson, C.; Suber, K. (Skanska Sverige AB (SE)); Andersson, R.; Larsson, R. (Cementa, Malmoe (SE)); Nuorkivi, I. (Skanska Oyj, Helsinki (FI))

    2007-08-15

    The research project 'ICT for whole life optimisation of residential buildings' (ICTWLORB) developed and tested the whole life design and optimisation methods for residential buildings. The objective of the ICTWLORB project was to develop and implement an ICT based tool box for integrated life cycle design of residential building. ICTWLORB was performed in cooperation with Swedish and Finnish partners. The ICTWLORB project defined as a premise that an industrialised building process is characterised by two main elements: a building concept based approach and efficient information management. Building concept based approach enables (1) the product development of the end product, (2) repetition of the basic elements of the building from one project to others and (3) customisation of the end-product considering the specific needs of the case and the client. Information management enables (1) the consideration of wide spectrum of aspects including building performance, environmental aspects, life cycle costs and service life, and (2) rapid adapting of the design to the specific requirements of the case. (orig.)

  14. Statistical Optimisation of Fermentation Conditions for Citric Acid ...

    African Journals Online (AJOL)

    This study investigated the optimisation of fermentation conditions during citric acid production via solid state fermentation (SSF) of pineapple peels using Aspergillus niger. A three-variable, three-level Box-Behnken design (BBD) comprising 17 experimental runs was used to develop a statistical model for the fermentation ...

  15. optimisation of compressive strength of periwinkle shell aggregate

    African Journals Online (AJOL)

    user

    2017-01-01

    Jan 1, 2017 ... In this paper, a regression model is developed to predict and optimise the compressive strength of periwinkle shell aggregate concrete using Scheffe's regression theory. The results obtained from the derived regression model agreed favourably with the experimental data. The model was tested for ...

  16. Stochastic Optimisation of Battery System Operation Strategy under different Utility Tariff Structures

    OpenAIRE

    Erdal, Jørgen Sørgård

    2017-01-01

    This master thesis develops a stochastic optimisation software for household grid-connected batteries combined with PV-systems. The objective of the optimisation is to operate the battery system in order to minimise the costs of the consumer, and it was implemented in MATLAB using a self-written stochastic dynamic programming algorithm. Load was considered as a stochastic variable and modelled as a Markov Chain. Transition probabilities between time steps were calculated using historic load p...

  17. Economic and Mathematical Modelling of Optimisation of Transaction Expenses of Engineering Enterprises

    OpenAIRE

    Makaliuk Iryna V.

    2014-01-01

    The article identifies stages of the process of optimisation of transaction expenses. It develops an economic and mathematical model of optimisation of transaction expenses of engineering enterprises by the criterion of maximisation of income from realisation of products and system of restrictions, which envisages exceeding income growth rate over the expenses growth rate. The article offers to use types of expenses by accounting accounts as indicators of transaction expenses. In the result o...

  18. A knowledge representation model for the optimisation of electricity generation mixes

    International Nuclear Information System (INIS)

    Chee Tahir, Aidid; Bañares-Alcántara, René

    2012-01-01

    Highlights: ► Prototype energy model which uses semantic representation (ontologies). ► Model accepts both quantitative and qualitative based energy policy goals. ► Uses logic inference to formulate equations for linear optimisation. ► Proposes electricity generation mix based on energy policy goals. -- Abstract: Energy models such as MARKAL, MESSAGE and DNE-21 are optimisation tools which aid in the formulation of energy policies. The strength of these models lie in their solid theoretical foundations built on rigorous mathematical equations designed to process numerical (quantitative) data related to economics and the environment. Nevertheless, a complete consideration of energy policy issues also requires the consideration of the political and social aspects of energy. These political and social issues are often associated with non-numerical (qualitative) information. To enable the evaluation of these aspects in a computer model, we hypothesise that a different approach to energy model optimisation design is required. A prototype energy model that is based on a semantic representation using ontologies and is integrated to engineering models implemented in Java has been developed. The model provides both quantitative and qualitative evaluation capabilities through the use of logical inference. The semantic representation of energy policy goals is used (i) to translate a set of energy policy goals into a set of logic queries which is then used to determine the preferred electricity generation mix and (ii) to assist in the formulation of a set of equations which is then solved in order to obtain a proposed electricity generation mix. Scenario case studies have been developed and tested on the prototype energy model to determine its capabilities. Knowledge queries were made on the semantic representation to determine an electricity generation mix which fulfilled a set of energy policy goals (e.g. CO 2 emissions reduction, water conservation, energy supply

  19. Optimisation of X-ray examinations: General principles and an Irish perspective

    International Nuclear Information System (INIS)

    Matthews, Kate; Brennan, Patrick C.

    2009-01-01

    In Ireland, the European Medical Exposures Directive [Council Directive 97/43] was enacted into national law in Statutory Instrument 478 of 2002. This series of three review articles discusses the status of justification and optimisation of X-ray examinations nationally, and progress with the establishment of Irish diagnostic reference levels. In this second article, literature relating to optimisation issues arising in SI 478 of 2002 is reviewed. Optimisation associated with X-ray equipment and optimisation during day-to-day practice are considered. Optimisation proposals found in published research are summarised, and indicate the complex nature of optimisation. A paucity of current, research-based guidance documentation is identified. This is needed in order to support a range of professional staff in their practical implementation of optimisation.

  20. Coil optimisation for transcranial magnetic stimulation in realistic head geometry.

    Science.gov (United States)

    Koponen, Lari M; Nieminen, Jaakko O; Mutanen, Tuomas P; Stenroos, Matti; Ilmoniemi, Risto J

    Transcranial magnetic stimulation (TMS) allows focal, non-invasive stimulation of the cortex. A TMS pulse is inherently weakly coupled to the cortex; thus, magnetic stimulation requires both high current and high voltage to reach sufficient intensity. These requirements limit, for example, the maximum repetition rate and the maximum number of consecutive pulses with the same coil due to the rise of its temperature. To develop methods to optimise, design, and manufacture energy-efficient TMS coils in realistic head geometry with an arbitrary overall coil shape. We derive a semi-analytical integration scheme for computing the magnetic field energy of an arbitrary surface current distribution, compute the electric field induced by this distribution with a boundary element method, and optimise a TMS coil for focal stimulation. Additionally, we introduce a method for manufacturing such a coil by using Litz wire and a coil former machined from polyvinyl chloride. We designed, manufactured, and validated an optimised TMS coil and applied it to brain stimulation. Our simulations indicate that this coil requires less than half the power of a commercial figure-of-eight coil, with a 41% reduction due to the optimised winding geometry and a partial contribution due to our thinner coil former and reduced conductor height. With the optimised coil, the resting motor threshold of abductor pollicis brevis was reached with the capacitor voltage below 600 V and peak current below 3000 A. The described method allows designing practical TMS coils that have considerably higher efficiency than conventional figure-of-eight coils. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Development and optimisation by means of sensory analysis of new beverages based on different fruit juices and sherry wine vinegar.

    Science.gov (United States)

    Cejudo-Bastante, María Jesús; Rodríguez Dodero, M Carmen; Durán Guerrero, Enrique; Castro Mejías, Remedios; Natera Marín, Ramón; García Barroso, Carmelo

    2013-03-15

    Despite the long history of sherry wine vinegar, new alternatives of consumption are being developed, with the aim of diversifying its market. Several new acetic-based fruit juices have been developed by optimising the amount of sherry wine vinegar added to different fruit juices: apple, peach, orange and pineapple. Once the concentrations of wine vinegar were optimised by an expert panel, the aforementioned new acetic fruit juices were tasted by 86 consumers. Three different aspects were taken into account: habits of consumption of vinegar and fruit juices, gender and age. Based on the sensory analysis, 50 g kg(-1) of wine vinegar was the optimal and preferred amount of wine vinegar added to the apple, orange and peach juices, whereas 10 g kg(-1) was the favourite for the pineapple fruit. Based on the olfactory and gustatory impression, and 'purchase intent', the acetic beverages made from peach and pineapple juices were the most appreciated, followed by apple juice, while those obtained from orange juice were the least preferred by consumers. New opportunities for diversification of the oenological market could be possible as a result of the development of this type of new product which can be easily developed by any vinegar or fruit juice maker company. © 2012 Society of Chemical Industry.

  2. An integrated framework for the optimisation of sport and athlete development: a practitioner approach.

    Science.gov (United States)

    Gulbin, Jason P; Croser, Morag J; Morley, Elissa J; Weissensteiner, Juanita R

    2013-01-01

    This paper introduces a new sport and athlete development framework that has been generated by multidisciplinary sport practitioners. By combining current theoretical research perspectives with extensive empirical observations from one of the world's leading sport agencies, the proposed FTEM (Foundations, Talent, Elite, Mastery) framework offers broad utility to researchers and sporting stakeholders alike. FTEM is unique in comparison with alternative models and frameworks, because it: integrates general and specialised phases of development for participants within the active lifestyle, sport participation and sport excellence pathways; typically doubles the number of developmental phases (n = 10) in order to better understand athlete transition; avoids chronological and training prescriptions; more optimally establishes a continuum between participation and elite; and allows full inclusion of many developmental support drivers at the sport and system levels. The FTEM framework offers a viable and more flexible alternative for those sporting stakeholders interested in managing, optimising, and researching sport and athlete development pathways.

  3. Selection of controlled variables in bioprocesses. Application to a SHARON-Anammox process for autotrophic nitrogen removal

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Valverde Perez, Borja; Sin, Gürkan

    Selecting the right controlled variables in a bioprocess is challenging since the objectives of the process (yields, product or substrate concentration) are difficult to relate with a given actuator. We apply here process control tools that can be used to assist in the selection of controlled var...... variables to the case of the SHARON-Anammox process for autotrophic nitrogen removal....

  4. Genetic algorithms and artificial neural networks for loading pattern optimisation of advanced gas-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ziver, A.K. E-mail: a.k.ziver@imperial.ac.uk; Pain, C.C; Carter, J.N.; Oliveira, C.R.E. de; Goddard, A.J.H.; Overton, R.S

    2004-03-01

    A non-generational genetic algorithm (GA) has been developed for fuel management optimisation of Advanced Gas-Cooled Reactors, which are operated by British Energy and produce around 20% of the UK's electricity requirements. An evolutionary search is coded using the genetic operators; namely selection by tournament, two-point crossover, mutation and random assessment of population for multi-cycle loading pattern (LP) optimisation. A detailed description of the chromosomes in the genetic algorithm coded is presented. Artificial Neural Networks (ANNs) have been constructed and trained to accelerate the GA-based search during the optimisation process. The whole package, called GAOPT, is linked to the reactor analysis code PANTHER, which performs fresh fuel loading, burn-up and power shaping calculations for each reactor cycle by imposing station-specific safety and operational constraints. GAOPT has been verified by performing a number of tests, which are applied to the Hinkley Point B and Hartlepool reactors. The test results giving loading pattern (LP) scenarios obtained from single and multi-cycle optimisation calculations applied to realistic reactor states of the Hartlepool and Hinkley Point B reactors are discussed. The results have shown that the GA/ANN algorithms developed can help the fuel engineer to optimise loading patterns in an efficient and more profitable way than currently available for multi-cycle refuelling of AGRs. Research leading to parallel GAs applied to LP optimisation are outlined, which can be adapted to present day LWR fuel management problems.

  5. Methodology implementation for multi objective optimisation for nuclear fleet evolution scenarios

    International Nuclear Information System (INIS)

    Freynet, David

    2016-01-01

    The issue of the evolution French nuclear fleet can be considered through the study of nuclear transition scenarios. These studies are of paramount importance as their results can greatly affect the decision making process, given that they take into account industrial concerns, investments, time, and nuclear system complexity. Such studies can be performed with the COSI code (developed at the CEA/DEN), which enables the calculation of matter inventories and fluxes across the fuel cycle (nuclear reactors and associated facilities), especially when coupled with the CESAR depletion code. The studies today performed with COSI require the definition of the various scenarios' input parameters, in order to fulfil different objectives such as minimising natural uranium consumption, waste production and so on. These parameters concern the quantities and the scheduling of spent fuel destined for reprocessing, and the number, the type and the commissioning dates of deployed reactors.This work aims to develop, validate and apply an optimisation methodology coupled with COSI, in order to determine optimal nuclear transition scenarios for a multi-objective platform. Firstly, this methodology is based on the acceleration of scenario evaluation, enabling the use of optimisation methods in a reasonable time-frame. With this goal in mind, artificial neural network irradiation surrogate models are created with the URANIE platform (developed at the CEA/DEN) and are implemented within COSI. The next step in this work is to use, adapt and compare different optimisation methods, such as URANIE's genetic algorithm and particle swarm methods, in order to define a methodology suited to this type of study. This methodology development is based on an incremental approach which progressively adds objectives, constraints and decision variables to the optimisation problem definition. The variables added, which are related to reactor deployment and spent fuel reprocessing strategies, are chosen

  6. Mutual information-based LPI optimisation for radar network

    Science.gov (United States)

    Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun

    2015-07-01

    Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.

  7. Multicriteria Optimisation in Logistics Forwarder Activities

    Directory of Open Access Journals (Sweden)

    Tanja Poletan Jugović

    2007-05-01

    Full Text Available Logistics forwarder, as organizer and planner of coordinationand integration of all the transport and logistics chains elements,uses adequate ways and methods in the process of planningand decision-making. One of these methods, analysed inthis paper, which could be used in optimisation of transportand logistics processes and activities of logistics forwarder, isthe multicriteria optimisation method. Using that method, inthis paper is suggested model of multicriteria optimisation of logisticsforwarder activities. The suggested model of optimisationis justified in keeping with method principles of multicriteriaoptimization, which is included in operation researchmethods and it represents the process of multicriteria optimizationof variants. Among many different processes of multicriteriaoptimization, PROMETHEE (Preference Ranking OrganizationMethod for Enrichment Evaluations and Promcalc& Gaia V. 3.2., computer program of multicriteria programming,which is based on the mentioned process, were used.

  8. An effective approach to reducing strategy space for maintenance optimisation of multistate series–parallel systems

    International Nuclear Information System (INIS)

    Zhou, Yifan; Lin, Tian Ran; Sun, Yong; Bian, Yangqing; Ma, Lin

    2015-01-01

    Maintenance optimisation of series–parallel systems is a research topic of practical significance. Nevertheless, a cost-effective maintenance strategy is difficult to obtain due to the large strategy space for maintenance optimisation of such systems. The heuristic algorithm is often employed to deal with this problem. However, the solution obtained by the heuristic algorithm is not always the global optimum and the algorithm itself can be very time consuming. An alternative method based on linear programming is thus developed in this paper to overcome such difficulties by reducing strategy space of maintenance optimisation. A theoretical proof is provided in the paper to verify that the proposed method is at least as effective as the existing methods for strategy space reduction. Numerical examples for maintenance optimisation of series–parallel systems having multistate components and considering both economic dependence among components and multiple-level imperfect maintenance are also presented. The simulation results confirm that the proposed method is more effective than the existing methods in removing inappropriate maintenance strategies of multistate series–parallel systems. - Highlights: • A new method using linear programming is developed to reduce the strategy space. • The effectiveness of the new method for strategy reduction is theoretically proved. • Imperfect maintenance and economic dependence are considered during optimisation

  9. Optimisation by hierarchical search

    Science.gov (United States)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  10. Bio-processing of solid wastes and secondary resources for metal extraction – A review

    International Nuclear Information System (INIS)

    Lee, Jae-chun; Pandey, Banshi Dhar

    2012-01-01

    Highlights: ► Review focuses on bio-extraction of metals from solid wastes of industries and consumer goods. ► Bio-processing of certain effluents/wastewaters with metals is also included in brief. ► Quantity/composition of wastes are assessed, and microbes used and leaching conditions included. ► Bio-recovery using bacteria, fungi and archaea is highlighted for resource recycling. ► Process methodology/mechanism, R and D direction and scope of large scale use are briefly included. - Abstract: Metal containing wastes/byproducts of various industries, used consumer goods, and municipal waste are potential pollutants, if not treated properly. They may also be important secondary resources if processed in eco-friendly manner for secured supply of contained metals/materials. Bio-extraction of metals from such resources with microbes such as bacteria, fungi and archaea is being increasingly explored to meet the twin objectives of resource recycling and pollution mitigation. This review focuses on the bio-processing of solid wastes/byproducts of metallurgical and manufacturing industries, chemical/petrochemical plants, electroplating and tanning units, besides sewage sludge and fly ash of municipal incinerators, electronic wastes (e-wastes/PCBs), used batteries, etc. An assessment has been made to quantify the wastes generated and its compositions, microbes used, metal leaching efficiency etc. Processing of certain effluents and wastewaters comprising of metals is also included in brief. Future directions of research are highlighted.

  11. Optimised dipper fine tunes shovel performance

    Energy Technology Data Exchange (ETDEWEB)

    Fiscor, S.

    2005-06-01

    Joint efforts between mine operators, OEMs, and researchers yields unexpected benefits from dippers for shovels for coal, oil, or hardrock mining that can now be tailored to meet site-specific conditions. The article outlines a process being developed by CRCMining and P & H MIning Equipment to optimise the dipper that involves rapid prototyping and scale modelling of the dipper and the mine conditions. Scale models have been successfully field tested. 2 photos.

  12. Current NRPB recommendations on optimisation of protection of workers

    International Nuclear Information System (INIS)

    Wrixon, A.D.

    1994-01-01

    The National Radiological Protection Board is required by Ministerial Direction to provide advice on the relevance of the recommendations of the International Commission on Radiological Protection to the UK. Its advice was published in the Spring of 1993 after a period of consultation. In this article, which formed the basis of a presentation at an SRP Meeting on 29 April 1994, the Board's advice on the optimisation of protection of workers is explored and presented in the context of the developments in the understanding of the principle that have taken place in recent years. The most significant developments are the realisation that implementation of the principle is an essential function of good management and the recognition that the interests of the individual are not sufficiently taken into account by the dose limits alone but doses to individuals should be both constrained and optimised. (author)

  13. Energy balance of the optimised CVT-hybrid-driveline

    Energy Technology Data Exchange (ETDEWEB)

    Hoehn, Bernd-Robert; Pflaum, Hermann; Lechner, Claus [Forschungsstelle fuer Zahnraeder und Getriebebau, Technische Univ. Muenchen, Garching (Germany)

    2009-07-01

    Funded by the DFG (German Research Foundation) and some industry partners like GM Powertrain Europe, ZF, EPCOS the Optimised CVT-Hybrid was developed at Technische Universitaet Muenchen in close collaboration with the industry and is currently under scientific investigation. Designed as a parallel hybrid vehicle the Optimised CVT-Hybrid combines a series-production diesel engine with a small electric motor. The core element of the driveline is a two range continuously variable transmission (i{radical}i-transmission), which is based on a chain variator. By a special shifting process without interruption of traction force the ratio range of the chain variator is used twice; thereby a wide transmission-ratio spread is achieved by low complexity. Thus the transmission provides a large pull-away ratio for the small electric motor and a fuel-efficient overdrive ratio for the ic-engine. Instead of heavy and space-consuming accumulators a small efficient package of double layer capacitors (UltraCaps) is used for electric energy and power storage. The driveline management is done by an optimised vehicle controller. Within the scope of the research project two prototype drivelines were manufactured. One driveline is integrated into an Opel Vectra Caravan and is available for investigations at the roller dynamometer and in the actual road traffic. The second hybrid driveline is assembled at the powertrain test rig of the FZG for detailed analysis of system behaviour and fuel consumption. Based on measurements of standardised driving cycles system behaviour, fuel consumption and a detailed energy balance of the Optimised CVT-Hybrid are presented. In comparison to the series-production vehicle the fuel savings are shown. (orig.)

  14. Public transport optimisation emphasising passengers’ travel behaviour.

    OpenAIRE

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in ...

  15. Optimised Design and Analysis of All-Optical Networks

    DEFF Research Database (Denmark)

    Glenstrup, Arne John

    2002-01-01

    through various experiments and is shown to produce good results and to be able to scale up to networks of realistic sizes. A novel method, subpath wavelength grouping, for routing connections in a multigranular all-optical network where several wavelengths can be grouped and switched at band and fibre......This PhD thesis presents a suite of methods for optimising design and for analysing blocking probabilities of all-optical networks. It thus contributes methodical knowledge to the field of computer assisted planning of optical networks. A two-stage greenfield optical network design optimiser...... is developed, based on shortest-path algorithms and a comparatively new metaheuristic called simulated allocation. It is able to handle design of all-optical mesh networks with optical cross-connects, considers duct as well as fibre and node costs, and can also design protected networks. The method is assessed...

  16. Smart border initiative: a Franco-German cross-border energy optimisation project

    International Nuclear Information System (INIS)

    2017-01-01

    Integrated and optimised local energy systems will play a key role in achieving the energy transition objectives set by France and Germany, in line with the Energy Union's goals, and contribute to ensuring a secure, affordable and climate-friendly energy supply in the EU. In order to capitalise on the French and German expertise and experiences in developing such systems and to continue strengthening the cross-border cooperation towards a fully integrated European energy market, both Governments have decided to launch a common initiative to identify and structure a cross-border energy optimisation project. Tilia and Dena have undertaken this mission to jointly develop the Smart Border Initiative (SBI). The SBI will, on the one hand, connect policies designed by France and Germany in order to support their cities and territories in their energy transition strategies and European market integration. It is currently a paradox that, though more balanced and resilient energy systems build up, bottom-up, at the local level, borders remain an obstacle to this local integration, in spite of the numerous complementarities observed in cross-border regions, and of their specific needs, in terms of smart mobility for example. The SBI project aims at enabling European neighbouring regions separated by a border to jointly build up optimised local energy systems, and jointly develop their local economies following an integrated, sustainable and low-carbon model. On the other hand, this showcase project will initiate a new stage in the EU electricity market integration, by completing high voltage interconnections with local, low voltage integration at DSO level, opening new optimisation possibilities in managing the electricity balance, and enabling DSOs to jointly overcome some of the current challenges, notably the increased share of renewable energy (RE) and ensuring Europe's security of supply

  17. Optimisation of the Management of Higher Activity Waste in the UK - 13537

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Ciara; Buckley, Matthew [Nuclear Decommissioning Authority, Building 587, Curie Avenue, Harwell Oxford, Didcot, Oxfordshire, OX11 0RH (United Kingdom)

    2013-07-01

    The Upstream Optioneering project was created in the Nuclear Decommissioning Authority (UK) to support the development and implementation of significant opportunities to optimise activities across all the phases of the Higher Activity Waste management life cycle (i.e. retrieval, characterisation, conditioning, packaging, storage, transport and disposal). The objective of the Upstream Optioneering project is to work in conjunction with other functions within NDA and the waste producers to identify and deliver solutions to optimise the management of higher activity waste. Historically, optimisation may have occurred on aspects of the waste life cycle (considered here to include retrieval, conditioning, treatment, packaging, interim storage, transport to final end state, which may be geological disposal). By considering the waste life cycle as a whole, critical analysis of assumed constraints may lead to cost savings for the UK Tax Payer. For example, it may be possible to challenge the requirements for packaging wastes for disposal to deliver an optimised waste life cycle. It is likely that the challenges faced in the UK are shared in other countries. It is therefore likely that the opportunities identified may also apply elsewhere, with the potential for sharing information to enable value to be shared. (authors)

  18. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    DEFF Research Database (Denmark)

    Helle, K.B.; Müller, T.O.; Astrup, Poul

    2014-01-01

    of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64......Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often...... source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given...

  19. Multiobjective optimisation of energy systems and building envelope retrofit in a residential community

    International Nuclear Information System (INIS)

    Wu, Raphael; Mavromatidis, Georgios; Orehounig, Kristina; Carmeliet, Jan

    2017-01-01

    Highlights: • Simultaneous optimisation of building envelope retrofit and energy systems. • Retrofit and energy systems change interact and should be considered simultaneously. • Case study quantifies cost-GHG emission tradeoffs for different retrofit options. - Abstract: In this paper, a method for a multi-objective and simultaneous optimisation of building energy systems and retrofit is presented. Tailored to be suitable for the diverse range of existing buildings in terms of age, size, and use, it combines dynamic energy demand simulation to explore individual retrofit scenarios with an energy hub optimisation. Implemented as an epsilon-constrained mixed integer linear program (MILP), the optimisation matches envelope retrofit with renewable and high efficiency energy supply technologies such as biomass boilers, heat pumps, photovoltaic and solar thermal panels to minimise life cycle cost and greenhouse gas (GHG) emissions. Due to its multi-objective, integrated assessment of building transformation options and its ability to capture both individual building characteristics and trends within a neighbourhood, this method is aimed to provide developers, neighbourhood and town policy makers with the necessary information to make adequate decisions. Our method is deployed in a case study of typical residential buildings in the Swiss village of Zernez, simulating energy demands in EnergyPlus and solving the optimisation problem with CPLEX. Although common trade-offs in energy system and retrofit choice can be observed, optimisation results suggest that the diversity in building age and size leads to optimal strategies for retrofitting and building system solutions, which are specific to different categories. With this method, GHG emissions of the entire community can be reduced by up to 76% at a cost increase of 3% compared to the current emission levels, if an optimised solution is selected for each building category.

  20. Design and optimisation of dual-mode heat pump systems using natural fluids

    International Nuclear Information System (INIS)

    Zhang Wenling; Klemeš, Jiří Jaromír; Kim, Jin-Kuk

    2012-01-01

    The paper introduces new multi-period modelling and design methodology for dual-mode heat pumps using natural fluids. First, a mathematical model is developed to capture thermodynamic and operating characteristics of dual-mode heat pump systems, subject to different ambient temperatures. The multi-period optimisation framework has been developed to reflect different ambient conditions and its influences on heat pump performance, as well as to determine a system capacity of heat pump which allows systematic economic trade-offs between supplementary heating (or cooling) and operating cost for heat pump. Case study considering three geographical locations with different heating and cooling demands is presented to illustrate the importance of using multi-period optimisation for the design of heat pump systems.

  1. Spatial-structural interaction and strain energy structural optimisation

    NARCIS (Netherlands)

    Hofmeyer, H.; Davila Delgado, J.M.; Borrmann, A.; Geyer, P.; Rafiq, Y.; Wilde, de P.

    2012-01-01

    A research engine iteratively transforms spatial designs into structural designs and vice versa. Furthermore, spatial and structural designs are optimised. It is suggested to optimise a structural design by evaluating the strain energy of its elements and by then removing, adding, or changing the

  2. Profile control studies for JET optimised shear regime

    Energy Technology Data Exchange (ETDEWEB)

    Litaudon, X.; Becoulet, A.; Eriksson, L.G.; Fuchs, V.; Huysmans, G.; How, J.; Moreau, D.; Rochard, F.; Tresset, G.; Zwingmann, W. [Association Euratom-CEA, CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint-Paul-lez-Durance (France); Bayetti, P.; Joffrin, E.; Maget, P.; Mayorat, M.L.; Mazon, D.; Sarazin, Y. [JET Abingdon, Oxfordshire (United Kingdom); Voitsekhovitch, I. [Universite de Provence, LPIIM, Aix-Marseille 1, 13 (France)

    2000-03-01

    This report summarises the profile control studies, i.e. preparation and analysis of JET Optimised Shear plasmas, carried out during the year 1999 within the framework of the Task-Agreement (RF/CEA/02) between JET and the Association Euratom-CEA/Cadarache. We report on our participation in the preparation of the JET Optimised Shear experiments together with their comprehensive analyses and the modelling. Emphasis is put on the various aspects of pressure profile control (core and edge pressure) together with detailed studies of current profile control by non-inductive means, in the prospects of achieving steady, high performance, Optimised Shear plasmas. (authors)

  3. Optimisation of radiation protection

    International Nuclear Information System (INIS)

    1988-01-01

    Optimisation of radiation protection is one of the key elements in the current radiation protection philosophy. The present system of dose limitation was issued in 1977 by the International Commission on Radiological Protection (ICRP) and includes, in addition to the requirements of justification of practices and limitation of individual doses, the requirement that all exposures be kept as low as is reasonably achievable, taking social and economic factors into account. This last principle is usually referred to as optimisation of radiation protection, or the ALARA principle. The NEA Committee on Radiation Protection and Public Health (CRPPH) organised an ad hoc meeting, in liaison with the NEA committees on the safety of nuclear installations and radioactive waste management. Separate abstracts were prepared for individual papers presented at the meeting

  4. Bioprocessing of low-level radioactive and mixed hazard wastes

    International Nuclear Information System (INIS)

    Stoner, D.L.

    1990-01-01

    Biologically-based treatment technologies are currently being developed at the Idaho National Engineering Laboratory (INEL) to aid in volume reduction and/or reclassification of low-level radioactive and mixed hazardous wastes prior to processing for disposal. The approaches taken to treat low-level radioactive and mixed wastes will reflect the physical (e.g., liquid, solid, slurry) and chemical (inorganic and/or organic) nature of the waste material being processed. Bioprocessing utilizes the diverse metabolic and biochemical characteristics of microorganisms. The application of bioadsorption and bioflocculation to reduce the volume of low-level radioactive waste are strategies comparable to the use of ion-exchange resins and coagulants that are currently used in waste reduction processes. Mixed hazardous waste would require organic as well as radionuclide treatment processes. Biodegradation of organic wastes or bioemulsification could be used in conjunction with radioisotope bioadsorption methods to treat mixed hazardous radioactive wastes. The degradation of the organic constituents of mixed wastes can be considered an alternative to incineration, while the use of bioemulsification may simply be used as a means to separate inorganic and organics to enable reclassification of wastes. The proposed technology base for the biological treatment of low-level radioactive and mixed hazardous waste has been established. Biodegradation of a variety of organic compounds that are typically found in mixed hazardous wastes has been demonstrated, degradative pathways determined and the nutritional requirements of the microorganisms are understood. Accumulation, adsorption and concentration of heavy and transition metal species and transuranics by microorganisms is widely recognized. Work at the INEL focuses on the application of demonstrated microbial transformations to process development

  5. Analysis of optimisation method for a two-stroke piston ring using the Finite Element Method and the Simulated Annealing Method

    Science.gov (United States)

    Kaliszewski, M.; Mazuro, P.

    2016-09-01

    Simulated Annealing Method of optimisation for the sealing piston ring geometry is tested. The aim of optimisation is to develop ring geometry which would exert demanded pressure on a cylinder just while being bended to fit the cylinder. Method of FEM analysis of an arbitrary piston ring geometry is applied in an ANSYS software. The demanded pressure function (basing on formulae presented by A. Iskra) as well as objective function are introduced. Geometry definition constructed by polynomials in radial coordinate system is delivered and discussed. Possible application of Simulated Annealing Method in a piston ring optimisation task is proposed and visualised. Difficulties leading to possible lack of convergence of optimisation are presented. An example of an unsuccessful optimisation performed in APDL is discussed. Possible line of further optimisation improvement is proposed.

  6. TEM turbulence optimisation in stellarators

    Science.gov (United States)

    Proll, J. H. E.; Mynick, H. E.; Xanthopoulos, P.; Lazerson, S. A.; Faber, B. J.

    2016-01-01

    With the advent of neoclassically optimised stellarators, optimising stellarators for turbulent transport is an important next step. The reduction of ion-temperature-gradient-driven turbulence has been achieved via shaping of the magnetic field, and the reduction of trapped-electron mode (TEM) turbulence is addressed in the present paper. Recent analytical and numerical findings suggest TEMs are stabilised when a large fraction of trapped particles experiences favourable bounce-averaged curvature. This is the case for example in Wendelstein 7-X (Beidler et al 1990 Fusion Technol. 17 148) and other Helias-type stellarators. Using this knowledge, a proxy function was designed to estimate the TEM dynamics, allowing optimal configurations for TEM stability to be determined with the STELLOPT (Spong et al 2001 Nucl. Fusion 41 711) code without extensive turbulence simulations. A first proof-of-principle optimised equilibrium stemming from the TEM-dominated stellarator experiment HSX (Anderson et al 1995 Fusion Technol. 27 273) is presented for which a reduction of the linear growth rates is achieved over a broad range of the operational parameter space. As an important consequence of this property, the turbulent heat flux levels are reduced compared with the initial configuration.

  7. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2005-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi...... of application messages to frames. Optimisation heuristics for frame packing aimed at producing a schedulable system are presented. Extensive experiments and a real-life example show the efficiency of the frame-packing approach....

  8. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi...... of application messages to frames. Optimisation heuristics for frame packing aimed at producing a schedulable system are presented. Extensive experiments and a real-life example show the efficiency of the frame-packing approach....

  9. Isogeometric Analysis and Shape Optimisation

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Gersborg, Allan Roulund

    of the whole domain. So in every optimisation cycle we need to extend a parametrisation of the boundary of a domain to the whole domain. It has to be fast in order not to slow the optimisation down but it also has to be robust and give a parametrisation of high quality. These are conflicting requirements so we...... will explain how the validity of a parametrisation can be checked and we will describe various ways to parametrise a domain. We will in particular study the Winslow functional which turns out to have some desirable properties. Other problems we touch upon is clustering of boundary control points (design...

  10. Optimisation of occupational exposure

    International Nuclear Information System (INIS)

    Webb, G.A.M.; Fleishman, A.B.

    1982-01-01

    The general concept of the optimisation of protection of the public is briefly described. Some ideas being developed for extending the cost benefit framework to include radiation workers with full implementation of the ALARA criterion are described. The role of cost benefit analysis in radiological protection and the valuation of health detriment including the derivation of monetary values and practical implications are discussed. Cost benefit analysis can lay out for inspection the doses, the associated health detriment costs and the costs of protection for alternative courses of action. However it is emphasised that the cost benefit process is an input to decisions on what is 'as low as reasonably achievable' and not a prescription for making them. (U.K.)

  11. Time varying acceleration coefficients particle swarm optimisation (TVACPSO): A new optimisation algorithm for estimating parameters of PV cells and modules

    International Nuclear Information System (INIS)

    Jordehi, Ahmad Rezaee

    2016-01-01

    Highlights: • A modified PSO has been proposed for parameter estimation of PV cells and modules. • In the proposed modified PSO, acceleration coefficients are changed during run. • The proposed modified PSO mitigates premature convergence problem. • Parameter estimation problem has been solved for both PV cells and PV modules. • The results show that proposed PSO outperforms other state of the art algorithms. - Abstract: Estimating circuit model parameters of PV cells/modules represents a challenging problem. PV cell/module parameter estimation problem is typically translated into an optimisation problem and is solved by metaheuristic optimisation problems. Particle swarm optimisation (PSO) is considered as a popular and well-established optimisation algorithm. Despite all its advantages, PSO suffers from premature convergence problem meaning that it may get trapped in local optima. Personal and social acceleration coefficients are two control parameters that, due to their effect on explorative and exploitative capabilities, play important roles in computational behavior of PSO. In this paper, in an attempt toward premature convergence mitigation in PSO, its personal acceleration coefficient is decreased during the course of run, while its social acceleration coefficient is increased. In this way, an appropriate tradeoff between explorative and exploitative capabilities of PSO is established during the course of run and premature convergence problem is significantly mitigated. The results vividly show that in parameter estimation of PV cells and modules, the proposed time varying acceleration coefficients PSO (TVACPSO) offers more accurate parameters than conventional PSO, teaching learning-based optimisation (TLBO) algorithm, imperialistic competitive algorithm (ICA), grey wolf optimisation (GWO), water cycle algorithm (WCA), pattern search (PS) and Newton algorithm. For validation of the proposed methodology, parameter estimation has been done both for

  12. Treatment of supermarket vegetable wastes to be used as alternative substrates in bioprocesses.

    Science.gov (United States)

    Díaz, Ana Isabel; Laca, Amanda; Laca, Adriana; Díaz, Mario

    2017-09-01

    Fruits and vegetables have the highest wastage rates at retail and consumer levels. These wastes have promising potential for being used as substrates in bioprocesses. However, an effective hydrolysis of carbohydrates that form these residues has to be developed before the biotransformation. In this work, vegetable wastes from supermarket (tomatoes, green peppers and potatoes) have been separately treated by acid, thermal and enzymatic hydrolysis processes in order to maximise the concentration of fermentable sugars in the final broth. For all substrates, thermal and enzymatic processes have shown to be the most effective. A new combined hydrolysis procedure including these both treatments was also assayed and the enzymatic step was successfully modelled. With this combined hydrolysis, the percentage of reducing sugars extracted was increased, in comparison with the amount extracted from non-hydrolysed samples, approximately by 30% in the case of tomato and green peeper wastes. For potato wastes this percentage increased from values lower than 1% to 77%. In addition, very low values of fermentation inhibitors were found in the final broth. Copyright © 2017. Published by Elsevier Ltd.

  13. Establishing Local Reference Dose Values and Optimisation Strategies

    International Nuclear Information System (INIS)

    Connolly, P.; Moores, B.M.

    2000-01-01

    The revised EC Patient Directive 97/43 EURATOM introduces the concepts of clinical audit, diagnostic reference levels and optimisation of radiation protection in diagnostic radiology. The application of reference dose levels in practice involves the establishment of reference dose values as actual measurable operational quantities. These values should then form part of an ongoing optimisation and audit programme against which routine performance can be compared. The CEC Quality Criteria for Radiographic Images provides guidance reference dose values against which local performance can be compared. In many cases these values can be improved upon quite considerably. This paper presents the results of a local initiative in the North West of the UK aimed at establishing local reference dose values for a number of major hospital sites. The purpose of this initiative is to establish a foundation for both optimisation strategies and clinical audit as an ongoing and routine practice. The paper presents results from an ongoing trial involving patient dose measurements for several radiological examinations upon the sites. The results of an attempt to establish local reference dose values from measured dose values and to employ them in optimisation strategies are presented. In particular emphasis is placed on the routine quality control programmes necessary to underpin this strategy including the effective data management of results from such programmes and how they can be employed to optimisation practices. (author)

  14. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  15. Utility systems operation: Optimisation-based decision making

    International Nuclear Information System (INIS)

    Velasco-Garcia, Patricia; Varbanov, Petar Sabev; Arellano-Garcia, Harvey; Wozny, Guenter

    2011-01-01

    Utility systems provide heat and power to industrial sites. The importance of operating these systems in an optimal way has increased significantly due to the unstable and in the long term rising prices of fossil fuels as well as the need for reducing the greenhouse gas emissions. This paper presents an analysis of the problem for supporting operator decision making under conditions of variable steam demands from the production processes on an industrial site. An optimisation model has been developed, where besides for running the utility system, also the costs associated with starting-up the operating units have been modelled. The illustrative case study shows that accounting for the shut-downs and start-ups of utility operating units can bring significant cost savings. - Highlights: → Optimisation methodology for decision making on running utility systems. → Accounting for varying steam demands. → Optimal operating specifications when a demand change occurs. → Operating costs include start-up costs of boilers and other units. → Validated on a real-life case study. Up to 20% cost savings are possible.

  16. Development of an optimised 1:1 physiotherapy intervention post first-time lumbar discectomy: a mixed-methods study

    Science.gov (United States)

    Rushton, A; White, L; Heap, A; Heneghan, N; Goodwin, P

    2016-01-01

    Objectives To develop an optimised 1:1 physiotherapy intervention that reflects best practice, with flexibility to tailor management to individual patients, thereby ensuring patient-centred practice. Design Mixed-methods combining evidence synthesis, expert review and focus groups. Setting Secondary care involving 5 UK specialist spinal centres. Participants A purposive panel of clinical experts from the 5 spinal centres, comprising spinal surgeons, inpatient and outpatient physiotherapists, provided expert review of the draft intervention. Purposive samples of patients (n=10) and physiotherapists (n=10) (inpatient/outpatient physiotherapists managing patients with lumbar discectomy) were invited to participate in the focus groups at 1 spinal centre. Methods A draft intervention developed from 2 systematic reviews; a survey of current practice and research related to stratified care was circulated to the panel of clinical experts. Lead physiotherapists collaborated with physiotherapy and surgeon colleagues to provide feedback that informed the intervention presented at 2 focus groups investigating acceptability to patients and physiotherapists. The focus groups were facilitated by an experienced facilitator, recorded in written and tape-recorded forms by an observer. Tape recordings were transcribed verbatim. Data analysis, conducted by 2 independent researchers, employed an iterative and constant comparative process of (1) initial descriptive coding to identify categories and subsequent themes, and (2) deeper, interpretive coding and thematic analysis enabling concepts to emerge and overarching pattern codes to be identified. Results The intervention reflected best available evidence and provided flexibility to ensure patient-centred care. The intervention comprised up to 8 sessions of 1:1 physiotherapy over 8 weeks, starting 4 weeks postsurgery. The intervention was acceptable to patients and physiotherapists. Conclusions A rigorous process informed an

  17. Design and manufacture of Portland cement - application of sensitivity analysis in exploration and optimisation Part II. Optimisation

    DEFF Research Database (Denmark)

    Svinning, K.; Høskuldsson, Agnar

    2006-01-01

    A program for a model-based optimisation has been developed. The program contains two subprograms. The first one does minimising or maximising constrained by one original PLS-component or one equal to a combination of several. The second one does searching for the optimal combination of PLS-compo......-components, which gives max or min y. The program has proved to be applicable for achieving realistic results for implementation in the design of Portland cement with respect to performance and in the quality control during production....

  18. Process and Economic Optimisation of a Milk Processing Plant with Solar Thermal Energy

    DEFF Research Database (Denmark)

    Bühler, Fabian; Nguyen, Tuong-Van; Elmegaard, Brian

    2016-01-01

    . Based on the case study of a dairy factory, where first a heat integration is performed to optimise the system, a model for solar thermal process integration is developed. The detailed model is based on annual hourly global direct and diffuse solar radiation, from which the radiation on a defined......This work investigates the integration of solar thermal systems for process energy use. A shift from fossil fuels to renewable energy could be beneficial both from environmental and economic perspectives, after the process itself has been optimised and efficiency measures have been implemented...... surface is calculated. Based on hourly process stream data from the dairy factory, the optimal streams for solar thermal process integration are found, with an optimal thermal storagetank volume. The last step consists of an economic optimisation of the problem to determine the optimal size...

  19. Validation of a large-scale audit technique for CT dose optimisation

    International Nuclear Information System (INIS)

    Wood, T. J.; Davis, A. W.; Moore, C. S.; Beavis, A. W.; Saunderson, J. R.

    2008-01-01

    The expansion and increasing availability of computed tomography (CT) imaging means that there is a greater need for the development of efficient optimisation strategies that are able to inform clinical practice, without placing a significant burden on limited departmental resources. One of the most fundamental aspects to any optimisation programme is the collection of patient dose information, which can be compared with appropriate diagnostic reference levels. This study has investigated the implementation of a large-scale audit technique, which utilises data that already exist in the radiology information system, to determine typical doses for a range of examinations on four CT scanners. This method has been validated against what is considered the 'gold standard' technique for patient dose audits, and it has been demonstrated that results equivalent to the 'standard-sized patient' can be inferred from this much larger data set. This is particularly valuable where CT optimisation is concerned as it is considered a 'high dose' technique, and hence close monitoring of patient dose is particularly important. (authors)

  20. Optimisation of Transmission Systems by use of Phase Shifting Transformers

    Energy Technology Data Exchange (ETDEWEB)

    Verboomen, J

    2008-10-13

    In this thesis, transmission grids with PSTs (Phase Shifting Transformers) are investigated. In particular, the following goals are put forward: (a) The analysis and quantification of the impact of a PST on a meshed grid. This includes the development of models for the device; (b) The development of methods to obtain optimal coordination of several PSTs in a meshed grid. An objective function should be formulated, and an optimisation method must be adopted to solve the problem; and (c) The investigation of different strategies to use a PST. Chapter 2 gives a short overview of active power flow controlling devices. In chapter 3, a first step towards optimal PST coordination is taken. In chapter 4, metaheuristic optimisation methods are discussed. Chapter 5 introduces DC load flow approximations, leading to analytically closed equations that describe the relation between PST settings and active power flows. In chapter 6, some applications of the methods that are developed in earlier chapters are presented. Chapter 7 contains the conclusions of this thesis, as well as recommendations for future work.

  1. Effectiveness of an implementation optimisation intervention aimed at increasing parent engagement in HENRY, a childhood obesity prevention programme - the Optimising Family Engagement in HENRY (OFTEN) trial: study protocol for a randomised controlled trial.

    Science.gov (United States)

    Bryant, Maria; Burton, Wendy; Cundill, Bonnie; Farrin, Amanda J; Nixon, Jane; Stevens, June; Roberts, Kim; Foy, Robbie; Rutter, Harry; Hartley, Suzanne; Tubeuf, Sandy; Collinson, Michelle; Brown, Julia

    2017-01-24

    Family-based interventions to prevent childhood obesity depend upon parents' taking action to improve diet and other lifestyle behaviours in their families. Programmes that attract and retain high numbers of parents provide an enhanced opportunity to improve public health and are also likely to be more cost-effective than those that do not. We have developed a theory-informed optimisation intervention to promote parent engagement within an existing childhood obesity prevention group programme, HENRY (Health Exercise Nutrition for the Really Young). Here, we describe a proposal to evaluate the effectiveness of this optimisation intervention in regard to the engagement of parents and cost-effectiveness. The Optimising Family Engagement in HENRY (OFTEN) trial is a cluster randomised controlled trial being conducted across 24 local authorities (approximately 144 children's centres) which currently deliver HENRY programmes. The primary outcome will be parental enrolment and attendance at the HENRY programme, assessed using routinely collected process data. Cost-effectiveness will be presented in terms of primary outcomes using acceptability curves and through eliciting the willingness to pay for the optimisation from HENRY commissioners. Secondary outcomes include the longitudinal impact of the optimisation, parent-reported infant intake of fruits and vegetables (as a proxy to compliance) and other parent-reported family habits and lifestyle. This innovative trial will provide evidence on the implementation of a theory-informed optimisation intervention to promote parent engagement in HENRY, a community-based childhood obesity prevention programme. The findings will be generalisable to other interventions delivered to parents in other community-based environments. This research meets the expressed needs of commissioners, children's centres and parents to optimise the potential impact that HENRY has on obesity prevention. A subsequent cluster randomised controlled pilot

  2. Estimators for initial conditions for optimisation in learning hydraulic systems

    NARCIS (Netherlands)

    Post, W.J.A.E.M.; Burrows, C.R.; Edge, K.A.

    1998-01-01

    In Learning Hydraulic Systems (LHS1. developed at the Eindhoven University of Technology, a specialised optimisation routine is employed In order to reduce energy losses in hydraulic systems. Typical load situations which can be managed by LHS are variable cyclic loads, as can be observed In many

  3. Structural-electrical coupling optimisation for radiating and scattering performances of active phased array antenna

    Science.gov (United States)

    Wang, Congsi; Wang, Yan; Wang, Zhihai; Wang, Meng; Yuan, Shuai; Wang, Weifeng

    2018-04-01

    It is well known that calculating and reducing of radar cross section (RCS) of the active phased array antenna (APAA) are both difficult and complicated. It remains unresolved to balance the performance of the radiating and scattering when the RCS is reduced. Therefore, this paper develops a structure and scattering array factor coupling model of APAA based on the phase errors of radiated elements generated by structural distortion and installation error of the array. To obtain the optimal radiating and scattering performance, an integrated optimisation model is built to optimise the installation height of all the radiated elements in normal direction of the array, in which the particle swarm optimisation method is adopted and the gain loss and scattering array factor are selected as the fitness function. The simulation indicates that the proposed coupling model and integrated optimisation method can effectively decrease the RCS and that the necessary radiating performance can be simultaneously guaranteed, which demonstrate an important application value in engineering design and structural evaluation of APAA.

  4. Some Current Problems in Optimisation of Radiation Protection System

    International Nuclear Information System (INIS)

    Franic, Z.; Prlic, I.

    2001-01-01

    Full text: The current system of radiation protection is generally based on recommendations promulgated in the International Commission on Radiological Protection (ICRP) publication 60. These principles and recommendations were subsequently adopted by the International Atomic Energy Agency (IAEA) in International Basic Safety Standards for Protection against Ionising Radiation and for the Safety of Radiation Sources (BSS). However, in recent years certain problems have arisen such as application of risk factors at low doses, use and interpretation of a collective dose, concept of dose commitment, optimisation of all types of occupational exposure and practices, implementation of ALARA approach in the common occupational as well as in quite complex situations etc. In this paper are presented some of the issues that have to be addressed in the development of the new ICRP Recommendations that are planned to be developed in next four or five years. As the new radiation protection philosophy shifts from society-based control of stochastic risks to an individual-based policy, consequently it will require introduction of modified approach to optimisation process and probably introduction of some new dosimetric quantities. (author)

  5. Topology optimisation of micro fluidic mixers considering fluid-structure interactions with a coupled Lattice Boltzmann algorithm

    Science.gov (United States)

    Munk, David J.; Kipouros, Timoleon; Vio, Gareth A.; Steven, Grant P.; Parks, Geoffrey T.

    2017-11-01

    Recently, the study of micro fluidic devices has gained much interest in various fields from biology to engineering. In the constant development cycle, the need to optimise the topology of the interior of these devices, where there are two or more optimality criteria, is always present. In this work, twin physical situations, whereby optimal fluid mixing in the form of vorticity maximisation is accompanied by the requirement that the casing in which the mixing takes place has the best structural performance in terms of the greatest specific stiffness, are considered. In the steady state of mixing this also means that the stresses in the casing are as uniform as possible, thus giving a desired operating life with minimum weight. The ultimate aim of this research is to couple two key disciplines, fluids and structures, into a topology optimisation framework, which shows fast convergence for multidisciplinary optimisation problems. This is achieved by developing a bi-directional evolutionary structural optimisation algorithm that is directly coupled to the Lattice Boltzmann method, used for simulating the flow in the micro fluidic device, for the objectives of minimum compliance and maximum vorticity. The needs for the exploration of larger design spaces and to produce innovative designs make meta-heuristic algorithms, such as genetic algorithms, particle swarms and Tabu Searches, less efficient for this task. The multidisciplinary topology optimisation framework presented in this article is shown to increase the stiffness of the structure from the datum case and produce physically acceptable designs. Furthermore, the topology optimisation method outperforms a Tabu Search algorithm in designing the baffle to maximise the mixing of the two fluids.

  6. Multifunctional optimised scope simulators in Central and Eastern Europe

    International Nuclear Information System (INIS)

    Bartak, J.; Hauesberger, P.; Dalleur, J.P.; Houard, J.

    1999-01-01

    In the field of operator training, multiple functions have to be covered such as basic principles training, training on specific systems, operations training addressing operating procedures in normal, incidental and accidental situations, plant physical phenomena analysis. Training simulators are appropriate tools to meet theses needs. Optimisation of the scope of simulation is required to meet specific training objectives and produce cost-effective solutions that allow for possible future extensions. Training needs and training programs have to be identified with the participation of final users, leading to the development of appropriate training materials: 'multifunctional' (also called analytical) optimised scope simulators are a concrete solution to meeting this challenge. For these simulators, the quality of physical models used is equivalent to that used in the full-scope replica-type simulators. Moreover, all state-of-the-art technical requirements in terms of development of training simulators, must be satisfied: realism of modelling, tolerances, simulated incidents and accidents. Examples of this concept will be illustrated in the paper through the presentation of recent developments of simulators in Central and Eastern European NPPs (VVER-1000, VVER-440, RBMK, BN600, PWR 600). A brief presentation of the software workshop used to develop these simulators concludes the paper. (author)

  7. User perspectives in public transport timetable optimisation

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    The present paper deals with timetable optimisation from the perspective of minimising the waiting time experienced by passengers when transferring either to or from a bus. Due to its inherent complexity, this bi-level minimisation problem is extremely difficult to solve mathematically, since tim...... on the large-scale public transport network in Denmark. The timetable optimisation approach yielded a yearly reduction in weighted waiting time equivalent to approximately 45 million Danish kroner (9 million USD)....

  8. Optimising of Steel Fiber Reinforced Concrete Mix Design | Beddar ...

    African Journals Online (AJOL)

    Optimising of Steel Fiber Reinforced Concrete Mix Design. ... as a result of the loss of mixture workability that will be translated into a difficult concrete casting in site. ... An experimental study of an optimisation method of fibres in reinforced ...

  9. Module detection in complex networks using integer optimisation

    Directory of Open Access Journals (Sweden)

    Tsoka Sophia

    2010-11-01

    Full Text Available Abstract Background The detection of modules or community structure is widely used to reveal the underlying properties of complex networks in biology, as well as physical and social sciences. Since the adoption of modularity as a measure of network topological properties, several methodologies for the discovery of community structure based on modularity maximisation have been developed. However, satisfactory partitions of large graphs with modest computational resources are particularly challenging due to the NP-hard nature of the related optimisation problem. Furthermore, it has been suggested that optimising the modularity metric can reach a resolution limit whereby the algorithm fails to detect smaller communities than a specific size in large networks. Results We present a novel solution approach to identify community structure in large complex networks and address resolution limitations in module detection. The proposed algorithm employs modularity to express network community structure and it is based on mixed integer optimisation models. The solution procedure is extended through an iterative procedure to diminish effects that tend to agglomerate smaller modules (resolution limitations. Conclusions A comprehensive comparative analysis of methodologies for module detection based on modularity maximisation shows that our approach outperforms previously reported methods. Furthermore, in contrast to previous reports, we propose a strategy to handle resolution limitations in modularity maximisation. Overall, we illustrate ways to improve existing methodologies for community structure identification so as to increase its efficiency and applicability.

  10. Discussion on Implementation of ICRP Recommendations Concerning Reference Levels and Optimisation

    International Nuclear Information System (INIS)

    2013-02-01

    national plans. The International Atomic Energy Agency (IAEA) has adopted the ICRP recommendations into its revised Basic Safety Standards (BSSs). In practice, the full implementation will take some time. The EGIRES undertook this work to produce a report that would hopefully contribute to the common understanding of these important concepts and that would therefore assist in the use of reference levels and optimisation in the development of protection strategies in national plans. This report provides emergency management authorities in the NEA's member countries with clear and concise information and recommendations on key issues, possible approaches, and a summary of experience for implementing the new ICRP recommendations and revised BSSs for emergency exposure situations and for resultant existing exposure situations. (authors)

  11. Optimisation of milling parameters using neural network

    Directory of Open Access Journals (Sweden)

    Lipski Jerzy

    2017-01-01

    Full Text Available The purpose of this study was to design and test an intelligent computer software developed with the purpose of increasing average productivity of milling not compromising the design features of the final product. The developed system generates optimal milling parameters based on the extent of tool wear. The introduced optimisation algorithm employs a multilayer model of a milling process developed in the artificial neural network. The input parameters for model training are the following: cutting speed vc, feed per tooth fz and the degree of tool wear measured by means of localised flank wear (VB3. The output parameter is the surface roughness of a machined surface Ra. Since the model in the neural network exhibits good approximation of functional relationships, it was applied to determine optimal milling parameters in changeable tool wear conditions (VB3 and stabilisation of surface roughness parameter Ra. Our solution enables constant control over surface roughness parameters and productivity of milling process after each assessment of tool condition. The recommended parameters, i.e. those which applied in milling ensure desired surface roughness and maximal productivity, are selected from all the parameters generated by the model. The developed software may constitute an expert system supporting a milling machine operator. In addition, the application may be installed on a mobile device (smartphone, connected to a tool wear diagnostics instrument and the machine tool controller in order to supply updated optimal parameters of milling. The presented solution facilitates tool life optimisation and decreasing tool change costs, particularly during prolonged operation.

  12. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Andreasen, Casper Schousboe; Aage, Niels

    stabilised finite elements implemented in a parallel multiphysics analysis and optimisation framework DFEM [1], developed and maintained in house. Focus is put on control of the temperature field within the solid structure and the problems can therefore be seen as conjugate heat transfer problems, where heat...... conduction governs in the solid parts of the design domain and couples to convection-dominated heat transfer to a surrounding fluid. Both loosely coupled and tightly coupled problems are considered. The loosely coupled problems are convection-diffusion problems, based on an advective velocity field from...

  13. 1st oPAC Topical Workshop: Grand Challenges in Accelerator Optimisation

    CERN Document Server

    2013-01-01

    Accelerators are key instruments for fundamental research, health and industry applications. International collaboration is very important for their continued optimisation. To address this oPAC is organising this two-day international workshop on Grand Challenges in Accelerator Optimisation. The workshop will provide an overview of the current state of the art in beam physics, numerical simulations and beam instrumentation and highlight existing limitations. It will discuss research and development being undertaken and ambitions to further improve the performance of existing and future facilities. In addition to invited talks, there will be industry displays and a special seminar covering recent LHC discoveries. All participants will have an opportunity to contribute a poster.

  14. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  15. Noise aspects at aerodynamic blade optimisation projects

    International Nuclear Information System (INIS)

    Schepers, J.G.

    1997-06-01

    The Netherlands Energy Research Foundation (ECN) has often been involved in industrial projects, in which blade geometries are created automatic by means of numerical optimisation. Usually, these projects aim at the determination of the aerodynamic optimal wind turbine blade, i.e. the goal is to design a blade which is optimal with regard to energy yield. In other cases, blades have been designed which are optimal with regard to cost of generated energy. However, it is obvious that the wind turbine blade designs which result from these optimisations, are not necessarily optimal with regard to noise emission. In this paper an example is shown of an aerodynamic blade optimisation, using the ECN-program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. 11 figs., 8 refs

  16. Achieving a Sustainable Urban Form through Land Use Optimisation: Insights from Bekasi City’s Land-Use Plan (2010–2030

    Directory of Open Access Journals (Sweden)

    Rahmadya Trias Handayanto

    2017-02-01

    Full Text Available Cities worldwide have been trying to achieve a sustainable urban form to handle their rapid urban growth. Many sustainable urban forms have been studied and two of them, the compact city and the eco city, were chosen in this study as urban form foundations. Based on these forms, four sustainable city criteria (compactness, compatibility, dependency, and suitability were considered as necessary functions for land use optimisation. This study presents a land use optimisation as a method for achieving a sustainable urban form. Three optimisation methods (particle swarm optimisation, genetic algorithms, and a local search method were combined into a single hybrid optimisation method for land use in Bekasi city, Indonesia. It was also used for examining Bekasi city’s land-use-plan (2010–2030 after optimising current (2015 and future land use (2030. After current land use optimisation, the score of sustainable city criteria increased significantly. Three important centres of land use (commercial, industrial, and residential were also created through clustering the results. These centres were slightly different from centres of the city plan zones. Additional land uses in 2030 were predicted using a nonlinear autoregressive neural network with external input. Three scenarios were used for allocating these additional land uses including sustainable development, government policy, and business-as-usual. Future land use allocation in 2030 found that the sustainable development scenario showed better performance compared to government policy and business-as-usual scenarios.

  17. Optimal design and operation of a photovoltaic-electrolyser system using particle swarm optimisation

    Science.gov (United States)

    Sayedin, Farid; Maroufmashat, Azadeh; Roshandel, Ramin; Khavas, Sourena Sattari

    2016-07-01

    In this study, hydrogen generation is maximised by optimising the size and the operating conditions of an electrolyser (EL) directly connected to a photovoltaic (PV) module at different irradiance. Due to the variations of maximum power points of the PV module during a year and the complexity of the system, a nonlinear approach is considered. A mathematical model has been developed to determine the performance of the PV/EL system. The optimisation methodology presented here is based on the particle swarm optimisation algorithm. By this method, for the given number of PV modules, the optimal sizeand operating condition of a PV/EL system areachieved. The approach can be applied for different sizes of PV systems, various ambient temperatures and different locations with various climaticconditions. The results show that for the given location and the PV system, the energy transfer efficiency of PV/EL system can reach up to 97.83%.

  18. DACIA LOGAN LIVE AXLE OPTIMISATION USING COMPUTER GRAPHICS

    Directory of Open Access Journals (Sweden)

    KIRALY Andrei

    2017-05-01

    Full Text Available The paper presents some contributions to the calculus and optimisation of a live axle used at Dacia Logan using computer graphics software for creating the model and afterwards using FEA evaluation to determine the effectiveness of the optimisation. Thus using specialized computer software, a simulation is made and the results were compared to the measured real prototype.

  19. Mesh dependence in PDE-constrained optimisation an application in tidal turbine array layouts

    CERN Document Server

    Schwedes, Tobias; Funke, Simon W; Piggott, Matthew D

    2017-01-01

    This book provides an introduction to PDE-constrained optimisation using finite elements and the adjoint approach. The practical impact of the mathematical insights presented here are demonstrated using the realistic scenario of the optimal placement of marine power turbines, thereby illustrating the real-world relevance of best-practice Hilbert space aware approaches to PDE-constrained optimisation problems. Many optimisation problems that arise in a real-world context are constrained by partial differential equations (PDEs). That is, the system whose configuration is to be optimised follows physical laws given by PDEs. This book describes general Hilbert space formulations of optimisation algorithms, thereby facilitating optimisations whose controls are functions of space. It demonstrates the importance of methods that respect the Hilbert space structure of the problem by analysing the mathematical drawbacks of failing to do so. The approaches considered are illustrated using the optimisation problem arisin...

  20. A conceptual optimisation strategy for radiography in a digital environment

    International Nuclear Information System (INIS)

    Baath, M.; Haakansson, M.; Hansson, J.; Maansson, L. G.

    2005-01-01

    Using a completely digital environment for the entire imaging process leads to new possibilities for optimisation of radiography since many restrictions of screen/film systems, such as the small dynamic range and the lack of possibilities for image processing, do not apply any longer. However, at the same time these new possibilities lead to a more complicated optimisation process, since more freedom is given to alter parameters. This paper focuses on describing an optimisation strategy that concentrates on taking advantage of the conceptual differences between digital systems and screen/film systems. The strategy can be summarised as: (a) always include the anatomical background during the optimisation, (b) perform all comparisons at a constant effective dose and (c) separate the image display stage from the image collection stage. A three-step process is proposed where the optimal setting of the technique parameters is determined at first, followed by an optimisation of the image processing. In the final step the optimal dose level - given the optimal settings of the image collection and image display stages - is determined. (authors)

  1. Reliability analysis and optimisation of subsea compression system facing operational covariate stresses

    International Nuclear Information System (INIS)

    Okaro, Ikenna Anthony; Tao, Longbin

    2016-01-01

    This paper proposes an enhanced Weibull-Corrosion Covariate model for reliability assessment of a system facing operational stresses. The newly developed model is applied to a Subsea Gas Compression System planned for offshore West Africa to predict its reliability index. System technical failure was modelled by developing a Weibull failure model incorporating a physically tested corrosion profile as stress in order to quantify the survival rate of the system under additional operational covariates including marine pH, temperature and pressure. Using Reliability Block Diagrams and enhanced Fusell-Vesely formulations, the whole system was systematically decomposed to sub-systems to analyse the criticality of each component and optimise them. Human reliability was addressed using an enhanced barrier weighting method. A rapid degradation curve is obtained on a subsea system relative to the base case subjected to a time-dependent corrosion stress factor. It reveals that subsea system components failed faster than their Mean time to failure specifications from Offshore Reliability Database as a result of cumulative marine stresses exertion. The case study demonstrated that the reliability of a subsea system can be systematically optimised by modelling the system under higher technical and organisational stresses, prioritising the critical sub-systems and making befitting provisions for redundancy and tolerances. - Highlights: • Novel Weibull Corrosion-Covariate model for reliability analysis of subsea assets. • Predict the accelerated degradation profile of a subsea gas compression. • An enhanced optimisation method based on Fusell-Vesely decomposition process. • New optimisation approach for smoothening of over- and under-designed components. • Demonstrated a significant improvement in producing more realistic failure rate.

  2. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  3. Dynamics of complex interconnected systems: Networks and bioprocesses[A NATO study seminary

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, Line K

    2005-07-01

    Rapid detection of chemical and biological agents and weapons, and rapid diagnosis of their effects on people will require molecular recognition as well as signal discrimination, i.e. avoiding false positives and negatives, and signal transduction. It will be important to have reagentless, cheap, easily manufactured sensors that can be field deployed in large numbers. While this problem is urgent it is not yet solved. This ASI brought together researchers with various interests and background including theoretical physicists, soft condensed matter experimentalists, biological physicists, and molecular biologists to identify and discuss areas where synergism between modem physics and biology may be most fruitfully applied to the study of bioprocesses for molecular recognition and of networks for converting molecular reactions into usable signals and appropriate responses. (Author)

  4. Optimisation of a novel trailing edge concept for a high lift device

    CSIR Research Space (South Africa)

    Botha, JDM

    2014-09-01

    Full Text Available A novel concept (referred to as the flap extension) is implemented on the leading edge of the flap of a three element high lift device. The concept is optimised using two optimisation approaches based on Genetic Algorithm optimisations. A zero order...

  5. Distributed optimisation problem with communication delay and external disturbance

    Science.gov (United States)

    Tran, Ngoc-Tu; Xiao, Jiang-Wen; Wang, Yan-Wu; Yang, Wu

    2017-12-01

    This paper investigates the distributed optimisation problem for the multi-agent systems (MASs) with the simultaneous presence of external disturbance and the communication delay. To solve this problem, a two-step design scheme is introduced. In the first step, based on the internal model principle, the internal model term is constructed to compensate the disturbance asymptotically. In the second step, a distributed optimisation algorithm is designed to solve the distributed optimisation problem based on the MASs with the simultaneous presence of disturbance and communication delay. Moreover, in the proposed algorithm, each agent interacts with its neighbours through the connected topology and the delay occurs during the information exchange. By utilising Lyapunov-Krasovskii functional, the delay-dependent conditions are derived for both slowly and fast time-varying delay, respectively, to ensure the convergence of the algorithm to the optimal solution of the optimisation problem. Several numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.

  6. Development of a model for optimisation of a power plant mix by means of evolution strategy; Modellentwicklung zur Kraftwerksparkoptimierung mit Hilfe von Evolutionsstrategien

    Energy Technology Data Exchange (ETDEWEB)

    Roth, Hans

    2008-09-17

    Within the scope of this thesis a model based on evolution strategy is depicted, which optimises the upgrade of an existing power plant mix. In doing so the optimisation problem is divided in two sections covering the building of new power plants as well as their ideal usage within the persisting power plant mix. The building of new power plants is optimised by means of mutations, while their ideal usage is specified by a heuristic classification according to the merit order of the power plant mix. By applying a residual yearly load curve the consumer load can be modelled, incorporating the impact of fluctuating power generation and its probability of occurrence. Power plant failures and the duration of revisions are adequately considered by means of a power reduction factor. The optimisation furthermore accommodates a limiting threshold for yearly carbon dioxide emissions as well as a premature decommissioning of power plants. (orig.)

  7. SINGLE FIXED CRANE OPTIMISATION WITHIN A DISTRIBUTION CENTRE

    Directory of Open Access Journals (Sweden)

    J. Matthews

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper considersthe optimisation of the movement of a fixed crane operating in a single aisle of a distribution centre. The crane must move pallets in inventory between docking bays, storage locations, and picking lines. Both a static and a dynamic approach to the problem are presented. The optimisation is performed by means of tabu search, ant colony metaheuristics,and hybrids of these two methods. All these solution approaches were tested on real life data obtained from an operational distribution centre. Results indicate that the hybrid methods outperform the other approaches.

    AFRIKAANSE OPSOMMING: Die optimisering van die beweging van 'n vaste hyskraan in 'n enkele gang van 'n distribusiesentrum word in hierdie artikel beskou. Die hyskraan moet pallette vervoer tussen dokhokke, stoorposisies, en opmaaklyne. Beide 'n statiese en 'n dinamiese benadering tot die probleem word aangebied. Die optimisering word gedoen met behulp van tabu-soektogte, mierkolonieoptimisering,en hibriede van hierdie twee metodes. Al die oplossingsbenaderings is getoets met werklike data wat van 'n operasionele distribusiesentrum verkry is. Die resultate toon aan dat die hibriedmetodes die beste oplossings lewer.

  8. Optimisation of the performance of a novel rotationally asymmetrical optical concentrator design for building integrated photovoltaic system

    International Nuclear Information System (INIS)

    Abu-Bakar, Siti Hawa; Muhammad-Sukki, Firdaus; Freier, Daria; Ramirez-Iniguez, Roberto; Mallick, Tapas Kumar; Munir, Abu Bakar; Mohd Yasin, Siti Hajar; Abubakar Mas'ud, Abdullahi; Md Yunus, Norhidayah

    2015-01-01

    Solar energy is one of the renewable energy sources that has shown promising potential in addressing the world's energy needs, particularly via the solar PV (photovoltaic) technology. However, the high cost of installation is still being considered as the main obstacle to the widespread adoption of solar PV system. The use of solar concentrators is one of the solutions that could help to produce lower cost solar PV systems. One of the existing concentrator designs is known as the RADTIRC (rotationally asymmetrical dielectric totally internally reflecting concentrator) which was developed in GCU (Glasgow Caledonian University) since 2010. This paper aims at optimising the existing RADTIRC prototype by increasing its electrical output whilst keeping the cost of the system at minimum. This is achieved by adopting a better material and a different technique to fabricate the concentrator. The optimised RADTIRC prototype was fabricated from PMMA (polymethyl-methacrylate) using injection moulding. It was found that the optimised RADTIRC-PV prototype generated an opto-electronic gain of 4.48 when compared with the bare cell under STC (standard test conditions). A comparison with the old prototype showed that the optimised RADTIRC-PV prototype increased the short circuit current by 13.57% under STC. - Highlights: • An optimisation of the performance of the RADTIRC was presented. • The optimised prototype was fabricated from PMMA using injection moulding. • The electrical and optical performances were investigated. • The optimised prototype generated an opto-electronic gain of 4.48x

  9. Optimising resource management in neurorehabilitation.

    Science.gov (United States)

    Wood, Richard M; Griffiths, Jeff D; Williams, Janet E; Brouwers, Jakko

    2014-01-01

    To date, little research has been published regarding the effective and efficient management of resources (beds and staff) in neurorehabilitation, despite being an expensive service in limited supply. To demonstrate how mathematical modelling can be used to optimise service delivery, by way of a case study at a major 21 bed neurorehabilitation unit in the UK. An automated computer program for assigning weekly treatment sessions is developed. Queue modelling is used to construct a mathematical model of the hospital in terms of referral submissions to a waiting list, admission and treatment, and ultimately discharge. This is used to analyse the impact of hypothetical strategic decisions on a variety of performance measures and costs. The project culminates in a hybridised model of these two approaches, since a relationship is found between the number of therapy hours received each week (scheduling output) and length of stay (queuing model input). The introduction of the treatment scheduling program has substantially improved timetable quality (meaning a better and fairer service to patients) and has reduced employee time expended in its creation by approximately six hours each week (freeing up time for clinical work). The queuing model has been used to assess the effect of potential strategies, such as increasing the number of beds or employing more therapists. The use of mathematical modelling has not only optimised resources in the short term, but has allowed the optimality of longer term strategic decisions to be assessed.

  10. Natural Erosion of Sandstone as Shape Optimisation.

    Science.gov (United States)

    Ostanin, Igor; Safonov, Alexander; Oseledets, Ivan

    2017-12-11

    Natural arches, pillars and other exotic sandstone formations have always been attracting attention for their unusual shapes and amazing mechanical balance that leave a strong impression of intelligent design rather than the result of a stochastic process. It has been recently demonstrated that these shapes could have been the result of the negative feedback between stress and erosion that originates in fundamental laws of friction between the rock's constituent particles. Here we present a deeper analysis of this idea and bridge it with the approaches utilized in shape and topology optimisation. It appears that the processes of natural erosion, driven by stochastic surface forces and Mohr-Coulomb law of dry friction, can be viewed within the framework of local optimisation for minimum elastic strain energy. Our hypothesis is confirmed by numerical simulations of the erosion using the topological-shape optimisation model. Our work contributes to a better understanding of stochastic erosion and feasible landscape formations that could be found on Earth and beyond.

  11. Optimisation of timetable-based, stochastic transit assignment models based on MSA

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker; Frederiksen, Rasmus Dyhr

    2006-01-01

    (CRM), such a large-scale transit assignment model was developed and estimated. The Stochastic User Equilibrium problem was solved by the Method of Successive Averages (MSA). However, the model suffered from very large calculation times. The paper focuses on how to optimise transit assignment models...

  12. Techno-economic optimisation of energy systems; Contribution a l'optimisation technico-economique de systemes energetiques

    Energy Technology Data Exchange (ETDEWEB)

    Mansilla Pellen, Ch

    2006-07-15

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  13. A national optimisation model for energy wood streams; Energiapuuvirtojen valtakunnallinen optimointimalli

    Energy Technology Data Exchange (ETDEWEB)

    Iikkanen, P.; Keskinen, S.; Korpilahti, A.; Raesaenen, T.; Sirkiae, A.

    2011-07-01

    In 2010 a total of 12,5 terawatt hours of forest energy was used in Finland's heat and power plants. According to studies by Metsaeteho and Poeyry, use of energy wood will nearly double to 21.6 terawatt hours by 2020. There are also plans to use energy wood as a raw material for biofuel plants. The techno-ecological supply potential of energy wood in 2020 is estimated at 42.9 terawatt hours. Energy wood has been transported almost entirely by road. The situation is changing, however, because growing demand for energy wood will expand raw wood procurement areas and lengthen transport distances. A cost-effective transport system therefore also requires the use of rail and waterway transports. In Finland, however, there is almost a complete absence of the terminals required for the use of rail and waterway transports; where energy wood is chipped, temporarily stored and loaded onto railway wagons and vessels for further transport. A national optimisation model for energy wood has been developed to serve transport system planning in particular. The linear optimisation model optimises, on a national level, goods streams between supply points and usage points based on forest energy procurement costs. The model simultaneously covers deliveries of forest chips, stumps and small-sized thinning wood. The procurement costs used in the optimisation include the costs of the energy wood's roadside price, chipping, transport and terminal handling. The transport system described in the optimisation model consists of wood supply points (2007 municipality precision), wood usage points, railway terminals and the connections between them along the main road and rail network. Elements required for the examination of waterway transports can also be easily added to the model. The optimisation model can be used to examine, for example, the effects of changes of energy wood demand and supply as well as transport costs on energy wood goods streams, the relative use of different

  14. Application of Surpac and Whittle Software in Open Pit Optimisation ...

    African Journals Online (AJOL)

    Application of Surpac and Whittle Software in Open Pit Optimisation and Design. ... This paper studies the Surpac and Whittle software and their application in designing an optimised pit. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  15. Topology Optimisation of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Thike Aye Min

    2016-01-01

    Full Text Available Wireless sensor networks are widely used in a variety of fields including industrial environments. In case of a clustered network the location of cluster head affects the reliability of the network operation. Finding of the optimum location of the cluster head, therefore, is critical for the design of a network. This paper discusses the optimisation approach, based on the brute force algorithm, in the context of topology optimisation of a cluster structure centralised wireless sensor network. Two examples are given to verify the approach that demonstrate the implementation of the brute force algorithm to find an optimum location of the cluster head.

  16. Application of ant colony optimisation in distribution transformer sizing

    African Journals Online (AJOL)

    This study proposes an optimisation method for transformer sizing in power system using ant colony optimisation and a verification of the process by MATLAB software. The aim is to address the issue of transformer sizing which is a major challenge affecting its effective performance, longevity, huge capital cost and power ...

  17. A reliability-based maintenance technicians' workloads optimisation model with stochastic consideration

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2016-06-01

    The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.

  18. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation.

    Science.gov (United States)

    Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D; Christie, Steven D R

    2017-01-01

    Additive manufacturing or '3D printing' is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis.

  19. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    Directory of Open Access Journals (Sweden)

    Andrew J. Capel

    2017-01-01

    Full Text Available Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis.

  20. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    Science.gov (United States)

    Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D

    2017-01-01

    Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis. PMID:28228852

  1. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  2. Optimisation of Software-Defined Networks Performance Using a Hybrid Intelligent System

    Directory of Open Access Journals (Sweden)

    Ann Sabih

    2017-06-01

    Full Text Available This paper proposes a novel intelligent technique that has been designed to optimise the performance of Software Defined Networks (SDN. The proposed hybrid intelligent system has employed integration of intelligence-based optimisation approaches with the artificial neural network. These heuristic optimisation methods include Genetic Algorithms (GA and Particle Swarm Optimisation (PSO. These methods were utilised separately in order to select the best inputs to maximise SDN performance. In order to identify SDN behaviour, the neural network model is trained and applied. The maximal optimisation approach has been identified using an analytical approach that considered SDN performance and the computational time as objective functions. Initially, the general model of the neural network was tested with unseen data before implementing the model using GA and PSO to determine the optimal performance of SDN. The results showed that the SDN represented by Artificial Neural Network ANN, and optmised by PSO, generated a better configuration with regards to computational efficiency and performance index.

  3. Optimisation of energy supply at off-grid healthcare facilities using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Dufo-López, Rodolfo; Pérez-Cebollada, Eduardo; Bernal-Agustín, José L.; Martínez-Ruiz, Ignacio

    2016-01-01

    Highlights: • We study the application of renewable energies in a hospital located in Kalonge. • A stochastic approach is developed by means of Monte Carlo simulation. • We propose adding PV panels to improve the supply of electrical energy. • The results show that optimal design could achieve 28% reduction in the LCE. • Furthermore, we discuss possible improvements to the telecommunications of the hospital. - Abstract: In this paper, we present a methodology for the optimisation of off-grid hybrid systems (photovoltaic–diesel–battery systems). A stochastic approach is developed by means of Monte Carlo simulation to consider the uncertainties of irradiation and load. The optimisation is economic; that is, we look for a system with a lower net present cost including installation, replacement of the components, operation and maintenance, etc. The most important variable that must be estimated is the batteries lifespan, which depends on the operating conditions (charge/discharge cycles, corrosion, state of charge, etc.). Previous works used classical methods for the estimation of batteries lifespan, which can be too optimistic in many cases, obtaining a net present cost of the system much lower than in reality. In this work, we include an advanced weighted Ah-throughput model for the lead-acid batteries, which is much more realistic. The optimisation methodology presented in this paper is applied in the optimisation of the electrical supply for an off-grid hospital located in Kalonge (Democratic Republic of the Congo). At the moment, the power supply relies on a diesel generator; batteries are used in order to ensure the basic supply of energy when the generator is unavailable (night hours). The optimisation includes the possibility of adding solar photovoltaic (PV) panels to improve the supply of electrical energy. The results show that optimal design could achieve a 28% reduction in the levelised cost of energy and a 54% reduction in the diesel fuel

  4. Microbial ecology of fermentative hydrogen producing bioprocesses: useful insights for driving the ecosystem function.

    Science.gov (United States)

    Cabrol, Lea; Marone, Antonella; Tapia-Venegas, Estela; Steyer, Jean-Philippe; Ruiz-Filippi, Gonzalo; Trably, Eric

    2017-03-01

    One of the most important biotechnological challenges is to develop environment friendly technologies to produce new sources of energy. Microbial production of biohydrogen through dark fermentation, by conversion of residual biomass, is an attractive solution for short-term development of bioH2 producing processes. Efficient biohydrogen production relies on complex mixed communities working in tight interaction. Species composition and functional traits are of crucial importance to maintain the ecosystem service. The analysis of microbial community revealed a wide phylogenetic diversity that contributes in different-and still mostly unclear-ways to hydrogen production. Bridging this gap of knowledge between microbial ecology features and ecosystem functionality is essential to optimize the bioprocess and develop strategies toward a maximization of the efficiency and stability of substrate conversion. The aim of this review is to provide a comprehensive overview of the most up-to-date biodata available and discuss the main microbial community features of biohydrogen engineered ecosystems, with a special emphasis on the crucial role of interactions and the relationships between species composition and ecosystem service. The elucidation of intricate relationships between community structure and ecosystem function would make possible to drive ecosystems toward an improved functionality on the basis of microbial ecology principles. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Harvesting and transport operations to optimise biomass supply chain and industrial biorefinery processes

    Directory of Open Access Journals (Sweden)

    Robert Matindi

    2018-10-01

    Full Text Available In Australia, Bioenergy plays an important role in modern power systems, where many biomass resources provide greenhouse gas neutral and electricity at a variety of scales. By 2050, the Biomass energy is projected to have a 40-50 % share as an alternative source of energy. In addition to conversion of biomass, barriers and uncertainties in the production, supply may hinder biomass energy development. The sugarcane is an essential ingredient in the production of Bioenergy, across the whole spectrum ranging from the first generation to second generation, e.g., production of energy from the lignocellulosic component of the sugarcane initially regarded as waste (bagasse and cane residue. Sustainable recovery of the Lignocellulosic component of sugarcane from the field through a structured process is largely unknown and associated with high capital outlay that have stifled the growth of bioenergy sector. In this context, this paper develops a new scheduler to optimise the recovery of lignocellulosic component of sugarcane and cane, transport and harvest systems with reducing the associated costs and operational time. An Optimisation Algorithm called Limited Discrepancy Search has been adapted and integrated with the developed scheduling transport algorithms. The developed algorithms are formulated and coded by Optimization Programming Language (OPL to obtain the optimised cane and cane residues transport schedules. Computational experiments demonstrate that high-quality solutions are obtainable for industry-scale instances. To provide insightful decisions, sensitivity analysis is conducted in terms of different scenarios and criteria.

  6. The new ICRP recommendations' project: A broader approach of the optimisation of radiation protection

    International Nuclear Information System (INIS)

    Lochard, J.

    2005-01-01

    In the framework of the preparation of its new recommendations ICRP has developed a new text on the optimisation of radiological protection. This text prolongs the previous publications on the principle (Publications 37 and 45) reminding the need to adopt a pragmatic approach combining quantitative techniques when they are relevant as well as know-how and past experience which are often sufficient to ensure good protection. Moreover, it aims at adapting the optimisation process to the recent evolutions of risk management with the increasing role of stakeholder involvement in the decision framing. (author)

  7. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    is extended with stochastic branching, message passing and reward annotations which allow for the modelling of resources consumed during the execution of a business process. Further, it is shown how this structure can be used to formalise the established business process modelling language Business Process...... fault tree analysis and the automated optimisation of business processes by means of an evolutionary algorithm. This work is motivated by problems that stem from the healthcare sector, and examples encountered in this field are used to illustrate these developments....

  8. Ants Colony Optimisation of a Measuring Path of Prismatic Parts on a CMM

    Directory of Open Access Journals (Sweden)

    Stojadinovic Slavenko M.

    2016-03-01

    Full Text Available This paper presents optimisation of a measuring probe path in inspecting the prismatic parts on a CMM. The optimisation model is based on: (i the mathematical model that establishes an initial collision-free path presented by a set of points, and (ii the solution of Travelling Salesman Problem (TSP obtained with Ant Colony Optimisation (ACO. In order to solve TSP, an ACO algorithm that aims to find the shortest path of ant colony movement (i.e. the optimised path is applied. Then, the optimised path is compared with the measuring path obtained with online programming on CMM ZEISS UMM500 and with the measuring path obtained in the CMM inspection module of Pro/ENGINEER® software. The results of comparing the optimised path with the other two generated paths show that the optimised path is at least 20% shorter than the path obtained by on-line programming on CMM ZEISS UMM500, and at least 10% shorter than the path obtained by using the CMM module in Pro/ENGINEER®.

  9. PHYSICAL-MATEMATICALSCIENCE MECHANICS SIMULATION CHALLENGES IN OPTIMISING THEORETICAL METAL CUTTING TASKS

    Directory of Open Access Journals (Sweden)

    Rasul V. Guseynov

    2017-01-01

    Full Text Available Abstract. Objectives In the article, problems in the optimising of machining operations, which provide end-unit production of the required quality with a minimum processing cost, are addressed. Methods Increasing the effectiveness of experimental research was achieved through the use of mathematical methods for planning experiments for optimising metal cutting tasks. The minimal processing cost model, in which the objective function is polynomial, is adopted as a criterion for the selection of optimal parameters. Results Polynomial models of the influence of angles φ, α, γ on the torque applied when cutting threads in various steels are constructed. Optimum values of the geometrical tool parameters were obtained using the criterion of minimum cutting forces during processing. The high stability of tools having optimal geometric parameters is determined. It is shown that the use of experimental planning methods allows the optimisation of cutting parameters. In optimising solutions to metal cutting problems, it is found to be expedient to use multifactor experimental planning methods and to select the cutting force as the optimisation parameter when determining tool geometry. Conclusion The joint use of geometric programming and experiment planning methods in order to optimise the parameters of cutting significantly increases the efficiency of technological metal processing approaches. 

  10. Solar photo-Fenton optimisation in treating carbofuran-contaminated water

    Directory of Open Access Journals (Sweden)

    Manuel Alejandro Hernández-Shek

    2012-01-01

    Full Text Available Box-Benkhen design response-surface methodology was developed to optimise photo-Fenton degradation of carbofuran (C12H15NO3 by using a compound parabolic collector pilot plant. The four variables considered in Box-Benkhen design model included carbofuran degradation percentage, initial carbofuran concentration, hydrogen peroxide [H2O2] concentration and iron [Fe2+] concentration. Degradation was monitored by using total organic carbon concentration and high-performance liquid chromatography. A 93.2 mg l-1 carbofuran concentration was completely degraded in t30W = 15 min with 17.1 mg l-1 Fe2+ and 121.6 mg l-1 H2O2. Photo-Fenton degradation led to 76.7% mineralisation. Biodegradability during optimisation was evaluated by using the BOD5/COD ratio; this value increased from 0.04 at the beginning of the process to 0.52 in t30W = 20 min, thereby showing the effectiveness of using biological treatments.

  11. Water distribution systems design optimisation using metaheuristics and hyperheuristics

    Directory of Open Access Journals (Sweden)

    DN Raad

    2011-06-01

    Full Text Available The topic of multi-objective water distribution systems (WDS design optimisation using metaheuristics is investigated, comparing numerous modern metaheuristics, including sev- eral multi-objective evolutionary algorithms, an estimation of distribution algorithm and a recent hyperheuristic named AMALGAM (an evolutionary framework for the simultaneous incorporation of multiple metaheuristics, in order to determine which approach is most capa- ble with respect to WDS design optimisation. Novel metaheuristics and variants of existing algorithms are developed, for a total of twenty-three algorithms examined. Testing with re- spect to eight small-to-large-sized WDS benchmarks from the literature reveal that the four top-performing algorithms are mutually non-dominated with respect to the various perfor- mance metrics used. These algorithms are NSGA-II, TAMALGAMJndu , TAMALGAMndu and AMALGAMSndp (the last three being novel variants of AMALGAM. However, when these four algorithms are applied to the design of a very large real-world benchmark, the AMALGAM paradigm outperforms NSGA-II convincingly, with AMALGAMSndp exhibiting the best performance overall.

  12. Computed tomography dose optimisation in cystic fibrosis: A review.

    LENUS (Irish Health Repository)

    Ferris, Helena

    2016-04-28

    Cystic fibrosis (CF) is the most common autosomal recessive disease of the Caucasian population worldwide, with respiratory disease remaining the most relevant source of morbidity and mortality. Computed tomography (CT) is frequently used for monitoring disease complications and progression. Over the last fifteen years there has been a six-fold increase in the use of CT, which has lead to a growing concern in relation to cumulative radiation exposure. The challenge to the medical profession is to identify dose reduction strategies that meet acceptable image quality, but fulfil the requirements of a diagnostic quality CT. Dose-optimisation, particularly in CT, is essential as it reduces the chances of patients receiving cumulative radiation doses in excess of 100 mSv, a dose deemed significant by the United Nations Scientific Committee on the Effects of Atomic Radiation. This review article explores the current trends in imaging in CF with particular emphasis on new developments in dose optimisation.

  13. Optimisation of rocker sole footwear for prevention of first plantar ulcer: comparison of group-optimised and individually-selected footwear designs.

    Science.gov (United States)

    Preece, Stephen J; Chapman, Jonathan D; Braunstein, Bjoern; Brüggemann, Gert-Peter; Nester, Christopher J

    2017-01-01

    Appropriate footwear for individuals with diabetes but no ulceration history could reduce the risk of first ulceration. However, individuals who deem themselves at low risk are unlikely to seek out bespoke footwear which is personalised. Therefore, our primary aim was to investigate whether group-optimised footwear designs, which could be prefabricated and delivered in a retail setting, could achieve appropriate pressure reduction, or whether footwear selection must be on a patient-by-patient basis. A second aim was to compare responses to footwear design between healthy participants and people with diabetes in order to understand the transferability of previous footwear research, performed in healthy populations. Plantar pressures were recorded from 102 individuals with diabetes, considered at low risk of ulceration. This cohort included 17 individuals with peripheral neuropathy. We also collected data from 66 healthy controls. Each participant walked in 8 rocker shoe designs (4 apex positions × 2 rocker angles). ANOVA analysis was then used to understand the effect of two design features and descriptive statistics used to identify the group-optimised design. Using 200 kPa as a target, this group-optimised design was then compared to the design identified as the best for each participant (using plantar pressure data). Peak plantar pressure increased significantly as apex position was moved distally and rocker angle reduced ( p  footwear which was individually selected. In terms of optimised footwear designs, healthy participants demonstrated the same response as participants with diabetes, despite having lower plantar pressures. This is the first study demonstrating that a group-optimised, generic rocker shoe might perform almost as well as footwear selected on a patient by patient basis in a low risk patient group. This work provides a starting point for clinical evaluation of generic versus personalised pressure reducing footwear.

  14. Optimisation of searches for Supersymmetry with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Zvolsky, Milan

    2012-01-15

    The ATLAS experiment is one of the four large experiments at the Large Hadron Collider which is specifically designed to search for the Higgs boson and physics beyond the Standard Model. The aim of this thesis is the optimisation of searches for Supersymmetry in decays with two leptons and missing transverse energy in the final state. Two different optimisation studies have been performed for two important analysis aspects: The final signal region selection and the choice of the trigger selection. In the first part of the analysis, a cut-based optimisation of signal regions is performed, maximising the signal for a minimal background contamination. By this, the signal yield can in parts be more than doubled. The second approach is to introduce di-lepton triggers which allow to lower the lepton transverse momentum threshold, thus enhancing the number of selected signal events significantly. The signal region optimisation was considered for the choice of the final event selection in the ATLAS di-lepton analyses. The trigger study contributed to the incorporation of di-lepton triggers to the ATLAS trigger menu. (orig.)

  15. Data for TROTS – The Radiotherapy Optimisation Test Set

    Directory of Open Access Journals (Sweden)

    Sebastiaan Breedveld

    2017-06-01

    Full Text Available The Radiotherapy Optimisation Test Set (TROTS is an extensive set of problems originating from radiotherapy (radiation therapy treatment planning. This dataset is created for 2 purposes: (1 to supply a large-scale dense dataset to measure performance and quality of mathematical solvers, and (2 to supply a dataset to investigate the multi-criteria optimisation and decision-making nature of the radiotherapy problem. The dataset contains 120 problems (patients, divided over 6 different treatment protocols/tumour types. Each problem contains numerical data, a configuration for the optimisation problem, and data required to visualise and interpret the results. The data is stored as HDF5 compatible Matlab files, and includes scripts to work with the dataset.

  16. Identification of the mechanical behaviour of biopolymer composites using multistart optimisation technique

    KAUST Repository

    Brahim, Elhacen

    2013-10-01

    This paper aims at identifying the mechanical behaviour of starch-zein composites as a function of zein content using a novel optimisation technique. Starting from bending experiments, force-deflection response is used to derive adequate mechanical parameters representing the elastic-plastic behaviour of the studied material. For such a purpose, a finite element model is developed accounting for a simple hardening rule, namely isotropic hardening model. A deterministic optimisation strategy is implemented to provide rapid matching between parameters of the constitutive law and the observed behaviour. Results are discussed based on the robustness of the numerical approach and predicted tendencies with regards to the role of zein content. © 2013 Elsevier Ltd.

  17. More of the same? Comment on "An integrated framework for the optimisation of sport and athlete development: a practitioner approach".

    Science.gov (United States)

    MacNamara, Aine; Collins, Dave

    2014-01-01

    Gulbin and colleagues (Gulbin, J. P., Croser, M. J., Morley, E. J., & Weissensteiner, J. R. (2013). An integrated framework for the optimisation of sport and athlete development: A practitioner approach. Journal of Sports Sciences) present a new sport and athlete development framework that evolved from empirical observations from working with the Australian Institute of Sport. The FTEM (Foundations, Talent, Elite, Mastery) framework is proposed to integrate general and specialised phases of development for participants within the active lifestyle, sport participation and sport excellence pathways. A number of issues concerning the FTEM framework are presented. We also propose the need to move beyond prescriptive models of talent identification and development towards a consideration of features of best practice and process markers of development together with robust guidelines about the implementation of these in applied practice.

  18. The optimisation of wedge filters in radiotherapy of the prostate

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony J.; Webb, Steve

    1995-01-01

    A treatment plan optimisation algorithm has been applied to 12 patients with early prostate cancer in order to determine the optimum beam-weights and wedge angles for a standard conformal three-field treatment technique. The optimisation algorithm was based on fast-simulated-annealing using a cost function designed to achieve a uniform dose in the planning-target-volume (PTV) and to minimise the integral doses to the organs-at-risk. The algorithm has been applied to standard conformal three-field plans created by an experienced human planner, and run in three PLAN MODES: (1) where the wedge angles were fixed by the human planner and only the beam-weights were optimised; (2) where both the wedge angles and beam-weights were optimised; and (3) where both the wedge angles and beam-weights were optimised and a non-uniform dose was prescribed to the PTV. In the latter PLAN MODE, a uniform 100% dose was prescribed to all of the PTV except for that region that overlaps with the rectum where a lower (e.g., 90%) dose was prescribed. The resulting optimised plans have been compared with those of the human planner who found beam-weights by conventional forward planning techniques. Plans were compared on the basis of dose statistics, normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results of the comparison showed that all three PLAN MODES produced plans with slightly higher TCP for the same rectal NTCP, than the human planner. The best results were observed for PLAN MODE 3, where an average increase in TCP of 0.73% (± 0.20, 95% confidence interval) was predicted by the biological models. This increase arises from a beneficial dose gradient which is produced across the tumour. Although the TCP gain is small it comes with no increase in treatment complexity, and could translate into increased cures given the large numbers of patients being referred. A study of the beam-weights and wedge angles chosen by the optimisation algorithm revealed

  19. GAOS: Spatial optimisation of crop and nature within agricultural fields

    NARCIS (Netherlands)

    Bruin, de S.; Janssen, H.; Klompe, A.; Lerink, P.; Vanmeulebrouk, B.

    2010-01-01

    This paper proposes and demonstrates a spatial optimiser that allocates areas of inefficient machine manoeuvring to field margins thus improving the use of available space and supporting map-based Controlled Traffic Farming. A prototype web service (GAOS) allows farmers to optimise tracks within

  20. Techno-economic optimisation of energy systems

    International Nuclear Information System (INIS)

    Mansilla Pellen, Ch.

    2006-07-01

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  1. Feasibility of the use of optimisation techniques to calibrate the models used in a post-closure radiological assessment

    International Nuclear Information System (INIS)

    Laundy, R.S.

    1991-01-01

    This report addresses the feasibility of the use of optimisation techniques to calibrate the models developed for the impact assessment of a radioactive waste repository. The maximum likelihood method for improving parameter estimates is considered in detail, and non-linear optimisation techniques for finding solutions are reviewed. Applications are described for the calibration of groundwater flow, radionuclide transport and biosphere models. (author)

  2. New perspectives for the petroleum industry. Bioprocesses for the selective removal of sulphur, nitrogen and metals

    International Nuclear Information System (INIS)

    Zerlia, T.

    2000-01-01

    Fuel biocatalytic conversion is a process that removes, through selective enzyme-catalyzed reactions, sulphur, nitrogen and metals. The mild operating conditions, the specificity of reactions and the quality of coproducts (particularly the organo sulphur compounds, a source for the petrochemical industry) are just a few of the attractive aspects of this new technology which could open a new world of possibilities in the technology and in the environmental impact of fuels. The paper shows the state-of-the-art of the research and applications of bioprocesses to the petroleum field [it

  3. Robustness analysis of bogie suspension components Pareto optimised values

    Science.gov (United States)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  4. Stress-optimised shape memory devices for the use in microvalves

    International Nuclear Information System (INIS)

    Skrobanek, K.D.; Kohl, M.; Miyazaki, S.

    1997-01-01

    A gas valve of 6 x 6 x 2 mm 3 size has been developed for high pressure applications. Stress-optimised shape memory microbeams of 100 μm thickness are used to control the deflection of a membrane above a valve chamber. The shape memory thin sheets have been fabricated by melting and rolling, which creates specific textures. Investigations by X-ray diffraction revealed major orientations of [111] and [011] in rolling direction. The corresponding maximum anisotropy of transformation strain was 20%. The microbeams have been fabricated by laser cutting. For stress-optimisation, the lateral widths of the beams are designed for homogeneous stress distributions along the beam surfaces allowing an optimised use of the shape memory effect and a minimisation of fatigue effects. For actuation, a rhombohedral phase transformation is used. This allows operation below pressure differences of 1200 hPa in designs with one valve chamber and below 4500 hPa in pressure-compensated designs with a second valve chamber above the membrane. Maximum gas flows of 1600 seem (seem cm 2 at standart conditions/minute) and work outputs of 35 μNm are achieved for a driving power of 210 mW. The response times for closing the valves vary between 0.5 and 1.2 s and for opening between 1 and 2 s depending on the applied pressure difference. (orig.)

  5. Assessment of grid optimisation measures for the German transmission grid using open source grid data

    Science.gov (United States)

    Böing, F.; Murmann, A.; Pellinger, C.; Bruckmeier, A.; Kern, T.; Mongin, T.

    2018-02-01

    The expansion of capacities in the German transmission grid is a necessity for further integration of renewable energy sources into the electricity sector. In this paper, the grid optimisation measures ‘Overhead Line Monitoring’, ‘Power-to-Heat’ and ‘Demand Response in the Industry’ are evaluated and compared against conventional grid expansion for the year 2030. Initially, the methodical approach of the simulation model is presented and detailed descriptions of the grid model and the used grid data, which partly originates from open-source platforms, are provided. Further, this paper explains how ‘Curtailment’ and ‘Redispatch’ can be reduced by implementing grid optimisation measures and how the depreciation of economic costs can be determined considering construction costs. The developed simulations show that the conventional grid expansion is more efficient and implies more grid relieving effects than the evaluated grid optimisation measures.

  6. Optimised performance of industrial high resolution computerised tomography

    International Nuclear Information System (INIS)

    Maangaard, M.

    2000-01-01

    The purpose of non-destructive evaluation (NDE) is to acquire knowledge of the investigated sample. Digital x-ray imaging techniques such as radiography or computerised tomography (CT) produce images of the interior of a sample. The obtained image quality determines the possibility of detecting sample related features, e.g. details and flaws. This thesis presents a method of optimising the performance of industrial X-ray equipment for the imaging task at issue in order to obtain images with high quality. CT produces maps of the X-ray linear attenuation of the sample's interior. CT can produce two dimensional cross-section images or three-dimensional images with volumetric information on the investigated sample. The image contrast and noise depend on both the investigated sample and the equipment and settings used (X-ray tube potential, X-ray filtration, exposure time, etc.). Hence, it is vital to find the optimal equipment settings in order to obtain images of high quality. To be able to mathematically optimise the image quality, it is necessary to have a model of the X-ray imaging system together with an appropriate measure of image quality. The optimisation is performed with a developed model for an X-ray image-intensifier-based radiography system. The model predicts the mean value and variance of the measured signal level in the collected radiographic images. The traditionally used measure of physical image quality is the signal-to-noise ratio (SNR). To calculate the signal-to-noise ratio, a well-defined detail (flaw) is required. It was found that maximising the SNR leads to ambiguities, the optimised settings found by maximising the SNR were dependent on the material in the detail. When CT is performed on irregular shaped samples containing density and compositional variations, it is difficult to define which SNR to use for optimisation. This difficulty is solved by the measures of physical image quality proposed here, the ratios geometry

  7. Optimising polarised neutron scattering measurements--XYZ and polarimetry analysis

    International Nuclear Information System (INIS)

    Cussen, L.D.; Goossens, D.J.

    2002-01-01

    The analytic optimisation of neutron scattering measurements made using XYZ polarisation analysis and neutron polarimetry techniques is discussed. Expressions for the 'quality factor' and the optimum division of counting time for the XYZ technique are presented. For neutron polarimetry the optimisation is identified as analogous to that for measuring the flipping ratio and reference is made to the results already in the literature

  8. Optimising polarised neutron scattering measurements--XYZ and polarimetry analysis

    CERN Document Server

    Cussen, L D

    2002-01-01

    The analytic optimisation of neutron scattering measurements made using XYZ polarisation analysis and neutron polarimetry techniques is discussed. Expressions for the 'quality factor' and the optimum division of counting time for the XYZ technique are presented. For neutron polarimetry the optimisation is identified as analogous to that for measuring the flipping ratio and reference is made to the results already in the literature.

  9. Design Optimisation and Conrol of a Pilot Operated Seat Valve

    DEFF Research Database (Denmark)

    Nielsen, Brian; Andersen, Torben Ole; Hansen, Michael Rygaard

    2004-01-01

    The paper gives an approach for optimisation of the bandwidth of a pilot operated seat valve for mobile applications. Physical dimensions as well as parameters of the implemented control loop are optimised simultaneously. The frequency response of the valve varies as a function of the pressure drop...

  10. Adjoint Optimisation of the Turbulent Flow in an Annular Diffuser

    DEFF Research Database (Denmark)

    Gotfredsen, Erik; Agular Knudsen, Christian; Kunoy, Jens Dahl

    2017-01-01

    In the present study, a numerical optimisation of guide vanes in an annular diffuser, is performed. The optimisation is preformed for the purpose of improving the following two parameters simultaneously; the first parameter is the uniformity perpen-dicular to the flow direction, a 1/3 diameter do...

  11. Multi-objective optimisation for spacecraft design for demise and survivability

    OpenAIRE

    Trisolini, Mirko; Colombo, Camilla; Lewis, Hugh

    2017-01-01

    The paper presents the development of a multi-objective optimisation framework to study the effects that preliminary design choices have on the demisability and the survivability of a spacecraft. Building a spacecraft such that most of it will demise during the re-entry through design-for-demise strategies may lead to design that are more vulnerable to space debris impacts, thus compromising the reliability of the mission. The two models developed to analyse the demisability and the survivabi...

  12. Optimisation of Protection as applicable to geological disposal: the ICRP view

    International Nuclear Information System (INIS)

    Weiss, W.

    2010-01-01

    Wolfgang Weiss (BfS), vice-chair of ICRP Committee 4, recalled that the role of optimisation is to select the best protection options under the prevailing circumstances based on scientific considerations, societal concerns and ethical aspects as well as considerations of transparency. An important role of the concept of optimisation of protection is to foster a 'safety culture' and thereby to engender a state of thinking in everyone responsible for control of radiation exposures, such that they are continuously asking themselves the question, 'Have I done all that I reasonably can to avoid or reduce these doses?' Clearly, the answer to this question is a matter of judgement and necessitates co-operation between all parties involved and, as a minimum, the operating management and the regulatory agencies, but the dialogue would be more complete if other stakeholders were also involved. What kinds of checks and balances or factors would be needed to be considered for an 'optimal' system? Can indicators be identified? Quantitative methods may provide input to this dialogue but they should never be the sole input. The ICRP considers that the parameters to take into account include also social considerations and values, environmental considerations, as well as technical and economic considerations. Wolfgang Weiss approached the question of the distinction to be made between system optimisation (in the sense of taking account of social and economic as well as of all types of hazards) and optimisation of radiological protection. The position of the ICRP is that the system of protection that it proposes is based on both science (quantification of the health risk) and value judgement (what is an acceptable risk?) and optimisation is the recommended process to integrate both aspects. Indeed, there has been evolution since the old system of intervention levels to the new system, whereby, even if the level of the dose or risk (which is called constraint in ICRP-81 ) is met

  13. Study on the evolutionary optimisation of the topology of network control systems

    Science.gov (United States)

    Zhou, Zude; Chen, Benyuan; Wang, Hong; Fan, Zhun

    2010-08-01

    Computer networks have been very popular in enterprise applications. However, optimisation of network designs that allows networks to be used more efficiently in industrial environment and enterprise applications remains an interesting research topic. This article mainly discusses the topology optimisation theory and methods of the network control system based on switched Ethernet in an industrial context. Factors that affect the real-time performance of the industrial control network are presented in detail, and optimisation criteria with their internal relations are analysed. After the definition of performance parameters, the normalised indices for the evaluation of the topology optimisation are proposed. The topology optimisation problem is formulated as a multi-objective optimisation problem and the evolutionary algorithm is applied to solve it. Special communication characteristics of the industrial control network are considered in the optimisation process. In respect to the evolutionary algorithm design, an improved arena algorithm is proposed for the construction of the non-dominated set of the population. In addition, for the evaluation of individuals, the integrated use of the dominative relation method and the objective function combination method, for reducing the computational cost of the algorithm, are given. Simulation tests show that the performance of the proposed algorithm is preferable and superior compared to other algorithms. The final solution greatly improves the following indices: traffic localisation, traffic balance and utilisation rate balance of switches. In addition, a new performance index with its estimation process is proposed.

  14. Cultural-based particle swarm for dynamic optimisation problems

    Science.gov (United States)

    Daneshyari, Moayed; Yen, Gary G.

    2012-07-01

    Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.

  15. A framework for inverse planning of beam-on times for 3D small animal radiotherapy using interactive multi-objective optimisation

    International Nuclear Information System (INIS)

    Balvert, Marleen; Den Hertog, Dick; Van Hoof, Stefan J; Granton, Patrick V; Trani, Daniela; Hoffmann, Aswin L; Verhaegen, Frank

    2015-01-01

    Advances in precision small animal radiotherapy hardware enable the delivery of increasingly complicated dose distributions on the millimeter scale. Manual creation and evaluation of treatment plans becomes difficult or even infeasible with an increasing number of degrees of freedom for dose delivery and available image data. The goal of this work is to develop an optimisation model that determines beam-on times for a given beam configuration, and to assess the feasibility and benefits of an automated treatment planning system for small animal radiotherapy.The developed model determines a Pareto optimal solution using operator-defined weights for a multiple-objective treatment planning problem. An interactive approach allows the planner to navigate towards, and to select the Pareto optimal treatment plan that yields the most preferred trade-off of the conflicting objectives. This model was evaluated using four small animal cases based on cone-beam computed tomography images. Resulting treatment plan quality was compared to the quality of manually optimised treatment plans using dose-volume histograms and metrics.Results show that the developed framework is well capable of optimising beam-on times for 3D dose distributions and offers several advantages over manual treatment plan optimisation. For all cases but the simple flank tumour case, a similar amount of time was needed for manual and automated beam-on time optimisation. In this time frame, manual optimisation generates a single treatment plan, while the inverse planning system yields a set of Pareto optimal solutions which provides quantitative insight on the sensitivity of conflicting objectives. Treatment planning automation decreases the dependence on operator experience and allows for the use of class solutions for similar treatment scenarios. This can shorten the time required for treatment planning and therefore increase animal throughput. In addition, this can improve treatment standardisation and

  16. Nuclear power plant maintenance optimisation SENUF network activity

    International Nuclear Information System (INIS)

    Ahlstrand, R.; Bieth, M.; Pla, P.; Rieg, C.; Trampus, P.

    2004-01-01

    During providing scientific and technical support to TACIS and PHARE nuclear safety programs a large amount of knowledge related to Russian design reactor systems has accumulated and led to creation of a new Network concerning Nuclear Safety in Central and Eastern Europe called ''Safety of Eastern European type Nuclear Facilities'' (SENUF). SENUF contributes to bring together all stakeholders of TACIS and PHARE: beneficiaries, end users, Eastern und Western nuclear industries, and thus, to favour fruitful technical exchanges and feedback of experience. At present the main focus of SENUF is the nuclear power plant maintenance as substantial element of plant operational safety as well as life management. A Working Group has been established on plant maintenance. One of its major tasks in 2004 is to prepare a status report on advanced strategies to optimise maintenance. Optimisation projects have an interface with the plant's overall life management program. Today, almost all plants involved in SENUF network have an explicit policy to extend their service life, thus, component ageing management, modernization and refurbishment actions became much more important. A database is also under development, which intends to help sharing the available knowledge and specific equipment and tools. (orig.)

  17. Infrastructure optimisation via MBR retrofit: a design guide.

    Science.gov (United States)

    Bagg, W K

    2009-01-01

    Wastewater management is continually evolving with the development and implementation of new, more efficient technologies. One of these is the Membrane Bioreactor (MBR). Although a relatively new technology in Australia, MBR wastewater treatment has been widely used elsewhere for over 20 years, with thousands of MBRs now in operation worldwide. Over the past 5 years, MBR technology has been enthusiastically embraced in Australia as a potential treatment upgrade option, and via retrofit typically offers two major benefits: (1) more capacity using mostly existing facilities, and (2) very high quality treated effluent. However, infrastructure optimisation via MBR retrofit is not a simple or low-cost solution and there are many factors which should be carefully evaluated before deciding on this method of plant upgrade. The paper reviews a range of design parameters which should be carefully evaluated when considering an MBR retrofit solution. Several actual and conceptual case studies are considered to demonstrate both advantages and disadvantages. Whilst optimising existing facilities and production of high quality water for reuse are powerful drivers, it is suggested that MBRs are perhaps not always the most sustainable Whole-of-Life solution for a wastewater treatment plant upgrade, especially by way of a retrofit.

  18. Consolidated bioprocessing for production of polyhydroxyalkanotes from red algae Gelidium amansii.

    Science.gov (United States)

    Sawant, Shailesh S; Salunke, Bipinchandra K; Kim, Beom Soo

    2018-04-01

    Noncompetitive carbon sources such as algae are unconventional and promising raw material for sustainable biofuel production. The capability of one marine bacterium, Saccharophagus degradans 2-40 to degrade red seaweed Gelidium amansii for production of polyhydroxyalkanoates (PHA) was evaluated in this study. S. degradans can readily attach to algae, degrade algal carbohydrates, and utilize that material as main carbon source. Minimal media containing 8g/L G. amansii were used for the growth of S. degradans. The PHA content obtained was 17-27% of dry cell weight by pure culture of S. degradans and co-culture of S. degradans and Bacillus cereus, a contaminant found with S. degradans cultures. The PHA type was found to be poly(3-hydroxybutyrate) by gas chromatography and Fourier transform-infrared spectroscopy. This work demonstrates PHA production through consolidated bioprocessing of insoluble, untreated red algae by bacterial pure culture and co-culture. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Solving dynamic multi-objective problems with vector evaluated particle swarm optimisation

    CSIR Research Space (South Africa)

    Greeff, M

    2008-06-01

    Full Text Available Many optimisation problems are multi-objective and change dynamically. Many methods use a weighted average approach to the multiple objectives. This paper introduces the usage of the vector evaluated particle swarm optimiser (VEPSO) to solve dynamic...

  20. Postprandial glucose metabolism and SCFA after consuming wholegrain rye bread and wheat bread enriched with bioprocessed rye bran in individuals with mild gastrointestinal symptoms

    DEFF Research Database (Denmark)

    Lappi, J; Mykkänen, H; Knudsen, Knud Erik Bach

    2014-01-01

    BackgroundRye bread benefits glucose metabolism. It is unknown whether the same effect is achieved by rye bran-enriched wheat bread. We tested whether white wheat bread enriched with bioprocessed rye bran (BRB + WW) and sourdough wholegrain rye bread (WGR) have similar effects on glucose metabolism...... and plasma level of short chain fatty acids (SCFAs).  MethodsTwenty-one (12 women) of 23 recruited subjects completed an intervention with a four-week run-in and two four-week test periods in cross-over design. White wheat bread (WW; 3% fibre) was consumed during the run-in, and WGR and BRB + WW (10% fibre.......05) and propionate (p = 0.009) at 30 min increased during both rye bread periods.ConclusionsBeneficial effects of WGR over white wheat bread on glucose and SCFA production were confirmed. The enrichment of the white wheat bread with bioprocessed rye bran (BRB + WW) yielded similar but not as pronounced effects than...

  1. Intelligent Support for a Computer Aided Design Optimisation Cycle

    OpenAIRE

    B. Dolšak; M. Novak; J. Kaljun

    2006-01-01

    It is becoming more and more evident that  adding intelligence  to existing computer aids, such as computer aided design systems, can lead to significant improvements in the effective and reliable performance of various engineering tasks, including design optimisation. This paper presents three different intelligent modules to be applied within a computer aided design optimisation cycle to enable more intelligent and less experience-dependent design performance. 

  2. Optimised performance of a plug-in electric vehicle aggregator in energy and reserve markets

    International Nuclear Information System (INIS)

    Shafie-khah, M.; Moghaddam, M.P.; Sheikh-El-Eslami, M.K.; Catalão, J.P.S.

    2015-01-01

    Highlights: • A new model is developed to optimise the performance of a PEV aggregator in the power market. • PEVs aggregator can combine the PEVs and manage the charge/discharge of their batteries. • A new approach to calculate the satisfaction/motivation of PEV owners is proposed. • Several uncertainties are taken into account using a two-stage stochastic programing approach. • The proposed model is proficient in significantly improving the short- and long-term behaviour. - Abstract: In this paper, a new model is developed to optimise the performance of a plug-in Electric Vehicle (EV) aggregator in electricity markets, considering both short- and long-term horizons. EV aggregator as a new player of the power market can aggregate the EVs and manage the charge/discharge of their batteries. The aggregator maximises the profit and optimises EV owners’ revenue by applying changes in tariffs to compete with other market players for retaining current customers and acquiring new owners. On this basis, a new approach to calculate the satisfaction/motivation of EV owners and their market participation is proposed in this paper. Moreover, the behaviour of owners to select their supplying company is considered. The aggregator optimises the self-scheduling programme and submits the best bidding/offering strategies to the day-ahead and real-time markets. To achieve this purpose, the day-ahead and real-time energy and reserve markets are modelled as oligopoly markets, in contrast with previous works that utilised perfectly competitive ones. Furthermore, several uncertainties and constraints are taken into account using a two-stage stochastic programing approach, which have not been addressed in previous works. The numerical studies show the effectiveness of the proposed model

  3. Simulation and optimisation modelling approach for operation of the Hoa Binh Reservoir, Vietnam

    DEFF Research Database (Denmark)

    Ngo, Long le; Madsen, Henrik; Rosbjerg, Dan

    2007-01-01

    Hoa Binh, the largest reservoir in Vietnam, plays an important role in flood control for the Red River delta and hydropower generation. Due to its multi-purpose character, conflicts and disputes in operating the reservoir have been ongoing since its construction, particularly in the flood season....... This paper proposes to optimise the control strategies for the Hoa Binh reservoir operation by applying a combination of simulation and optimisation models. The control strategies are set up in the MIKE 11 simulation model to guide the releases of the reservoir system according to the current storage level......, the hydro-meteorological conditions, and the time of the year. A heuristic global optimisation tool, the shuffled complex evolution (SCE) algorithm, is adopted for optimising the reservoir operation. The optimisation puts focus on the trade-off between flood control and hydropower generation for the Hoa...

  4. Advanced manufacturing: optimising the factories of tomorrow

    International Nuclear Information System (INIS)

    Philippon, Patrick

    2013-01-01

    Faced with competition Patrick Philippon - Les Defis du CEA no.179 - April 2013 from the emerging countries, the competitiveness of the industrialised nations depends on the ability of their industries to innovate. This strategy necessarily entails the reorganisation and optimisation of the production systems. This is the whole challenge for 'advanced manufacturing', which relies on the new information and communication technologies. Interactive robotics, virtual reality and non-destructive testing are all technological building blocks developed by CEA, now approved within a cross-cutting programme, to meet the needs of industry and together build the factories of tomorrow. (author)

  5. MIMO-Radar Waveform Design for Beampattern Using Particle-Swarm-Optimisation

    KAUST Repository

    Ahmed, Sajid

    2012-07-31

    Multiple input multiple output (MIMO) radars have many advantages over their phased-array counterparts: improved spatial resolution; better parametric identifiably and greater flexibility to acheive the desired transmit beampattern. The desired transmit beampatterns using MIMO-radar requires the waveforms to have arbitrary auto- and cross-correlations. To design such waveforms, generally a waveform covariance matrix, R, is synthesised first then the actual waveforms are designed. Synthesis of the covariance matrix, R, is a constrained optimisation problem, which requires R to be positive semidefinite and all of its diagonal elements to be equal. To simplify the first constraint the covariance matrix is synthesised indirectly from its square-root matrix U, while for the second constraint the elements of the m-th column of U are parameterised using the coordinates of the m-hypersphere. This implicitly fulfils both of the constraints and enables us to write the cost-function in closed form. Then the cost-function is optimised using a simple particle-swarm-optimisation (PSO) technique, which requires only the cost-function and can optimise any choice of norm cost-function. © 2012 IEEE.

  6. A study of certain Monte Carlo search and optimisation methods

    International Nuclear Information System (INIS)

    Budd, C.

    1984-11-01

    Studies are described which might lead to the development of a search and optimisation facility for the Monte Carlo criticality code MONK. The facility envisaged could be used to maximise a function of k-effective with respect to certain parameters of the system or, alternatively, to find the system (in a given range of systems) for which that function takes a given value. (UK)

  7. Work management to optimise occupational radiological protection

    International Nuclear Information System (INIS)

    Ahier, B.

    2009-01-01

    Although work management is no longer a new concept, continued efforts are still needed to ensure that good performance, outcomes and trends are maintained in the face of current and future challenges. The ISOE programme thus created an Expert Group on Work Management in 2007 to develop an updated report reflecting the current state of knowledge, technology and experience in the occupational radiological protection of workers at nuclear power plants. Published in 2009, the new ISOE report on Work Management to Optimise Occupational Radiological Protection in the Nuclear Power Industry provides up-to-date practical guidance on the application of work management principles. Work management measures aim at optimising occupational radiological protection in the context of the economic viability of the installation. Important factors in this respect are measures and techniques influencing i) dose and dose rate, including source- term reduction; ii) exposure, including amount of time spent in controlled areas for operations; and iii) efficiency in short- and long-term planning, worker involvement, coordination and training. Equally important due to their broad, cross-cutting nature are the motivational and organisational arrangements adopted. The responsibility for these aspects may reside in various parts of an installation's organisational structure, and thus, a multi-disciplinary approach must be recognised, accounted for and well-integrated in any work. Based on the operational experience within the ISOE programme, the following key areas of work management have been identified: - regulatory aspects; - ALARA management policy; - worker involvement and performance; - work planning and scheduling; - work preparation; - work implementation; - work assessment and feedback; - ensuring continuous improvement. The details of each of these areas are elaborated and illustrated in the report through examples and case studies arising from ISOE experience. They are intended to

  8. Design of optimised backstepping controller for the synchronisation of chaotic Colpitts oscillator using shark smell algorithm

    Science.gov (United States)

    Fouladi, Ehsan; Mojallali, Hamed

    2018-01-01

    In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master-slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm optimisation (PSO) algorithm. Simulation results show better performance in terms of accuracy and convergence for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller.

  9. Public transport optimisation emphasising passengers’ travel behaviour

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo

    to the case where the two problems are solved sequentially without taking into account interdependencies. Figure 1 - Planning public transport The PhD study develops a metaheuristic algorithm to adapt the line plan configuration in order better to match passengers’ travel demand in terms of transfers as well......Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms...... to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in the context of public transport operations. In public transport, the demand is represented...

  10. Separative power of an optimised concurrent gas centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Bogovalov, Sergey; Boman, Vladimir [National Research Nuclear University (MEPHI), Moscow (Russian Federation)

    2016-06-15

    The problem of separation of isotopes in a concurrent gas centrifuge is solved analytically for an arbitrary binary mixture of isotopes. The separative power of the optimised concurrent gas centrifuges for the uranium isotopes equals to δU = 12.7 (V/700 m/s)2(300 K/T)(L/1 m) kg·SWU/yr, where L and V are the length and linear velocity of the rotor of the gas centrifuge and T is the temperature. This equation agrees well with the empirically determined separative power of optimised counter-current gas centrifuges.

  11. ExRET-Opt: An automated exergy/exergoeconomic simulation framework for building energy retrofit analysis and design optimisation

    International Nuclear Information System (INIS)

    García Kerdan, Iván; Raslan, Rokia; Ruyssevelt, Paul; Morillón Gálvez, David

    2017-01-01

    Highlights: • Development of a building retrofit-oriented exergoeconomic-based optimisation tool. • A new exergoeconomic cost-benefit indicator is developed for design comparison. • Thermodynamic and thermal comfort variables used as constraints and/or objectives. • Irreversibilities and exergetic cost for end-use processes are substantially reduced. • Robust methodology that should be pursued in everyday building retrofit practice. - Abstract: Energy simulation tools have a major role in the assessment of building energy retrofit (BER) measures. Exergoeconomic analysis and optimisation is a common practice in sectors such as the power generation and chemical processes, aiding engineers to obtain more energy-efficient and cost-effective energy systems designs. ExRET-Opt, a retrofit-oriented modular-based dynamic simulation framework has been developed by embedding a comprehensive exergy/exergoeconomic calculation method into a typical open-source building energy simulation tool (EnergyPlus). The aim of this paper is to show the decomposition of ExRET-Opt by presenting modules, submodules and subroutines used for the framework’s development as well as verify the outputs with existing research data. In addition, the possibility to perform multi-objective optimisation analysis based on genetic-algorithms combined with multi-criteria decision making methods was included within the simulation framework. This addition could potentiate BER design teams to perform quick exergy/exergoeconomic optimisation, in order to find opportunities for thermodynamic improvements along the building’s active and passive energy systems. The enhanced simulation framework is tested using a primary school building as a case study. Results demonstrate that the proposed simulation framework provide users with thermodynamic efficient and cost-effective designs, even under tight thermodynamic and economic constraints, suggesting its use in everyday BER practice.

  12. An exergy-based multi-objective optimisation model for energy retrofit strategies in non-domestic buildings

    International Nuclear Information System (INIS)

    García Kerdan, Iván; Raslan, Rokia; Ruyssevelt, Paul

    2016-01-01

    While the building sector has a significant thermodynamic improvement potential, exergy analysis has been shown to provide new insight for the optimisation of building energy systems. This paper presents an exergy-based multi-objective optimisation tool that aims to assess the impact of a diverse range of retrofit measures with a focus on non-domestic buildings. EnergyPlus was used as a dynamic calculation engine for first law analysis, while a Python add-on was developed to link dynamic exergy analysis and a Genetic Algorithm optimisation process with the aforementioned software. Two UK archetype case studies (an office and a primary school) were used to test the feasibility of the proposed framework. Different measures combinations based on retrofitting the envelope insulation levels and the application of different HVAC configurations were assessed. The objective functions in this study are annual energy use, occupants' thermal comfort, and total building exergy destructions. A large range of optimal solutions was achieved highlighting the framework capabilities. The model achieved improvements of 53% in annual energy use, 51% of exergy destructions and 66% of thermal comfort for the school building, and 50%, 33%, and 80% for the office building. This approach can be extended by using exergoeconomic optimisation. - Highlights: • Integration of dynamic exergy analysis into a retrofit-oriented simulation tool. • Two UK non-domestic building archetypes are used as case studies. • The model delivers non-dominated solutions based on energy, exergy and comfort. • Exergy destructions of ERMs are optimised using GA algorithms. • Strengths and limitations of the proposed exergy-based framework are discussed.

  13. Auto-optimisation for three-dimensional conformal radiotherapy of nasopharyngeal carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Wu, V.W.C. E-mail: orvinwu@polyu.edu.hk; Kwong, D.W.L.; Sham, J.S.T.; Mui, A.W.L

    2003-08-01

    Purpose: The purpose of this study was to evaluate the application of auto-optimisation in the treatment planning of three-dimensional conformal radiotherapy (3DCRT) of nasopharyngeal carcinoma (NPC). Methods: Twenty-nine NPC patients were planned by both forward planning and auto-optimisation methods. The forward plans, which consisted of three coplanar facial fields, were produced according to the routine planning criteria. The auto-optimised plans, which consisted of 5-15 (median 9) fields, were generated by the planning system after prescribing the dose requirements and the importance weightings of the planning target volume and organs at risk. Plans produced by the two planning methods were compared by the dose volume histogram, tumour control probability (TCP), conformity index and normal tissue complication probability (NTCP). Results: The auto-optimised plans reduced the average planner's time by over 35 min. It demonstrated better TCP and conformity index than the forward plans (P=0.03 and 0.04, respectively). Besides, the parotid gland and temporo-mandibular (TM) joint were better spared with the mean dose reduction of 31.8 and 17.7%, respectively. The slight trade off was the mild dose increase in spinal cord and brain stem with their maximum doses remaining within the tolerance limits. Conclusions: The findings demonstrated the potentials of auto-optimisation for improving target dose and parotid sparing in the 3DCRT of NPC with saving of the planner's time.

  14. Auto-optimisation for three-dimensional conformal radiotherapy of nasopharyngeal carcinoma

    International Nuclear Information System (INIS)

    Wu, V.W.C.; Kwong, D.W.L.; Sham, J.S.T.; Mui, A.W.L.

    2003-01-01

    Purpose: The purpose of this study was to evaluate the application of auto-optimisation in the treatment planning of three-dimensional conformal radiotherapy (3DCRT) of nasopharyngeal carcinoma (NPC). Methods: Twenty-nine NPC patients were planned by both forward planning and auto-optimisation methods. The forward plans, which consisted of three coplanar facial fields, were produced according to the routine planning criteria. The auto-optimised plans, which consisted of 5-15 (median 9) fields, were generated by the planning system after prescribing the dose requirements and the importance weightings of the planning target volume and organs at risk. Plans produced by the two planning methods were compared by the dose volume histogram, tumour control probability (TCP), conformity index and normal tissue complication probability (NTCP). Results: The auto-optimised plans reduced the average planner's time by over 35 min. It demonstrated better TCP and conformity index than the forward plans (P=0.03 and 0.04, respectively). Besides, the parotid gland and temporo-mandibular (TM) joint were better spared with the mean dose reduction of 31.8 and 17.7%, respectively. The slight trade off was the mild dose increase in spinal cord and brain stem with their maximum doses remaining within the tolerance limits. Conclusions: The findings demonstrated the potentials of auto-optimisation for improving target dose and parotid sparing in the 3DCRT of NPC with saving of the planner's time

  15. Techno-Economic Models for Optimised Utilisation of Jatropha curcas Linnaeus under an Out-Grower Farming Scheme in Ghana

    Directory of Open Access Journals (Sweden)

    Isaac Osei

    2016-11-01

    Full Text Available Techno-economic models for optimised utilisation of jatropha oil under an out-grower farming scheme were developed based on different considerations for oil and by-product utilisation. Model 1: Out-grower scheme where oil is exported and press cake utilised for compost. Model 2: Out-grower scheme with six scenarios considered for the utilisation of oil and by-products. Linear programming models were developed based on outcomes of the models to optimise the use of the oil through profit maximisation. The findings revealed that Model 1 was financially viable from the processors’ perspective but not for the farmer at seed price of $0.07/kg. All scenarios considered under Model 2 were financially viable from the processors perspective but not for the farmer at seed price of $0.07/kg; however, at seed price of $0.085/kg, financial viability was achieved for both parties. Optimising the utilisation of the oil resulted in an annual maximum profit of $123,300.

  16. Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine

    Science.gov (United States)

    Erdogan, Gamze; Yavuz, Mahmut

    2017-12-01

    The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.

  17. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  18. The development and optimisation of 3D black-blood R2* mapping of the carotid artery wall.

    Science.gov (United States)

    Yuan, Jianmin; Graves, Martin J; Patterson, Andrew J; Priest, Andrew N; Ruetten, Pascal P R; Usman, Ammara; Gillard, Jonathan H

    2017-12-01

    To develop and optimise a 3D black-blood R 2 * mapping sequence for imaging the carotid artery wall, using optimal blood suppression and k-space view ordering. Two different blood suppression preparation methods were used; Delay Alternating with Nutation for Tailored Excitation (DANTE) and improved Motion Sensitive Driven Equilibrium (iMSDE) were each combined with a three-dimensional (3D) multi-echo Fast Spoiled GRadient echo (ME-FSPGR) readout. Three different k-space view-order designs: Radial Fan-beam Encoding Ordering (RFEO), Distance-Determined Encoding Ordering (DDEO) and Centric Phase Encoding Order (CPEO) were investigated. The sequences were evaluated through Bloch simulation and in a cohort of twenty volunteers. The vessel wall Signal-to-Noise Ratio (SNR), Contrast-to-Noise Ratio (CNR) and R 2 *, and the sternocleidomastoid muscle R 2 * were measured and compared. Different numbers of acquisitions-per-shot (APS) were evaluated to further optimise the effectiveness of blood suppression. All sequences resulted in comparable R 2 * measurements to a conventional, i.e. non-blood suppressed sequence in the sternocleidomastoid muscle of the volunteers. Both Bloch simulations and volunteer data showed that DANTE has a higher signal intensity and results in a higher image SNR than iMSDE. Blood suppression efficiency was not significantly different when using different k-space view orders. Smaller APS achieved better blood suppression. The use of blood-suppression preparation methods does not affect the measurement of R 2 *. DANTE prepared ME-FSPGR sequence with a small number of acquisitions-per-shot can provide high quality black-blood R 2 * measurements of the carotid vessel wall. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Production of a generic microbial feedstock for lignocellulose biorefineries through sequential bioprocessing.

    Science.gov (United States)

    Chang, Chen-Wei; Webb, Colin

    2017-03-01

    Lignocellulosic materials, mostly from agricultural and forestry residues, provide a potential renewable resource for sustainable biorefineries. Reducing sugars can be produced only after a pre-treatment stage, which normally involves chemicals but can be biological. In this case, two steps are usually necessary: solid-state cultivation of fungi for deconstruction, followed by enzymatic hydrolysis using cellulolytic enzymes. In this research, the utilisation of solid-state bioprocessing using the fungus Trichoderma longibrachiatum was implemented as a simultaneous microbial pretreatment and in-situ enzyme production method for fungal autolysis and further enzyme hydrolysis of fermented solids. Suspending the fermented solids in water at 50°C led to the highest hydrolysis yields of 226mg/g reducing sugar and 7.7mg/g free amino nitrogen (FAN). The resultant feedstock was shown to be suitable for the production of various products including ethanol. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Optimisation of electron beam characteristics by simulated annealing

    International Nuclear Information System (INIS)

    Ebert, M.A.; University of Adelaide, SA; Hoban, P.W.

    1996-01-01

    Full text: With the development of technology in the field of treatment beam delivery, the possibility of tailoring radiation beams (via manipulation of the beam's phase space) is foreseeable. This investigation involved evaluating a method for determining the characteristics of pure electron beams which provided dose distributions that best approximated desired distributions. The aim is to determine which degrees of freedom are advantageous and worth pursuing in a clinical setting. A simulated annealing routine was developed to determine optimum electron beam characteristics. A set of beam elements are defined at the surface of a homogeneous water equivalent phantom defining discrete positions and angles of incidence, and electron energies. The optimal weighting of these elements is determined by the (generally approximate) solution to the linear equation, Dw = d, where d represents the dose distribution calculated over the phantom, w the vector of (50 - 2x10 4 ) beam element relative weights, and D a normalised matrix of dose deposition kernels. In the iterative annealing procedure, beam elements are randomly selected and beam weighting distributions are sampled and used to perturb the selected elements. Perturbations are accepted or rejected according to standard simulated annealing criteria. The result (after the algorithm has terminated due to meeting an iteration or optimisation specification) is an approximate solution for the beam weight vector (w) specified by the above equation. This technique has been applied for several sample dose distributions and phase space restrictions. An example is given of the phase space obtained when endeavouring to conform to a rectangular 100% dose region with polyenergetic though normally incident electrons. For regular distributions, intuitive conclusions regarding the benefits of energy/angular manipulation may be made, whereas for complex distributions, variations in intensity over beam elements of varying energy and

  1. HERCA Position Paper. The process of CT dose optimisation through education and training and role of CT Manufacturers - October 2014. Addendum to HERCA CT Position paper: The process of CT dose optimisation through education and training and the role of the manufacturers - November 2015

    International Nuclear Information System (INIS)

    2014-10-01

    CT is the most important source of exposures to radiation in most developed countries today. For this reason CT dose optimisation is of great importance. In this position paper four main stakeholders who are involved in CT dose optimisation are identified. These are the CT manufacturers, the medical doctors, the CT technologists and the medical physicists. HERCA has been working together with the CT manufacturers and COCIR since 2010 following a self-commitment provided by COCIR in 2011. A number of dose optimisation and management tools have been developed by the CT manufacturers and are now available on modern CT scanners. These are presented in this paper. The process of CT dose optimisation can only be achieved if all the stakeholders involved work together as a team and are educated and trained in the use of CT dose optimisation and management tools. The CT manufacturers have an important role in this process. They need to ensure that their staff is properly trained, they need to provide proper education and training to the other three stakeholders involved and these three stakeholders need to find the time and be willing to be trained. This is clearly stated in this position paper with the aim of ensuring appropriate and effective use of CT imaging equipment. On 1 April 2015, HERCA organised a multi-stakeholder meeting kindly hosted by the French Nuclear Safety Authority (ASN) in its premises in Paris. The stakeholders included: - COCIR, supported by the main manufacturers of CT equipment (GE, Philips, Siemens and Toshiba), - The professional organisations: ESR, ESPR, EFRS, EANM, ESTRO and EFOMP, - The international organisations IAEA, EC, and the US FDA (present as observers). The objective of the meeting was to exchange views with a variety of key stakeholders on issues with regard to the optimised use of computed tomography (CT) scanners. The ultimate goal of this focus on dose optimisation is to ensure the best patient care by providing an optimised

  2. Visual grading characteristics and ordinal regression analysis during optimisation of CT head examinations.

    Science.gov (United States)

    Zarb, Francis; McEntee, Mark F; Rainford, Louise

    2015-06-01

    To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.

  3. Optimising Service Delivery of AAC AT Devices and Compensating AT for Dyslexia.

    Science.gov (United States)

    Roentgen, Uta R; Hagedoren, Edith A V; Horions, Katrien D L; Dalemans, Ruth J P

    2017-01-01

    To promote successful use of Assistive Technology (AT) supporting Augmentative and Alternative Communication (AAC) and compensating for dyslexia, the last steps of their provision, delivery and instruction, use, maintenance and evaluation, were optimised. In co-creation with all stakeholders based on a list of requirements an integral method and tools were developed.

  4. Credit price optimisation within retail banking

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... cost based pricing, where the price of a product or service is based on the .... function obtained from fitting a logistic regression model .... Note that the proposed optimisation approach below will allow us to also incorporate.

  5. Optimisation of Oil Production in Two – Phase Flow Reservoir Using Simultaneous Method and Interior Point Optimiser

    DEFF Research Database (Denmark)

    Lerch, Dariusz Michal; Völcker, Carsten; Capolei, Andrea

    2012-01-01

    in the reservoir. A promising decrease of these remained resources can be provided by smart wells applying water injections to sustain satisfactory pressure level in the reservoir throughout the whole process of oil production. Basically to enhance secondary recovery of the remaining oil after drilling, water...... is injected at the injection wells of the down-hole pipes. This sustains the pressure in the reservoir and drives oil towards production wells. There are however, many factors contributing to the poor conventional secondary recovery methods e.g. strong surface tension, heterogeneity of the porous rock...... fields, or closed loop optimisation, can be used for optimising the reservoir performance in terms of net present value of oil recovery or another economic objective. In order to solve an optimal control problem we use a direct collocation method where we translate a continuous problem into a discrete...

  6. Alternatives for optimisation of rumen fermentation in ruminants

    Directory of Open Access Journals (Sweden)

    T. Slavov

    2017-06-01

    Full Text Available Abstract. The proper knowledge on the variety of events occurring in the rumen makes possible their optimisation with respect to the complete feed conversion and increasing the productive performance of ruminants. The inclusion of various dietary additives (supplements, biologically active substances, nutritional antibiotics, probiotics, enzymatic preparations, plant extracts etc. has an effect on the intensity and specific pathway of fermentation, and thus, on the general digestion and systemic metabolism. The optimisation of rumen digestion is a method with substantial potential for improving the efficiency of ruminant husbandry, increasing of quality of their produce and health maintenance.

  7. Optimising a shaft's geometry by applying genetic algorithms

    Directory of Open Access Journals (Sweden)

    María Alejandra Guzmán

    2005-05-01

    Full Text Available Many engnieering design tasks involve optimising several conflicting goals; these types of problem are known as Multiobjective Optimisation Problems (MOPs. Evolutionary techniques have proved to be an effective tool for finding solutions to these MOPs during the last decade, Variations on the basic generic algorithm have been particulary proposed by different researchers for finding rapid optimal solutions to MOPs. The NSGA (Non-dominated Sorting Generic Algorithm has been implemented in this paper for finding an optimal design for a shaft subjected to cyclic loads, the conflycting goals being minimum weight and minimum lateral deflection.

  8. Reduction environmental effects of civil aircraft through multi-objective flight plan optimisation

    International Nuclear Information System (INIS)

    Lee, D S; Gonzalez, L F; Walker, R; Periaux, J; Onate, E

    2010-01-01

    With rising environmental alarm, the reduction of critical aircraft emissions including carbon dioxides (CO 2 ) and nitrogen oxides (NO x ) is one of most important aeronautical problems. There can be many possible attempts to solve such problem by designing new wing/aircraft shape, new efficient engine, etc. The paper rather provides a set of acceptable flight plans as a first step besides replacing current aircrafts. The paper investigates a green aircraft design optimisation in terms of aircraft range, mission fuel weight (CO 2 ) and NO x using advanced Evolutionary Algorithms coupled to flight optimisation system software. Two multi-objective design optimisations are conducted to find the best set of flight plans for current aircrafts considering discretised altitude and Mach numbers without designing aircraft shape and engine types. The objectives of first optimisation are to maximise range of aircraft while minimising NO x with constant mission fuel weight. The second optimisation considers minimisation of mission fuel weight and NO x with fixed aircraft range. Numerical results show that the method is able to capture a set of useful trade-offs that reduce NO x and CO 2 (minimum mission fuel weight).

  9. Topology optimisation of passive coolers for light-emitting diode lamps

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    2015-01-01

    This work applies topology optimisation to the design of passive coolers for light-emitting diode (LED) lamps. The heat sinks are cooled by the natural convection currents arising from the temperature difference between the LED lamp and the surrounding air. A large scale parallel computational....... The optimisation results show interesting features that are currently being incorporated into industrial designs for enhanced passive cooling abilities....

  10. Nuclear power plant maintenance optimisation SENUF network activity

    Energy Technology Data Exchange (ETDEWEB)

    Ahlstrand, R.; Bieth, M.; Pla, P.; Rieg, C.; Trampus, P. [Inst. for Energy, EC DG Joint Research Centre, Petten (Netherlands)

    2004-07-01

    During providing scientific and technical support to TACIS and PHARE nuclear safety programs a large amount of knowledge related to Russian design reactor systems has accumulated and led to creation of a new Network concerning Nuclear Safety in Central and Eastern Europe called ''Safety of Eastern European type Nuclear Facilities'' (SENUF). SENUF contributes to bring together all stakeholders of TACIS and PHARE: beneficiaries, end users, Eastern und Western nuclear industries, and thus, to favour fruitful technical exchanges and feedback of experience. At present the main focus of SENUF is the nuclear power plant maintenance as substantial element of plant operational safety as well as life management. A Working Group has been established on plant maintenance. One of its major tasks in 2004 is to prepare a status report on advanced strategies to optimise maintenance. Optimisation projects have an interface with the plant's overall life management program. Today, almost all plants involved in SENUF network have an explicit policy to extend their service life, thus, component ageing management, modernization and refurbishment actions became much more important. A database is also under development, which intends to help sharing the available knowledge and specific equipment and tools. (orig.)

  11. Optimising parallel R correlation matrix calculations on gene expression data using MapReduce.

    Science.gov (United States)

    Wang, Shicai; Pandis, Ioannis; Johnson, David; Emam, Ibrahim; Guitton, Florian; Oehmichen, Axel; Guo, Yike

    2014-11-05

    High-throughput molecular profiling data has been used to improve clinical decision making by stratifying subjects based on their molecular profiles. Unsupervised clustering algorithms can be used for stratification purposes. However, the current speed of the clustering algorithms cannot meet the requirement of large-scale molecular data due to poor performance of the correlation matrix calculation. With high-throughput sequencing technologies promising to produce even larger datasets per subject, we expect the performance of the state-of-the-art statistical algorithms to be further impacted unless efforts towards optimisation are carried out. MapReduce is a widely used high performance parallel framework that can solve the problem. In this paper, we evaluate the current parallel modes for correlation calculation methods and introduce an efficient data distribution and parallel calculation algorithm based on MapReduce to optimise the correlation calculation. We studied the performance of our algorithm using two gene expression benchmarks. In the micro-benchmark, our implementation using MapReduce, based on the R package RHIPE, demonstrates a 3.26-5.83 fold increase compared to the default Snowfall and 1.56-1.64 fold increase compared to the basic RHIPE in the Euclidean, Pearson and Spearman correlations. Though vanilla R and the optimised Snowfall outperforms our optimised RHIPE in the micro-benchmark, they do not scale well with the macro-benchmark. In the macro-benchmark the optimised RHIPE performs 2.03-16.56 times faster than vanilla R. Benefiting from the 3.30-5.13 times faster data preparation, the optimised RHIPE performs 1.22-1.71 times faster than the optimised Snowfall. Both the optimised RHIPE and the optimised Snowfall successfully performs the Kendall correlation with TCGA dataset within 7 hours. Both of them conduct more than 30 times faster than the estimated vanilla R. The performance evaluation found that the new MapReduce algorithm and its

  12. Bioprocessing of lignite coals using reductive microorganisms. Final technical report, September 30, 1988--March 29, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D.L.

    1992-03-29

    In order to convert lignite coals into liquid fuels, gases or chemical feedstock, the macromolecular structure of the coal must be broken down into low molecular weight fractions prior to further modification. Our research focused on this aspect of coal bioprocessing. We isolated, characterized and studied the lignite coal-depolymerizing organisms Streptomyces viridosporus T7A, Pseudomonas sp. DLC-62, unidentified bacterial strain DLC-BB2 and Gram-positive Bacillus megaterium strain DLC-21. In this research we showed that these bacteria are able to solubilize and depolymerize lignite coals using a combination of biological mechanisms including the excretion of coal solublizing basic chemical metabolites and extracellular coal depolymerizing enzymes.

  13. Crystal structure optimisation using an auxiliary equation of state

    Science.gov (United States)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron

    2015-11-01

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.

  14. Crystal structure optimisation using an auxiliary equation of state

    International Nuclear Information System (INIS)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; 3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of))" data-affiliation=" (Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Global E3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of))" >Walsh, Aron

    2015-01-01

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy–volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other “beyond” density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu 2 ZnSnS 4 and the magnetic metal-organic framework HKUST-1

  15. Crystal structure optimisation using an auxiliary equation of state

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T. [Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Walsh, Aron, E-mail: a.walsh@bath.ac.uk [Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Global E" 3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of)

    2015-11-14

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy–volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other “beyond” density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu{sub 2}ZnSnS{sub 4} and the magnetic metal-organic framework HKUST-1.

  16. Cogeneration technologies, optimisation and implementation

    CERN Document Server

    Frangopoulos, Christos A

    2017-01-01

    Cogeneration refers to the use of a power station to deliver two or more useful forms of energy, for example, to generate electricity and heat at the same time. This book provides an integrated treatment of cogeneration, including a tour of the available technologies and their features, and how these systems can be analysed and optimised.

  17. Optimisation of ultrafiltration of a highly viscous protein solution using spiral-wound modules

    DEFF Research Database (Denmark)

    Lipnizki, Jens; Casani, S.; Jonsson, Gunnar Eigil

    2005-01-01

    The ultrafiltration process of highly viscous protein process water with spiral-wound modules was optimised by analysing the fouling and developing a strategy to reduce it. It was shown that the flux reduction during filtration is mainly caused by the adsorption of proteins on the membrane and no...

  18. A novel low-power fluxgate sensor using a macroscale optimisation technique for space physics instrumentation

    Science.gov (United States)

    Dekoulis, G.; Honary, F.

    2007-05-01

    This paper describes the design of a novel low-power single-axis fluxgate sensor. Several soft magnetic alloy materials have been considered and the choice was based on the balance between maximum permeability and minimum saturation flux density values. The sensor has been modelled using the Finite Integration Theory (FIT) method. The sensor was imposed to a custom macroscale optimisation technique that significantly reduced the power consumption by a factor of 16. The results of the sensor's optimisation technique will be used, subsequently, in the development of a cutting-edge ground based magnetometer for the study of the complex solar wind-magnetospheric-ionospheric system.

  19. Dipeptidyl Peptidase-4 Inhibitor Development and Post-authorisation Programme for Vildagliptin - Clinical Evidence for Optimised Management of Chronic Diseases Beyond Type 2 Diabetes.

    Science.gov (United States)

    Strain, William David; Paldánius, Päivi M

    2017-08-01

    The last decade has witnessed the role of dipeptidyl peptidase-4 (DPP-4) inhibitors in producing a conceptual change in early management of type 2 diabetes mellitus (T2DM) by shifting emphasis from a gluco-centric approach to holistically treating underlying pathophysiological processes. DPP-4 inhibitors highlighted the importance of acknowledging hypoglycaemia and weight gain as barriers to optimised care in T2DM. These complications were an integral part of diabetes management before the introduction of DPP-4 inhibitors. During the development of DPP-4 inhibitors, regulatory requirements for introducing new agents underwent substantial changes, with increased emphasis on safety. This led to the systematic collection of adjudicated cardiovascular (CV) safety data, and, where 95% confidence of a lack of harm could not be demonstrated, the standardised CV safety studies. Furthermore, the growing awareness of the worldwide extent of T2DM demanded a more diverse approach to recruitment and participation in clinical trials. Finally, the global financial crisis placed a new awareness on the health economics of diabetes, which rapidly became the most expensive disease in the world. This review encompasses unique developments in the global landscape, and the role DPP-4 inhibitors, specifically vildagliptin, have played in research advancement and optimisation of diabetes care in a diverse population with T2DM worldwide.

  20. Control strategies to optimise power output in heave buoy energy convertors

    International Nuclear Information System (INIS)

    Abu Zarim, M A U A; Sharip, R M

    2013-01-01

    Wave energy converter (WEC) designs are always discussed in order to obtain an optimum design to generate the power from the wave. Output power from wave energy converter can be improved by controlling the oscillation in order to acquire the interaction between the WEC and the incident wave.The purpose of this research is to study the heave buoys in the interest to generate an optimum power output by optimising the phase control and amplitude in order to maximise the active power. In line with the real aims of this study which investigate the theory and function and hence optimise the power generation of heave buoys as renewable energy sources, the condition that influence the heave buoy must be understand in which to propose the control strategies that can be use to control parameters to obtain optimum power output. However, this research is in an early stage, and further analysis and technical development is require

  1. Flow optimisation of a biomass mixer; Stroemungstechnische Optimierung eines Biomasse-Ruehrwerks

    Energy Technology Data Exchange (ETDEWEB)

    Casartelli, E.; Waser, R. [Hochschule fuer Technik und Architektur Luzern (HTA), Horw (Switzerland); Fankhauser, H. [Fankhauser Maschinenfabrik, Malters (Switzerland)

    2007-03-15

    This illustrated final report for the Swiss Federal Office of Energy (SFOE) reports on the optimisation of a mixing system used in biomass reactors. Aim of this work was to improve the fluid dynamic qualities of the mixer in order to increase its efficiency while, at the same time, maintaining robustness and low price. Investigative work performed with CFD (Computational Fluid Dynamics) is reported on. CFD is quoted by the authors as being very effective in solving such optimisation problems as it is suited to flows that are not easily accessible for analysis. Experiments were performed on a fermenter / mixer model in order to confirm the computational findings. The results obtained with two and three-dimensional simulations are presented and discussed, as are those resulting from the tests with the 1:10 scale model of a digester. Initial tests with the newly developed mixer-propellers in a real-life biogas installation are reported on and further tests to be made are listed.

  2. Process integration in bioprocess indystry: waste heat recovery in yeast and ethyl alcohol plant

    International Nuclear Information System (INIS)

    Raskovic, P.; Anastasovski, A.; Markovska, Lj.; Mesko, V.

    2010-01-01

    The process integration of the bioprocess plant for production of yeast and alcohol was studied. Preliminary energy audit of the plant identified the huge amount of thermal losses, caused by waste heat in exhausted process streams, and reviled the great potential for energy efficiency improvement by heat recovery system. Research roadmap, based on process integration approach, is divided on six phases, and the primary tool used for the design of heat recovery network was Pinch Analysis. Performance of preliminary design are obtained by targeting procedure, for three process stream sets, and evaluated by the economic criteria. The results of process integration study are presented in the form of heat exchanger networks which fulfilled the utilization of waste heat and enable considerable savings of energy in short payback period.

  3. Décomposition-coordination en optimisation déterministe et stochastique

    CERN Document Server

    Carpentier, Pierre

    2017-01-01

    Ce livre considère le traitement de problèmes d'optimisation de grande taille. L'idée est d'éclater le problème d'optimisation global en sous-problèmes plus petits, donc plus faciles à résoudre, chacun impliquant l'un des sous-systèmes (décomposition), mais sans renoncer à obtenir l'optimum global, ce qui nécessite d'utiliser une procédure itérative (coordination). Ce sujet a fait l'objet de plusieurs livres publiés dans les années 70 dans le contexte de l'optimisation déterministe. Nous présentans ici les principes essentiels et méthodes de décomposition-coordination au travers de situations typiques, puis nous proposons un cadre général qui permet de construire des algorithmes corrects et d'étudier leur convergence. Cette théorie est présentée aussi bien dans le contexte de l'optimisation déterministe que stochastique. Ce matériel a été enseigné par les auteurs dans divers cours de 3ème cycle et également mis en œuvre dans de nombreuses applications industrielles. Des exerc...

  4. Optimisation of trawl energy efficiency under fishing effort constraint

    OpenAIRE

    Priour, Daniel; Khaled, Ramez

    2009-01-01

    Trawls energy efficiency is greatly affected by the drag, as well as by the swept area. The drag results in an increase of the energy consumption and the sweeping influences the catch. Methods of optimisation of the trawl design have been developed in order to reduce the volume of carburant per kg of caught fish and consequently the drag per swept area of the trawl. Based on a finite element method model for flexible netting structures, the tool modifies step by step a reference design. For e...

  5. Thickness Optimisation of Textiles Subjected to Heat and Mass Transport during Ironing

    Directory of Open Access Journals (Sweden)

    Korycki Ryszard

    2016-09-01

    Full Text Available Let us next analyse the coupled problem during ironing of textiles, that is, the heat is transported with mass whereas the mass transport with heat is negligible. It is necessary to define both physical and mathematical models. Introducing two-phase system of mass sorption by fibres, the transport equations are introduced and accompanied by the set of boundary and initial conditions. Optimisation of material thickness during ironing is gradient oriented. The first-order sensitivity of an arbitrary objective functional is analysed and included in optimisation procedure. Numerical example is the thickness optimisation of different textile materials in ironing device.

  6. Optimisation of NMR dynamic models II. A new methodology for the dual optimisation of the model-free parameters and the Brownian rotational diffusion tensor

    International Nuclear Information System (INIS)

    D'Auvergne, Edward J.; Gooley, Paul R.

    2008-01-01

    Finding the dynamics of an entire macromolecule is a complex problem as the model-free parameter values are intricately linked to the Brownian rotational diffusion of the molecule, mathematically through the autocorrelation function of the motion and statistically through model selection. The solution to this problem was formulated using set theory as an element of the universal set U-the union of all model-free spaces (d'Auvergne EJ and Gooley PR (2007) Mol BioSyst 3(7), 483-494). The current procedure commonly used to find the universal solution is to initially estimate the diffusion tensor parameters, to optimise the model-free parameters of numerous models, and then to choose the best model via model selection. The global model is then optimised and the procedure repeated until convergence. In this paper a new methodology is presented which takes a different approach to this diffusion seeded model-free paradigm. Rather than starting with the diffusion tensor this iterative protocol begins by optimising the model-free parameters in the absence of any global model parameters, selecting between all the model-free models, and finally optimising the diffusion tensor. The new model-free optimisation protocol will be validated using synthetic data from Schurr JM et al. (1994) J Magn Reson B 105(3), 211-224 and the relaxation data of the bacteriorhodopsin (1-36)BR fragment from Orekhov VY (1999) J Biomol NMR 14(4), 345-356. To demonstrate the importance of this new procedure the NMR relaxation data of the Olfactory Marker Protein (OMP) of Gitti R et al. (2005) Biochem 44(28), 9673-9679 is reanalysed. The result is that the dynamics for certain secondary structural elements is very different from those originally reported

  7. Value Chain Optimisation of Biogas Production

    DEFF Research Database (Denmark)

    Jensen, Ida Græsted

    economically feasible. In this PhD thesis, the focus is to create models for investigating the profitability of biogas projects by: 1) including the whole value chain in a mathematical model and considering mass and energy changes on the upstream part of the chain; and 2) including profit allocation in a value......, the costs on the biogas plant has been included in the model using economy of scale. For the second point, a mathematical model considering profit allocation was developed applying three allocation mechanisms. This mathematical model can be applied as a second step after the value chain optimisation. After...... in the energy systems model to find the optimal end use of each type of gas and fuel. The main contributions of this thesis are the methods developed on plant level. Both the mathematical model for the value chain and the profit allocation model can be generalised and used in other industries where mass...

  8. Optimising Job-Shop Functions Utilising the Score-Function Method

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2000-01-01

    During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging to this ......During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging...... of a Job-Shop can be handled by the SF method....

  9. HVAC system optimisation-in-building section

    Energy Technology Data Exchange (ETDEWEB)

    Lu, L.; Cai, W.; Xie, L.; Li, S.; Soh, Y.C. [School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore (Singapore)

    2004-07-01

    This paper presents a practical method to optimise in-building section of centralised Heating, Ventilation and Air-Conditioning (HVAC) systems which consist of indoor air loops and chilled water loops. First, through component characteristic analysis, mathematical models associated with cooling loads and energy consumption for heat exchangers and energy consuming devices are established. By considering variation of cooling load of each end user, adaptive neuro-fuzzy inference system (ANFIS) is employed to model duct and pipe networks and obtain optimal differential pressure (DP) set points based on limited sensor information. A mix-integer nonlinear constraint optimization of system energy is formulated and solved by a modified genetic algorithm. The main feature of our paper is a systematic approach in optimizing the overall system energy consumption rather than that of individual component. A simulation study for a typical centralized HVAC system is provided to compare the proposed optimisation method with traditional ones. The results show that the proposed method indeed improves the system performance significantly. (author)

  10. Pre-operative optimisation of lung function

    Directory of Open Access Journals (Sweden)

    Naheed Azhar

    2015-01-01

    Full Text Available The anaesthetic management of patients with pre-existing pulmonary disease is a challenging task. It is associated with increased morbidity in the form of post-operative pulmonary complications. Pre-operative optimisation of lung function helps in reducing these complications. Patients are advised to stop smoking for a period of 4–6 weeks. This reduces airway reactivity, improves mucociliary function and decreases carboxy-haemoglobin. The widely used incentive spirometry may be useful only when combined with other respiratory muscle exercises. Volume-based inspiratory devices have the best results. Pharmacotherapy of asthma and chronic obstructive pulmonary disease must be optimised before considering the patient for elective surgery. Beta 2 agonists, inhaled corticosteroids and systemic corticosteroids, are the main drugs used for this and several drugs play an adjunctive role in medical therapy. A graded approach has been suggested to manage these patients for elective surgery with an aim to achieve optimal pulmonary function.

  11. Acoustic Resonator Optimisation for Airborne Particle Manipulation

    Science.gov (United States)

    Devendran, Citsabehsan; Billson, Duncan R.; Hutchins, David A.; Alan, Tuncay; Neild, Adrian

    Advances in micro-electromechanical systems (MEMS) technology and biomedical research necessitate micro-machined manipulators to capture, handle and position delicate micron-sized particles. To this end, a parallel plate acoustic resonator system has been investigated for the purposes of manipulation and entrapment of micron sized particles in air. Numerical and finite element modelling was performed to optimise the design of the layered acoustic resonator. To obtain an optimised resonator design, careful considerations of the effect of thickness and material properties are required. Furthermore, the effect of acoustic attenuation which is dependent on frequency is also considered within this study, leading to an optimum operational frequency range. Finally, experimental results demonstrated good particle levitation and capture of various particle properties and sizes ranging to as small as 14.8 μm.

  12. Corrosion performance of optimised and advanced fuel rod cladding in PWRs at high burnups

    International Nuclear Information System (INIS)

    Jourdain, P.; Hallstadius, L.; Pati, S.R.; Smith, G.P.; Garde, A.M.

    1997-01-01

    The corrosion behaviour both in-pile and out-of-pile for a number of cladding alloys developed by ABB to meet the current and future needs for fuel rod cladding with improved corrosion resistance is presented. The cladding materials include: 1) Zircaloy-4 (OPTIN) with optimised composition and processing and Zircaloy-2 optimised for Pressurised Water Reactors (PWR), (Zircaloy-2P), and 2) several alternative zirconium-based alloys with compositions outside the composition range for Zircaloys. The data presented originate from fuel rods irradiated in six PWRs to burnups up to about 66 MWd/kgU and from tests conducted in 360 o water autoclave. Also included are in-pile fuel rod growth measurements on some of the alloys. (UK)

  13. Validation, optimisation, and application data in support of the development of a targeted selected ion monitoring assay for degraded cardiac troponin T

    Directory of Open Access Journals (Sweden)

    Alexander S. Streng

    2016-06-01

    Full Text Available Cardiac troponin T (cTnT fragmentation in human serum was investigated using a newly developed targeted selected ion monitoring assay, as described in the accompanying article: “Development of a targeted selected ion monitoring assay for the elucidation of protease induced structural changes in cardiac troponin T” [1]. This article presents data describing aspects of the validation and optimisation of this assay. The data consists of several figures, an excel file containing the results of a sequence identity search, and a description of the raw mass spectrometry (MS data files, deposited in the ProteomeXchange repository with id PRIDE: http://www.ebi.ac.uk/pride/archive/projects/PXD003187.

  14. Expect systems and optimisation in process control

    Energy Technology Data Exchange (ETDEWEB)

    Mamdani, A.; Efstathiou, J. (eds.)

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately.

  15. Expert systems and optimisation in process control

    International Nuclear Information System (INIS)

    Mamdani, A.; Efstathiou, J.

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately. (author)

  16. Induction of fungal laccase production under solid state bioprocessing of new agroindustrial waste and its application on dye decolorization.

    Science.gov (United States)

    Akpinar, Merve; Ozturk Urek, Raziye

    2017-06-01

    Lignocellulosic wastes are generally produced in huge amounts worldwide. Peach waste of these obtained from fruit juice industry was utilized as the substrate for laccase production by Pleurotus eryngii under solid state bioprocessing (SSB). Its chemical composition was determined and this bioprocess was carried out under stationary conditions at 28 °C. The effects of different compounds; copper, iron, Tween 80, ammonium nitrate and manganese, and their variable concentrations on laccase production were investigated in detail. The optimum production of laccase (43,761.33 ± 3845 U L -1 ) was achieved on the day of 20 by employing peach waste of 5.0 g and 70 µM Cu 2+ , 18 µM Fe 2+ , 0.025% (v/v) Tween 80, 4.0 g L -1 ammonium nitrate, 750 µM Mn 2+ as the inducers. The dye decolorization also researched to determine the degrading capability of laccase produced from peach culture under the above-mentioned conditions. Within this scope of the study, methyl orange, tartrazine, reactive red 2 and reactive black dyes were treated with this enzyme. The highest decolorization was performed with methyl orange as 43 ± 2.8% after 5 min of treatment when compared to other dyes. Up to now, this is the first report on the induction of laccase production by P. eryngii under SSB using peach waste as the substrate.

  17. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane Loft

    2013-01-01

    . The definition of an optimal performance was carried out by through a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow, given the probability of a future rain event. The methodology is successfully applied...

  18. Recent perspectives on optimisation of radiological protection

    International Nuclear Information System (INIS)

    Robb, J.D.; Croft, J.R.

    1992-01-01

    The ALARA principle as a requirement in radiological protection has evolved from its theoretical roots. Based on several years work, this paper provides a backdrop to practical approaches to ALARA for the 1990s. The key step, developing ALARA thinking so that it becomes an integral part of radiological protection programmes, is discussed using examples from the UK and France, as is the role of tools to help standardise judgements for decision-making. In its latest recommendations, ICRP have suggested that the optimisation of protection should be constrained by restrictions on the doses to individuals. This paper also considers the function of such restrictions for occupational, public and medical exposure, and in the design process. (author)

  19. Optimising investment in asset management using the multivariate asset management assessment topography

    Directory of Open Access Journals (Sweden)

    Bam, Wouter Gideon

    2014-08-01

    Full Text Available The multivariate asset management assessment topography (MAMAT was developed to quantify, and represent graphically, development, adoption, and performance of a business’ asset management (AM systems, as described by standards such as PAS 55. The MAMAT provides a way to visualise clearly the strengths and weaknesses of a business’ asset management system. Building on MAMAT, a model describing the relationship between the commitment of resources and the corresponding improvement in the MAMAT assessment outcome is proposed. The goal is to develop an optimisation model that will maximise financial benefits by improving the MAMAT assessment score achieved by a business, while minimising the investment required to attain this improvement. This is achieved by determining the optimal allocation of resources to the different subcategories of the MAMAT assessment framework. The multi-objective cross-entropy method (MOO CEM is used to find the Pareto set of solutions for this problem. In order to showcase the intended industry application and use of the optimisation model, a hypothetical case study is executed and described in this paper. From this application, it was found that the MOO CEM finds useful solutions that can support the implementation of standards such as PAS 55 by prioritising and assigning resources to implementation activities.

  20. Optimised low-dose multidetector CT protocol for children with cranial deformity

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez, Jose Luis [Complejo Hospitalario Universitario de Vigo, Department of Radiology, Vigo, Pontevedra (Spain); Pombar, Miguel Angel [Complejo Hospitalario Universitario de Santiago, Department of Radiophysics, Santiago de Compostela, La Coruna (Spain); Pumar, Jose Manuel [Complejo Hospitalario Universitario de Santiago, Department of Radiology, Santiago de Compostela, La Coruna (Spain); Campo, Victor Miguel del [Complejo Hospitalario Universitario de Vigo, Department of Public Health, Vigo, Pontevedra (Spain)

    2013-08-15

    To present an optimised low-dose multidetector computed tomography (MDCT) protocol for the study of children with cranial deformity. Ninety-one consecutive MDCT studies were performed in 80 children. Studies were performed with either our standard head CT protocol (group 1, n = 20) or a low-dose cranial deformity protocol (groups 2 and 3). Group 2 (n = 38), initial, and group 3 (n = 33), final and more optimised. All studies were performed in the same 64-MDCT equipment. Cranial deformity protocol was gradationally optimised decreasing kVp, limiting mA range, using automatic exposure control (AEC) and increasing the noise index (NI). Image quality was assessed. Dose indicators such us CT dose index volume (CTDIvol), dose-length product (DLP) and effective dose (E) were used. The optimised low-dose protocol reached the following values: 80 kVp, mA range: 50-150 and NI = 23. We achieved a maximum dose reduction of 10-22 times in the 1- to 12-month-old cranium in regard to the 2004 European guidelines for MDCT. A low-dose MDCT protocol that may be used as the first diagnostic imaging option in clinically selected patients with skull abnormalities. (orig.)

  1. Optimisation of the PCR-invA primers for the detection of Salmonella ...

    African Journals Online (AJOL)

    A polymerase chain reaction (PCR)-based method for the detection of Salmonella species in water samples was optimised and evaluated for speed, specificity and sensitivity. Optimisation of Mg2+ and primer concentrations and cycling parameters increased the sensitivity and limit of detection of PCR to 2.6 x 104 cfu/m.

  2. Algorithme intelligent d'optimisation d'un design structurel de grande envergure

    Science.gov (United States)

    Dominique, Stephane

    The implementation of an automated decision support system in the field of design and structural optimisation can give a significant advantage to any industry working on mechanical designs. Indeed, by providing solution ideas to a designer or by upgrading existing design solutions while the designer is not at work, the system may reduce the project cycle time, or allow more time to produce a better design. This thesis presents a new approach to automate a design process based on Case-Based Reasoning (CBR), in combination with a new genetic algorithm named Genetic Algorithm with Territorial core Evolution (GATE). This approach was developed in order to reduce the operating cost of the process. However, as the system implementation cost is quite expensive, the approach is better suited for large scale design problem, and particularly for design problems that the designer plans to solve for many different specification sets. First, the CBR process uses a databank filled with every known solution to similar design problems. Then, the closest solutions to the current problem in term of specifications are selected. After this, during the adaptation phase, an artificial neural network (ANN) interpolates amongst known solutions to produce an additional solution to the current problem using the current specifications as inputs. Each solution produced and selected by the CBR is then used to initialize the population of an island of the genetic algorithm. The algorithm will optimise the solution further during the refinement phase. Using progressive refinement, the algorithm starts using only the most important variables for the problem. Then, as the optimisation progress, the remaining variables are gradually introduced, layer by layer. The genetic algorithm that is used is a new algorithm specifically created during this thesis to solve optimisation problems from the field of mechanical device structural design. The algorithm is named GATE, and is essentially a real number

  3. Thermodynamic optimisation of a heat exchanger

    NARCIS (Netherlands)

    Cornelissen, Rene; Hirs, Gerard

    1999-01-01

    The objective of this paper is to show that for the optimal design of an energy system, where there is a trade-off between exergy saving during operation and exergy use during construction of the energy system, exergy analysis and life cycle analysis should be combined. An exergy optimisation of a

  4. Allogeneic cell therapy bioprocess economics and optimization: single-use cell expansion technologies.

    Science.gov (United States)

    Simaria, Ana S; Hassan, Sally; Varadaraju, Hemanthram; Rowley, Jon; Warren, Kim; Vanek, Philip; Farid, Suzanne S

    2014-01-01

    For allogeneic cell therapies to reach their therapeutic potential, challenges related to achieving scalable and robust manufacturing processes will need to be addressed. A particular challenge is producing lot-sizes capable of meeting commercial demands of up to 10(9) cells/dose for large patient numbers due to the current limitations of expansion technologies. This article describes the application of a decisional tool to identify the most cost-effective expansion technologies for different scales of production as well as current gaps in the technology capabilities for allogeneic cell therapy manufacture. The tool integrates bioprocess economics with optimization to assess the economic competitiveness of planar and microcarrier-based cell expansion technologies. Visualization methods were used to identify the production scales where planar technologies will cease to be cost-effective and where microcarrier-based bioreactors become the only option. The tool outputs also predict that for the industry to be sustainable for high demand scenarios, significant increases will likely be needed in the performance capabilities of microcarrier-based systems. These data are presented using a technology S-curve as well as windows of operation to identify the combination of cell productivities and scale of single-use bioreactors required to meet future lot sizes. The modeling insights can be used to identify where future R&D investment should be focused to improve the performance of the most promising technologies so that they become a robust and scalable option that enables the cell therapy industry reach commercially relevant lot sizes. The tool outputs can facilitate decision-making very early on in development and be used to predict, and better manage, the risk of process changes needed as products proceed through the development pathway. © 2013 Wiley Periodicals, Inc.

  5. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    Science.gov (United States)

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  6. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    Directory of Open Access Journals (Sweden)

    Vito Trianni

    Full Text Available The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled. However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  7. SLA-based optimisation of virtualised resource for multi-tier web applications in cloud data centres

    Science.gov (United States)

    Bi, Jing; Yuan, Haitao; Tie, Ming; Tan, Wei

    2015-10-01

    Dynamic virtualised resource allocation is the key to quality of service assurance for multi-tier web application services in cloud data centre. In this paper, we develop a self-management architecture of cloud data centres with virtualisation mechanism for multi-tier web application services. Based on this architecture, we establish a flexible hybrid queueing model to determine the amount of virtual machines for each tier of virtualised application service environments. Besides, we propose a non-linear constrained optimisation problem with restrictions defined in service level agreement. Furthermore, we develop a heuristic mixed optimisation algorithm to maximise the profit of cloud infrastructure providers, and to meet performance requirements from different clients as well. Finally, we compare the effectiveness of our dynamic allocation strategy with two other allocation strategies. The simulation results show that the proposed resource allocation method is efficient in improving the overall performance and reducing the resource energy cost.

  8. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi

    2012-09-04

    In this paper, shape optimisation of flapping wings in forward flight is considered. This analysis is performed by combining a local gradient-based optimizer with the unsteady vortex lattice method (UVLM). Although the UVLM applies only to incompressible, inviscid flows where the separation lines are known a priori, Persson et al. [1] showed through a detailed comparison between UVLM and higher-fidelity computational fluid dynamics methods for flapping flight that the UVLM schemes produce accurate results for attached flow cases and even remain trend-relevant in the presence of flow separation. As such, they recommended the use of an aerodynamic model based on UVLM to perform preliminary design studies of flapping wing vehicles Unlike standard computational fluid dynamics schemes, this method requires meshing of the wing surface only and not of the whole flow domain [2]. From the design or optimisation perspective taken in our work, it is fairly common (and sometimes entirely necessary, as a result of the excessive computational cost of the highest fidelity tools such as Navier-Stokes solvers) to rely upon such a moderate level of modelling fidelity to traverse the design space in an economical manner. The objective of the work, described in this paper, is to identify a set of optimised shapes that maximise the propulsive efficiency, defined as the ratio of the propulsive power over the aerodynamic power, under lift, thrust, and area constraints. The shape of the wings is modelled using B-splines, a technology used in the computer-aided design (CAD) field for decades. This basis can be used to smoothly discretize wing shapes with few degrees of freedom, referred to as control points. The locations of the control points constitute the design variables. The results suggest that changing the shape yields significant improvement in the performance of the flapping wings. The optimisation pushes the design to "bird-like" shapes with substantial increase in the time

  9. Capacity planning for batch and perfusion bioprocesses across multiple biopharmaceutical facilities.

    Science.gov (United States)

    Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S

    2014-01-01

    Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fed-batch or perfusion culture processes such as sequence-dependent changeover times, continuous culture constraints, and decoupled upstream and downstream operations that permit independent scheduling of each. Strategic inventory levels were accounted for by applying cost penalties when they were not met. A rolling time horizon methodology was utilized in conjunction with the MILP model and was shown to obtain solutions with greater optimality in less computational time than the full-scale model. The model was applied to an industrial case study to illustrate how the framework aids decisions regarding outsourcing capacity to third party manufacturers or building new facilities. The impact of variations on key parameters such as demand or titres on the optimal production plans and costs was captured. The analysis identified the critical ratio of in-house to contract manufacturing organization (CMO) manufacturing costs that led the optimization results to favor building a future facility over using a CMO. The tool predicted that if titres were higher than expected then the optimal solution would allocate more production to in-house facilities, where manufacturing costs were lower. Utilization graphs indicated when capacity expansion should be considered. © 2014 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  10. OPTIMISATION OF A DRIVE SYSTEM AND ITS EPICYCLIC GEAR SET

    OpenAIRE

    Bellegarde , Nicolas; Dessante , Philippe; Vidal , Pierre; Vannier , Jean-Claude

    2007-01-01

    International audience; This paper describes the design of a drive consisting of a DC motor, a speed reducer, a lead screw transformation system, a power converter and its associated DC source. The objective is to reduce the mass of the system. Indeed, the volume and weight optimisation of an electrical drive is an important issue for embedded applications. Here, we present an analytical model of the system in a specific application and afterwards an optimisation of the motor and speed reduce...

  11. Optimisation of gas-cooled reactors with the aid of mathematical computers

    Energy Technology Data Exchange (ETDEWEB)

    Margen, P H

    1959-04-15

    Reactor optimisation is the task of finding the combination of values of the independent variables in a reactor design producing the lowest cost of electricity. In a gas-cooled reactor the number of independent variables is particularly large and the optimisation process is, therefore, laborious. The present note describes a procedure for performing the entire optimisation procedure with the aid of a mathematical computer in a single operation, thus saving time for the design staff. Detailed equations and numerical constants are proposed for the thermal and cost relations involved. The reactor physics equations, on the other hand are merely stated as general functions of the relevant variables. The task of expressing these functions as detailed equations will be covered by separate documents prepared by the reactor physics department.

  12. Optimisation of gas-cooled reactors with the aid of mathematical computers

    International Nuclear Information System (INIS)

    Margen, P.H.

    1959-04-01

    Reactor optimisation is the task of finding the combination of values of the independent variables in a reactor design producing the lowest cost of electricity. In a gas-cooled reactor the number of independent variables is particularly large and the optimisation process is, therefore, laborious. The present note describes a procedure for performing the entire optimisation procedure with the aid of a mathematical computer in a single operation, thus saving time for the design staff. Detailed equations and numerical constants are proposed for the thermal and cost relations involved. The reactor physics equations, on the other hand are merely stated as general functions of the relevant variables. The task of expressing these functions as detailed equations will be covered by separate documents prepared by the reactor physics department

  13. Water distribution systems design optimisation using metaheuristics ...

    African Journals Online (AJOL)

    The topic of multi-objective water distribution systems (WDS) design optimisation using metaheuristics is investigated, comparing numerous modern metaheuristics, including several multi-objective evolutionary algorithms, an estimation of distribution algorithm and a recent hyperheuristic named AMALGAM (an evolutionary ...

  14. Power and delay optimisation in multi-hop wireless networks

    KAUST Repository

    Xia, Li

    2014-02-05

    In this paper, we study the optimisation problem of transmission power and delay in a multi-hop wireless network consisting of multiple nodes. The goal is to determine the optimal policy of transmission rates at various buffer and channel states in order to minimise the power consumption and the queueing delay of the whole network. With the assumptions of interference-free links and independently and identically distributed (i.i.d.) channel states, we formulate this problem using a semi-open Jackson network model for data transmission and a Markov model for channel states transition. We derive a difference equation of the system performance under any two different policies. The necessary and sufficient condition of optimal policy is obtained. We also prove that the system performance is monotonic with respect to (w.r.t.) the transmission rate and the optimal transmission rate can be either maximal or minimal. That is, the ‘bang-bang’ control is an optimal control. This optimality structure greatly reduces the problem complexity. Furthermore, we develop an iterative algorithm to find the optimal solution. Finally, we conduct the simulation experiments to demonstrate the effectiveness of our approach. We hope our work can shed some insights on solving this complicated optimisation problem.

  15. Optimisation of a machine learning algorithm in human locomotion using principal component and discriminant function analyses.

    Science.gov (United States)

    Bisele, Maria; Bencsik, Martin; Lewis, Martin G C; Barnett, Cleveland T

    2017-01-01

    Assessment methods in human locomotion often involve the description of normalised graphical profiles and/or the extraction of discrete variables. Whilst useful, these approaches may not represent the full complexity of gait data. Multivariate statistical methods, such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA), have been adopted since they have the potential to overcome these data handling issues. The aim of the current study was to develop and optimise a specific machine learning algorithm for processing human locomotion data. Twenty participants ran at a self-selected speed across a 15m runway in barefoot and shod conditions. Ground reaction forces (BW) and kinematics were measured at 1000 Hz and 100 Hz, respectively from which joint angles (°), joint moments (N.m.kg-1) and joint powers (W.kg-1) for the hip, knee and ankle joints were calculated in all three anatomical planes. Using PCA and DFA, power spectra of the kinematic and kinetic variables were used as a training database for the development of a machine learning algorithm. All possible combinations of 10 out of 20 participants were explored to find the iteration of individuals that would optimise the machine learning algorithm. The results showed that the algorithm was able to successfully predict whether a participant ran shod or barefoot in 93.5% of cases. To the authors' knowledge, this is the first study to optimise the development of a machine learning algorithm.

  16. Optimisation of hybrid high-modulus/high-strength carbon fiber reinforced plastic composite drive

    OpenAIRE

    Montagnier, Olivier; Hochard, Christian

    2011-01-01

    International audience; This study deals with the optimisation of hybrid composite drive shafts operating at subcritical or supercritical speeds, using a genetic algorithm. A formulation for the flexural vibrations of a composite drive shaft mounted on viscoelastic supports including shear effects is developed. In particular, an analytic stability criterion is developed to ensure the integrity of the system in the supercritical regime. Then it is shown that the torsional strength can be compu...

  17. Optimisation of window settings for traditional and noise-optimised virtual monoenergetic imaging in dual-energy computed tomography pulmonary angiography

    International Nuclear Information System (INIS)

    D'Angelo, Tommaso; ''G. Martino'' University Hospital, Messina; Bucher, Andreas M.; Lenga, Lukas; Arendt, Christophe T.; Peterke, Julia L.; Martin, Simon S.; Leithner, Doris; Vogl, Thomas J.; Wichmann, Julian L.; Caruso, Damiano; University Hospital, Latina; Mazziotti, Silvio; Blandino, Alfredo; Ascenti, Giorgio; University Hospital, Messina; Othman, Ahmed E.

    2018-01-01

    To define optimal window settings for displaying virtual monoenergetic images (VMI) of dual-energy CT pulmonary angiography (DE-CTPA). Forty-five patients who underwent clinically-indicated third-generation dual-source DE-CTPA were retrospectively evaluated. Standard linearly-blended (M 0 .6), 70-keV traditional VMI (M70), and 40-keV noise-optimised VMI (M40+) reconstructions were analysed. For M70 and M40+ datasets, the subjectively best window setting (width and level, B-W/L) was independently determined by two observers and subsequently related with pulmonary artery attenuation to calculate separate optimised values (O-W/L) using linear regression. Subjective evaluation of image quality (IQ) between W/L settings were assessed by two additional readers. Repeated measures of variance were performed to compare W/L settings and IQ indices between M 0 .6, M70, and M40+. B-W/L and O-W/L for M70 were 460/140 and 450/140, and were 1100/380 and 1070/380 for M40+, respectively, differing from standard DE-CTPA W/L settings (450/100). Highest subjective scores were observed for M40+ regarding vascular contrast, embolism demarcation, and overall IQ (all p<0.001). Application of O-W/L settings is beneficial to optimise subjective IQ of VMI reconstructions of DE-CTPA. A width slightly less than two times the pulmonary trunk attenuation and a level approximately of overall pulmonary vessel attenuation are recommended. (orig.)

  18. Optimising the Target and Capture Sections of the Neutrino Factory

    CERN Document Server

    Hansen, Ole Martin; Stapnes, Steinar

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accelerator and thus the neutrino beam intensity. The optimisation studies were performed with the use of Monte Carlo simulation tools. The production of secondary particles, by interactions between the incoming proton beam and the mercury target, was optimised by varying the proton beam impact position and impact angles on the target. The proton beam and target interaction region was studied and showed to be off the central axis of the capture section in the baseline configuration. The off-centred interaction region resulted in ...

  19. Optimisation of surgical care for rectal cancer

    NARCIS (Netherlands)

    Borstlap, W.A.A.

    2017-01-01

    Optimisation of surgical care means weighing the risk of treatment related morbidity against the patients’ potential benefits of a surgical intervention. The first part of this thesis focusses on the anaemic patient undergoing colorectal surgery. Hypothesizing that a more profound haemoglobin

  20. Thermodynamic optimisation and analysis of four Kalina cycle layouts for high temperature applications

    International Nuclear Information System (INIS)

    Modi, Anish; Haglind, Fredrik

    2015-01-01

    The Kalina cycle has seen increased interest in the last few years as an efficient alternative to the conventional steam Rankine cycle. However, the available literature gives little information on the algorithms to solve or optimise this inherently complex cycle. This paper presents a detailed approach to solve and optimise a Kalina cycle for high temperature (a turbine inlet temperature of 500 °C) and high pressure (over 100 bar) applications using a computationally efficient solution algorithm. A central receiver solar thermal power plant with direct steam generation was considered as a case study. Four different layouts for the Kalina cycle based on the number and/or placement of the recuperators in the cycle were optimised and compared based on performance parameters such as the cycle efficiency and the cooling water requirement. The cycles were modelled in steady state and optimised with the maximisation of the cycle efficiency as the objective function. It is observed that the different cycle layouts result in different regions for the optimal value of the turbine inlet ammonia mass fraction. Out of the four compared layouts, the most complex layout KC1234 gives the highest efficiency. The cooling water requirement is closely related to the cycle efficiency, i.e., the better the efficiency, the lower is the cooling water requirement. - Highlights: • Detailed methodology for solving and optimising Kalina cycle for high temperature applications. • A central receiver solar thermal power plant with direct steam generation considered as a case study. • Four Kalina cycle layouts based on the placement of recuperators optimised and compared

  1. Optimisation of a parallel ocean general circulation model

    Science.gov (United States)

    Beare, M. I.; Stevens, D. P.

    1997-10-01

    This paper presents the development of a general-purpose parallel ocean circulation model, for use on a wide range of computer platforms, from traditional scalar machines to workstation clusters and massively parallel processors. Parallelism is provided, as a modular option, via high-level message-passing routines, thus hiding the technical intricacies from the user. An initial implementation highlights that the parallel efficiency of the model is adversely affected by a number of factors, for which optimisations are discussed and implemented. The resulting ocean code is portable and, in particular, allows science to be achieved on local workstations that could otherwise only be undertaken on state-of-the-art supercomputers.

  2. Aspects of approximate optimisation: overcoming the curse of dimensionality and design of experiments

    NARCIS (Netherlands)

    Trichon, Sophie; Bonte, M.H.A.; Ponthot, Jean-Philippe; van den Boogaard, Antonius H.

    2007-01-01

    Coupling optimisation algorithms to Finite Element Methods (FEM) is a very promising way to achieve optimal metal forming processes. However, many optimisation algorithms exist and it is not clear which of these algorithms to use. This paper investigates the sensitivity of a Sequential Approximate

  3. Bio-processing of solid wastes and secondary resources for metal extraction - A review.

    Science.gov (United States)

    Lee, Jae-Chun; Pandey, Banshi Dhar

    2012-01-01

    Metal containing wastes/byproducts of various industries, used consumer goods, and municipal waste are potential pollutants, if not treated properly. They may also be important secondary resources if processed in eco-friendly manner for secured supply of contained metals/materials. Bio-extraction of metals from such resources with microbes such as bacteria, fungi and archaea is being increasingly explored to meet the twin objectives of resource recycling and pollution mitigation. This review focuses on the bio-processing of solid wastes/byproducts of metallurgical and manufacturing industries, chemical/petrochemical plants, electroplating and tanning units, besides sewage sludge and fly ash of municipal incinerators, electronic wastes (e-wastes/PCBs), used batteries, etc. An assessment has been made to quantify the wastes generated and its compositions, microbes used, metal leaching efficiency etc. Processing of certain effluents and wastewaters comprising of metals is also included in brief. Future directions of research are highlighted. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Invitro Study on the Fluid From Banana Stem Bioprocess as Direct Fed Microbial

    Science.gov (United States)

    Mutaqin, B. K.; Tanuwiria, U. H.; Hernawan, E.

    2018-02-01

    The purpose of this research was to study the liquid produced by the bioprocess of banana stem as a Direct Fed Microbial (DFM) in order to enhance local sheep productivity invitro. Studying was the use of DFM in two invitro feeds. The object observed in this research was fermentability and digestibility value. The method was experimental with the experimental design, i.e. factorial experimental design with two factors. The first factor was DFM, the levels of which were 0, 0,2, 0,4 and 0,6%, while the second factor was two feed types (complete feed and Pennisetum purpureum only) with the treatment of threefold repetition. This research showed that fermentability and digestibility value were influenced by the DFM in the invitro complete feed. The research result analyzed using MANOVA with further testing using Duncan Test. The conclusion of the research result were shows the interaction DFM in the complete feed improve fermentability and digestibility values and DFM 0,6% shows the highest value.

  5. Multi-objective optimisation of wastewater treatment plant control to reduce greenhouse gas emissions.

    Science.gov (United States)

    Sweetapple, Christine; Fu, Guangtao; Butler, David

    2014-05-15

    This study investigates the potential of control strategy optimisation for the reduction of operational greenhouse gas emissions from wastewater treatment in a cost-effective manner, and demonstrates that significant improvements can be realised. A multi-objective evolutionary algorithm, NSGA-II, is used to derive sets of Pareto optimal operational and control parameter values for an activated sludge wastewater treatment plant, with objectives including minimisation of greenhouse gas emissions, operational costs and effluent pollutant concentrations, subject to legislative compliance. Different problem formulations are explored, to identify the most effective approach to emissions reduction, and the sets of optimal solutions enable identification of trade-offs between conflicting objectives. It is found that multi-objective optimisation can facilitate a significant reduction in greenhouse gas emissions without the need for plant redesign or modification of the control strategy layout, but there are trade-offs to consider: most importantly, if operational costs are not to be increased, reduction of greenhouse gas emissions is likely to incur an increase in effluent ammonia and total nitrogen concentrations. Design of control strategies for a high effluent quality and low costs alone is likely to result in an inadvertent increase in greenhouse gas emissions, so it is of key importance that effects on emissions are considered in control strategy development and optimisation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Status of achievements reached in applying optimisation of protection in prevention and mitigation of accidents in nuclear facilities

    International Nuclear Information System (INIS)

    Bengtsson, G.; Hoegberg, L.

    1988-01-01

    Optimisation of protection in a broad sense is basically a political undertaking, where the resources put into protection are balanced against other factors - quantifiable and non-quantifiable - to obtain the best protection that can be achieved under the circumstances. In a narrower sense, optimisation can be evaluated in procedures allowing for a few quantifiable factors, such as cost/effectiveness analysis. These procedures are used as inputs to the broader optimisation. The paper discusses several examples from Sweden concerning evaluations and decisions relating to prevention of accidents and mitigation of their consequences. Comparison is made with typical optimisation criteria proposed for radiation protection work and for cost/effective analysis in the USA, notably NUREG-1150 (draft). The examples show that optimisation procedures in a narrower sense have not been decisive. Individual dose limits seem to be increasingly important as compared to collective dose optimisation, and political, commercial or engineering judgements may lead to decisions far away from those suggested by simple optimisation considerations

  7. Simultaneous production of lipases and biosurfactants by submerged and solid-state bioprocesses.

    Science.gov (United States)

    Colla, Luciane Maria; Rizzardi, Juliana; Pinto, Marta Heidtmann; Reinehr, Christian Oliveira; Bertolin, Telma Elita; Costa, Jorge Alberto Vieira

    2010-11-01

    Lipases and biosurfactants are compounds produced by microorganisms generally involved in the metabolization of oil substrates. However, the relationship between the production of lipases and biosurfactants has not been established yet. Therefore, this study aimed to evaluate the correlation between production of lipases and biosurfactants by submerged (SmgB) and solid-state bioprocess (SSB) using Aspergillus spp., which were isolated from a soil contaminated by diesel oil. SSB had the highest production of lipases, with lipolytic activities of 25.22U, while SmgB had 4.52U. The production of biosurfactants was not observed in the SSB. In the SmgB, correlation coefficients of 91% and 87% were obtained between lipolytic activity and oil in water and water in oil emulsifying activities, respectively. A correlation of 84% was obtained between lipolytic activity and reduction of surface tension in the culture medium. The surface tension decreased from 50 to 28mNm(-1) indicating that biosurfactants were produced in the culture medium. Copyright 2010 Elsevier Ltd. All rights reserved.

  8. Bioprocess iterative batch-to-batch optimization based on hybrid parametric/nonparametric models.

    Science.gov (United States)

    Teixeira, Ana P; Clemente, João J; Cunha, António E; Carrondo, Manuel J T; Oliveira, Rui

    2006-01-01

    This paper presents a novel method for iterative batch-to-batch dynamic optimization of bioprocesses. The relationship between process performance and control inputs is established by means of hybrid grey-box models combining parametric and nonparametric structures. The bioreactor dynamics are defined by material balance equations, whereas the cell population subsystem is represented by an adjustable mixture of nonparametric and parametric models. Thus optimizations are possible without detailed mechanistic knowledge concerning the biological system. A clustering technique is used to supervise the reliability of the nonparametric subsystem during the optimization. Whenever the nonparametric outputs are unreliable, the objective function is penalized. The technique was evaluated with three simulation case studies. The overall results suggest that the convergence to the optimal process performance may be achieved after a small number of batches. The model unreliability risk constraint along with sampling scheduling are crucial to minimize the experimental effort required to attain a given process performance. In general terms, it may be concluded that the proposed method broadens the application of the hybrid parametric/nonparametric modeling technique to "newer" processes with higher potential for optimization.

  9. Approaches and challenges to optimising primary care teams’ electronic health record usage

    Directory of Open Access Journals (Sweden)

    Nancy Pandhi

    2014-07-01

    Full Text Available Background Although the presence of an electronic health record (EHR alone does not ensure high quality, efficient care, few studies have focused on the work of those charged with optimising use of existing EHR functionality.Objective To examine the approaches used and challenges perceived by analysts supporting the optimisation of primary care teams’ EHR use at a large U.S. academic health care system.Methods A qualitative study was conducted. Optimisation analysts and their supervisor were interviewed and data were analysed for themes.Results Analysts needed to reconcile the tension created by organisational mandates focused on the standardisation of EHR processes with the primary care teams’ demand for EHR customisation. They gained an understanding of health information technology (HIT leadership’s and primary care team’s goals through attending meetings, reading meeting minutes and visiting with clinical teams. Within what was organisationally possible, EHR education could then be tailored to fit team needs. Major challenges were related to organisational attempts to standardise EHR use despite varied clinic contexts, personnel readiness and technical issues with the EHR platform. Forcing standardisation upon clinical needs that current EHR functionality could not satisfy was difficult.Conclusions Dedicated optimisation analysts can add value to health systems through playing a mediating role between HIT leadership and care teams. Our findings imply that EHR optimisation should be performed with an in-depth understanding of the workflow, cognitive and interactional activities in primary care.

  10. Reflections on the juridicial roots of the principle of optimisation

    International Nuclear Information System (INIS)

    Lochard, J.; Boehler, M.C.

    1992-01-01

    The disciplines of jurisprudence tend in general towards a rationalisation and stabilisation of social or economic practice and are oriented towards concepts or practices which belong to the field of the determinate. When it comes to the principle of optimising radiological protection, however, the classical juridical technique of administrative law does not exactly answer the problems of implementing this. From the obligations of performance traditionally imposed by the government, a transition to obligation of behaviour by those involved seems to be called for, and this is what makes the optimisation principle difficult to qualify juridically. Instead of a law of command, exemption and control the government must essentially put its trust in the operators of nuclear installations by issuing a standard which sets an objective rather than a standard with the force of a regulation as in the past. Does the future of the juridical sciences in fact lie in the development of an administrative law of the indeterminate which would oblige the government to recognise that, even in the field of the determinate, it is not always government which knows best? While our classical administrative law is a law of command and control, the administrative law of the indeterminate will be that of the law of common effort, framed in collective acts and based on trust, consultation, and obligations of behaviour, all under the control of a judge who intervenes when there is a manifest contradiction between the acts and the promised behaviour. In French law optimisation has remained a general principle unaccompanied by specific provisions for its implementation. The object of our paper is to examine on what juridical foundations it would be possible to apply this principle at practical level without betraying its spirit. (author)

  11. Pre-segmented 2-Step IMRT with subsequent direct machine parameter optimisation – a planning study

    International Nuclear Information System (INIS)

    Bratengeier, Klaus; Meyer, Jürgen; Flentje, Michael

    2008-01-01

    Modern intensity modulated radiotherapy (IMRT) mostly uses iterative optimisation methods. The integration of machine parameters into the optimisation process of step and shoot leaf positions has been shown to be successful. For IMRT segmentation algorithms based on the analysis of the geometrical structure of the planning target volumes (PTV) and the organs at risk (OAR), the potential of such procedures has not yet been fully explored. In this work, 2-Step IMRT was combined with subsequent direct machine parameter optimisation (DMPO-Raysearch Laboratories, Sweden) to investigate this potential. In a planning study DMPO on a commercial planning system was compared with manual primary 2-Step IMRT segment generation followed by DMPO optimisation. 15 clinical cases and the ESTRO Quasimodo phantom were employed. Both the same number of optimisation steps and the same set of objective values were used. The plans were compared with a clinical DMPO reference plan and a traditional IMRT plan based on fluence optimisation and consequent segmentation. The composite objective value (the weighted sum of quadratic deviations of the objective values and the related points in the dose volume histogram) was used as a measure for the plan quality. Additionally, a more extended set of parameters was used for the breast cases to compare the plans. The plans with segments pre-defined with 2-Step IMRT were slightly superior to DMPO alone in the majority of cases. The composite objective value tended to be even lower for a smaller number of segments. The total number of monitor units was slightly higher than for the DMPO-plans. Traditional IMRT fluence optimisation with subsequent segmentation could not compete. 2-Step IMRT segmentation is suitable as starting point for further DMPO optimisation and, in general, results in less complex plans which are equal or superior to plans generated by DMPO alone

  12. A method to derive fixed budget results from expected optimisation times

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Jansen, Thomas; Witt, Carsten

    2013-01-01

    At last year's GECCO a novel perspective for theoretical performance analysis of evolutionary algorithms and other randomised search heuristics was introduced that concentrates on the expected function value after a pre-defined number of steps, called budget. This is significantly different from...... the common perspective where the expected optimisation time is analysed. While there is a huge body of work and a large collection of tools for the analysis of the expected optimisation time the new fixed budget perspective introduces new analytical challenges. Here it is shown how results on the expected...... optimisation time that are strengthened by deviation bounds can be systematically turned into fixed budget results. We demonstrate our approach by considering the (1+1) EA on LeadingOnes and significantly improving previous results. We prove that deviating from the expected time by an additive term of ω(n3...

  13. Design optimisation of a flywheel hybrid vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Kok, D.B.

    1999-11-04

    during the engine start-up and shutdown periods. Correct throttle valve control ensures that hydrocarbon emissions are not critical for legislative emission limits, but the engine's standard lambda control cannot prevent an increase of nitric oxides. In order to improve tailpipe emissions, the thermo-chemical behaviour of the catalytic converter is investigated and adapted for hybrid vehicle application. In cold-start situations, the fuel consumption and exhaust gas emissions of a mechanical driveline with internal combustion engine increase. A detailed numerical investigation of the thermal behaviour of the hybrid driveline showed that the energy-efficient operation of the engine decreases thermal waste energy that is available to warm up driveline components. Therefore, a redesign of the cooling circuitry and thermal management of the driveline was required to improve system warm-up. A computer model has been developed that combines the functional description of the flywheel hybrid vehicle with the calculation of energy losses. Apart from standardised European drive cycles, velocity profiles that represent more realistic vehicle utilisation are used to assess and optimise the hybrid vehicle's fuel economy, exhaust gas emission and acceleration performance. Subdivision of energy consumption enabled the classification of those systems and components that have a major effect on fuel consumption. Of these, the optimised flywheel system, the hydraulic system, and the transmission consume energy of comparable magnitude in city driving. It is shown that the system's fuel economy is mainly a result of the improved engine operation. Regenerative braking has only limited effect on vehicle fuel consumption. Experiments with an early prototype of the hybrid driveline yielded no gains in fuel consumption when compared to a conventional CVT reference vehicle due to high storage losses in the flywheel system. However, the improved prototype of the flywheel hybrid

  14. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi; Collier, Nathan; Niemi, Antti; Calo, Victor M.

    2012-01-01

    optimised shapes produce efficient flapping flights, the wake pattern and its vorticity strength are examined. This work described in this paper should facilitate better guidance for shape design of engineered flying systems.

  15. Numerical optimisation of friction stir welding: review of future challenges

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2011-01-01

    During the last decade, the combination of increasingly more advanced numerical simulation software with high computational power has resulted in models for friction stir welding (FSW), which have improved the understanding of the determining physical phenomena behind the process substantially....... This has made optimisation of certain process parameters possible and has in turn led to better performing friction stir welded products, thus contributing to a general increase in the popularity of the process and its applications. However, most of these optimisation studies do not go well beyond manual...

  16. Optimising Comprehensibility in Interlingual Translation

    DEFF Research Database (Denmark)

    Nisbeth Jensen, Matilde

    2015-01-01

    The increasing demand for citizen engagement in areas traditionally belonging exclusively to experts, such as health, law and technology has given rise to the necessity of making expert knowledge available to the general public through genres such as instruction manuals for consumer goods, patien...... the functional text type of Patient Information Leaflet. Finally, the usefulness of applying the principles of Plain Language and intralingual translation for optimising comprehensibility in interlingual translation is discussed....

  17. Day-ahead economic optimisation of energy storage

    NARCIS (Netherlands)

    Lampropoulos, I.; Garoufalis, P.; Bosch, van den P.P.J.; Groot, de R.J.W.; Kling, W.L.

    2014-01-01

    This article addresses the day-ahead economic optimisation of energy storage systems within the setting of electricity spot markets. The case study is about a lithium-ion battery system integrated in a low voltage distribution grid with residential customers and photovoltaic generation in the

  18. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  19. Systematic development and implementation of interventions to OPtimise Health Literacy and Access (Ophelia

    Directory of Open Access Journals (Sweden)

    Alison Beauchamp

    2017-03-01

    Full Text Available Abstract Background The need for healthcare strengthening to enhance equity is critical, requiring systematic approaches that focus on those experiencing lesser access and outcomes. This project developed and tested the Ophelia (OPtimising HEalth LIteracy and Access approach for co-design of interventions to improve health literacy and equity of access. Eight principles guided this development: Outcomes focused; Equity driven, Needs diagnosis, Co-design, Driven by local wisdom, Sustainable, Responsive and Systematically applied. We report the application of the Ophelia process where proof-of-concept was defined as successful application of the principles. Methods Nine sites were briefed on the aims of the project around health literacy, co-design and quality improvement. The sites were rural/metropolitan, small/large hospitals, community health centres or municipalities. Each site identified their own priorities for improvement; collected health literacy data using the Health Literacy Questionnaire (HLQ within the identified priority groups; engaged staff in co-design workshops to generate ideas for improvement; developed program-logic models; and implemented their projects using Plan-Do-Study-Act (PDSA cycles. Evaluation included assessment of impacts on organisations, practitioners and service users, and whether the principles were applied. Results Sites undertook co-design workshops involving discussion of service user needs informed by HLQ (n = 813 and interview data. Sites generated between 21 and 78 intervention ideas and then planned their selected interventions through program-logic models. Sites successfully implemented interventions and refined them progressively with PDSA cycles. Interventions generally involved one of four pathways: development of clinician skills and resources for health literacy, engagement of community volunteers to disseminate health promotion messages, direct impact on consumers’ health literacy, and

  20. Optimisation of FAME production from waste cooking oil for biodiesel use

    OpenAIRE

    Bautista, Luis Fernando; Vicente, Gemma; Rodríguez, Rosalía; Pacheco, María

    2009-01-01

    This study consists of the development and optimisation of the potassium hydroxide-catalysed synthesis of fatty acid methyl esters (FAME) from waste cooking oil. A factorial design of experiments and a central composite design have been used. The variables chosen were fatty acid concentration in the waste cooking oil, temperature and initial catalyst concentration by weight of waste cooking oil, while the responses were FAME purity and yield. The initial catalyst concentration is the most imp...

  1. Future aircraft cabins and design thinking: optimisation vs. win-win scenarios

    Directory of Open Access Journals (Sweden)

    A. Hall

    2013-06-01

    Full Text Available With projections indicating an increase in mobility over the next few decades and annual flight departures expected to rise to over 16 billion by 2050, there is a demand for the aviation industry and associated stakeholders to consider new forms of aircraft and technology. Customer requirements are recognized as a key driver in business. The airline is the principal customer for the aircraft manufacture. The passenger is, in turn, the airline's principal customer but they are just one of several stakeholders that include aviation authorities, airport operators, air-traffic control and security agencies. The passenger experience is a key differentiator used by airlines to attract and retain custom and the fuselage that defines the cabin envelope for the in-flight passenger experience and cabin design therefore receives significant attention for new aircraft, service updates and refurbishments. Decision making in design is crucial to arriving at viable and worthwhile cabin formats. Too little innovation will result in an aircraft manufacturer and airlines using its products falling behind its competitors. Too much may result in an over-extension with, for example, use of immature technologies that do not have the necessary reliability for a safety critical industry or sufficient value to justify the development effort. The multiple requirements associated with cabin design, can be viewed as an area for optimisation, accepting trade-offs between the various parameters. Good design, however, is often defined as developing a concept that resolves the contradictions and takes the solution towards a win-win scenario. Indeed our understanding and practice of design allows for behaviors that enhance design thinking through divergence and convergence, the use of abductive reasoning, experimentation and systems thinking. This paper explores and defines the challenges of designing the aircraft cabin of the future that will deliver on the multiple

  2. Development, optimisation and characterisation of a radiation hard mixed-signal readout chip for LHCb

    Energy Technology Data Exchange (ETDEWEB)

    Loechner, S.

    2006-07-26

    The Beetle chip is a radiation hard, 128 channel pipelined readout chip for silicon strip detectors. The front-end consists of a charge-sensitive preamplifier followed by a CR-RC pulse shaper. The analogue pipeline memory is implemented as a switched capacitor array with a maximum latency of 4us. The 128 analogue channels are multiplexed and transmitted off chip in 900ns via four current output drivers. Beside the pipelined readout path, the Beetle provides a fast discrimination of the front-end pulse. Within this doctoral thesis parts of the radiation hard Beetle readout chip for the LHCb experiment have been developed. The overall chip performances like noise, power consumption, input charge rates have been optimised as well as the elimination of failures so that the Beetle fulfils the requirements of the experiment. Furthermore the characterisation of the chip was a major part of this thesis. Beside the detailed measurement of the chip performance, several irradiation tests and an Single Event Upset (SEU) test were performed. A long-time measurement with a silicon strip detector was also part of this work as well as the development and test of a first mass production test setup. The Beetle chip showed no functional failure and only slight degradation in the analogue performance under irradiation of up to 130Mrad total dose. The Beetle chip fulfils all requirements of the vertex detector (VELO), the trigger tracker (TT) and the inner tracker (IT) and is ready for the start of LHCb end of 2007. (orig.)

  3. Development, optimisation and characterisation of a radiation hard mixed-signal readout chip for LHCb

    International Nuclear Information System (INIS)

    Loechner, S.

    2006-01-01

    The Beetle chip is a radiation hard, 128 channel pipelined readout chip for silicon strip detectors. The front-end consists of a charge-sensitive preamplifier followed by a CR-RC pulse shaper. The analogue pipeline memory is implemented as a switched capacitor array with a maximum latency of 4us. The 128 analogue channels are multiplexed and transmitted off chip in 900ns via four current output drivers. Beside the pipelined readout path, the Beetle provides a fast discrimination of the front-end pulse. Within this doctoral thesis parts of the radiation hard Beetle readout chip for the LHCb experiment have been developed. The overall chip performances like noise, power consumption, input charge rates have been optimised as well as the elimination of failures so that the Beetle fulfils the requirements of the experiment. Furthermore the characterisation of the chip was a major part of this thesis. Beside the detailed measurement of the chip performance, several irradiation tests and an Single Event Upset (SEU) test were performed. A long-time measurement with a silicon strip detector was also part of this work as well as the development and test of a first mass production test setup. The Beetle chip showed no functional failure and only slight degradation in the analogue performance under irradiation of up to 130Mrad total dose. The Beetle chip fulfils all requirements of the vertex detector (VELO), the trigger tracker (TT) and the inner tracker (IT) and is ready for the start of LHCb end of 2007. (orig.)

  4. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    OpenAIRE

    Andrew J. Capel; Andrew Wright; Matthew J. Harding; George W. Weaver; Yuqi Li; Russell A. Harris; Steve Edmondson; Ruth D. Goodridge; Steven D. R. Christie

    2017-01-01

    Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro and milli-scale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multi-functional fluidic devices with embedded reaction moni...

  5. Influence of multiple bioprocess parameters on production of lipase from Pseudomonas sp. BWS-5

    Directory of Open Access Journals (Sweden)

    Balwinder Singh Sooch

    2013-10-01

    Full Text Available The aim of the present work was to study the influence of multiple bioprocess parameters for the maximum production of lipase from Pseudomonas sp. BWS-5. The culture reached the stationary phase of growth after 36h of incubation when the maximum lipase production was obtained at flask level. The different media components such as carbon sources, nitrogen sources, trace elements and process parameters such as the pH of the medium, temperature and time of incubation, agitation/stationary conditions, etc. were optimized at flask level and at bioreactor level. The maximum enzyme production of 298 IU/mL was obtained with the use of simple medium with pH 6.5 containing glucose (1 %, w/v, peptone (3 %, w/v and KCl (0.05 %, w/v after 30h of incubation at 37°C under agitation (200 rpm conditions with 0.75 vvm of air supply.

  6. The optimization of treatment and management of schizophrenia in Europe (OPTiMiSE) trial

    DEFF Research Database (Denmark)

    Leucht, Stefan; Winter-van Rossum, Inge; Heres, Stephan

    2015-01-01

    Commission sponsored "Optimization of Treatment and Management of Schizophrenia in Europe" (OPTiMiSE) trial which aims to provide a treatment algorithm for patients with a first episode of schizophrenia. METHODS: We searched Pubmed (October 29, 2014) for randomized controlled trials (RCTs) that examined...... switching the drug in nonresponders to another antipsychotic. We described important methodological choices of the OPTiMiSE trial. RESULTS: We found 10 RCTs on switching antipsychotic drugs. No trial was conclusive and none was concerned with first-episode schizophrenia. In OPTiMiSE, 500 first episode...

  7. Convex optimisation approach to constrained fuel optimal control of spacecraft in close relative motion

    Science.gov (United States)

    Massioni, Paolo; Massari, Mauro

    2018-05-01

    This paper describes an interesting and powerful approach to the constrained fuel-optimal control of spacecraft in close relative motion. The proposed approach is well suited for problems under linear dynamic equations, therefore perfectly fitting to the case of spacecraft flying in close relative motion. If the solution of the optimisation is approximated as a polynomial with respect to the time variable, then the problem can be approached with a technique developed in the control engineering community, known as "Sum Of Squares" (SOS), and the constraints can be reduced to bounds on the polynomials. Such a technique allows rewriting polynomial bounding problems in the form of convex optimisation problems, at the cost of a certain amount of conservatism. The principles of the techniques are explained and some application related to spacecraft flying in close relative motion are shown.

  8. Optimisation of the level-1 calorimeter trigger at ATLAS for Run II

    Energy Technology Data Exchange (ETDEWEB)

    Suchek, Stanislav [Kirchhoff-Institute for Physics, Im Neuenheimer Feld 227, 69120 Heidelberg (Germany); Collaboration: ATLAS-Collaboration

    2015-07-01

    The Level-1 Calorimeter Trigger (L1Calo) is a central part of the ATLAS Level-1 Trigger system, designed to identify jet, electron, photon, and hadronic tau candidates, and to measure their transverse energies, as well total transverse energy and missing transverse energy. The optimisation of the jet energy resolution is an important part of the L1Calo upgrade for Run II. A Look-Up Table (LUT) is used to translate the electronic signal from each trigger tower to its transverse energy. By optimising the LUT calibration we can achieve better jet energy resolution and better performance of the jet transverse energy triggers, which are vital for many physics analyses. In addition, the improved energy calibration leads to significant improvements of the missing transverse energy resolution. A new Multi-Chip Module (MCM), as a part of the L1Calo upgrade, provides two separate LUTs for jets and electrons/photons/taus, allowing to optimise jet transverse energy and missing transverse energy separately from the electromagnetic objects. The optimisation is validated using jet transverse energy and missing transverse energy triggers turn-on curves and rates.

  9. Balanced the Trade-offs problem of ANFIS Using Particle Swarm Optimisation

    Directory of Open Access Journals (Sweden)

    Dian Palupi Rini

    2013-11-01

    Full Text Available Improving the approximation accuracy and interpretability of fuzzy systems is an important issue either in fuzzy systems theory or in its applications . It is known that simultaneous optimisation both issues was the trade-offs problem, but it will improve performance of the system and avoid overtraining of data. Particle swarm optimisation (PSO is part of evolutionary algorithm that is good candidate algorithms to solve multiple optimal solution and better global search space. This paper introduces an integration of PSO dan ANFIS for optimise its learning especially for tuning membership function parameters and finding the optimal rule for better classification. The proposed method has been tested on four standard dataset from UCI machine learning i.e. Iris Flower, Habermans Survival Data, Balloon and Thyroid dataset. The results have shown better classification using the proposed PSO-ANFIS and the time complexity has reduced accordingly.

  10. Portfolio optimisation for hydropower producers that balances riverine ecosystem protection and producer needs

    Science.gov (United States)

    Yin, X. A.; Yang, Z. F.; Liu, C. L.

    2014-04-01

    In deregulated electricity markets, hydropower portfolio design has become an essential task for producers. The previous research on hydropower portfolio optimisation focused mainly on the maximisation of profits but did not take into account riverine ecosystem protection. Although profit maximisation is the major objective for producers in deregulated markets, protection of riverine ecosystems must be incorporated into the process of hydropower portfolio optimisation, especially against a background of increasing attention to environmental protection and stronger opposition to hydropower generation. This research seeks mainly to remind hydropower producers of the requirement of river protection when they design portfolios and help shift portfolio optimisation from economically oriented to ecologically friendly. We establish a framework to determine the optimal portfolio for a hydropower reservoir, accounting for both economic benefits and ecological needs. In this framework, the degree of natural flow regime alteration is adopted as a constraint on hydropower generation to protect riverine ecosystems, and the maximisation of mean annual revenue is set as the optimisation objective. The electricity volumes assigned in different electricity submarkets are optimised by the noisy genetic algorithm. The proposed framework is applied to China's Wangkuai Reservoir to test its effectiveness. The results show that the new framework could help to design eco-friendly portfolios that can ensure a planned profit and reduce alteration of the natural flow regime.

  11. Optimised cantilever biosensor with piezoresistive read-out

    DEFF Research Database (Denmark)

    Rasmussen, Peter; Thaysen, J.; Hansen, Ole

    2003-01-01

    We present a cantilever-based biochemical sensor with piezoresistive read-out which has been optimised for measuring surface stress. The resistors and the electrical wiring on the chip are encapsulated in low-pressure chemical vapor deposition (LPCVD) silicon nitride, so that the chip is well sui...

  12. Optimisation of the energy efficiency of bread-baking ovens using a combined experimental and computational approach

    International Nuclear Information System (INIS)

    Khatir, Zinedine; Paton, Joe; Thompson, Harvey; Kapur, Nik; Toropov, Vassili

    2013-01-01

    Highlights: ► A scientific framework for optimising oven operating conditions is presented. ► Experiments measuring local convective heat transfer coefficient are undertaken. ► An energy efficiency model is developed with experimentally calibrated CFD analysis. ► Designing ovens with optimum heat transfer coefficients reduces energy use. ► Results demonstrate a strong case to design and manufacture energy optimised ovens. - Abstract: Changing legislation and rising energy costs are bringing the need for efficient baking processes into much sharper focus. High-speed air impingement bread-baking ovens are complex systems using air flow to transfer heat to the product. In this paper, computational fluid dynamics (CFD) is combined with experimental analysis to develop a rigorous scientific framework for the rapid generation of forced convection oven designs. A design parameterisation of a three-dimensional generic oven model is carried out for a wide range of oven sizes and flow conditions to optimise desirable features such as temperature uniformity throughout the oven, energy efficiency and manufacturability. Coupled with the computational model, a series of experiments measuring the local convective heat transfer coefficient (h c ) are undertaken. The facility used for the heat transfer experiments is representative of a scaled-down production oven where the air temperature and velocity as well as important physical constraints such as nozzle dimensions and nozzle-to-surface distance can be varied. An efficient energy model is developed using a CFD analysis calibrated using experimentally determined inputs. Results from a range of oven designs are presented together with ensuing energy usage and savings

  13. The Draft Genome Sequence of Clostridium sp. Strain NJ4, a Bacterium Capable of Producing Butanol from Inulin Through Consolidated Bioprocessing.

    Science.gov (United States)

    Jiang, Yujia; Lu, Jiasheng; Chen, Tianpeng; Yan, Wei; Dong, Weiliang; Zhou, Jie; Zhang, Wenming; Ma, Jiangfeng; Jiang, Min; Xin, Fengxue

    2018-05-23

    A novel butanogenic Clostridium sp. NJ4 was successfully isolated and characterized, which could directly produce relatively high titer of butanol from inulin through consolidated bioprocessing (CBP). The assembled draft genome of strain NJ4 is 4.09 Mp, containing 3891 encoded protein sequences with G+C content of 30.73%. Among these annotated genes, a levanase, a hypothetical inulinase, and two bifunctional alcohol/aldehyde dehydrogenases (AdhE) were found to play key roles in the achievement of ABE production from inulin through CBP.

  14. Capacity Planning for Batch and Perfusion Bioprocesses Across Multiple Biopharmaceutical Facilities

    Science.gov (United States)

    Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S

    2014-01-01

    Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fed-batch or perfusion culture processes such as sequence-dependent changeover times, continuous culture constraints, and decoupled upstream and downstream operations that permit independent scheduling of each. Strategic inventory levels were accounted for by applying cost penalties when they were not met. A rolling time horizon methodology was utilized in conjunction with the MILP model and was shown to obtain solutions with greater optimality in less computational time than the full-scale model. The model was applied to an industrial case study to illustrate how the framework aids decisions regarding outsourcing capacity to third party manufacturers or building new facilities. The impact of variations on key parameters such as demand or titres on the optimal production plans and costs was captured. The analysis identified the critical ratio of in-house to contract manufacturing organization (CMO) manufacturing costs that led the optimization results to favor building a future facility over using a CMO. The tool predicted that if titres were higher than expected then the optimal solution would allocate more production to in-house facilities, where manufacturing costs were lower. Utilization graphs indicated when capacity expansion should be considered. © 2013 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 30:594–606, 2014 PMID:24376262

  15. Optimised operation of an off-grid hybrid wind-diesel-battery system using genetic algorithm

    International Nuclear Information System (INIS)

    Gan, Leong Kit; Shek, Jonathan K.H.; Mueller, Markus A.

    2016-01-01

    Highlights: • Diesel generator’s operation is optimised in a hybrid wind-diesel-battery system. • Optimisation is performed using wind speed and load demand forecasts. • The objective is to maximise wind energy utilisation with limited battery storage. • Physical modelling approach (Simscape) is used to verify mathematical model. • Sensitivity analyses are performed with synthesised wind and load forecast errors. - Abstract: In an off-grid hybrid wind-diesel-battery system, the diesel generator is often not utilised efficiently, therefore compromising its lifetime. In particular, the general rule of thumb of running the diesel generator at more than 40% of its rated capacity is often unmet. This is due to the variation in power demand and wind speed which needs to be supplied by the diesel generator. In addition, the frequent start-stop of the diesel generator leads to additional mechanical wear and fuel wastage. This research paper proposes a novel control algorithm which optimises the operation of a diesel generator, using genetic algorithm. With a given day-ahead forecast of local renewable energy resource and load demand, it is possible to optimise the operation of a diesel generator, subjected to other pre-defined constraints. Thus, the utilisation of the renewable energy sources to supply electricity can be maximised. Usually, the optimisation studies of a hybrid system are being conducted through simple analytical modelling, coupled with a selected optimisation algorithm to seek the optimised solution. The obtained solution is not verified using a more realistic system model, for instance the physical modelling approach. This often led to the question of the applicability of such optimised operation being used in reality. In order to take a step further, model-based design using Simulink is employed in this research to perform a comparison through a physical modelling approach. The Simulink model has the capability to incorporate the electrical

  16. Optimisation of a parallel ocean general circulation model

    Directory of Open Access Journals (Sweden)

    M. I. Beare

    1997-10-01

    Full Text Available This paper presents the development of a general-purpose parallel ocean circulation model, for use on a wide range of computer platforms, from traditional scalar machines to workstation clusters and massively parallel processors. Parallelism is provided, as a modular option, via high-level message-passing routines, thus hiding the technical intricacies from the user. An initial implementation highlights that the parallel efficiency of the model is adversely affected by a number of factors, for which optimisations are discussed and implemented. The resulting ocean code is portable and, in particular, allows science to be achieved on local workstations that could otherwise only be undertaken on state-of-the-art supercomputers.

  17. Optimisation of a parallel ocean general circulation model

    Directory of Open Access Journals (Sweden)

    M. I. Beare

    Full Text Available This paper presents the development of a general-purpose parallel ocean circulation model, for use on a wide range of computer platforms, from traditional scalar machines to workstation clusters and massively parallel processors. Parallelism is provided, as a modular option, via high-level message-passing routines, thus hiding the technical intricacies from the user. An initial implementation highlights that the parallel efficiency of the model is adversely affected by a number of factors, for which optimisations are discussed and implemented. The resulting ocean code is portable and, in particular, allows science to be achieved on local workstations that could otherwise only be undertaken on state-of-the-art supercomputers.

  18. Optimising radiation outcomes, scheduling patient waiting lists for maximum population tumour control

    International Nuclear Information System (INIS)

    Ebert, M.A.; Jennings, L.; Kearvell, R.; Bydder, S.

    2011-01-01

    Full text: Delays in the commencement of radiotherapy, possibly due to resource constraints, are known to impact on control-related outcomes. We sought an objective solution for patient prioritisation based on tumour control probability (TCP). With a utilitarian objective for maximising TCP in a population of M patients, with patient i waiting a time between diagnosis and treatment of Ti and a mean wait time of TMean, the optimisation problem is as shown. A linear-quadratic/Poissonian model for cell survival/TCP was considered including cell doubling during the wait time. Solutions to several distributions of patient population characteristics were examined together with the expected change in TCP for the population and individuals. An analytical solution to the optimisation problem was found which gives the optimal wait time for each patient as a function of the distribution of radiobiological characteristics in the population. This solution does not allow a negativity constraint on an individual's optimised waiting time so a waiting list simulation was developed to enforce that. Optimal wait time distributions were calculated for situations where patients are allocated distinct diagnostic groups (sharing radiobiological parameters) and for a (log-normal) distribution of doubling times in the population. In order to meet the utilitarian objective, the optimal solutions require patients with rapid cell doubling times to be accelerated up the waiting list at the expense of those with slowly proliferating tumours. The net population benefit however is comparable to or greater then the expected benefit from beam intensity modulation or dose escalation.

  19. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    DR. AMIN

    optimise the production of ADC from organic fractions of domestic wastes and the effects of ADC amendments on soil .... (22%), cooked meat (9%), lettuce (11%), carrots. (3%), potato (44%) ... seed was obtained from a mesophilic anaerobic.

  20. Automation of route identification and optimisation based on data-mining and chemical intuition.

    Science.gov (United States)

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  1. Geometrical exploration of a flux-optimised sodium receiver through multi-objective optimisation

    Science.gov (United States)

    Asselineau, Charles-Alexis; Corsi, Clothilde; Coventry, Joe; Pye, John

    2017-06-01

    A stochastic multi-objective optimisation method is used to determine receiver geometries with maximum second law efficiency, minimal average temperature and minimal surface area. The method is able to identify a set of Pareto optimal candidates that show advantageous geometrical features, mainly in being able to maximise the intercepted flux within the geometrical boundaries set. Receivers with first law thermal efficiencies ranging from 87% to 91% are also evaluated using the second law of thermodynamics and found to have similar efficiencies of over 60%, highlighting the influence that the geometry can play in the maximisation of the work output of receivers by influencing the distribution of the flux from the concentrator.

  2. Particle Swarm Optimisation with Spatial Particle Extension

    DEFF Research Database (Denmark)

    Krink, Thiemo; Vesterstrøm, Jakob Svaneborg; Riget, Jacques

    2002-01-01

    In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed...

  3. Bioprocess design guided by in situ substrate supply and product removal: process intensification for synthesis of (S)-1-(2-chlorophenyl)ethanol.

    Science.gov (United States)

    Schmölzer, Katharina; Mädje, Katharina; Nidetzky, Bernd; Kratzer, Regina

    2012-03-01

    We report herein on bioprocess development guided by the hydrophobicities of substrate and product. Bioreductions of o-chloroacetophenone are severely limited by instability of the catalyst in the presence of aromatic substrate and (S)-1-(2-chlorophenyl)ethanol. In situ substrate supply and product removal was used to protect the utilized Escherichia coli whole cell catalyst based on Candida tenuis xylose reductase during the reaction. Further engineering at the levels of the catalyst and the reaction media was matched to low substrate concentrations in the aqueous phase. Productivities obtained in aqueous batch reductions were 21-fold improved by addition of 20% (v/v) hexane, NAD(+), expression engineering, cell permeabilization and pH optimization. Reduction of 300 mM substrate was accomplished in 97% yield and use of the co-solvent hexane in subsequent extraction steps led to 88% recovery. Product loss due to high catalyst loading was minimized by using the same extractant in bioreduction and product isolation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Contributions of depth filter components to protein adsorption in bioprocessing.

    Science.gov (United States)

    Khanal, Ohnmar; Singh, Nripen; Traylor, Steven J; Xu, Xuankuo; Ghose, Sanchayita; Li, Zheng J; Lenhoff, Abraham M

    2018-04-16

    Depth filtration is widely used in downstream bioprocessing to remove particulate contaminants via depth straining and is therefore applied to harvest clarification and other processing steps. However, depth filtration also removes proteins via adsorption, which can contribute variously to impurity clearance and to reduction in product yield. The adsorption may occur on the different components of the depth filter, that is, filter aid, binder, and cellulose filter. We measured adsorption of several model proteins and therapeutic proteins onto filter aids, cellulose, and commercial depth filters at pH 5-8 and ionic strengths filter component in the adsorption of proteins with different net charges, using confocal microscopy. Our findings show that a complete depth filter's maximum adsorptive capacity for proteins can be estimated by its protein monolayer coverage values, which are of order mg/m 2 , depending on the protein size. Furthermore, the extent of adsorption of different proteins appears to depend on the nature of the resin binder and its extent of coating over the depth filter surface, particularly in masking the cation-exchanger-like capacity of the siliceous filter aids. In addition to guiding improved depth filter selection, the findings can be leveraged in inspiring a more intentional selection of components and design of depth filter construction for particular impurity removal targets. © 2018 Wiley Periodicals, Inc.

  5. Protection against natural radiation: Optimisation and decision exercises

    International Nuclear Information System (INIS)

    O'Riordan, M.C.

    1984-02-01

    Six easy exercises are presented in which cost-benefit analysis is used to optimise protection against natural radiation or to decide whether protection is appropriate. The exercises are illustrative only and do not commit the Board. (author)

  6. Optimising Signalised Intersection Using Wireless Vehicle Detectors

    DEFF Research Database (Denmark)

    Adjin, Daniel Michael Okwabi; Torkudzor, Moses; Asare, Jack

    Traffic congestion on roads wastes travel times. In this paper, we developed a vehicular traffic model to optimise a signalised intersection in Accra, using wireless vehicle detectors. Traffic volume gathered was extrapolated to cover 2011 and 2016 and were analysed to obtain the peak hour traffic...... volume causing congestion. The intersection was modelled and simulated in Synchro7 as an actuated signalised model using results from the analysed data. The model for morning peak periods gave optimal cycle lengths of 100s and 150s with corresponding intersection delay of 48.9s and 90.6s in 2011 and 2016...... respectively while that for the evening was 55s giving delay of 14.2s and 16.3s respectively. It is shown that the model will improve traffic flow at the intersection....

  7. Modulation aware cluster size optimisation in wireless sensor networks

    Science.gov (United States)

    Sriram Naik, M.; Kumar, Vinay

    2017-07-01

    Wireless sensor networks (WSNs) play a great role because of their numerous advantages to the mankind. The main challenge with WSNs is the energy efficiency. In this paper, we have focused on the energy minimisation with the help of cluster size optimisation along with consideration of modulation effect when the nodes are not able to communicate using baseband communication technique. Cluster size optimisations is important technique to improve the performance of WSNs. It provides improvement in energy efficiency, network scalability, network lifetime and latency. We have proposed analytical expression for cluster size optimisation using traditional sensing model of nodes for square sensing field with consideration of modulation effects. Energy minimisation can be achieved by changing the modulation schemes such as BPSK, 16-QAM, QPSK, 64-QAM, etc., so we are considering the effect of different modulation techniques in the cluster formation. The nodes in the sensing fields are random and uniformly deployed. It is also observed that placement of base station at centre of scenario enables very less number of modulation schemes to work in energy efficient manner but when base station placed at the corner of the sensing field, it enable large number of modulation schemes to work in energy efficient manner.

  8. Optimisation of wire-cut EDM process parameter by Grey-based response surface methodology

    Science.gov (United States)

    Kumar, Amit; Soota, Tarun; Kumar, Jitendra

    2018-03-01

    Wire electric discharge machining (WEDM) is one of the advanced machining processes. Response surface methodology coupled with Grey relation analysis method has been proposed and used to optimise the machining parameters of WEDM. A face centred cubic design is used for conducting experiments on high speed steel (HSS) M2 grade workpiece material. The regression model of significant factors such as pulse-on time, pulse-off time, peak current, and wire feed is considered for optimising the responses variables material removal rate (MRR), surface roughness and Kerf width. The optimal condition of the machining parameter was obtained using the Grey relation grade. ANOVA is applied to determine significance of the input parameters for optimising the Grey relation grade.

  9. Optimisation and decisions in radiological protection - A report of the work of an ICRP task group

    International Nuclear Information System (INIS)

    Webb, G.A.M.

    1988-01-01

    In 1984 the International Commission on Radiological Protection (ICRP) established a Task Group of Committee 4 to produce a report on methods for optimisation of protection other than cost-benefit analysis. As the work of the task group progressed it became clear that it would be more useful to produce a report on the entire field of application of optimisation, mainly to show how the various techniques including cost-benefit analysis could be applied appropriately to problems at different levels of complexity. This paper reports on the main ideas that have been developed by the task group. It must be emphasised that these ideas have not been endorsed by Committee 4 nor approved by the Commission so they can not yet be considered as recommendations

  10. A constriction factor based particle swarm optimisation algorithm to solve the economic dispatch problem including losses

    Energy Technology Data Exchange (ETDEWEB)

    Young, Steven; Montakhab, Mohammad; Nouri, Hassan

    2011-07-15

    Economic dispatch (ED) is one of the most important problems to be solved in power generation as fractional percentage fuel reductions represent significant cost savings. ED wishes to optimise the power generated by each generating unit in a system in order to find the minimum operating cost at a required load demand, whilst ensuring both equality and inequality constraints are met. For the process of optimisation, a model must be created for each generating unit. The particle swarm optimisation technique is an evolutionary computation technique with one of the most powerful methods for solving global optimisation problems. The aim of this paper is to add in a constriction factor to the particle swarm optimisation algorithm (CFBPSO). Results show that the algorithm is very good at solving the ED problem and that CFBPSO must be able to work in a practical environment and so a valve point effect with transmission losses should be included in future work.

  11. Navigating catastrophes: Local but not global optimisation allows for macro-economic navigation of crises

    Science.gov (United States)

    Harré, Michael S.

    2013-02-01

    Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.

  12. Optimising culture medium for producing the yeast Pichia onychis (Lv027

    Directory of Open Access Journals (Sweden)

    Andrés Díaz

    2005-01-01

    Full Text Available Optimising Pichia onychis yeast biomass production was evaluated using different substrates and different physicochemical conditions for liquid fermentation. The Plackett-Burman statistical design was initially applied for screening the most important nutritional variables (three carbon sources and eight nitrogen sources affecting yeast biomass production. Four nutritional sources and two physicochemical variables were subsequently evaluated using a factorial fractionated design as the starting point for optimising the process by applying a central composite rotational design. The results obtained f rom employing a polynomial regression model using the experimental data showed that biomass production was strongly affected by nutritional and physicochemical conditions. The highest yield was obtained in the following conditions: 43,42 g/L carbon source, 0,261 g/L nitrogen organic source, shaking at 110 rpm, 6,0 pH, 48 h total fermentation time during which 8,95 XlO9 cells/mL were obtained, equivalent to 6,30 g/L dry biomass. Key words: Pichia onychis, optimisation, liquid fermentation.

  13. Design of passive coolers for light-emitting diode lamps using topology optimisation

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Sigmund, Ole; Meyer, Knud Erik

    2018-01-01

    Topology optimised designs for passive cooling of light-emitting diode (LED) lamps are investigated through extensive numerical parameter studies. The designs are optimised for either horizontal or vertical orientations and are compared to a lattice-fin design as well as a simple parameter......, while maintaining low sensitivity to orientation. Furthermore, they exhibit several defining features and provide insight and general guidelines for the design of passive coolers for LED lamps....

  14. A joint spare part and maintenance inspection optimisation model using the Delay-Time concept

    International Nuclear Information System (INIS)

    Wang Wenbin

    2011-01-01

    Spare parts and maintenance are closely related logistics activities where maintenance generates the need for spare parts. When preventive maintenance is present, it may need more spare parts at one time because of the planned preventive maintenance activities. This paper considers the joint optimisation of three decision variables, e.g., the ordering quantity, ordering interval and inspection interval. The model is constructed using the well-known Delay-Time concept where the failure process is divided into a two-stage process. The objective function is the long run expected cost per unit time in terms of the three decision variables to be optimised. Here we use a block-based inspection policy where all components are inspected at the same time regardless of the ages of the components. This creates a situation that the time to failure since the immediate previous inspection is random and has to be modelled by a distribution. This time is called the forward time and a limiting but closed form of such distribution is obtained. We develop an algorithm for the optimal solution of the decision process using a combination of analytical and enumeration approaches. The model is demonstrated by a numerical example. - Highlights: → Joint optimisation of maintenance and spare part inventory. → The use of the Delay-Time concept. → Block-based inspection. → Fixed order interval but variable order quantity.

  15. Statistical optimisation techniques in fatigue signal editing problem

    International Nuclear Information System (INIS)

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-01-01

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection

  16. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  17. FISHRENT; Bio-economic simulation and optimisation model

    NARCIS (Netherlands)

    Salz, P.; Buisman, F.C.; Soma, K.; Frost, H.; Accadia, P.; Prellezo, R.

    2011-01-01

    Key findings: The FISHRENT model is a major step forward in bio-economic model-ling, combining features that have not been fully integrated in earlier models: 1- Incorporation of any number of species (or stock) and/or fleets 2- Integration of simulation and optimisation over a period of 25 years 3-

  18. Thermal-economic optimisation of a CHP gas turbine system by applying a fit-problem genetic algorithm

    Science.gov (United States)

    Ferreira, Ana C. M.; Teixeira, Senhorinha F. C. F.; Silva, Rui G.; Silva, Ângela M.

    2018-04-01

    Cogeneration allows the optimal use of the primary energy sources and significant reductions in carbon emissions. Its use has great potential for applications in the residential sector. This study aims to develop a methodology for thermal-economic optimisation of small-scale micro-gas turbine for cogeneration purposes, able to fulfil domestic energy needs with a thermal power out of 125 kW. A constrained non-linear optimisation model was built. The objective function is the maximisation of the annual worth from the combined heat and power, representing the balance between the annual incomes and the expenditures subject to physical and economic constraints. A genetic algorithm coded in the java programming language was developed. An optimal micro-gas turbine able to produce 103.5 kW of electrical power with a positive annual profit (i.e. 11,925 €/year) was disclosed. The investment can be recovered in 4 years and 9 months, which is less than half of system lifetime expectancy.

  19. Design of Circularly-Polarised, Crossed Drooping Dipole, Phased Array Antenna Using Genetic Algorithm Optimisation

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal

    2007-01-01

    A printed drooping dipole array is designed and constructed. The design is based on a genetic algorithm optimisation procedure used in conjunction with the software programme AWAS. By optimising the array G/T for specific combinations of scan angles and frequencies an optimum design is obtained...

  20. Bio-processing of Agro-industrial Wastes for Production of Food-grade Enzymes: Progress and Prospects

    Directory of Open Access Journals (Sweden)

    Parmjit S Panesar

    2016-10-01

    Full Text Available Background and Objectives: In the era of global industrialization, enzymes are being used extensively in the various sectors including food processing. Owing to the high price of enzymes, various initiatives have been undertaken by the R&D sector for the development of new processes or improvement in the existing processes for production of cost effective enzymes. With the advancement in the field of biotechnology, different bioprocesses are being used for utilization of different agro-industrial residues for the production of various enzymes. This review focuses on different types of agro-industrial wastes and their utilization in the production of enzymes. The present scenario as well as the future scope of utilization of enzymes in the food industry has also been discussed.Results and Conclusion: The regulations from the various governmental as well as environmental agencies for the demand of cleaner environment have led to the advancement in various technologies for utilization of the wastes for the production of value-added products such as enzymes. Among the different types of fermentation, maximum work has been carried under solid state conditions by batch fermentation. The research has indicated the significant potential of agro-industrial wastes for production of food-grade enzymes in order to improve the economics of the process.Conflict of interests: The authors declare no conflict of interest.

  1. An investigation into the application of modern heuristic optimisation techniques to problems in power and processing utilities

    International Nuclear Information System (INIS)

    Dahal, Keshav Prasad

    2000-01-01

    The work contained in this thesis demonstrates that there is a significant requirement for the development and application of new optimisation techniques for solving industrial scheduling problems, in order to achieve a better schedule with significant economic and operational impact. An investigation of how modern heuristic approaches, such as genetic algorithm (GA), simulated annealing (SA), fuzzy logic and hybrids of these techniques, may be developed, designed and implemented appropriately for solving short term and long term NP-hard scheduling problems that exist in electric power utilities and process facilities. GA and SA based methods are developed for generator maintenance scheduling using a novel integer encoding and appropriate GA and SA operators. Three hybrid approaches (an inoculated GA, a GA/SA and a GA with fuzzy logic) are proposed in order to improve the solution performance, and to take advantage of any flexibilities inherent in the problem. Five different GA-based approaches are investigated for solving the generation scheduling problem. Of those, a knowledge-based hybrid GA approach achieves better solutions in a shorter computational time. This approach integrates problem specific knowledge, heuristic dispatch calculation and linear programming within the GA-framework. The application of a GA-based methodology is proposed for the scheduling of storage tanks of a water treatment facility. The proposed approach is an integration of a GA and a heuristic rule-base. The GA string considers the tank allocation problem, and the heuristic approach solves the rate determination problems within the framework of the GA. For optimising the schedule of operations of a bulk handling port facility, a generic modelling tool is developed characterising the operational and maintenance activities of the facility. A GA-based approach is integrated with the simulation software for optimising the scheduling of operations of the facility. Each of these approaches is

  2. Sizing Combined Heat and Power Units and Domestic Building Energy Cost Optimisation

    Directory of Open Access Journals (Sweden)

    Dongmin Yu

    2017-06-01

    Full Text Available Many combined heat and power (CHP units have been installed in domestic buildings to increase energy efficiency and reduce energy costs. However, inappropriate sizing of a CHP may actually increase energy costs and reduce energy efficiency. Moreover, the high manufacturing cost of batteries makes batteries less affordable. Therefore, this paper will attempt to size the capacity of CHP and optimise daily energy costs for a domestic building with only CHP installed. In this paper, electricity and heat loads are firstly used as sizing criteria in finding the best capacities of different types of CHP with the help of the maximum rectangle (MR method. Subsequently, the genetic algorithm (GA will be used to optimise the daily energy costs of the different cases. Then, heat and electricity loads are jointly considered for sizing different types of CHP and for optimising the daily energy costs through the GA method. The optimisation results show that the GA sizing method gives a higher average daily energy cost saving, which is 13% reduction compared to a building without installing CHP. However, to achieve this, there will be about 3% energy efficiency reduction and 7% input power to rated power ratio reduction compared to using the MR method and heat demand in sizing CHP.

  3. Formulation des betons autopla~ants : Optimisation du squelette ...

    African Journals Online (AJOL)

    Formulation des betons autopla~ants : Optimisation du squelette granulaire par la methode graphique de Dreux - Gorisse. Fatiha Boumaza - Zeraoulia* & Mourad Behim. Laboratoire Materiaux, Geo - Materiaux et Environnement - Departement de Genie Civil. Universite Badji Mokhtar Annaba - BP 12, 23000 Annaba - ...

  4. Manufacturing footprint optimisation: a necessity for manufacturing network in changing business environment

    DEFF Research Database (Denmark)

    Yang, Cheng; Farooq, Sami; Johansen, John

    2010-01-01

    Facing the unpredictable financial crisis, optimising the footprint can be the biggest and most important transformation a manufacturer can undertake. In order to realise the optimisation, fundamental understanding on manufacturing footprint is required. Different elements of manufacturing...... footprint have been investigated independently in the existing literature. In this paper, for the purpose of relationship exploration between different elements, manufacturing footprints of three industrial companies are traced historically. Based on them, four reasons for the transformation...

  5. Research on Duct Flow Field Optimisation of a Robot Vacuum Cleaner

    Directory of Open Access Journals (Sweden)

    Xiao-bo Lai

    2011-11-01

    Full Text Available The duct of a robot vacuum cleaner is the length of the flow channel between the inlet of the rolling brush blower and the outlet of the vacuum blower. To cope with the pressure drop problem of the duct flow field in a robot vacuum cleaner, a method based on Pressure Implicit with Splitting of Operators (PRISO algorithm is introduced and the optimisation design of the duct flow field is implemented. Firstly, the duct structure in a robot vacuum cleaner is taken as a research object, with the computational fluid dynamics (CFD theories adopted; a three-dimensional fluid model of the duct is established by means of the FLUENT solver of the CFD software. Secondly, with the k-∊ turbulence model of three-dimensional incompressible fluid considered and the PRISO pressure modification algorithm employed, the flow field numerical simulations inside the duct of the robot vacuum cleaner are carried out. Then, the velocity vector plots on the arbitrary plane of the duct flow field are obtained. Finally, an investigation of the dynamic characteristics of the duct flow field is done and defects of the original duct flow field are analysed, the optimisation of the original flow field has then been conducted. Experimental results show that the duct flow field after optimisation can effectively reduce pressure drop, the feasibility as well as the correctness of the theoretical modelling and optimisation approaches are validated.

  6. Research on Duct Flow Field Optimisation of a Robot Vacuum Cleaner

    Directory of Open Access Journals (Sweden)

    Xiao-bo Lai

    2011-11-01

    Full Text Available The duct of a robot vacuum cleaner is the length of the flow channel between the inlet of the rolling brush blower and the outlet of the vacuum blower. To cope with the pressure drop problem of the duct flow field in a robot vacuum cleaner, a method based on Pressure Implicit with Splitting of Operators (PRISO algorithm is introduced and the optimisation design of the duct flow field is implemented. Firstly, the duct structure in a robot vacuum cleaner is taken as a research object, with the computational fluid dynamics (CFD theories adopted; a three‐dimensional fluid model of the duct is established by means of the FLUENT solver of the CFD software. Secondly, with the k‐ε turbulence model of three‐ dimensional incompressible fluid considered and the PRISO pressure modification algorithm employed, the flow field numerical simulations inside the duct of the robot vacuum cleaner are carried out. Then, the velocity vector plots on the arbitrary plane of the duct flow field are obtained. Finally, an investigation of the dynamic characteristics of the duct flow field is done and defects of the original duct flow field are analysed, the optimisation of the original flow field has then been conducted. Experimental results show that the duct flow field after optimisation can effectively reduce pressure drop, the feasibility as well as the correctness of the theoretical modelling and optimisation approaches are validated.

  7. A New Plant Intelligent Behaviour Optimisation Algorithm for Solving Vehicle Routing Problem

    OpenAIRE

    Chagwiza, Godfrey

    2018-01-01

    A new plant intelligent behaviour optimisation algorithm is developed. The algorithm is motivated by intelligent behaviour of plants and is implemented to solve benchmark vehicle routing problems of all sizes, and results were compared to those in literature. The results show that the new algorithm outperforms most of algorithms it was compared to for very large and large vehicle routing problem instances. This is attributed to the ability of the plant to use previously stored memory to respo...

  8. Multigrid Implementation of Cellular Automata for Topology Optimisation of Continuum Structures with Design Dependent loads

    NARCIS (Netherlands)

    Zakhama, R.

    2009-01-01

    Topology optimisation of continuum structures has become mature enough to be often applied in industry and continues to attract the attention of researchers and software companies in various engineering fields. Traditionally, most available algorithms for solving topology optimisation problems are

  9. The optimisation study of tbp synthesis process by phosphoric acid

    International Nuclear Information System (INIS)

    Amedjkouh, A.; Attou, M.; Azzouz, A.; Zaoui, B.

    1995-07-01

    The present work deals with the optimisation study of TBP synthesis process by phosphoric acid. This way of synthesis is more advantageous than POCL3 or P2O5 as phosphatant agents. these latters are toxic and dangerous for the environnement. The optimisation study is based on a series of 16 experiences taking into account the range of variation of the following parameters : temperature, pressure, reagents mole ratio, promoter content. the yield calculation is based on the randomisation of an equation including all parameters. the resolution of this equation gave a 30% TBP molar ratio. this value is in agreement with that of experimental data

  10. Flotation process control optimisation at Prominent Hill

    International Nuclear Information System (INIS)

    Lombardi, Josephine; Muhamad, Nur; Weidenbach, M.

    2012-01-01

    OZ Minerals' Prominent Hill copper- gold concentrator is located 130 km south east of the town of Coober Pedy in the Gawler Craton of South Australia. The concentrator was built in 2008 and commenced commercial production in early 2009. The Prominent Hill concentrator is comprised of a conventional grinding and flotation processing plant with a 9.6 Mtpa ore throughput capacity. The flotation circuit includes six rougher cells, an IseMill for regrinding the rougher concentrate and a Jameson cell heading up the three stage conventional cell cleaner circuit. In total there are four level controllers in the rougher train and ten level controllers in the cleaning circuit for 18 cells. Generic proportional — integral and derivative (PID) control used on the level controllers alone propagated any disturbances downstream in the circuit that were generated from the grinding circuit, hoppers, between cells and interconnected banks of cells, having a negative impact on plant performance. To better control such disturbances, FloatStar level stabiliser was selected for installation on the flotation circuit to account for the interaction between the cells. Multivariable control was also installed on the five concentrate hoppers to maintain consistent feed to the cells and to the IsaMill. An additional area identified for optimisation in the flotation circuit was the mass pull rate from the rougher cells. FloatStar flow optimiser was selected to be installed subsequent to the FloatStar level stabiliser. This allowed for a unified, consistent and optimal approach to running the rougher circuit. This paper describes the improvement in the stabilisation of the circuit achieved by the FloatStar level stabiliser by using the interaction matrix between cell level controllers and the results and benefits of implementing the FloatStar flow optimiser on the rougher train.

  11. Optimisation of monochrome images

    International Nuclear Information System (INIS)

    Potter, R.

    1983-01-01

    Gamma cameras with modern imaging systems usually digitize the signals to allow storage and processing of the image in a computer. Although such computer systems are widely used for the extraction of quantitative uptake estimates and the analysis of time variant data, the vast majority of nuclear medicine images is still interpreted on the basis of an observer's visual assessment of a photographic hardcopy image. The optimisation of hardcopy devices is therefore vital and factors such as resolution, uniformity, noise grey scales and display matrices are discussed. Once optimum display parameters have been determined, routine procedures for quality control need to be established; suitable procedures are discussed. (U.K.)

  12. Use of artificial intelligence techniques for optimisation of co-combustion of coal with biomass

    Energy Technology Data Exchange (ETDEWEB)

    Tan, C.K.; Wilcox, S.J.; Ward, J. [University of Glamorgan, Pontypridd (United Kingdom). Division of Mechanical Engineering

    2006-03-15

    The optimisation of burner operation in conventional pulverised-coal-fired boilers for co-combustion applications represents a significant challenge This paper describes a strategic framework in which Artificial Intelligence (AI) techniques can be applied to solve such an optimisation problem. The effectiveness of the proposed system is demonstrated by a case study that simulates the co-combustion of coal with sewage sludge in a 500-kW pilot-scale combustion rig equipped with a swirl stabilised low-NOx burner. A series of Computational Fluid Dynamics (CFD) simulations were performed to generate data for different operating conditions, which were then used to train several Artificial Neural Networks (ANNs) to predict the co-combustion performance. Once trained, the ANNs were able to make estimations of unseen situations in a fraction of the time taken by the CFD simulation. Consequently, the networks were capable of representing the underlying physics of the CFD models and could be executed efficiently for a large number of iterations as required by optimisation techniques based on Evolutionary Algorithms (EAs). Four operating parameters of the burner, namely the swirl angles and flow rates of the secondary and tertiary combustion air were optimised with the objective of minimising the NOx and CO emissions as well as the unburned carbon at the furnace exit. The results suggest that ANNs combined with EAs provide a useful tool for optimising co-combustion processes.

  13. Optimisation of active suspension control inputs for improved performance of active safety systems

    Science.gov (United States)

    Čorić, Mirko; Deur, Joško; Xu, Li; Tseng, H. Eric; Hrovat, Davor

    2018-01-01

    A collocation-type control variable optimisation method is used to investigate the extent to which the fully active suspension (FAS) can be applied to improve the vehicle electronic stability control (ESC) performance and reduce the braking distance. First, the optimisation approach is applied to the scenario of vehicle stabilisation during the sine-with-dwell manoeuvre. The results are used to provide insights into different FAS control mechanisms for vehicle performance improvements related to responsiveness and yaw rate error reduction indices. The FAS control performance is compared to performances of the standard ESC system, optimal active brake system and combined FAS and ESC configuration. Second, the optimisation approach is employed to the task of FAS-based braking distance reduction for straight-line vehicle motion. Here, the scenarios of uniform and longitudinally or laterally non-uniform tyre-road friction coefficient are considered. The influences of limited anti-lock braking system (ABS) actuator bandwidth and limit-cycle ABS behaviour are also analysed. The optimisation results indicate that the FAS can provide competitive stabilisation performance and improved agility when compared to the ESC system, and that it can reduce the braking distance by up to 5% for distinctively non-uniform friction conditions.

  14. Elm Tree (Ulmus parvifolia) Bark Bioprocessed with Mycelia of Shiitake (Lentinus edodes) Mushrooms in Liquid Culture: Composition and Mechanism of Protection against Allergic Asthma in Mice.

    Science.gov (United States)

    Kim, Sung Phil; Lee, Sang Jong; Nam, Seok Hyun; Friedman, Mendel

    2016-02-03

    Mushrooms can break down complex plant materials into smaller, more digestible and bioactive compounds. The present study investigated the antiasthma effect of an Ulmus parvifolia bark extract bioprocessed in Lentinus edodes liquid mycelium culture (BPUBE) against allergic asthma in chicken egg ovalbumin (OVA)-sensitized/challenged mice. BPUBE suppressed total IgE release from U266B1 cells in a dose-dependent manner without cytotoxicity. Inhibitory activity of BPUBE against OVA-specific IgE secretion in bronchoalveolar lavage fluid (BALF) was observed in OVA-sensitized/challenged asthmatic mice. BPUBE also inhibited OVA-specific IgG and IgG1 secretion into serum from the allergic mice, suggesting the restoration of a Th2-biased immune reaction to a Th1/Th2-balanced status, as indicated by the Th1/Th2 as well as regulatory T cell (Treg) cytokine profile changes caused by BPUBE in serum or BALF. Inflammatory cell counts in BALF and lung histology showed that leukocytosis and eosinophilia induced by OVA-sensitization/challenge were inhibited by the oral administration of BPUBE. Amelioration of eosinophil infiltration near the trachea was associated with reduced eotaxin and vascular cell adhesion molecule-1 (VCAM-1) levels. Changes in proinflammatory mediator levels in BALF suggest that BPUBE decreased OVA-sensitization-induced elevation of leukotriene C4 (LTC4) and prostaglandin D2 (PGD2). The finding that asthma-associated biomarker levels of OVA-sensitized/challenged mice were much more inhibited with BPUBE treatment than NPUBE (not-bioprocessed Ulmus parvifolia extract) treatment suggested the production of new bioactive compounds by the mushroom mycelia that may be involved in enhancing the observed antiasthmatic properties. The possible relation of the composition determined by proximate analysis and GC/MS to observed bioactivity is discussed. The results suggest that the elm tree (Ulmus parvifolia) bark bioprocessed with mycelia of shiitake (Lentinus edodes

  15. Microbial biocatalyst developments to upgrade fossil fuels.

    Science.gov (United States)

    Kilbane, John J

    2006-06-01

    Steady increases in the average sulfur content of petroleum and stricter environmental regulations concerning the sulfur content have promoted studies of bioprocessing to upgrade fossil fuels. Bioprocesses can potentially provide a solution to the need for improved and expanded fuel upgrading worldwide, because bioprocesses for fuel upgrading do not require hydrogen and produce far less carbon dioxide than thermochemical processes. Recent advances have demonstrated that biodesulfurization is capable of removing sulfur from hydrotreated diesel to yield a product with an ultra-low sulfur concentration that meets current environmental regulations. However, the technology has not yet progressed beyond laboratory-scale testing, as more efficient biocatalysts are needed. Genetic studies to obtain improved biocatalysts for the selective removal of sulfur and nitrogen from petroleum provide the focus of current research efforts.

  16. AllAboard: Visual Exploration of Cellphone Mobility Data to Optimise Public Transport.

    Science.gov (United States)

    Di Lorenzo, G; Sbodio, M; Calabrese, F; Berlingerio, M; Pinelli, F; Nair, R

    2016-02-01

    The deep penetration of mobile phones offers cities the ability to opportunistically monitor citizens' mobility and use data-driven insights to better plan and manage services. With large scale data on mobility patterns, operators can move away from the costly, mostly survey based, transportation planning processes, to a more data-centric view, that places the instrumented user at the center of development. In this framework, using mobile phone data to perform transit analysis and optimization represents a new frontier with significant societal impact, especially in developing countries. In this paper we present AllAboard, an intelligent tool that analyses cellphone data to help city authorities in visually exploring urban mobility and optimizing public transport. This is performed within a self contained tool, as opposed to the current solutions which rely on a combination of several distinct tools for analysis, reporting, optimisation and planning. An interactive user interface allows transit operators to visually explore the travel demand in both space and time, correlate it with the transit network, and evaluate the quality of service that a transit network provides to the citizens at very fine grain. Operators can visually test scenarios for transit network improvements, and compare the expected impact on the travellers' experience. The system has been tested using real telecommunication data for the city of Abidjan, Ivory Coast, and evaluated from a data mining, optimisation and user prospective.

  17. Optimisation of the alcoholic fermentation of aqueous jerivá pulp extract

    Directory of Open Access Journals (Sweden)

    Guilherme Arielo Rodrigues Maia

    2014-05-01

    Full Text Available The objective of this research is to determinate the optimum conditions for the alcoholic fermentation process of aqueous jerivá pulp extract using the response surface methodology and simplex optimisation technique. The incomplete factorial design 3³ was applied with the yeast extract, NH4H2PO4 and yeast as the independent variables and the alcohol production yield as the response. The regression analysis indicated that the model is predictive, and the simplex optimisation generated a formulation containing 0.35 g L-1 yeast extract, 6.33 g L-1 yeast and 0.30 g L-1NH4H2PO4 for an optimum yield of 85.40% ethanol. To validate the predictive equation, the experiment was carried out in triplicate under optimum conditions, and an average yield of 87.15% was obtained. According to a t-test, no significant difference was observed (on the order of 5% between the average value obtained and the value indicated by the simplex optimisation technique.

  18. Optimisation de modèles de propagation à partir des données de ...

    African Journals Online (AJOL)

    optimisation radio dans les réseaux mobiles. Le présent article présente une étude comparative de cinq méthodes d'optimisation de modèles de propagation à savoir : La régression linéaire, la méthode de Newton de second ordre, le recuit ...

  19. Parallel unstructured mesh optimisation for 3D radiation transport and fluids modelling

    International Nuclear Information System (INIS)

    Gorman, G.J.; Pain, Ch. C.; Oliveira, C.R.E. de; Umpleby, A.P.; Goddard, A.J.H.

    2003-01-01

    In this paper we describe the theory and application of a parallel mesh optimisation procedure to obtain self-adapting finite element solutions on unstructured tetrahedral grids. The optimisation procedure adapts the tetrahedral mesh to the solution of a radiation transport or fluid flow problem without sacrificing the integrity of the boundary (geometry), or internal boundaries (regions) of the domain. The objective is to obtain a mesh which has both a uniform interpolation error in any direction and the element shapes are of good quality. This is accomplished with use of a non-Euclidean (anisotropic) metric which is related to the Hessian of the solution field. Appropriate scaling of the metric enables the resolution of multi-scale phenomena as encountered in transient incompressible fluids and multigroup transport calculations. The resulting metric is used to calculate element size and shape quality. The mesh optimisation method is based on a series of mesh connectivity and node position searches of the landscape defining mesh quality which is gauged by a functional. The mesh modification thus fits the solution field(s) in an optimal manner. The parallel mesh optimisation/adaptivity procedure presented in this paper is of general applicability. We illustrate this by applying it to a transient CFD (computational fluid dynamics) problem. Incompressible flow past a cylinder at moderate Reynolds numbers is modelled to demonstrate that the mesh can follow transient flow features. (authors)

  20. Optimising resolution for a preparative separation of Chinese herbal medicine using a surrogate model sample system.

    Science.gov (United States)

    Ye, Haoyu; Ignatova, Svetlana; Peng, Aihua; Chen, Lijuan; Sutherland, Ian

    2009-06-26

    This paper builds on previous modelling research with short single layer columns to develop rapid methods for optimising high-performance counter-current chromatography at constant stationary phase retention. Benzyl alcohol and p-cresol are used as model compounds to rapidly optimise first flow and then rotational speed operating conditions at a preparative scale with long columns for a given phase system using a Dynamic Extractions Midi-DE centrifuge. The transfer to a high value extract such as the crude ethanol extract of Chinese herbal medicine Millettia pachycarpa Benth. is then demonstrated and validated using the same phase system. The results show that constant stationary phase modelling of flow and speed with long multilayer columns works well as a cheap, quick and effective method of optimising operating conditions for the chosen phase system-hexane-ethyl acetate-methanol-water (1:0.8:1:0.6, v/v). Optimum conditions for resolution were a flow of 20 ml/min and speed of 1200 rpm, but for throughput were 80 ml/min at the same speed. The results show that 80 ml/min gave the best throughputs for tephrosin (518 mg/h), pyranoisoflavone (47.2 mg/h) and dehydrodeguelin (10.4 mg/h), whereas for deguelin (100.5 mg/h), the best flow rate was 40 ml/min.

  1. Joint optimisation of spare part inventory, maintenance frequency and repair capacity for k-out-of-N systems

    NARCIS (Netherlands)

    de Smidt-Destombes, Karin S.; van der Heijden, Matthijs C.; van Harten, Aart

    2009-01-01

    To achieve a high system availability at minimal costs, relevant decisions include the choice of preventive maintenance frequency, spare part inventory levels and spare part repair capacity. We develop heuristics for the joint optimisation of these variables for (a) a single k-out-of-N system under

  2. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R H

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  3. Design of optimised backstepping controller for the synchronisation

    Indian Academy of Sciences (India)

    In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master–slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm ...

  4. The optimisation of a water distribution system using Bentley WaterGEMS software

    Directory of Open Access Journals (Sweden)

    Świtnicka Karolina

    2017-01-01

    Full Text Available The proper maintenance of water distribution systems (WDSs requires from operators multiple actions in order to ensure optimal functioning. Usually, all requirements should be adjusted simultaneously. Therefore, the decision-making process is often supported by multi-criteria optimisation methods. Significant improvements of exploitation conditions of WDSs functioning can be achieved by connecting small water supply networks into group systems. Among many potential tools supporting advanced maintenance and management of WDSs, significant improvements have tools that can find the optimal solution by the implemented mechanism of metaheuristic methods, such as the genetic algorithm. In this paper, an exemplary WDS functioning optimisation is presented, in relevance to a group water supply system. The action range of optimised parameters included: maximisation of water flow velocity, regulation of pressure head, minimisation of water retention time in a network (water age and minimisation of pump energy consumption. All simulations were performed in Bentley WaterGEMS software.

  5. Optimisation of wheat-sprouted soybean flour bread using response ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-11-16

    Nov 16, 2009 ... Full Length Research Paper. Optimisation of ... Victoria A. Jideani1* and Felix C. Onwubali2. 1Department of Food Technology, Cape Peninsula University of Technology, P. O. Box 652, Cape Town 8000, South. Africa.

  6. Issues with performance measures for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available Symposium on Computational Intelligence in Dynamic and Uncertain Environments (CIDUE), Mexico, 20-23 June 2013 Issues with Performance Measures for Dynamic Multi-objective Optimisation Mard´e Helbig CSIR: Meraka Institute Brummeria, South Africa...

  7. Dynamic optimisation of an industrial web process

    Directory of Open Access Journals (Sweden)

    M Soufian

    2008-09-01

    Full Text Available An industrial web process has been studied and it is shown that theunderlying physics of such processes governs by the Navier-Stokes partialdifferential equations with moving boundary conditions, which in turn have tobe determined by the solution of the thermodynamics equations. Thedevelopment of a two-dimensional continuous-discrete model structurebased on this study is presented. Other models are constructed based onthis model for better identification and optimisation purposes. Theparameters of the proposed models are then estimated using real dataobtained from the identification experiments with the process plant. Varioussimulation tests for validation are accompanied with the design, developmentand real-time industrial implementation of an optimal controller for dynamicoptimisation of this web process. It is shown that in comparison with thetraditional controller, the new controller resulted in a better performance, animprovement in film quality and saving in raw materials. This demonstrates theefficiency and validation of the developed models.

  8. Optimisation of VSC-HVDC Transmission for Wind Power Plants

    DEFF Research Database (Denmark)

    Silva, Rodrigo Da

    Connection of Wind Power Plants (WPP), typically oshore, using VSCHVDC transmission is an emerging solution with many benefits compared to the traditional AC solution, especially concerning the impact on control architecture of the wind farms and the grid. The VSC-HVDC solution is likely to meet...... more stringent grid codes than a conventional AC transmission connection. The purpose of this project is to analyse how HVDC solution, considering the voltage-source converter based technology, for grid connection of large wind power plants can be designed and optimised. By optimisation, the project...... the robust control technique is applied is compared with the classical proportional-integral (PI) performance, by means of time domain simulation in a point-to-point HVDC connection. The three main parameters in the discussion are the wind power delivered from the offshore wind power plant, the variation...

  9. Optimising the remediation of sites contaminated by the Wismut uranium mining operations using performance and risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pelz, F.; Jakubick, A.Th.; Kahnt, R. [Wismut GmbH, Chemnitz (Germany)

    2003-07-01

    The cost and risk assessment at Wismut GmbH is performed for optimising the remediation of sites contaminated by uranium mining and milling. An iterative either probabilistic or deterministic 'top-down' model of the remediation project as an integrated system is used. Initially all relevant processes are captured in a rather abstract and simplistic way. In the course of the model development those variables and processes to which results have been shown to be sensitive are described in more detail. This approach is useful for identifying any gaps in the knowledge base that have to be filled in the course of the multi-attributive decision making. The requirement for optimisation, also with respect to socio-economic impacts, is met by including other variables in addition to costs and health risks. (authors)

  10. Optimising the remediation of sites contaminated by the Wismut uranium mining operations using performance and risk assessment

    International Nuclear Information System (INIS)

    Pelz, F.; Jakubick, A.Th.; Kahnt, R.

    2003-01-01

    The cost and risk assessment at Wismut GmbH is performed for optimising the remediation of sites contaminated by uranium mining and milling. An iterative either probabilistic or deterministic 'top-down' model of the remediation project as an integrated system is used. Initially all relevant processes are captured in a rather abstract and simplistic way. In the course of the model development those variables and processes to which results have been shown to be sensitive are described in more detail. This approach is useful for identifying any gaps in the knowledge base that have to be filled in the course of the multi-attributive decision making. The requirement for optimisation, also with respect to socio-economic impacts, is met by including other variables in addition to costs and health risks. (authors)

  11. Numerical Analysis and Geometry Optimisation of Vertical Vane of Room Air-conditioner

    Directory of Open Access Journals (Sweden)

    Al-Obaidi Abdulkareem Sh. Mahdi

    2018-01-01

    Full Text Available Vertical vanes of room air-conditioners are used to control and direct cold air. This paper aims to study vertical vane as one of the parameters that affect the efficiency of dissipating cold air to a given space. The vertical vane geometry is analysed and optimised for lower production cost using CFD. The optimised geometry of the vertical vane should have the same or increased efficiency of dissipating cold air and have lesser mass compared to the existing original design. The existing original design of vertical vane is simplified and analysed by using ANSYS Fluent. Efficiency of wind direction is define as how accurate the direction of airflow coming out from vertical vane. In order to calculate the efficiency of wind direction, 15° and 30° rotation of vertical vane inside room air-conditioner are simulated. The efficiency of wind direction for 15° rotation of vertical vane is 57.81% while efficiency of wind direction for 30° rotation of vertical vane is 47.54%. The results of the efficiency of wind direction are used as base reference for parametric study. The parameters investigated for optimisation of vertical vane are focused at length of long span, tip chord and short span. The design of 15% decreased in vane surface area at tip chord is the best optimised design of vertical vane because the efficiency of wind direction is the highest as 60.32%.

  12. Influence of design improvements in optimising staffing of NPPs - an Indian experience

    International Nuclear Information System (INIS)

    Bhattacharya, A.S.

    2001-01-01

    Three decades of operating experience in India has led to sustained high performance of NPP's. The staffing modules and policies are standardised. The basic functions of operation, maintenance, technical support and quality assurance are carried out by a team of 727 in-plant persons (for a 2 x 220 MW PHWR station) organised at five levels, for fifty positions in ten job families. The organisational factors that led to optimising of staff are described in the companion paper. This optimisation of manpower is a result of continuous learning - for (i) optimising quantum of workload and (ii) improving productivity. For the first category, design improvements over older Indian NPP's have increased reliability, operability, maintainability and human factors. Few examples: (i) improved man-machine interface in plant controls and on-power refuelling system with operator guidance, logging as well as diagnostic/health monitoring features; (ii) spread out layout for better access and ease of maintenance, separation of plant services for unit-1 from unit-2 and, removal of reactor auxiliaries out to separate buildings; (iii) reduction of maintenance tasks through redesigned equipment and improved condition monitoring means. However, design and procedural improvements also include additional equipment for upgradation of safety measures, e.g. larger number of safety related pumps separate switchyard control room and increased service system equipment. This paper outlines experience of design improvements in optimising staffing and uses a specific case illustration to establish the findings for better use of staff. (author)

  13. Weight Optimisation of Steel Monopile Foundations for Offshore Windfarms

    DEFF Research Database (Denmark)

    Fog Gjersøe, Nils; Bouvin Pedersen, Erik; Kristensen, Brian

    2015-01-01

    The potential for mass reduction of monopiles in offshore windfarms using current design practice is investigated. Optimisation by sensitivity analysis is carried out for the following important parameters: wall thickness distribution between tower and monopile, soil stiffness, damping ratio...

  14. Active vibration reduction of a flexible structure bonded with optimised piezoelectric pairs using half and quarter chromosomes in genetic algorithms

    International Nuclear Information System (INIS)

    Daraji, A H; Hale, J M

    2012-01-01

    The optimal placement of sensors and actuators in active vibration control is limited by the number of candidates in the search space. The search space of a small structure discretized to one hundred elements for optimising the location of ten actuators gives 1.73 × 10 13 possible solutions, one of which is the global optimum. In this work, a new quarter and half chromosome technique based on symmetry is developed, by which the search space for optimisation of sensor/actuator locations in active vibration control of flexible structures may be greatly reduced. The technique is applied to the optimisation for eight and ten actuators located on a 500×500mm square plate, in which the search space is reduced by up to 99.99%. This technique helps for updating genetic algorithm program by updating natural frequencies and mode shapes in each generation to find the global optimal solution in a greatly reduced number of generations. An isotropic plate with piezoelectric sensor/actuator pairs bonded to its surface was investigated using the finite element method and Hamilton's principle based on first order shear deformation theory. The placement and feedback gain of ten and eight sensor/actuator pairs was optimised for a cantilever and clamped-clamped plate to attenuate the first six modes of vibration, using minimization of linear quadratic index as an objective function.

  15. Larval feeding inhibition assay – need for optimisation

    DEFF Research Database (Denmark)

    Azuhnwi, Blasius; Desrues, O.; Hoste, H.

    2013-01-01

    for this observed variation in results include: parasite (species/strain); material tested; or season. There is thus need to optimise LFIA to permit intra and inter-laboratory comparison of results. We investigate here, if changes in EC50 values occur over the patency phase of a nematode species using two test...

  16. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    Organic fraction of municipal solid waste (OFMSW) is a rich substrate for biogas and compost production. Anaerobic Digestate compost (ADC) is an organic fertilizer produced from stabilized residuals of anaerobic digestion of OFMSW. This paper reports the result of studies carried out to optimise the production of ADC from ...

  17. Design optimisation of powers-of-two FIR filter using self-organising random immigrants GA

    Science.gov (United States)

    Chandra, Abhijit; Chattopadhyay, Sudipta

    2015-01-01

    In this communication, we propose a novel design strategy of multiplier-less low-pass finite impulse response (FIR) filter with the aid of a recent evolutionary optimisation technique, known as the self-organising random immigrants genetic algorithm. Individual impulse response coefficients of the proposed filter have been encoded as sum of signed powers-of-two. During the formulation of the cost function for the optimisation algorithm, both the frequency response characteristic and the hardware cost of the discrete coefficient FIR filter have been considered. The role of crossover probability of the optimisation technique has been evaluated on the overall performance of the proposed strategy. For this purpose, the convergence characteristic of the optimisation technique has been included in the simulation results. In our analysis, two design examples of different specifications have been taken into account. In order to substantiate the efficiency of our proposed structure, a number of state-of-the-art design strategies of multiplier-less FIR filter have also been included in this article for the purpose of comparison. Critical analysis of the result unambiguously establishes the usefulness of our proposed approach for the hardware efficient design of digital filter.

  18. Group search optimiser-based optimal bidding strategies with no Karush-Kuhn-Tucker optimality conditions

    Science.gov (United States)

    Yadav, Naresh Kumar; Kumar, Mukesh; Gupta, S. K.

    2017-03-01

    General strategic bidding procedure has been formulated in the literature as a bi-level searching problem, in which the offer curve tends to minimise the market clearing function and to maximise the profit. Computationally, this is complex and hence, the researchers have adopted Karush-Kuhn-Tucker (KKT) optimality conditions to transform the model into a single-level maximisation problem. However, the profit maximisation problem with KKT optimality conditions poses great challenge to the classical optimisation algorithms. The problem has become more complex after the inclusion of transmission constraints. This paper simplifies the profit maximisation problem as a minimisation function, in which the transmission constraints, the operating limits and the ISO market clearing functions are considered with no KKT optimality conditions. The derived function is solved using group search optimiser (GSO), a robust population-based optimisation algorithm. Experimental investigation is carried out on IEEE 14 as well as IEEE 30 bus systems and the performance is compared against differential evolution-based strategic bidding, genetic algorithm-based strategic bidding and particle swarm optimisation-based strategic bidding methods. The simulation results demonstrate that the obtained profit maximisation through GSO-based bidding strategies is higher than the other three methods.

  19. An empirical study on website usability elements and how they affect search engine optimisation

    Directory of Open Access Journals (Sweden)

    Eugene B. Visser

    2011-03-01

    Full Text Available The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared to a usability-designed version of the control website,namely the experimental website. All optimisation elements were ignored, thus implementing only usability. The results clearly show that inclusion of the usability attributes positively affect conversion,indicating that usability is a prerequisite for effective website design. Search engine optimisation is also a prerequisite for the very reason that if a website does not rank on the first page of the search engine result page for a given keyword, then that website might as well not exist. According to this empirical work, usability is in contradiction to search engine optimisation best practices. Therefore the two need to be weighed up in terms of importance towards search engines and visitors.

  20. Optimisation of optical receiver for 10 Gbit/s optical duobinary transmission system

    DEFF Research Database (Denmark)

    Zheng, Xueyan; Liu, Fenghai; Jeppesen, Palle

    2001-01-01

    Optimisation of a receiver for an optical duobinary signal is studied numerically. It is shown that a conventional receiver is not optimum neither when a DCF is used before the receiver nor without a DCF being used. The optimum receiver for an optical duobinary system is identified.......Optimisation of a receiver for an optical duobinary signal is studied numerically. It is shown that a conventional receiver is not optimum neither when a DCF is used before the receiver nor without a DCF being used. The optimum receiver for an optical duobinary system is identified....