WorldWideScience

Sample records for gene optimised adenovirus-based

  1. Improving Adenovirus Based Gene Transfer: Strategies to Accomplish Immune Evasion

    Directory of Open Access Journals (Sweden)

    Andrea Amalfitano

    2010-09-01

    Full Text Available Adenovirus (Ad based gene transfer vectors continue to be the platform of choice for an increasing number of clinical trials worldwide. In fact, within the last five years, the number of clinical trials that utilize Ad based vectors has doubled, indicating growing enthusiasm for the numerous positive characteristics of this gene transfer platform. For example, Ad vectors can be easily and relatively inexpensively produced to high titers in a cGMP compliant manner, can be stably stored and transported, and have a broad applicability for a wide range of clinical conditions, including both gene therapy and vaccine applications. Ad vector based gene transfer will become more useful as strategies to counteract innate and/or pre-existing adaptive immune responses to Ads are developed and confirmed to be efficacious. The approaches attempting to overcome these limitations can be divided into two broad categories: pre-emptive immune modulation of the host, and selective modification of the Ad vector itself. The first category of methods includes the use of immunosuppressive drugs or specific compounds to block important immune pathways, which are known to be induced by Ads. The second category comprises several innovative strategies inclusive of: (1 Ad-capsid-display of specific inhibitors or ligands; (2 covalent modifications of the entire Ad vector capsid moiety; (3 the use of tissue specific promoters and local administration routes; (4 the use of genome modified Ads; and (5 the development of chimeric or alternative serotype Ads. This review article will focus on both the promise and the limitations of each of these immune evasion strategies, and in the process delineate future directions in developing safer and more efficacious Ad-based gene transfer strategies.

  2. Adenovirus-based p53 gene therapy in ovarian cancer.

    Science.gov (United States)

    Santoso, J T; Tang, D C; Lane, S B; Hung, J; Reed, D J; Muller, C Y; Carbone, D P; Lucci, J A; Miller, D S; Mathis, J M

    1995-11-01

    Mutations of the p53 tumor suppressor gene are the most common molecular genetic abnormality to be described in ovarian cancer. To determine the feasibility of mutant p53 as a molecular target for gene therapy in ovarian cancer, we constructed an adenovirus vector containing the wild-type p53 gene. The ability of this adenovirus construct (Ad-CMV-p53) to express p53 protein was examined by Western blot analysis in the H358 lung cancer cell line, which has a homozygous deletion of the p53 gene. The ability of the adenovirus vector system to infect ovarian cancer cells was tested using an adenovirus containing the beta-galactosidase reporter gene under the control of the CMV promoter (Ad-CMV-beta gal). The ovarian cancer cell line 2774, which contains an Arg273His p53 mutation, was infected with Ad-CMV-beta gal, and the infected cells were assayed for beta-galactosidase activity after 24 hr. To test the ability of wild-type p53 to inhibit cell growth, the 2774 cell line was infected with Ad-CMV-p53 or Ad-CMV-beta gal, and the effect of these agents on the growth of 2774 cells was determined using an in vitro growth inhibition assay. Western blot analysis of lysates from H358 cells infected with Ad-CMV-p53 showed expression of wild-type p53 protein. When 2774 cells were infected with Ad-CMV-beta gal at a multiplicity of infection (m.o.i.) of 10 PFU/cell, > 90% of cells showed beta-galactosidase activity, demonstrating that these cells are capable of efficient infection by the adenovirus vector. Growth of 2774 cells infected with Ad-CMV-p53 was inhibited by > 90% compared to noninfected cells. The ability of the adenovirus vector to mediate high-level expression of infected genes and the inhibitory effect of Ad-CMV-p53 on the 2774 cell line suggests that the Ad-CMV-p53 could be further developed into a therapeutic agent for ovarian cancer.

  3. A broadly applicable method to characterize large DNA viruses and adenoviruses based on the DNA polymerase gene

    Directory of Open Access Journals (Sweden)

    Montgomery Roy D

    2006-04-01

    Full Text Available Abstract Background Many viral pathogens are poorly characterized, are difficult to culture or reagents are lacking for confirmatory diagnoses. We have developed and tested a robust assay for detecting and characterizing large DNA viruses and adenoviruses. The assay is based on the use of degenerate PCR to target a gene common to these viruses, the DNA polymerase, and sequencing the products. Results We evaluated our method by applying it to fowl adenovirus isolates, catfish herpesvirus isolates, and largemouth bass ranavirus (iridovirus from cell culture and lymphocystis disease virus (iridovirus and avian poxvirus from tissue. All viruses with the exception of avian poxvirus produced the expected product. After optimization of extraction procedures, and after designing and applying an additional primer we were able to produce polymerase gene product from the avian poxvirus genome. The sequence data that we obtained demonstrated the simplicity and potential of the method for routine use in characterizing large DNA viruses. The adenovirus samples were demonstrated to represent 2 types of fowl adenovirus, fowl adenovirus 1 and an uncharacterized avian adenovirus most similar to fowl adenovirus 9. The herpesvirus isolate from blue catfish was shown to be similar to channel catfish virus (Ictalurid herpesvirus 1. The case isolate of largemouth bass ranavirus was shown to exactly match the type specimen and both were similar to tiger frog virus and frog virus 3. The lymphocystis disease virus isolate from largemouth bass was shown to be related but distinct from the two previously characterized lymphocystis disease virus isolates suggesting that it may represent a distinct lymphocystis disease virus species. Conclusion The method developed is rapid and broadly applicable to cell culture isolates and infected tissues. Targeting a specific gene for in the large DNA viruses and adenoviruses provide a common reference for grouping the newly identified

  4. Gene order computation using Alzheimer's DNA microarray gene expression data and the Ant Colony Optimisation algorithm.

    Science.gov (United States)

    Pang, Chaoyang; Jiang, Gang; Wang, Shipeng; Hu, Benqiong; Liu, Qingzhong; Deng, Youping; Huang, Xudong

    2012-01-01

    As Alzheimer's Disease (AD) is the most common form of dementia, the study of AD-related genes via biocomputation is an important research topic. One method of studying AD-related gene is to cluster similar genes together into a gene order. Gene order is a good clustering method as the results can be optimal globally while other clustering methods are only optimal locally. Herein we use the Ant Colony Optimisation (ACO)-based algorithm to calculate the gene order from an Alzheimer's DNA microarray dataset. We test it with four distance measurements: Pearson distance, Spearmen distance, Euclidean distance, and squared Euclidean distance. Our computing results indicate: a different distance formula generated a different quality of gene order, the squared Euclidean distance approach produced the optimal AD-related gene order.

  5. Optimising parallel R correlation matrix calculations on gene expression data using MapReduce.

    Science.gov (United States)

    Wang, Shicai; Pandis, Ioannis; Johnson, David; Emam, Ibrahim; Guitton, Florian; Oehmichen, Axel; Guo, Yike

    2014-11-05

    High-throughput molecular profiling data has been used to improve clinical decision making by stratifying subjects based on their molecular profiles. Unsupervised clustering algorithms can be used for stratification purposes. However, the current speed of the clustering algorithms cannot meet the requirement of large-scale molecular data due to poor performance of the correlation matrix calculation. With high-throughput sequencing technologies promising to produce even larger datasets per subject, we expect the performance of the state-of-the-art statistical algorithms to be further impacted unless efforts towards optimisation are carried out. MapReduce is a widely used high performance parallel framework that can solve the problem. In this paper, we evaluate the current parallel modes for correlation calculation methods and introduce an efficient data distribution and parallel calculation algorithm based on MapReduce to optimise the correlation calculation. We studied the performance of our algorithm using two gene expression benchmarks. In the micro-benchmark, our implementation using MapReduce, based on the R package RHIPE, demonstrates a 3.26-5.83 fold increase compared to the default Snowfall and 1.56-1.64 fold increase compared to the basic RHIPE in the Euclidean, Pearson and Spearman correlations. Though vanilla R and the optimised Snowfall outperforms our optimised RHIPE in the micro-benchmark, they do not scale well with the macro-benchmark. In the macro-benchmark the optimised RHIPE performs 2.03-16.56 times faster than vanilla R. Benefiting from the 3.30-5.13 times faster data preparation, the optimised RHIPE performs 1.22-1.71 times faster than the optimised Snowfall. Both the optimised RHIPE and the optimised Snowfall successfully performs the Kendall correlation with TCGA dataset within 7 hours. Both of them conduct more than 30 times faster than the estimated vanilla R. The performance evaluation found that the new MapReduce algorithm and its

  6. Adenovirus-based vaccine against Listeria monocytogenes

    DEFF Research Database (Denmark)

    Jensen, Søren; Steffensen, Maria Abildgaard; Jensen, Benjamin Anderschou Holbech

    2013-01-01

    The use of replication-deficient adenoviruses as vehicles for transfer of foreign genes offers many advantages in a vaccine setting, eliciting strong cellular immune responses involving both CD8(+) and CD4(+) T cells. Further improving the immunogenicity, tethering of the inserted target Ag to MHC...... class II-associated invariant chain (Ii) greatly enhances both the presentation of most target Ags, as well as overall protection against viral infection, such as lymphocytic choriomeningitis virus (LCMV). The present study extends this vaccination concept to include protection against intracellular...... bacteria, using Listeria monocytogenes as a model organism. Protection in C57BL/6 mice against recombinant L. monocytogenes expressing an immunodominant epitope of the LCMV glycoprotein (GP33) was greatly accelerated, augmented, and prolonged following vaccination with an adenoviral vaccine encoding GP...

  7. Optimising gene therapy of hypoparathyroidism with hematopoietic stem cells

    Institute of Scientific and Technical Information of China (English)

    ZHOU Yi; L(U) Bing-jie; XU Ping; SONG Chun-fang

    2005-01-01

    Background The treatment of hypoparathyroidism (HPT) is still a difficult clinical problem, which necessitates a new therapy. Gene therapy of HPT has been valuable, but how to improve the gene transfer efficiency and expression stability is a problem. This study was designed to optimize the gene therapy of HPT with hematopoietic stem cells (HSCs) recombined with the parathyroid hormone (PTH) gene. Methods The human PTH gene was amplified by polymerase chain reaction (PCR) from pcDNA3.1-PTH vectors and inserted into murine stem cell virus (MSCV) vectors with double enzyme digestion (EcoRI and XhoI). The recombinant vectors were transfected into PA317 packaging cell lines by the lipofectin method and screened by G418 selective medium. The condensed recombinant retroviruses were extracted and used to infect HSCs, which were injected into mice suffering from HPT. The change of symptoms and serum levels of PTH and calcium in each group of mice were investigated. Results The human PTH gene was inserted into MSCV vectors successfully and the titres were up to 2×107 colony forming unit (CFU)/ml in condensed retroviral solution. The secretion of PTH reached 15 ng·10-6·cell-1 per 48 hours. The wild type viruses were not detected via PCR amplification, so they were safe for use. The mice suffering from HPT recovered quickly and the serum levels of calcium and PTH remained normal for about three months after the HSCs recombined with PTH were injected into them. The therapeutic effect of this method was better than simple recombinant retroviruses injection.Conclusions The recombinant retroviral vectors MSCV-PTH and the high-titre condensed retroviral solution recombined with the PTH gene are obtained. The recombinant retroviral solution could infect HSCs at a high rate of efficiency. The infected HSCs could cure HPT in mice. This method has provided theoretical evidence for the clinical gene therapy of HPT.

  8. Large-scale codon de-optimisation of the p29 replicase gene by synonymous substitutions causes a loss of infectivity of melon necrotic spot virus.

    Science.gov (United States)

    Usami, Atsushi; Mochizuki, Tomofumi; Tsuda, Shinya; Ohki, Satoshi T

    2013-09-01

    The effect of synonymous substitutions in the melon necrotic spot virus p29 replicase gene on viral pathogenicity was investigated. The codons in the p29 gene were replaced by the least frequently used synonymous codons in Arabidopsis thaliana or melons. Mechanical inoculation of melon with p29 variants resulted in a loss of viral infectivity when all, one-half, or one-quarter of the gene was de-optimised. The effect of the de-optimisation in one-sixth of the gene was different depending on the de-optimised region. These results demonstrate that large-scale codon bias de-optimisation without amino acid substitutions of the p29 gene alter viral infectivity.

  9. Adenovirus-based vaccines against avian-origin H5N1 influenza viruses.

    Science.gov (United States)

    He, Biao; Zheng, Bo-jian; Wang, Qian; Du, Lanying; Jiang, Shibo; Lu, Lu

    2015-02-01

    Since 1997, human infection with avian H5N1, having about 60% mortality, has posed a threat to public health. In this review, we describe the epidemiology of H5N1 transmission, advantages and disadvantages of different influenza vaccine types, and characteristics of adenovirus, finally summarizing advances in adenovirus-based H5N1 systemic and mucosal vaccines.

  10. p53 Gene repair with zinc finger nucleases optimised by yeast 1-hybrid and validated by Solexa sequencing.

    Directory of Open Access Journals (Sweden)

    Frank Herrmann

    Full Text Available The tumor suppressor gene p53 is mutated or deleted in over 50% of human tumors. As functional p53 plays a pivotal role in protecting against cancer development, several strategies for restoring wild-type (wt p53 function have been investigated. In this study, we applied an approach using gene repair with zinc finger nucleases (ZFNs. We adapted a commercially-available yeast one-hybrid (Y1H selection kit to allow rapid building and optimization of 4-finger constructs from randomized PCR libraries. We thus generated novel functional zinc finger nucleases against two DNA sites in the human p53 gene, near cancer mutation 'hotspots'. The ZFNs were first validated using in vitro cleavage assays and in vivo episomal gene repair assays in HEK293T cells. Subsequently, the ZFNs were used to restore wt-p53 status in the SF268 human cancer cell line, via ZFN-induced homologous recombination. The frequency of gene repair and mutation by non-homologous end-joining was then ascertained in several cancer cell lines, using a deep sequencing strategy. Our Y1H system facilitates the generation and optimisation of novel, sequence-specific four- to six-finger peptides, and the p53-specific ZFN described here can be used to mutate or repair p53 in genomic loci.

  11. p53 Gene Repair with Zinc Finger Nucleases Optimised by Yeast 1-Hybrid and Validated by Solexa Sequencing

    Science.gov (United States)

    Herrmann, Frank; Garriga-Canut, Mireia; Baumstark, Rebecca; Fajardo-Sanchez, Emmanuel; Cotterell, James; Minoche, André; Himmelbauer, Heinz; Isalan, Mark

    2011-01-01

    The tumor suppressor gene p53 is mutated or deleted in over 50% of human tumors. As functional p53 plays a pivotal role in protecting against cancer development, several strategies for restoring wild-type (wt) p53 function have been investigated. In this study, we applied an approach using gene repair with zinc finger nucleases (ZFNs). We adapted a commercially-available yeast one-hybrid (Y1H) selection kit to allow rapid building and optimization of 4-finger constructs from randomized PCR libraries. We thus generated novel functional zinc finger nucleases against two DNA sites in the human p53 gene, near cancer mutation ‘hotspots’. The ZFNs were first validated using in vitro cleavage assays and in vivo episomal gene repair assays in HEK293T cells. Subsequently, the ZFNs were used to restore wt-p53 status in the SF268 human cancer cell line, via ZFN-induced homologous recombination. The frequency of gene repair and mutation by non-homologous end-joining was then ascertained in several cancer cell lines, using a deep sequencing strategy. Our Y1H system facilitates the generation and optimisation of novel, sequence-specific four- to six-finger peptides, and the p53-specific ZFN described here can be used to mutate or repair p53 in genomic loci. PMID:21695267

  12. Enhanced protection against Ebola virus mediated by an improved adenovirus-based vaccine.

    Directory of Open Access Journals (Sweden)

    Jason S Richardson

    Full Text Available BACKGROUND: The Ebola virus is transmitted by direct contact with bodily fluids of infected individuals, eliciting death rates as high as 90% among infected humans. Currently, replication defective adenovirus-based Ebola vaccine is being studied in a phase I clinical trial. Another Ebola vaccine, based on an attenuated vesicular stomatitis virus has shown efficacy in post-exposure treatment of nonhuman primates to Ebola infection. In this report, we modified the common recombinant adenovirus serotype 5-based Ebola vaccine expressing the wild-type ZEBOV glycoprotein sequence from a CMV promoter (Ad-CMVZGP. The immune response elicited by this improved expression cassette vector (Ad-CAGoptZGP and its ability to afford protection against lethal ZEBOV challenge in mice was compared to the standard Ad-CMVZGP vector. METHODOLOGY/PRINCIPAL FINDINGS: Ad-CMVZGP was previously shown to protect mice, guinea pigs and nonhuman primates from an otherwise lethal challenge of Zaire ebolavirus. The antigenic expression cassette of this vector was improved through codon optimization, inclusion of a consensus Kozak sequence and reconfiguration of a CAG promoter (Ad-CAGoptZGP. Expression of GP from Ad-CAGoptZGP was substantially higher than from Ad-CMVZGP. Ad-CAGoptZGP significantly improved T and B cell responses at doses 10 to 100-fold lower than that needed with Ad-CMVZGP. Additionally, Ad-CAGoptZGP afforded full protections in mice against lethal challenge at a dose 100 times lower than the dose required for Ad-CMVZGP. Finally, Ad-CAGoptZGP induced full protection to mice when given 30 minutes post-challenge. CONCLUSIONS/SIGNIFICANCE: We describe an improved adenovirus-based Ebola vaccine capable of affording post-exposure protection against lethal challenge in mice. The molecular modifications of the new improved vaccine also translated in the induction of significantly enhanced immune responses and complete protection at a dose 100 times lower than with the

  13. Optimising homing endonuclease gene drive performance in a semi-refractory species: the Drosophila melanogaster experience.

    Directory of Open Access Journals (Sweden)

    Yuk-Sang Chan

    Full Text Available Homing endonuclease gene (HEG drive is a promising insect population control technique that employs meganucleases to impair the fitness of pest populations. Our previous studies showed that HEG drive was more difficult to achieve in Drosophila melanogaster than Anopheles gambiae and we therefore investigated ways of improving homing performance in Drosophila. We show that homing in Drosophila responds to increased expression of HEGs specifically during the spermatogonia stage and this could be achieved through improved construct design. We found that 3'-UTR choice was important to maximise expression levels, with HEG activity increasing as we employed Hsp70, SV40, vasa and βTub56D derived UTRs. We also searched for spermatogonium-specific promoters and found that the Rcd-1r promoter was able to drive specific expression at this stage. Since Rcd-1 is a regulator of differentiation in other species, it suggests that Rcd-1r may serve a similar role during spermatogonial differentiation in Drosophila. Contrary to expectations, a fragment containing the entire region between the TBPH gene and the bgcn translational start drove strong HEG expression only during late spermatogenesis rather than in the germline stem cells and spermatogonia as expected. We also observed that the fraction of targets undergoing homing was temperature-sensitive, falling nearly four-fold when the temperature was lowered to 18°C. Taken together, this study demonstrates how a few simple measures can lead to substantial improvements in the HEG-based gene drive strategy and reinforce the idea that the HEG approach may be widely applicable to a variety of insect control programs.

  14. An optimised direct lysis method for gene expression studies on low cell numbers.

    Science.gov (United States)

    Le, Anh Viet-Phuong; Huang, Dexing; Blick, Tony; Thompson, Erik W; Dobrovic, Alexander

    2015-08-05

    There is increasing interest in gene expression analysis of either single cells or limited numbers of cells. One such application is the analysis of harvested circulating tumour cells (CTCs), which are often present in very low numbers. A highly efficient protocol for RNA extraction, which involves a minimal number of steps to avoid RNA loss, is essential for low input cell numbers. We compared several lysis solutions that enable reverse transcription (RT) to be performed directly on the cell lysate, offering a simple rapid approach to minimise RNA loss for RT. The lysis solutions were assessed by reverse transcription quantitative polymerase chain reaction (RT-qPCR) in low cell numbers isolated from four breast cancer cell lines. We found that a lysis solution containing both the non-ionic detergent (IGEPAL CA-630, chemically equivalent to Nonidet P-40 or NP-40) and bovine serum albumin (BSA) gave the best RT-qPCR yield. This direct lysis to reverse transcription protocol outperformed a column-based extraction method using a commercial kit. This study demonstrates a simple, reliable, time- and cost-effective method that can be widely used in any situation where RNA needs to be prepared from low to very low cell numbers.

  15. Optimisation of reference genes for gene-expression analysis in a rabbit model of left ventricular diastolic dysfunction.

    Directory of Open Access Journals (Sweden)

    Walid Nachar

    Full Text Available Left ventricular diastolic dysfunction (LVDD is characterized by the disturbance of ventricle's performance due to its abnormal relaxation or to its increased stiffness during the diastolic phase. The molecular mechanisms underlying LVDD remain unknown. We aimed to identify normalization genes for accurate gene-expression analysis of LVDD using quantitative real-time PCR (RT-PCR in a new rabbit model of LVDD. Eighteen rabbits were fed with a normal diet (n = 7 or a 0.5% cholesterol-enriched diet supplemented with vitamin D2 (n = 11 for an average of 14.5 weeks. We validated the presence of LVDD in this model using echocardiography for diastolic function assessment. RT-PCR was performed using cDNA derived from left ventricle samples to measure the stability of 10 genes as candidate reference genes (Gapdh, Hprt1, Ppia, Sdha, Rpl5, Actb, Eef1e1, Ywhaz, Pgk1, and G6pd. Using geNorm analysis, we report that Sdha, Gapdh and Hprt1 genes had the highest stability (M <0.2. By contrast, Hprt1 and Rpl5 genes were found to represent the best combination for normalization when using the Normfinder algorithm (stability value of 0.042. Comparison of both normalization strategies highlighted an increase of natriuretic peptides (Bnp and Anp, monocytes chemotactic protein-1 (Mcp-1 and NADPH oxidase subunit (Nox-2 mRNA expressions in ventricle samples of the hypercholesterolemic rabbits compared to controls (P<0.05. This increase correlates with LVDD echocardiographic parameters and most importantly it molecularly validates the presence of the disease in our model. This is the first study emphasizing the selection of stable reference genes for RT-PCR normalization in a rabbit model of LVDD.

  16. Optimisation of tomato Micro-tom regeneration and selection on glufosinate/Basta and dependency of gene silencing on transgene copy number.

    Science.gov (United States)

    Khuong, Thi Thu Huong; Crété, Patrice; Robaglia, Christophe; Caffarri, Stefano

    2013-09-01

    An efficient protocol of transformation and selection of transgenic lines of Micro-tom, a widespread model cultivar for tomato, is reported. RNA interference silencing efficiency and stability have been investigated and correlated with the number of insertions. Given its small size and ease of cultivation, the tomato (Solanum lycopersicon) cultivar Micro-tom is of widespread use as a model tomato plant. To create and screen transgenic plants, different selectable markers are commonly used. The bar marker carrying the resistance to the herbicide glufosinate/Basta, has many advantages, but it has been little utilised and with low efficiency for identification of tomato transgenic plants. Here we describe a procedure for accurate selection of transgenic Micro-tom both in vitro and in soil. Immunoblot, Southern blot and phenotypic analyses showed that 100 % of herbicide-resistant plants were transgenic. In addition, regeneration improvement has been obtained by using 2 mg/l Gibberellic acid in the shoot elongation medium; rooting optimisation on medium containing 1 mg/l IAA allowed up to 97 % of shoots developing strong and very healthy roots after only 10 days. Stable transformation frequency by infection of leaf explants with Agrobacterium reached 12 %. Shoots have been induced by combination of 1 mg/l zeatin-trans and 0.1 mg/l IAA. Somatic embryogenesis of cotyledon on medium containing 1 mg/l zeatin + 2 mg/l IAA is described in Micro-tom. The photosynthetic psbS gene has been used as reporter gene for RNA silencing studies. The efficiency of gene silencing has been found equivalent using three different target gene fragments of 519, 398 and 328 bp. Interestingly, silencing efficiency decreased from T0 to the T3 generation in plants containing multiple copies of the inserted T-DNA, while it was stable in plants containing a single insertion.

  17. Optimisation of a 96-well electroporation assay for postnatal rat CNS neurons suitable for cost-effective medium-throughput screening of genes that promote neurite outgrowth

    Directory of Open Access Journals (Sweden)

    Thomas eHutson

    2011-12-01

    Full Text Available Following an injury, central nervous system (CNS neurons show a very limited regenerative response which results in their failure to successfully form functional connections with their original target. This is due in part to the reduced intrinsic growth state of CNS neurons, which is characterised by their failure to express key regeneration-associated genes (RAGs and by the presence of growth inhibitory molecules in CNS environment that form a molecular and physical barrier to regeneration. Here we have optimised a 96-well electroporation and neurite outgrowth assay for postnatal rat cerebellar granule neurons cultured upon an inhibitory cellular substrate expressing myelin-associated glycoprotein or a mixture of growth-inhibitory chondroitin sulphate proteoglycans. Optimal electroporation parameters resulted in 25% transfection efficiency and 50% viability for postnatal rat cerebellar granule neurons (CGNs. The neurite outgrowth of transduced neurons was quantitatively measured using a semi-automated image capture and analysis system. The neurite outgrowth was significantly reduced by the inhibitory substrates which we demonstrated could be partially reversed using a Rho Kinase inhibitor. We are now using this assay to screen large sets of RAGs for their ability to increase neurite outgrowth on a variety of growth inhibitory and permissive substrates.

  18. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  19. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  20. Optimal optimisation in chemometrics

    NARCIS (Netherlands)

    Hageman, Joseph Albert

    2004-01-01

    The use of global optimisation methods is not straightforward, especially for the more difficult optimisation problems. Solutions have to be found for items such as the evaluation function, representation, step function and meta-parameters, before any useful results can be obtained. This thesis aims

  1. Agrobacterium-mediated transformation in chickpea (Cicer arietinum L.) with an insecticidal protein gene: optimisation of different factors.

    Science.gov (United States)

    Indurker, Shivani; Misra, Hari S; Eapen, Susan

    2010-07-01

    Agrobacterium-mediated transformation in chickpea was developed using strain LBA4404 carrying nptII, uidA and cryIAc genes and transformants selected on Murashige and Skoog's basal medium supplemented with benzyladenine, kinetin and kanamycin. Integration of transgenes was demonstrated using polymerase chain reaction and Southern blot hybridization of T0 plants. The expression of CryIAc delta endotoxin and GUS enzyme was shown by enzyme linked immunosorbent assay and histochemical assay respectively. The transgenic plants (T0) showed more tolerance to infection by Helicoverpa armigera compared to control plants. Various factors such as explant source, cultivar type, different preculture treatment period of explants, co-cultivation period, acetosyringone supplementation, Agrobacterium harboring different plasmids, vacuum infiltration and sonication treatment were tested to study the influence on transformation frequency. The results indicated that use of epicotyl as explant, cultivar ICCC37, Agrobacterium harboring plasmid pHS102 as vector, preculture of explant for 48 h, co-cultivation period of 2 days at 25°C and vacuum infiltration for 15 min produced the best transformation results. Sonication treatment of explants with Agrobacteria for 80 s was found to increase the frequency of transformation.

  2. Isogeometric design optimisation

    NARCIS (Netherlands)

    Nagy, A.P.

    2011-01-01

    Design optimisation is of paramount importance in most engineering, e.g. aeronautical, automotive, or naval, disciplines. Its interdisciplinary character is manifested in the synthesis of geometric modelling, numerical analysis, mathematical programming, and computer sciences. The evolution of the f

  3. Enriching regulatory networks by bootstrap learning using optimised GO-based gene similarity and gene links mined from PubMed abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.; Baddeley, Robert L.; Riensche, Roderick M.; Jensen, Russell S.; Verhagen, Marc; Pustejovsky, James

    2011-02-18

    Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant links between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.

  4. Simulation versus Optimisation

    DEFF Research Database (Denmark)

    Lund, Henrik; Arler, Finn; Østergaard, Poul Alberg

    2017-01-01

    investment optimisation or optimal solutions approach. On the other hand the analytical simulation or alternatives assessment approach. Awareness of the dissimilar theoretical assumption behind the models clarifies differences between the models, explains dissimilarities in results, and provides...... a theoretical and methodological foundation for understanding and interpreting results from the two archetypes. Keywords: energy system analysis; investment optimisation models; simulations models; modelling theory;renewable energy......In recent years, several tools and models have been developed and used for the design and analysis of future national energy systems. Many of these models focus on the integration of various renewable energy resources and the transformation of existing fossil-based energy systems into future...

  5. Optimising AspectJ

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    AspectJ, an aspect-oriented extension of Java, is becoming increasingly popular. However, not much work has been directed at optimising compilers for AspectJ. Optimising AOP languages provides many new and interesting challenges for compiler writers, and this paper identifies and addresses three...... all of the techniques in this paper in abc, our AspectBench Compiler for AspectJ, and we demonstrate significant speedups with empirical results. Some of our techniques have already been integrated into the production AspectJ compiler, ajc 1.2.1....

  6. Optimising Magnetostatic Assemblies

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Smith, Anders

    the optimal remanence distribution with respect to a linear objective functional. Additionally, it is shown here that the same formalism can be applied to the optimisation of the geometry of magnetic systems. Specifically, the border separating the permanent magnet from regions occupied by air or soft...

  7. Optimisation by hierarchical search

    Science.gov (United States)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  8. Optimisation of Microstrip Antenna

    Directory of Open Access Journals (Sweden)

    H. El Hamchary

    1996-04-01

    Full Text Available When choosing the most appropriate microstrip antenna configuration for particular applications, the kind of excitation of the radiating element is an essential factor that requires careful considerations. For controlling the distribution of energy of the linear or planar array of elements and for coupling energy to the individual elements, a wide variety of feed mechanisms are available. In this paper, the coaxial antenna feeding is assumed and the best (optimised feeding is found. Then, antenna characteristics such as radiation pattern, return loss, input impedance, and VSWR are obtained.

  9. A Global Optimisation Toolbox for Massively Parallel Engineering Optimisation

    CERN Document Server

    Biscani, Francesco; Yam, Chit Hong

    2010-01-01

    A software platform for global optimisation, called PaGMO, has been developed within the Advanced Concepts Team (ACT) at the European Space Agency, and was recently released as an open-source project. PaGMO is built to tackle high-dimensional global optimisation problems, and it has been successfully used to find solutions to real-life engineering problems among which the preliminary design of interplanetary spacecraft trajectories - both chemical (including multiple flybys and deep-space maneuvers) and low-thrust (limited, at the moment, to single phase trajectories), the inverse design of nano-structured radiators and the design of non-reactive controllers for planetary rovers. Featuring an arsenal of global and local optimisation algorithms (including genetic algorithms, differential evolution, simulated annealing, particle swarm optimisation, compass search, improved harmony search, and various interfaces to libraries for local optimisation such as SNOPT, IPOPT, GSL and NLopt), PaGMO is at its core a C++ ...

  10. Simple Combinatorial Optimisation Cost Games

    NARCIS (Netherlands)

    van Velzen, S.

    2005-01-01

    In this paper we introduce the class of simple combinatorial optimisation cost games, which are games associated to {0, 1}-matrices.A coalitional value of a combinatorial optimisation game is determined by solving an integer program associated with this matrix and the characteristic vector of the

  11. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  12. Engineering Optimisation by Cuckoo Search

    CERN Document Server

    Yang, Xin-She

    2010-01-01

    A new metaheuristic optimisation algorithm, called Cuckoo Search (CS), was developed recently by Yang and Deb (2009). This paper presents a more extensive comparison study using some standard test functions and newly designed stochastic test functions. We then apply the CS algorithm to solve engineering design optimisation problems, including the design of springs and welded beam structures. The optimal solutions obtained by CS are far better than the best solutions obtained by an efficient particle swarm optimiser. We will discuss the unique search features used in CS and the implications for further research.

  13. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn;

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  14. optPBN: An Optimisation Toolbox for Probabilistic Boolean Networks

    Science.gov (United States)

    Trairatphisan, Panuwat; Mizera, Andrzej; Pang, Jun; Tantar, Alexandru Adrian; Sauter, Thomas

    2014-01-01

    Background There exist several computational tools which allow for the optimisation and inference of biological networks using a Boolean formalism. Nevertheless, the results from such tools yield only limited quantitative insights into the complexity of biological systems because of the inherited qualitative nature of Boolean networks. Results We introduce optPBN, a Matlab-based toolbox for the optimisation of probabilistic Boolean networks (PBN) which operates under the framework of the BN/PBN toolbox. optPBN offers an easy generation of probabilistic Boolean networks from rule-based Boolean model specification and it allows for flexible measurement data integration from multiple experiments. Subsequently, optPBN generates integrated optimisation problems which can be solved by various optimisers. In term of functionalities, optPBN allows for the construction of a probabilistic Boolean network from a given set of potential constitutive Boolean networks by optimising the selection probabilities for these networks so that the resulting PBN fits experimental data. Furthermore, the optPBN pipeline can also be operated on large-scale computational platforms to solve complex optimisation problems. Apart from exemplary case studies which we correctly inferred the original network, we also successfully applied optPBN to study a large-scale Boolean model of apoptosis where it allows identifying the inverse correlation between UVB irradiation, NFκB and Caspase 3 activations, and apoptosis in primary hepatocytes quantitatively. Also, the results from optPBN help elucidating the relevancy of crosstalk interactions in the apoptotic network. Summary The optPBN toolbox provides a simple yet comprehensive pipeline for integrated optimisation problem generation in the PBN formalism that can readily be solved by various optimisers on local or grid-based computational platforms. optPBN can be further applied to various biological studies such as the inference of gene regulatory

  15. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  16. Isogeometric Analysis and Shape Optimisation

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Gersborg, Allan Roulund

    obtained and also some of the problems we have encountered. One of these problems is that the geometry of the shape is given by the boundary alone. And, it is the parametrisation of the boundary which is changed by the optimisation procedure. But isogeometric analysis requires a parametrisation......One of the attractive features of isogeometric analysis is the exact representation of the geometry. The geometry is furthermore given by a relative low number of control points and this makes isogeometric analysis an ideal basis for shape optimisation. I will describe some of the results we have...... of the whole domain. So in every optimisation cycle we need to extend a parametrisation of the boundary of a domain to the whole domain. It has to be fast in order not to slow the optimisation down but it also has to be robust and give a parametrisation of high quality. These are conflicting requirements so we...

  17. Turbulence optimisation in stellarator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Proll, Josefine H.E. [Max-Planck/Princeton Center for Plasma Physics (Germany); Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstr. 1, 17491 Greifswald (Germany); Faber, Benjamin J. [HSX Plasma Laboratory, University of Wisconsin-Madison, Madison, WI 53706 (United States); Helander, Per; Xanthopoulos, Pavlos [Max-Planck/Princeton Center for Plasma Physics (Germany); Lazerson, Samuel A.; Mynick, Harry E. [Plasma Physics Laboratory, Princeton University, P.O. Box 451 Princeton, New Jersey 08543-0451 (United States)

    2015-05-01

    Stellarators, the twisted siblings of the axisymmetric fusion experiments called tokamaks, have historically suffered from confining the heat of the plasma insufficiently compared with tokamaks and were therefore considered to be less promising candidates for a fusion reactor. This has changed, however, with the advent of stellarators in which the laminar transport is reduced to levels below that of tokamaks by shaping the magnetic field accordingly. As in tokamaks, the turbulent transport remains as the now dominant transport channel. Recent analytical theory suggests that the large configuration space of stellarators allows for an additional optimisation of the magnetic field to also reduce the turbulent transport. In this talk, the idea behind the turbulence optimisation is explained. We also present how an optimised equilibrium is obtained and how it might differ from the equilibrium field of an already existing device, and we compare experimental turbulence measurements in different configurations of the HSX stellarator in order to test the optimisation procedure.

  18. Optimisation of load control

    Energy Technology Data Exchange (ETDEWEB)

    Koponen, P. [VTT Energy, Espoo (Finland)

    1998-08-01

    Electricity cannot be stored in large quantities. That is why the electricity supply and consumption are always almost equal in large power supply systems. If this balance were disturbed beyond stability, the system or a part of it would collapse until a new stable equilibrium is reached. The balance between supply and consumption is mainly maintained by controlling the power production, but also the electricity consumption or, in other words, the load is controlled. Controlling the load of the power supply system is important, if easily controllable power production capacity is limited. Temporary shortage of capacity causes high peaks in the energy price in the electricity market. Load control either reduces the electricity consumption during peak consumption and peak price or moves electricity consumption to some other time. The project Optimisation of Load Control is a part of the EDISON research program for distribution automation. The following areas were studied: Optimization of space heating and ventilation, when electricity price is time variable, load control model in power purchase optimization, optimization of direct load control sequences, interaction between load control optimization and power purchase optimization, literature on load control, optimization methods and field tests and response models of direct load control and the effects of the electricity market deregulation on load control. An overview of the main results is given in this chapter

  19. Synergistic effects in gene delivery-a structure-activity approach to the optimisation of hybrid dendritic-lipidic transfection agents.

    Science.gov (United States)

    Jones, Simon P; Gabrielson, Nathan P; Pack, Daniel W; Smith, David K

    2008-10-21

    Novel gene delivery agents based on combining cholesterol units with spermine-functionalised dendrons exhibit enhanced transfection ability-we report significant synergistic effects in mixed (hybrid) systems which combine aspects of both main classes of synthetic vectors, i.e., cationic polymers and lipids.

  20. Computational optimisation of targeted DNA sequencing for cancer detection

    DEFF Research Database (Denmark)

    Martinez, Pierre; McGranahan, Nicholas; Birkbak, Nicolai Juul

    2013-01-01

    circulating tumour DNA (ctDNA) might represent a non-invasive method to detect mutations in patients, facilitating early detection. In this article, we define reduced gene panels from publicly available datasets as a first step to assess and optimise the potential of targeted ctDNA scans for early tumour...

  1. TEM turbulence optimisation in stellarators

    CERN Document Server

    Proll, J H E; Xanthopoulos, P; Lazerson, S A; Faber, B J

    2015-01-01

    With the advent of neoclassically optimised stellarators, optimising stellarators for turbulent transport is an important next step. The reduction of ion-temperature-gradient-driven turbulence has been achieved via shaping of the magnetic field, and the reduction of trapped-electron mode (TEM) turbulence is adressed in the present paper. Recent analytical and numerical findings suggest TEMs are stabilised when a large fraction of trapped particles experiences favourable bounce-averaged curvature. This is the case for example in Wendelstein 7-X [C.D. Beidler $\\textit{et al}$ Fusion Technology $\\bf{17}$, 148 (1990)] and other Helias-type stellarators. Using this knowledge, a proxy function was designed to estimate the TEM dynamics, allowing optimal configurations for TEM stability to be determined with the STELLOPT [D.A. Spong $\\textit{et al}$ Nucl. Fusion $\\bf{41}$, 711 (2001)] code without extensive turbulence simulations. A first proof-of-principle optimised equilibrium stemming from the TEM-dominated stella...

  2. Adenovirus-based strategies overcome temozolomide resistance by silencing the O6-methylguanine-DNA methyltransferase promoter.

    Science.gov (United States)

    Alonso, Marta M; Gomez-Manzano, Candelaria; Bekele, B Nebiyou; Yung, W K Alfred; Fueyo, Juan

    2007-12-15

    Currently, the most efficacious treatment for malignant gliomas is temozolomide; however, gliomas expressing the DNA repair enzyme O(6)-methylguanine-DNA methyltransferase (MGMT) are resistant to this drug. Strong clinical evidence shows that gliomas with methylation and subsequent silencing of the MGMT promoter are sensitive to temozolomide. Based on the fact that adenoviral proteins directly target and inactivate key DNA repair genes, we hypothesized that the oncolytic adenovirus Delta-24-RGD could be successfully combined with temozolomide to overcome the reported MGMT-mediated resistance. Our studies showed that the combination of Delta-24-RGD and temozolomide induces a profound therapeutic synergy in glioma cells. We observed that Delta-24-RGD treatment overrides the temozolomide-mediated G(2)-M arrest. Furthermore, Delta-24-RGD infection was followed by down-modulation of the RNA levels of MGMT. Chromatin immunoprecipitation assays showed that Delta-24-RGD prevented the recruitment of p300 to the MGMT promoter. Importantly, using mutant adenoviruses and wild-type and dominant-negative forms of the p300 protein, we showed that Delta-24-RGD interaction with p300 was required to induce silencing of the MGMT gene. Of further clinical relevance, the combination of Delta-24-RGD and temozolomide significantly improved the survival of glioma-bearing mice. Collectively, our data provide a strong mechanistic rationale for the combination of oncolytic adenoviruses and temozolomide, and should propel the clinical testing of this therapy approach in patients with malignant gliomas.

  3. A new adenovirus based vaccine vector expressing an Eimeria tenella derived TLR agonist improves cellular immune responses to an antigenic target.

    Directory of Open Access Journals (Sweden)

    Daniel M Appledorn

    Full Text Available BACKGROUND: Adenoviral based vectors remain promising vaccine platforms for use against numerous pathogens, including HIV. Recent vaccine trials utilizing Adenovirus based vaccines expressing HIV antigens confirmed induction of cellular immune responses, but these responses failed to prevent HIV infections in vaccinees. This illustrates the need to develop vaccine formulations capable of generating more potent T-cell responses to HIV antigens, such as HIV-Gag, since robust immune responses to this antigen correlate with improved outcomes in long-term non-progressor HIV infected individuals. METHODOLOGY/PRINCIPAL FINDINGS: In this study we designed a novel vaccine strategy utilizing an Ad-based vector expressing a potent TLR agonist derived from Eimeria tenella as an adjuvant to improve immune responses from a [E1-]Ad-based HIV-Gag vaccine. Our results confirm that expression of rEA elicits significantly increased TLR mediated innate immune responses as measured by the influx of plasma cytokines and chemokines, and activation of innate immune responding cells. Furthermore, our data show that the quantity and quality of HIV-Gag specific CD8(+ and CD8(- T-cell responses were significantly improved when coupled with rEA expression. These responses also correlated with a significantly increased number of HIV-Gag derived epitopes being recognized by host T cells. Finally, functional assays confirmed that rEA expression significantly improved antigen specific CTL responses, in vivo. Moreover, we show that these improved responses were dependent upon improved TLR pathway interactions. CONCLUSION/SIGNIFICANCE: The data presented in this study illustrate the potential utility of Ad-based vectors expressing TLR agonists to improve clinical outcomes dependent upon induction of robust, antigen specific immune responses.

  4. Cogeneration technologies, optimisation and implementation

    CERN Document Server

    Frangopoulos, Christos A

    2017-01-01

    Cogeneration refers to the use of a power station to deliver two or more useful forms of energy, for example, to generate electricity and heat at the same time. This book provides an integrated treatment of cogeneration, including a tour of the available technologies and their features, and how these systems can be analysed and optimised.

  5. Optimisation of solar synoptic observations

    Science.gov (United States)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  6. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  7. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  8. Optimising Code Generation with haggies

    OpenAIRE

    Reiter, Thomas

    2009-01-01

    This article describes haggies, a program for the generation of optimised programs for the efficient numerical evaluation of mathematical expressions. It uses a multivariate Horner-scheme and Common Subexpression Elimination to reduce the overall number of operations. The package can serve as a back-end for virtually any general purpose computer algebra program. Built-in type inference that allows to deal with non-standard data types in strongly typed languages and a very flexible, pattern-ba...

  9. Robust optimisation of railway crossing geometry

    Science.gov (United States)

    Wan, Chang; Markine, Valeri; Dollevoet, Rolf

    2016-05-01

    This paper presents a methodology for improving the crossing (frog) geometry through the robust optimisation approach, wherein the variability of the design parameters within a prescribed tolerance is included in the optimisation problem. Here, the crossing geometry is defined by parameterising the B-spline represented cross-sectional shape and the longitudinal height profile of the nose rail. The dynamic performance of the crossing is evaluated considering the variation of wheel profiles and track alignment. A multipoint approximation method (MAM) is applied in solving the optimisation problem of minimising the contact pressure during the wheel-rail contact and constraining the location of wheel transition at the crossing. To clarify the difference between the robust optimisation and the normal deterministic optimisation approaches, the optimisation problems are solved in both approaches. The results show that the deterministic optimum fails under slight change of the design variables; the robust optimum, however, has improved and robust performance.

  10. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...... using the boundary element method where absorption is incorporated. An example is given where the geometry of a room is defined by four design modes. The room geometry is optimised to get a uniform sound pressure....

  11. Optimisation combinatoire Theorie et algorithmes

    CERN Document Server

    Korte, Bernhard; Fonlupt, Jean

    2010-01-01

    Ce livre est la traduction fran aise de la quatri me et derni re dition de Combinatorial Optimization: Theory and Algorithms crit par deux minents sp cialistes du domaine: Bernhard Korte et Jens Vygen de l'universit de Bonn en Allemagne. Il met l accent sur les aspects th oriques de l'optimisation combinatoire ainsi que sur les algorithmes efficaces et exacts de r solution de probl mes. Il se distingue en cela des approches heuristiques plus simples et souvent d crites par ailleurs. L ouvrage contient de nombreuses d monstrations, concises et l gantes, de r sultats difficiles. Destin aux tudia

  12. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  13. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R H

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  14. Optimising resource management in neurorehabilitation.

    Science.gov (United States)

    Wood, Richard M; Griffiths, Jeff D; Williams, Janet E; Brouwers, Jakko

    2014-01-01

    To date, little research has been published regarding the effective and efficient management of resources (beds and staff) in neurorehabilitation, despite being an expensive service in limited supply. To demonstrate how mathematical modelling can be used to optimise service delivery, by way of a case study at a major 21 bed neurorehabilitation unit in the UK. An automated computer program for assigning weekly treatment sessions is developed. Queue modelling is used to construct a mathematical model of the hospital in terms of referral submissions to a waiting list, admission and treatment, and ultimately discharge. This is used to analyse the impact of hypothetical strategic decisions on a variety of performance measures and costs. The project culminates in a hybridised model of these two approaches, since a relationship is found between the number of therapy hours received each week (scheduling output) and length of stay (queuing model input). The introduction of the treatment scheduling program has substantially improved timetable quality (meaning a better and fairer service to patients) and has reduced employee time expended in its creation by approximately six hours each week (freeing up time for clinical work). The queuing model has been used to assess the effect of potential strategies, such as increasing the number of beds or employing more therapists. The use of mathematical modelling has not only optimised resources in the short term, but has allowed the optimality of longer term strategic decisions to be assessed.

  15. Dose optimisation in single plane interstitial brachytherapy

    DEFF Research Database (Denmark)

    Tanderup, Kari; Hellebust, Taran Paulsen; Honoré, Henriette Benedicte;

    2006-01-01

    BACKGROUND AND PURPOSE: Brachytherapy dose distributions can be optimised       by modulation of source dwell times. In this study dose optimisation in       single planar interstitial implants was evaluated in order to quantify the       potential benefit in patients. MATERIAL AND METHODS: In 14...

  16. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane Loft;

    The design of sewer system control is a complex task given the large size of the sewer networks, the transient dynamics of the water flows and the stochastic nature of rainfall. This contribution presents a generic methodology for the design of a self-optimising controller in sewer systems...... to design an optimising control strategy for a subcathcment area in Copenhagen....

  17. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...

  18. Haemodynamic optimisation in lower limb arterial surgery

    DEFF Research Database (Denmark)

    Bisgaard, J; Gilsaa, T; Rønholm, E;

    2012-01-01

    index was optimised by administering 250 ml aliquots of colloid intraoperatively and during the first 6 h post-operatively. Following surgery, fluid optimisation was supplemented with dobutamine, if necessary, targeting an oxygen delivery index level ≥ 600 ml/min(/) m(2) in the intervention group...

  19. Evolutionary programming for neutron instrument optimisation

    Science.gov (United States)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelièvre-Berna, Eddy

    2006-11-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  20. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  1. Optimising costs in WLCG operations

    CERN Document Server

    Pradillo, Mar; Flix, Josep; Forti, Alessandra; Sciabà, Andrea

    2015-01-01

    The Worldwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse the 50 Petabytes of data annually generated by the LHC. The WLCG operations are coordinated by a distributed team of managers and experts and performed by people at all participating sites and from all the experiments. Several improvements in the WLCG infrastructure have been implemented during the first long LHC shutdown to prepare for the increasing needs of the experiments during Run2 and beyond. However, constraints in funding will affect not only the computing resources but also the available effort for operations. This paper presents the results of a detailed investigation on the allocation of the effort in the different areas of WLCG operations, identifies the most important sources of inefficiency and proposes viable strategies for optimising the operational cost, taking into account the current trends in the evolution of the computing infrastruc...

  2. Optimising Comprehensibility in Interlingual Translation

    DEFF Research Database (Denmark)

    Nisbeth Jensen, Matilde

    2015-01-01

    . It is argued that Plain Language writing is a type of intralingual translation as it involves rewriting or translating a complex monolingual text into comprehensible language. Based on Plain Language literature, a comprehensibility framework is elaborated, which is subsequently exemplified through...... the functional text type of Patient Information Leaflet. Finally, the usefulness of applying the principles of Plain Language and intralingual translation for optimising comprehensibility in interlingual translation is discussed....... information on medication and tax information. Such texts are often written by experts and received by lay people, and, in today’s globalised world, they are often translated as well. In these functional texts, the receiver is not a mere recipient of information, but s/he needs to be able to act upon it...

  3. Systematic delay-driven power optimisation and power-driven delay optimisation of combinational circuits

    OpenAIRE

    Mehrotra, Rashmi

    2013-01-01

    With the proliferation of mobile wireless communication and embedded systems, the energy efficiency becomes a major design constraint. The dissipated energy is often referred as the product of power dissipation and the input-output delay. Most of electronic design automation techniques focus on optimising only one of these parameters either power or delay. Industry standard design flows integrate systematic methods of optimising either area or timing while for power consumption optimisation o...

  4. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  5. Topology optimisation for natural convection problems

    CERN Document Server

    Alexandersen, Joe; Andreasen, Casper Schousboe; Sigmund, Ole

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach for designing heat sink geometries cooled by natural convection and micropumps powered by natural convection.

  6. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    This thesis deals with topology optimisation for coupled convection problems. The aim is to extend and apply topology optimisation to steady-state conjugate heat transfer problems, where the heat conduction equation governs the heat transfer in a solid and is coupled to thermal transport...... in a surrounding uid, governed by a convection-diffusion equation, where the convective velocity field is found from solving the isothermal incompressible steady-state Navier-Stokes equations. Topology optimisation is also applied to steady-state natural convection problems. The modelling is done using stabilised...... finite elements, the formulation and implementation of which was done partly during a special course as prepatory work for this thesis. The formulation is extended with a Brinkman friction term in order to facilitate the topology optimisation of fluid flow and convective cooling problems. The derived...

  7. An optimisation method for complex product design

    Science.gov (United States)

    Li, Ni; Yi, Wenqing; Bi, Zhuming; Kong, Haipeng; Gong, Guanghong

    2013-11-01

    Designing a complex product such as an aircraft usually requires both qualitative and quantitative data and reasoning. To assist the design process, a critical issue is how to represent qualitative data and utilise it in the optimisation. In this study, a new method is proposed for the optimal design of complex products: to make the full use of available data, information and knowledge, qualitative reasoning is integrated into the optimisation process. The transformation and fusion of qualitative and qualitative data are achieved via the fuzzy sets theory and a cloud model. To shorten the design process, parallel computing is implemented to solve the formulated optimisation problems. A parallel adaptive hybrid algorithm (PAHA) has been proposed. The performance of the new algorithm has been verified by a comparison with the results from PAHA and two other existing algorithms. Further, PAHA has been applied to determine the shape parameters of an aircraft model for aerodynamic optimisation purpose.

  8. User perspectives in public transport timetable optimisation

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    The present paper deals with timetable optimisation from the perspective of minimising the waiting time experienced by passengers when transferring either to or from a bus. Due to its inherent complexity, this bi-level minimisation problem is extremely difficult to solve mathematically, since...... on the large-scale public transport network in Denmark. The timetable optimisation approach yielded a yearly reduction in weighted waiting time equivalent to approximately 45 million Danish kroner (9 million USD)....

  9. Analysing bone regeneration using topological optimisation

    Directory of Open Access Journals (Sweden)

    Diego Alexander Garzón Alvarado

    2010-04-01

    Full Text Available The present article's object is to present the mathematical foundations of topological optimisation aimed at carrying out a study of bone regeneration. Bone structure can be economically adopted to different mechanical demands responding to topological optimisation models (having "minimum" mass and "high" resistance. Such analysis is essential for formulating physical therapy in patients needing partial or total strengthening of a particular bone's tissue structure. A mathematical model is formulated, as are the methods for resolving it.

  10. Optimising costs in WLCG operations

    Science.gov (United States)

    Alandes Pradillo, María; Dimou, Maria; Flix, Josep; Forti, Alessandra; Sciabà, Andrea

    2015-12-01

    The Worldwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse the 50 Petabytes of data annually generated by the LHC. The WLCG operations are coordinated by a distributed team of managers and experts and performed by people at all participating sites and from all the experiments. Several improvements in the WLCG infrastructure have been implemented during the first long LHC shutdown to prepare for the increasing needs of the experiments during Run2 and beyond. However, constraints in funding will affect not only the computing resources but also the available effort for operations. This paper presents the results of a detailed investigation on the allocation of the effort in the different areas of WLCG operations, identifies the most important sources of inefficiency and proposes viable strategies for optimising the operational cost, taking into account the current trends in the evolution of the computing infrastructure and the computing models of the experiments.

  11. Optimisation Modelling of Efficiency of Enterprise Restructuring

    Directory of Open Access Journals (Sweden)

    Yefimova Hanna V.

    2014-03-01

    Full Text Available The article considers issues of optimisation of the use of resources directed at restructuring of a shipbuilding enterprise, which is the main prerequisite of its efficiency. Restructuring is considered as a process of complex and interconnected change in the structure of assets, liabilities, enterprise functions, initiated by dynamic environment, which is based on the strategic concept of its development and directed at increase of efficiency of its activity, which is expressed in the growth of cost. The task of making a decision to restructure a shipbuilding enterprise and selection of a specific restructuring project refers to optimisation tasks of prospective planning. Enterprise resources that are allocated for restructuring serve as constraints of the mathematical model. Main criteria of optimisation are maximisation of pure discounted income or minimisation of expenditures on restructuring measures. The formed optimisation model is designed for assessment of volumes of attraction of own and borrowed funds for restructuring. Imitation model ensures development of cash flows. The task solution is achieved on the basis of the complex of interrelated optimisation and imitation models and procedures on formation, selection and co-ordination of managerial decisions.

  12. Optimisation of Investment Resources at Small Enterprises

    Directory of Open Access Journals (Sweden)

    Shvets Iryna B.

    2014-03-01

    Full Text Available The goal of the article lies in the study of the process of optimisation of the structure of investment resources, development of criteria and stages of optimisation of volumes of investment resources for small enterprises by types of economic activity. The article characterises the process of transformation of investment resources into assets and liabilities of the balances of small enterprises and conducts calculation of the structure of sources of formation of investment resources in Ukraine at small enterprises by types of economic activity in 2011. On the basis of the conducted analysis of the structure of investment resources of small enterprises the article forms main groups of criteria of optimisation in the context of individual small enterprises by types of economic activity. The article offers an algorithm and step-by-step scheme of optimisation of investment resources at small enterprises in the form of a multi-stage process of management of investment resources in the context of increase of their mobility and rate of transformation of existing resources into investments. The prospect of further studies in this direction is development of a structural and logic scheme of optimisation of volumes of investment resources at small enterprises.

  13. Multicriteria Optimisation in Logistics Forwarder Activities

    Directory of Open Access Journals (Sweden)

    Tanja Poletan Jugović

    2007-05-01

    Full Text Available Logistics forwarder, as organizer and planner of coordinationand integration of all the transport and logistics chains elements,uses adequate ways and methods in the process of planningand decision-making. One of these methods, analysed inthis paper, which could be used in optimisation of transportand logistics processes and activities of logistics forwarder, isthe multicriteria optimisation method. Using that method, inthis paper is suggested model of multicriteria optimisation of logisticsforwarder activities. The suggested model of optimisationis justified in keeping with method principles of multicriteriaoptimization, which is included in operation researchmethods and it represents the process of multicriteria optimizationof variants. Among many different processes of multicriteriaoptimization, PROMETHEE (Preference Ranking OrganizationMethod for Enrichment Evaluations and Promcalc& Gaia V. 3.2., computer program of multicriteria programming,which is based on the mentioned process, were used.

  14. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Andreasen, Casper Schousboe; Aage, Niels

    conduction governs in the solid parts of the design domain and couples to convection-dominated heat transfer to a surrounding fluid. Both loosely coupled and tightly coupled problems are considered. The loosely coupled problems are convection-diffusion problems, based on an advective velocity field from......The work focuses on applying topology optimisation to forced and natural convection problems in fluid dynamics and conjugate (fluid-structure) heat transfer. To the authors' knowledge, topology optimisation has not yet been applied to natural convection flow problems in the published literature...... and the current work is thus seen as contributing new results to the field. In the literature, most works on the topology optimisation of weakly coupled convection-diffusion problems focus on the temperature distribution of the fluid, but a selection of notable exceptions also focusing on the temperature...

  15. [Process optimisation: from theory to practical implementation].

    Science.gov (United States)

    Töpfer, Armin

    2010-01-01

    Today process optimisation is an indispensable approach to mastering the current challenges of modern health care management. The objective is to design business processes free of defects and free of waste as well as their monitoring and controlling with meaningful test statistics. Based on the identification of essential key performance indicators, key success factors and value cash generators two basic approaches to process optimisation, which are well-established and widely used in the industry, are now being implemented in the health care sector as well: Lean Management and Six Sigma.

  16. Simulating stem growth using topological optimisation

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Narváez

    2010-04-01

    Full Text Available Engineers are currently resorting to observations of nature for making new designs. Studying the functioning of bodies of plants and animals has required them to be modelled and simulated; however, some models born from engineering problems could be used for such purposes. This article shows how topological optimisation (a mathematical model for optimising designing structural elements can be used for modeling and simulating the way a stem grows in terms of carrying out its funtion of providing support for the leaves and a plant's other upper organs.

  17. Bat Algorithm for Multi-objective Optimisation

    CERN Document Server

    Yang, Xin-She

    2012-01-01

    Engineering optimization is typically multiobjective and multidisciplinary with complex constraints, and the solution of such complex problems requires efficient optimization algorithms. Recently, Xin-She Yang proposed a bat-inspired algorithm for solving nonlinear, global optimisation problems. In this paper, we extend this algorithm to solve multiobjective optimisation problems. The proposed multiobjective bat algorithm (MOBA) is first validated against a subset of test functions, and then applied to solve multiobjective design problems such as welded beam design. Simulation results suggest that the proposed algorithm works efficiently.

  18. Topology Optimisation of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Thike Aye Min

    2016-01-01

    Full Text Available Wireless sensor networks are widely used in a variety of fields including industrial environments. In case of a clustered network the location of cluster head affects the reliability of the network operation. Finding of the optimum location of the cluster head, therefore, is critical for the design of a network. This paper discusses the optimisation approach, based on the brute force algorithm, in the context of topology optimisation of a cluster structure centralised wireless sensor network. Two examples are given to verify the approach that demonstrate the implementation of the brute force algorithm to find an optimum location of the cluster head.

  19. Optimisation of interventional cardiology procedures; Optimisation des procedures en cardiologie interventionnelle

    Energy Technology Data Exchange (ETDEWEB)

    Bar, Olivier [SELARL, Cardiologie Interventionnelle Imagerie Cardiaque - CIIC, 8, place de la Cathedrale - 37042 Tours (France)

    2011-07-15

    Radiation-guided procedures in interventional cardiology include diagnostic and/or therapeutic procedures, primarily coronary catheterization and coronary angioplasty. Application of the principles of radiation protection and the use of optimised procedures are contributing to dose reduction while maintaining the radiological image quality necessary for performance of the procedures. The mandatory training in patient radiation protection and technical training in the use of radiology devices mean that implementing continuous optimisation of procedures is possible in practice. This optimisation approach is the basis of patient radiation protection; when associated with the wearing of protective equipment it also contributes to the radiation protection of the cardiologists. (author)

  20. Extending Particle Swarm Optimisers with Self-Organized Criticality

    DEFF Research Database (Denmark)

    Løvbjerg, Morten; Krink, Thiemo

    2002-01-01

    Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.......Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions....

  1. Particle Swarm Optimisation with Spatial Particle Extension

    DEFF Research Database (Denmark)

    Krink, Thiemo; Vesterstrøm, Jakob Svaneborg; Riget, Jacques

    2002-01-01

    In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed...

  2. Topology optimisation of natural convection problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equation...

  3. Thermodynamic optimisation of a heat exchanger

    NARCIS (Netherlands)

    Cornelissen, R.L.; Hirs, G.G.

    1999-01-01

    The objective of this paper is to show that for the optimal design of an energy system, where there is a trade-off between exergy saving during operation and exergy use during construction of the energy system, exergy analysis and life cycle analysis should be combined. An exergy optimisation of a h

  4. Optimised Design of Transparent Optical Domains

    DEFF Research Database (Denmark)

    Hanik, N.; Caspar, C.; Schmidt, F.;

    2000-01-01

    Three different design concepts for transparent, dispersion compensated, optical WDM transmission links are optimised numerically and experimentally for 10 Gbit/s data rate per channel. It is shown that robust transparent domains of 1,500 km in diameter can be realised using simple design rutes....

  5. Public transport optimisation emphasising passengers’ travel behaviour

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo

    to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in the context of public transport operations. In public transport, the demand is represented...... the published performance measures and what passengers actually experience, a large academic contribution of the current PhD study is the explicit consideration of passengers’ travel behaviour in optimisation studies and in the performance assessment. Besides the explicit passenger focus in transit planning...... at as the motivator for delay-robust railway timetables. Interestingly, passenger oriented optimisation studies considering robustness in railway planning typically limit their emphasis on passengers to the consideration of transfer maintenance. Clearly, passengers’ travel behaviour is more complex and multifaceted...

  6. Topology optimised planar photonic crystal building blocks

    DEFF Research Database (Denmark)

    Frandsen, Lars Hagedorn; Hede, K. K.; Borel, Peter Ingo

    A photonic crystal waveguide (PhCW) 1x4 splitter has been constructed from PhCW 60° bends1 and Y-splitters2 that have been designed individually by utilising topology optimisation3. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1) and exhibits a broadband splitting...... for the TE-polarisation with an average excess loss of 1.55±0.54 dB for a 110 nm bandwidth. The 1x4 splitter demonstrates that individual topology-optimised parts can be used as building blocks to realise high-performance nanophotonic circuits. 1L. H. Frandsen et al., Opt. Express 12, 5916-5921 (2004) 2P. I...

  7. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, I.; Mollerup, Ane Loft

    2013-01-01

    Self-optimising control is a useful concept to select optimal controlled variables from a set of candidate measurements in a systematic manner. In this study, use self-optimizing control tools and apply them to the specific features of sewer systems, e.g. the continuously transient dynamics......, the availability of a large number of measurements, the stochastic and unforeseeable character of the disturbances (rainfall). Using a subcatchment area in the Copenhagen sewer system as a case study we demonstrate, step by step, the formulation of the self-optimising control problem. The final result...... is an improved control structure aimed at optimizing the losses for a given control objective, here the minimization of the combined sewer overflows despite rainfall variations....

  8. Improved Squeaky Wheel Optimisation for Driver Scheduling

    CERN Document Server

    Aickelin, Uwe; Li, Jingpeng

    2008-01-01

    This paper presents a technique called Improved Squeaky Wheel Optimisation for driver scheduling problems. It improves the original Squeaky Wheel Optimisations effectiveness and execution speed by incorporating two additional steps of Selection and Mutation which implement evolution within a single solution. In the ISWO, a cycle of Analysis-Selection-Mutation-Prioritization-Construction continues until stopping conditions are reached. The Analysis step first computes the fitness of a current solution to identify troublesome components. The Selection step then discards these troublesome components probabilistically by using the fitness measure, and the Mutation step follows to further discard a small number of components at random. After the above steps, an input solution becomes partial and thus the resulting partial solution needs to be repaired. The repair is carried out by using the Prioritization step to first produce priorities that determine an order by which the following Construction step then schedul...

  9. Buckling optimisation of sandwich cylindrical panels

    Science.gov (United States)

    Abouhamzeh, M.; Sadighi, M.

    2016-06-01

    In this paper, the buckling load optimisation is performed on sandwich cylindrical panels. A finite element program is developed in MATLAB to solve the governing differential equations of the global buckling of the structure. In order to find the optimal solution, the genetic algorithm Toolbox in MATLAB is implemented. Verifications are made for both the buckling finite element code and also the results from the genetic algorithm by comparisons to the results available in literature. Sandwich cylindrical panels are optimised for the buckling strength with isotropic or orthotropic cores with different boundary conditions. Results are presented in terms of stacking sequence of fibers in the face sheets and core to face sheet thickness ratio.

  10. Applying the Theory of Optimising Professional Life

    Directory of Open Access Journals (Sweden)

    Lesley Margaret Piko

    2014-12-01

    Full Text Available Glaser (2014 wrote that “the application of grounded theory (GT is a relatively neglected topic” (p. 1 in the literature. Applying GT to purposely intervene and improve a situation is an important adjunct to our knowledge and understanding of GT. A recent workshop of family doctors and general practitioners provides a useful example. The theory of optimising professional life explains that doctors are concerned about sustainment in their career and, to resolve this concern, they implement solutions to optimise their personal situation. Sustainment is a new, overarching concept of three needs: the need for self-care to sustain well-being, the need for work interest to sustain motivation, and the need for income to sustain lifestyle. The objective of the workshop was to empower doctors to reinvent their careers using this theory. Working individually and in small groups, participants were able to analyse a problem and to identify potential solutions.

  11. Fermionic orbital optimisation in tensor network states

    CERN Document Server

    Krumnow, C; Eisert, J

    2015-01-01

    Tensor network states and specifically matrix-product states have proven to be a powerful tool for simulating ground states of strongly correlated spin models. Recently, they have also been applied to interacting fermionic problems, specifically in the context of quantum chemistry. A new freedom arising in such non-local fermionic systems is the choice of orbitals, it being far from clear what choice of fermionic orbitals to make. In this work, we propose a way to overcome this challenge. We suggest a method intertwining the optimisation over matrix product states with suitable fermionic Gaussian mode transformations, hence bringing the advantages of both approaches together. The described algorithm generalises basis changes in the spirit of the Hartree-Fock methods to matrix-product states, and provides a black box tool for basis optimisations in tensor network methods.

  12. Adaptive Java Optimisation using machine learning techniques

    OpenAIRE

    Long, Shun

    2004-01-01

    There is a continuing demand for higher performance, particularly in the area of scientific and engineering computation. In order to achieve high performance in the context of frequent hardware upgrading, software must be adaptable for portable performance. What is required is an optimising compiler that evolves and adapts itself to environmental change without sacrificing performance. Java has emerged as a dominant programming language widely used in a variety of application areas. Howeve...

  13. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  14. Optimised polarisation measurements on Bragg peaks

    Energy Technology Data Exchange (ETDEWEB)

    Lelievre-Berna, E. [Institut Laue Langevin, 6 rue Jules Horowitz, 38042 Grenoble Cedex 9 (France)]. E-mail: lelievre@ill.fr; Brown, P.J. [Institut Laue Langevin, 6 rue Jules Horowitz, 38042 Grenoble Cedex 9 (France); Tasset, F. [Institut Laue Langevin, 6 rue Jules Horowitz, 38042 Grenoble Cedex 9 (France)

    2007-07-15

    Experimentally the asymmetry A (or the flipping ratio R) is deduced from the two count rates observed for |+> and |-> neutron spin states. Since the count rates for the two spin states may be quite different and both require to be corrected for background, the optimum strategy for the measurement is important. We present here the theory for optimisation of the accuracy of measurement of A (or R) within the constraint of a fixed total measuring time.

  15. Cuckoo search optimisation for feature selection in cancer classification: a new approach.

    Science.gov (United States)

    Gunavathi, C; Premalatha, K

    2015-01-01

    Cuckoo Search (CS) optimisation algorithm is used for feature selection in cancer classification using microarray gene expression data. Since the gene expression data has thousands of genes and a small number of samples, feature selection methods can be used for the selection of informative genes to improve the classification accuracy. Initially, the genes are ranked based on T-statistics, Signal-to-Noise Ratio (SNR) and F-statistics values. The CS is used to find the informative genes from the top-m ranked genes. The classification accuracy of k-Nearest Neighbour (kNN) technique is used as the fitness function for CS. The proposed method is experimented and analysed with ten different cancer gene expression datasets. The results show that the CS gives 100% average accuracy for DLBCL Harvard, Lung Michigan, Ovarian Cancer, AML-ALL and Lung Harvard2 datasets and it outperforms the existing techniques in DLBCL outcome and prostate datasets.

  16. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez

    2014-09-16

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  17. Designing Lead Optimisation of MMP-12 Inhibitors

    Directory of Open Access Journals (Sweden)

    Matteo Borrotti

    2014-01-01

    Full Text Available The design of new molecules with desired properties is in general a very difficult problem, involving heavy experimentation with high investment of resources and possible negative impact on the environment. The standard approach consists of iteration among formulation, synthesis, and testing cycles, which is a very long and laborious process. In this paper we address the so-called lead optimisation process by developing a new strategy to design experiments and modelling data, namely, the evolutionary model-based design for optimisation (EDO. This approach is developed on a very small set of experimental points, which change in relation to the response of the experimentation according to the principle of evolution and insights gained through statistical models. This new procedure is validated on a data set provided as test environment by Pickett et al. (2011, and the results are analysed and compared to the genetic algorithm optimisation (GAO as a benchmark. The very good performance of the EDO approach is shown in its capacity to uncover the optimum value using a very limited set of experimental points, avoiding unnecessary experimentation.

  18. Procedure for Application-Oriented Optimisation of Marine Propellers

    Directory of Open Access Journals (Sweden)

    Florian Vesting

    2016-11-01

    Full Text Available The use of automated optimisation in engineering applications is emerging. In particular, nature inspired algorithms are frequently used because of their variability and robust application in constraints and multi-objective optimisation problems. The purpose of this paper is the comparison of four different algorithms and several optimisation strategies on a set of seven test propellers in realistic industrial design setting. The propellers are picked from real commercial projects and the manual final designs were delivered to customers. The different approaches are evaluated and final results of the automated optimisation toolbox are compared with designs generated in a manual design process. We identify a two-stage optimisation for marine propellers, where the geometry is first modified by parametrised geometry distribution curves to gather knowledge of the test case. Here we vary the optimisation strategy in terms of applied algorithms, constraints and objectives. A second supporting optimisation aims to improve the design by locally changing the geometry, based on the results of the first optimisation. The optimisation algorithms and strategies yield propeller designs that are comparable to the manually designed propeller blade geometries, thus being suitable as robust and advanced design support tools. The supporting optimisation, with local modification of the blade geometry and the proposed cavity shape constraints, features particular good performance in modifying cavitation on the blade and is, with the AS NSGA-II (adaptive surrogate-assisted NSGA-II, superior in lead time.

  19. Railway vehicle performance optimisation using virtual homologation

    Science.gov (United States)

    Magalhães, H.; Madeira, J. F. A.; Ambrósio, J.; Pombo, J.

    2016-09-01

    Unlike regular automotive vehicles, which are designed to travel in different types of roads, railway vehicles travel mostly in the same route during their life cycle. To accept the operation of a railway vehicle in a particular network, a homologation process is required according to local standard regulations. In Europe, the standards EN 14363 and UIC 518, which are used for railway vehicle acceptance, require on-track tests and/or numerical simulations. An important advantage of using virtual homologation is the reduction of the high costs associated with on-track tests by studying the railway vehicle performance in different operation conditions. This work proposes a methodology for the improvement of railway vehicle design with the objective of its operation in selected railway tracks by using optimisation. The analyses required for the vehicle improvement are performed under control of the optimisation method global and local optimisation using direct search. To quantify the performance of the vehicle, a new objective function is proposed, which includes: a Dynamic Performance Index, defined as a weighted sum of the indices obtained from the virtual homologation process; the non-compensated acceleration, which is related to the operational velocity; and a penalty associated with cases where the vehicle presents an unacceptable dynamic behaviour according to the standards. Thus, the optimisation process intends not only to improve the quality of the vehicle in terms of running safety and ride quality, but also to increase the vehicle availability via the reduction of the time for a journey while ensuring its operational acceptance under the standards. The design variables include the suspension characteristics and the operational velocity of the vehicle, which are allowed to vary in an acceptable range of variation. The results of the optimisation lead to a global minimum of the objective function in which the suspensions characteristics of the vehicle are

  20. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    DEFF Research Database (Denmark)

    Helle, K.B.; Müller, T.O.; Astrup, Poul;

    2014-01-01

    chosen using regular grids or according to administrative constraints. Nowadays, however, the choice can be based on more realistic risk assessment, as it is possible to simulate potential radioactive plumes. To support sensor planning, we developed the DETECT Optimisation Tool (DOT) within the scope...... monitoring network for early detection of radioactive plumes or for the creation of dose maps. The DOT is implemented as a stand-alone easy-to-use JAVA-based application with a graphical user interface and an R backend. Users can run evaluations and optimisations, and display, store and download the results...

  1. Real-time optimisation of the Hoa Binh reservoir, Vietnam

    DEFF Research Database (Denmark)

    Richaud, Bertrand; Madsen, Henrik; Rosbjerg, Dan

    2011-01-01

    -time optimisation. First, the simulation-optimisation framework is applied for optimising reservoir operating rules. Secondly, real-time and forecast information is used for on-line optimisation that focuses on short-term goals, such as flood control or hydropower generation, without compromising the deviation......Multi-purpose reservoirs often have to be managed according to conflicting objectives, which requires efficient tools for trading-off the objectives. This paper proposes a multi-objective simulation-optimisation approach that couples off-line rule curve optimisation with on-line real...... of the forecast is addressed. The results illustrate the importance of a sufficient forecast lead time to start pre-releasing water in flood situations....

  2. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  3. Design Optimisation and Conrol of a Pilot Operated Seat Valve

    DEFF Research Database (Denmark)

    Nielsen, Brian; Andersen, Torben Ole; Hansen, Michael Rygaard

    2004-01-01

    The paper gives an approach for optimisation of the bandwidth of a pilot operated seat valve for mobile applications. Physical dimensions as well as parameters of the implemented control loop are optimised simultaneously. The frequency response of the valve varies as a function of the pressure drop...... across the valve, and it is found to be necessary to scale the controller parameters in the optimised design as a function of pressure drop....

  4. Mechatronic System Design Based On An Optimisation Approach

    DEFF Research Database (Denmark)

    Andersen, Torben Ole; Pedersen, Henrik Clemmensen; Hansen, Michael Rygaard

    The envisaged objective of this paper project is to extend the current state of the art regarding the design of complex mechatronic systems utilizing an optimisation approach. We propose to investigate a novel framework for mechatronic system design. The novelty and originality being the use...... of optimisation techniques. The methods used to optimise/design within the classical disciplines will be identified and extended to mechatronic system design....

  5. Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.

    Science.gov (United States)

    Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima

    2013-01-01

    The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  6. Robust Optimisation Approach for Vehicle Routing Problems with Uncertainty

    Directory of Open Access Journals (Sweden)

    Liang Sun

    2015-01-01

    Full Text Available We formulated a solution procedure for vehicle routing problems with uncertainty (VRPU for short with regard to future demand and transportation cost. Unlike E-SDROA (expectation semideviation robust optimisation approach for solving the proposed problem, the formulation focuses on robust optimisation considering situations possibly related to bidding and capital budgets. Besides, numerical experiments showed significant increments in the robustness of the solutions without much loss in solution quality. The differences and similarities of the robust optimisation model and existing robust optimisation approaches were also compared.

  7. Establishment and Optimisation of Virus-Induced Gene Silencing in System Hydroponic Cotton%水培条件下病毒诱导棉花基因沉默体系的建立及优化

    Institute of Scientific and Technical Information of China (English)

    穆春; 周琳; 李茂营; 杜明伟; 张明才; 田晓莉; 李召虎

    2016-01-01

    Using GhCLA1 as a marker gene and cotton variety Guoxinmian 3 plants as material, we aimed to explore effects of temperature, syringe-infiltrated concentrations and time, cultivation patterns, and cotton varieties on efficiency of tobacco rattle virus (TRV)-induced gene silencing (VIGS) under hydroponic condition. The results showed that higher silencing efficiency was induced by syringe-infiltrated time at 3 to 5 days after emergence and optimum growth temperature at 24℃ under hydroponic condition, but syringe-infiltrated concentrations could not affect VIGS silence efficiency. Moreover, pTRV-GFP as null fragment could alleviate the adverse effect of inserted fragment for plant growth. Silencing phenotype could be visible earlier in hydropon-ics culture than in soil culture, and the experimental period was significantly shortened under hydroponic condition. In addition, GhCLA1 could be silenced in all tested varieties (lines) under hydroponic condition. Cotton plants with silenced GhCTR1 were severely dwarfed, which indicated TRV-VIGS system can be applied widely in hydroponic cotton.%以国欣棉3号为材料,以棉花GhCLA1为指示基因,探讨了生长温度、重悬液浓度、注射时间、品种等对水培棉花pTRV介导的VIGS沉默效率的影响。在24℃条件下,出苗后3~5 d内注射能得到较高沉默效率,重悬液浓度对沉默效率没有影响;同时以注射 pTRV-GFP 作为空白对照可以消除插入片段对植株生长的影响,减小对照误差;水培与土培方式相比能更快更早出现沉默表型,缩短试验周期,并能诱导不同品种棉花材料 GhCLA1基因沉默;利用水培棉花TRV-VIGS体系,成功抑制了棉花GhCTR1基因的表达,与对照株相比,抑制后的棉花植株出现矮化表型,说明水培棉花TRV-VIGS体系建立在棉花研究中的广谱利用性。

  8. Biorefinery plant design, engineering and process optimisation

    DEFF Research Database (Denmark)

    Holm-Nielsen, Jens Bo; Ehimen, Ehiazesebhor Augustine

    2014-01-01

    applicable for the planning and upgrading of intended biorefinery systems, and includes discussions on the operation of an existing lignocellulosic-based biorefinery platform. Furthermore, technical considerations and tools (i.e., process analytical tools) which could be applied to optimise the operations......Before new biorefinery systems can be implemented, or the modification of existing single product biomass processing units into biorefineries can be carried out, proper planning of the intended biorefinery scheme must be performed initially. This chapter outlines design and synthesis approaches...

  9. SIROCCO. Silent rotors by acoustic optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G.; Curvers, A. [ECN Wind Energy, Petten (Netherlands); Oerlemans, S. [National Aerospace Laboratory NLR, Amsterdam (Netherlands); Braun, K.; Lutz, T.; Herrig, A.; Wuerz, W. [University of Stuttgart, Stuttgart (Germany); Matesanz, A.; Garcillan, L. [Gamesa Eolica, Madrid (Spain); Fisher, M.; Koegler, K.; Maeder, T. [GE Wind Energy/GE Global Research (United States)

    2007-07-15

    In this paper the results from the European 5th Framework project 'SIROCCO' are described. The project started in January 2003 and will end in August 2007. The main aim of the SIROCCO project is to reduce wind-turbine aerodynamic noise significantly while maintaining the aerodynamic performance. This is achieved by designing new acoustically and aerodynamically optimised airfoils for the outer part of the blade. The project focussed primarily on reducing trailing edge noise, which was broadly believed to be the dominant noise mechanism of modern wind turbines.

  10. Optimising Antibiotic Usage to Treat Bacterial Infections

    Science.gov (United States)

    Paterson, Iona K.; Hoyle, Andy; Ochoa, Gabriela; Baker-Austin, Craig; Taylor, Nick G. H.

    2016-11-01

    The increase in antibiotic resistant bacteria poses a threat to the continued use of antibiotics to treat bacterial infections. The overuse and misuse of antibiotics has been identified as a significant driver in the emergence of resistance. Finding optimal treatment regimens is therefore critical in ensuring the prolonged effectiveness of these antibiotics. This study uses mathematical modelling to analyse the effect traditional treatment regimens have on the dynamics of a bacterial infection. Using a novel approach, a genetic algorithm, the study then identifies improved treatment regimens. Using a single antibiotic the genetic algorithm identifies regimens which minimise the amount of antibiotic used while maximising bacterial eradication. Although exact treatments are highly dependent on parameter values and initial bacterial load, a significant common trend is identified throughout the results. A treatment regimen consisting of a high initial dose followed by an extended tapering of doses is found to optimise the use of antibiotics. This consistently improves the success of eradicating infections, uses less antibiotic than traditional regimens and reduces the time to eradication. The use of genetic algorithms to optimise treatment regimens enables an extensive search of possible regimens, with previous regimens directing the search into regions of better performance.

  11. Ant Colony Optimisation for Backward Production Scheduling

    Directory of Open Access Journals (Sweden)

    Leandro Pereira dos Santos

    2012-01-01

    Full Text Available The main objective of a production scheduling system is to assign tasks (orders or jobs to resources and sequence them as efficiently and economically (optimised as possible. Achieving this goal is a difficult task in complex environment where capacity is usually limited. In these scenarios, finding an optimal solution—if possible—demands a large amount of computer time. For this reason, in many cases, a good solution that is quickly found is preferred. In such situations, the use of metaheuristics is an appropriate strategy. In these last two decades, some out-of-the-shelf systems have been developed using such techniques. This paper presents and analyses the development of a shop-floor scheduling system that uses ant colony optimisation (ACO in a backward scheduling problem in a manufacturing scenario with single-stage processing, parallel resources, and flexible routings. This scenario was found in a large food industry where the corresponding author worked as consultant for more than a year. This work demonstrates the applicability of this artificial intelligence technique. In fact, ACO proved to be as efficient as branch-and-bound, however, executing much faster.

  12. Noise aspects at aerodynamic blade optimisation projects

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G. [Netherlands Energy Research Foundation, Petten (Netherlands)

    1997-12-31

    This paper shows an example of an aerodynamic blade optimisation, using the program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. The aerodynamic optimised geometry from PVOPT is the `real` optimum (up to the latest decimal). The most important conclusion from this study is, that it is worthwhile to investigate the behaviour of the objective function (in the present case the energy yield) around the optimum: If the optimum is flat, there is a possibility to apply modifications to the optimum configuration with only a limited loss in energy yield. It is obvious that the modified configurations emits a different (and possibly lower) noise level. In the BLADOPT program (the successor of PVOPT) it will be possible to quantify the noise level and hence to assess the reduced noise emission more thoroughly. At present the most promising approaches for noise reduction are believed to be a reduction of the rotor speed (if at all possible), and a reduction of the tip angle by means of low lift profiles, or decreased twist at the outboard stations. These modifications were possible without a significant loss in energy yield. (LN)

  13. Multi-objective evolutionary optimisation for product design and manufacturing

    CERN Document Server

    2011-01-01

    Presents state-of-the-art research in the area of multi-objective evolutionary optimisation for integrated product design and manufacturing Provides a comprehensive review of the literature Gives in-depth descriptions of recently developed innovative and novel methodologies, algorithms and systems in the area of modelling, simulation and optimisation

  14. Optimisation of GnRH antagonist use in ART

    NARCIS (Netherlands)

    Hamdine, O.

    2014-01-01

    This thesis focuses on the optimisation of controlled ovarian stimulation for IVF using exogenous FSH and GnRH antagonist co-treatment, by studying the timing of the initiation of GnRH antagonist co-medication and the role of ovarian reserve markers in optimising ovarian response and reproductive ou

  15. Aerodynamic shape parameterisation and optimisation of novel configurations

    NARCIS (Netherlands)

    Straathof, M.H.; Van Tooren, M.J.L.; Voskuijl, M.; Koren, B.

    2008-01-01

    The Multi-Disciplinary Design Optimisation (MDO) process can be supported by partial automation of analysis and optimisation steps. Design and Engineering Engines (DEE) are useful concepts to structure this type of automation. Within the DEE, a product can be parameterically defined using Knowledge

  16. DACIA LOGAN LIVE AXLE OPTIMISATION USING COMPUTER GRAPHICS

    Directory of Open Access Journals (Sweden)

    KIRALY Andrei

    2017-05-01

    Full Text Available The paper presents some contributions to the calculus and optimisation of a live axle used at Dacia Logan using computer graphics software for creating the model and afterwards using FEA evaluation to determine the effectiveness of the optimisation. Thus using specialized computer software, a simulation is made and the results were compared to the measured real prototype.

  17. There, and Back Again Quantum Theory and Global Optimisation

    CERN Document Server

    Audenaert, K M R

    2004-01-01

    We consider a problem in quantum theory that can be formulated as an optimisation problem and present a global optimisation algorithm for solving it, the foundation of which relies in turn on a theorem from quantum theory. To wit, we consider the maximal output purity $\

  18. Kriging based robust optimisation algorithm for minimax problems in electromagnetics

    Directory of Open Access Journals (Sweden)

    Li Yinjiang

    2016-12-01

    Full Text Available The paper discusses some of the recent advances in kriging based worst-case design optimisation and proposes a new two-stage approach to solve practical problems. The efficiency of the infill points allocation is improved significantly by adding an extra layer of optimisation enhanced by a validation process.

  19. GAOS: Spatial optimisation of crop and nature within agricultural fields

    NARCIS (Netherlands)

    Bruin, de S.; Janssen, H.; Klompe, A.; Lerink, P.; Vanmeulebrouk, B.

    2010-01-01

    This paper proposes and demonstrates a spatial optimiser that allocates areas of inefficient machine manoeuvring to field margins thus improving the use of available space and supporting map-based Controlled Traffic Farming. A prototype web service (GAOS) allows farmers to optimise tracks within the

  20. Parameter Optimisation for the Behaviour of Elastic Models over Time

    DEFF Research Database (Denmark)

    Mosegaard, Jesper

    2004-01-01

    Optimisation of parameters for elastic models is essential for comparison or finding equivalent behaviour of elastic models when parameters cannot simply be transferred or converted. This is the case with a large range of commonly used elastic models. In this paper we present a general method...... that will optimise parameters based on the behaviour of the elastic models over time....

  1. Multi-criterion scantling optimisation of cruise ships

    OpenAIRE

    2010-01-01

    A numerical tool for the optimisation of the scantlings of a ship is extended by considering production cost, weight and moment of iner tia in the objective function. A multi-criteria optimisation of a passenger ship is conducted to illustrate the analysis process. Pareto frontiers are obtained and results are verified with Bureau Veritas rules.

  2. Modified cuckoo search: A new gradient free optimisation algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Walton, S., E-mail: 512465@swansea.ac.uk [College of Engineering, Swansea University, Swansea SA2 8PP, Wales (United Kingdom); Hassan, O.; Morgan, K.; Brown, M.R. [College of Engineering, Swansea University, Swansea SA2 8PP, Wales (United Kingdom)

    2011-09-15

    Highlights: > Modified cuckoo search (MCS) is a new gradient free optimisation algorithm. > MCS shows a high convergence rate, able to outperform other optimisers. > MCS is particularly strong at high dimension objective functions. > MCS performs well when applied to engineering problems. - Abstract: A new robust optimisation algorithm, which can be regarded as a modification of the recently developed cuckoo search, is presented. The modification involves the addition of information exchange between the top eggs, or the best solutions. Standard optimisation benchmarking functions are used to test the effects of these modifications and it is demonstrated that, in most cases, the modified cuckoo search performs as well as, or better than, the standard cuckoo search, a particle swarm optimiser, and a differential evolution strategy. In particular the modified cuckoo search shows a high convergence rate to the true global minimum even at high numbers of dimensions.

  3. Optimisation of a Crossdocking Distribution Centre Simulation Model

    CERN Document Server

    Adewunmi, Adrian

    2010-01-01

    This paper reports on continuing research into the modelling of an order picking process within a Crossdocking distribution centre using Simulation Optimisation. The aim of this project is to optimise a discrete event simulation model and to understand factors that affect finding its optimal performance. Our initial investigation revealed that the precision of the selected simulation output performance measure and the number of replications required for the evaluation of the optimisation objective function through simulation influences the ability of the optimisation technique. We experimented with Common Random Numbers, in order to improve the precision of our simulation output performance measure, and intended to use the number of replications utilised for this purpose as the initial number of replications for the optimisation of our Crossdocking distribution centre simulation model. Our results demonstrate that we can improve the precision of our selected simulation output performance measure value using C...

  4. Open Cut resource Optimisation as applied to coal

    Institute of Scientific and Technical Information of China (English)

    Martin L.Smith

    2007-01-01

    Pit optimisation is the earliest and most established application of its kind in the minerals industry,but this has been primarily driven by metal,not coal.Coal has the same financiaI drivers for resource optimisation as does the metalliferous industry,yet pit optimisation is not common practice.Why? The following discussion presents the basics of pit optimisation as they relate to coal and illustrates how a technology developed for massive deposits is not suitable for thin.multi-seam deposits where mine planning is often driven more by product quality than by value drivers such as Net Present Value.An alternative methodology is presented that takes advantage of the data structure of bedded deposits to optimise resource recovery in terms of a production schedule that meets constraints on coal quality.

  5. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2014-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  6. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2013-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  7. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    to model checking. This allows for a rich set of both qualitative and quantitative properties of a business process to be precisely determined in an automated fashion directly from the model of the business process. A number of advanced applications of this framework are presented which allow for automated...... Model and Notation (BPMN). The automated analysis of business processes is done by means of quantitative probabilistic model checking which allows verification of validation and performance properties through use of an algorithm for the translation of business process models into a format amenable......This thesis develops a unified framework wherein to specify, verify and optimise stochastic business processes. This framework provides for the modelling of business processes via a mathematical structure which captures business processes as a series of connected activities. This structure...

  8. FEM Optimisation of Spark Plasma Sintering Furnace

    CERN Document Server

    Kellari, Demetrios Vasili

    2013-01-01

    Coupled electro-thermal FEM analysis has been carried out on a sintering furnace used to produce new materials for LHC collimators. The analysis showed there exist margins for improvement of the current process and equipment through minor changes. To optimise the design of the furnace several design changes have been proposed including: optimization of material selection using copper cooling plates, control of convection in cooling plates by lowering the water flow rate, modifying the electrode shape using unsymmetrical electrodes and upgrading the thermal shielding to make use of multilayer graphite shields. The results show that we have a significant improvement in temperature gradient on the plate, from 453 to 258 °C and a reduction in power requirement from 62 to 44 kW.

  9. Niobium Cavity Electropolishing Modelling and Optimisation

    CERN Document Server

    Ferreira, L M A; Forel, S; Shirra, J A

    2013-01-01

    It’s widely accepted that electropolishing (EP) is the most suitable surface finishing process to achieve high performance bulk Nb accelerating cavities. At CERN and in preparation for the processing of the 704 MHz high-beta Superconducting Proton Linac (SPL) cavities a new vertical electropolishing facility has been assembled and a study is on-going for the modelling of electropolishing on cavities with COMSOL® software. In a first phase, the electrochemical parameters were taken into account for a fixed process temperature and flow rate, and are presented in this poster as well as the results obtained on a real SPL single cell cavity. The procedure to acquire the data used as input for the simulation is presented. The modelling procedure adopted to optimise the cathode geometry, aimed at a uniform current density distribution in the cavity cell for the minimum working potential and total current is explained. Some preliminary results on fluid dynamics is also briefly described.

  10. Improving and optimising road pricing in Copenhagen

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker; Larsen, Marie Karen

    2008-01-01

    though quite a number of proposed charging systems have been examined only a few pricing strategies have been investigated. This paper deals with the optimisation of different designs for a road pricing system in the Greater Copenhagen area with respect to temporal and spatial differentiation......The question whether to introduce toll rings or road pricing in Copenhagen has been discussed intensively during the last 10 years. The main results of previous analyses are that none of the systems would make a positive contribution at present, when considered from a socio-economic view. Even...... of the pricing levels. A detailed transport model was used to describe the demand effects. The model was based on data from a real test of road pricing on 500 car drivers. The paper compares the price systems with regard to traffic effects and generalised costs for users and society. It is shown how important...

  11. A code for optimising triplet layout

    CERN Document Server

    Van Riesen-Haupt, Leon; Abelleira, Jose; Cruz Alaniz, Emilia

    2017-01-01

    One of the main challenges when designing final focus systems of particle accelerators is maximising the beam stay clear in the strong quadrupole magnets of the inner triplet. Moreover it is desirable to keep the quadrupoles in the inner triplet as short as possible for space and costs reasons but also to reduce chromaticity and simplify corrections schemes. An algorithm that explores the triplet parameter space to optimise both these aspects was written. It uses thin lenses as a first approximation for a broad parameter scan and MADX for more precise calculations. The thin lens algorithm is significantly faster than a full scan using MADX and relatively precise at indicating the approximate area where the optimum solution lies.

  12. Optimising Signalised Intersection Using Wireless Vehicle Detectors

    DEFF Research Database (Denmark)

    Adjin, Daniel Michael Okwabi; Torkudzor, Moses; Asare, Jack

    Traffic congestion on roads wastes travel times. In this paper, we developed a vehicular traffic model to optimise a signalised intersection in Accra, using wireless vehicle detectors. Traffic volume gathered was extrapolated to cover 2011 and 2016 and were analysed to obtain the peak hour traffic...... volume causing congestion. The intersection was modelled and simulated in Synchro7 as an actuated signalised model using results from the analysed data. The model for morning peak periods gave optimal cycle lengths of 100s and 150s with corresponding intersection delay of 48.9s and 90.6s in 2011 and 2016...... respectively while that for the evening was 55s giving delay of 14.2s and 16.3s respectively. It is shown that the model will improve traffic flow at the intersection....

  13. Simulation and optimisation of turbulence in stellarators

    Energy Technology Data Exchange (ETDEWEB)

    Xanthopoulos, Pavlos; Helander, Per; Turkin, Yuriy; Plunk, Gabriel G.; Bird, Thomas; Proll, Josefine H.E. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Wendelsteinstr. 1, 17491 Greifswald (Germany); Mynick, Harry [Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Jenko, Frank; Goerler, Tobias; Told, Daniel [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstr. 2, 85748 Garching (Germany)

    2014-07-01

    In tokamaks and stellarators - two leading types of devices used in fusion research - magnetic field lines trace out toroidal surfaces on which the plasma density and temperature are constant, but turbulent fluctuations carry energy across these surfaces to the wall, thus degrading the plasma confinement. Using petaflop-scale simulations, we calculate for the first time the pattern of turbulent structures forming on stellarator magnetic surfaces, and find striking differences relative to tokamaks. The observed sensitivity of the turbulence to the magnetic geometry suggests that there is room for further confinement improvement, in addition to measures already taken to minimise the laminar transport. With an eye towards fully optimised stellarators, we present a proof-of-principle configuration with substantially reduced turbulence compared to an existing design.

  14. Optimisation of Multilayer Insulation an Engineering Approach

    CERN Document Server

    Chorowski, M; Parente, C; Riddone, G

    2001-01-01

    A mathematical model has been developed to describe the heat flux through multilayer insulation (MLI). The total heat flux between the layers is the result of three distinct heat transfer modes: radiation, residual gas conduction and solid spacer conduction. The model describes the MLI behaviour considering a layer-to-layer approach and is based on an electrical analogy, in which the three heat transfer modes are treated as parallel thermal impedances. The values of each of the transfer mode vary from layer to layer, although the total heat flux remains constant across the whole MLI blanket. The model enables the optimisation of the insulation with regard to different MLI parameters, such as residual gas pressure, number of layers and boundary temperatures. The model has been tested with experimental measurements carried out at CERN and the results revealed to be in a good agreement, especially for insulation vacuum between 10-5 Pa and 10-3 Pa.

  15. Common mode chokes and optimisation aspects

    Science.gov (United States)

    Kut, T.; Lücken, A.; Dickmann, S.; Schulz, D.

    2014-11-01

    Due to the increasing electrification of modern aircraft, as a result of the More Electric Aircraft concept, new strategies and approaches are required to fulfil the strict EMC aircraft standards (DO-160/ED-14-Sec. 20). Common mode chokes are a key component of electromagnetic filters and often oversized because of the unknown impedance of the surrounding power electronic system. This oversizing results in an increase of weight and volume. It has to be avoided as far as possible for mobile applications. In this context, an advanced method is presented to measure these impedances under operating conditions. Furthermore, the different parameters of the inductance design is explained and an optimisation for weight and volume is introduced.

  16. Profile control studies for JET optimised shear regime

    Energy Technology Data Exchange (ETDEWEB)

    Litaudon, X.; Becoulet, A.; Eriksson, L.G.; Fuchs, V.; Huysmans, G.; How, J.; Moreau, D.; Rochard, F.; Tresset, G.; Zwingmann, W. [Association Euratom-CEA, CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint-Paul-lez-Durance (France); Bayetti, P.; Joffrin, E.; Maget, P.; Mayorat, M.L.; Mazon, D.; Sarazin, Y. [JET Abingdon, Oxfordshire (United Kingdom); Voitsekhovitch, I. [Universite de Provence, LPIIM, Aix-Marseille 1, 13 (France)

    2000-03-01

    This report summarises the profile control studies, i.e. preparation and analysis of JET Optimised Shear plasmas, carried out during the year 1999 within the framework of the Task-Agreement (RF/CEA/02) between JET and the Association Euratom-CEA/Cadarache. We report on our participation in the preparation of the JET Optimised Shear experiments together with their comprehensive analyses and the modelling. Emphasis is put on the various aspects of pressure profile control (core and edge pressure) together with detailed studies of current profile control by non-inductive means, in the prospects of achieving steady, high performance, Optimised Shear plasmas. (authors)

  17. Application of optimisation techniques in groundwater quantity and quality management

    Indian Academy of Sciences (India)

    Amlan Das; Bithin Datta

    2001-08-01

    This paper presents the state-of-the-art on application of optimisation techniques in groundwater quality and quantity management. In order to solve optimisation-based groundwater management models, researchers have used various mathematical programming techniques such as linear programming (LP), nonlinear programming (NLP), mixed-integer programming (MIP), optimal control theory-based mathematical programming, differential dynamic programming (DDP), stochastic programming (SP), combinatorial optimisation (CO), and multiple objective programming for multipurpose management. Studies reported in the literature on the application of these methods are reviewed in this paper.

  18. Genetic Algorithm Optimisation of a Ship Navigation System

    Directory of Open Access Journals (Sweden)

    E. Alfaro-Cid

    2001-01-01

    Full Text Available The optimisation of the PID controllers' gains for separate propulsion and heading control systems of CyberShip I, a scale model of an oil platform supply ship, using Genetic Algorithms is considered. During the initial design process both PID controllers have been manually tuned to improve their performance. However this tuning approach is a tedious and time consuming process. A solution to this problem is the use of optimisation techniques based on Genetic Algorithms to optimise the controllers' gain values. This investigation has been carried out through computer-generated simulations based on a non-linear hydrodynamic model of CyberShip I.

  19. Design optimisation of a flywheel hybrid vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Kok, D.B.

    1999-11-04

    This thesis describes the design optimisation of a flywheel hybrid vehicle with respect to fuel consumption and exhaust gas emissions. The driveline of this passenger car uses two power sources: a small spark ignition internal combustion engine with three-way catalyst, and a highspeed flywheel system for kinetic energy storage. A custom-made continuously variable transmission (CVT) with so-called i{sup 2} control transports energy between these power sources and the vehicle wheels. The driveline includes auxiliary systems for hydraulic, vacuum and electric purposes. In this fully mechanical driveline, parasitic energy losses determine the vehicle's fuel saving potential to a large extent. Practicable energy loss models have been derived to quantify friction losses in bearings, gearwheels, the CVT, clutches and dynamic seals. In addition, the aerodynamic drag in the flywheel system and power consumption of auxiliaries are charted. With the energy loss models available, a calculation procedure is introduced to optimise the flywheel as a subsystem in which the rotor geometry, the safety containment, and the vacuum system are designed for minimum energy use within the context of automotive applications. A first prototype of the flywheel system was tested experimentally and subsequently redesigned to improve rotordynamics and safety aspects. Coast-down experiments with the improved version show that the energy losses have been lowered significantly. The use of a kinetic energy storage device enables the uncoupling of vehicle wheel power and engine power. Therefore, the engine can be smaller and it can be chosen to operate in its region of best efficiency in start-stop mode. On a test-rig, the measured engine fuel consumption was reduced with more than 30 percent when the engine is intermittently restarted with the aid of the flywheel system. Although the start-stop mode proves to be advantageous for fuel consumption, exhaust gas emissions increase temporarily

  20. Operational optimisation of water supply networks using a fuzzy ...

    African Journals Online (AJOL)

    Operational optimisation of water supply networks using a fuzzy system. ... This paper presents a fuzzy system to control the pressure in a water distribution network, by using valves and controlling the rotor speed of the ... Article Metrics.

  1. Exploring RSSI Dependency on Height in UHF for throughput optimisation

    CSIR Research Space (South Africa)

    Maliwatu, R

    2016-11-01

    Full Text Available International Conference on Advances in Computing & Communication Engineering (ICACCE), 28-29 November 2016, Durban, South Africa Exploring RSSI Dependency on Height in UHF for throughput optimisation Richard Maliwatu Albert Lysko David Johnson...

  2. Efficient topology optimisation of multiscale and multiphysics problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    The aim of this Thesis is to present efficient methods for optimising high-resolution problems of a multiscale and multiphysics nature. The Thesis consists of two parts: one treating topology optimisation of microstructural details and the other treating topology optimisation of conjugate heat...... transfer problems. Part I begins with an introduction to the concept of microstructural details in the context of topology optimisation. Relevant literature is briefly reviewed and problems with existing methodologies are identified. The proposed methodology and its strengths are summarised. Details...... the computational cost of treating structures with fully-resolved microstructural details. The methodology is further applied to examples, where it is shown that it ensures connectivity of the microstructural details and that forced periodicity of the microstructural details can yield an implicit robustness to load...

  3. Share-of-Surplus Product Line Optimisation with Price Levels

    Directory of Open Access Journals (Sweden)

    X. G. Luo

    2014-01-01

    Full Text Available Kraus and Yano (2003 established the share-of-surplus product line optimisation model and developed a heuristic procedure for this nonlinear mixed-integer optimisation model. In their model, price of a product is defined as a continuous decision variable. However, because product line optimisation is a planning process in the early stage of product development, pricing decisions usually are not very precise. In this research, a nonlinear integer programming share-of-surplus product line optimization model that allows the selection of candidate price levels for products is established. The model is further transformed into an equivalent linear mixed-integer optimisation model by applying linearisation techniques. Experimental results in different market scenarios show that the computation time of the transformed model is much less than that of the original model.

  4. Open Pit Optimisation and Design: A Stepwise Approach*

    African Journals Online (AJOL)

    Michael

    2015-12-02

    Dec 2, 2015 ... holes were used for the analysis. ... retrieval and analysis, using Surpac software. .... economic and technical parameters were used to produce a set of nested pits. Fig. 4 depicts a summarised flow chart for the pit optimisation.

  5. Optimisation of patient protection and image quality in diagnostic ...

    African Journals Online (AJOL)

    Optimisation of patient protection and image quality in diagnostic radiology. ... The study leads to the introduction of the concept of plan- do-check-act on QC results ... (QA) programme and continues to collect data for establishment of DRL's.

  6. A Comparison of Existing Optimisation Techniques with the ...

    African Journals Online (AJOL)

    ... Existing Optimisation Techniques with the Univariate Marginal Distribution Algorithm ... graph colouring, neural networks, genetic algorithms and tabu search. ... optimization techniques and we show how the proposed algorithm performs in ...

  7. Elliptical Antenna Array Synthesis Using Backtracking Search Optimisation Algorithm

    Directory of Open Access Journals (Sweden)

    Kerim Guney

    2016-04-01

    Full Text Available The design of the elliptical antenna arrays is relatively new research area in the antenna array community. Backtracking search optimisation algorithm (BSA is employed for the synthesis of elliptical antenna arrays having different number of array elements. For this aim, BSA is used to calculate the optimum angular position and amplitude values of the array elements. BSA is a population-based iterative evolutionary algorithm. The remarkable properties of BSA are that it has a good optimisation performance, simple implementation structure, and few control parameters. The results of BSA are compared with those of self-adaptive differential evolution algorithm, firefly algorithm, biogeography based optimisation algorithm, and genetic algorithm. The results show that BSA can reach better solutions than the compared optimisation algorithms. Iterative performances of BSA are also compared with those of bacterial foraging algorithm and differential search algorithm.

  8. Construction and optimisation of a cartridge filter for removing ...

    African Journals Online (AJOL)

    Construction and optimisation of a cartridge filter for removing fluoride in drinking water. ... It was found that the optimal conditions for the F- filter that gave the best results in removing of F- from water with minimum ... Article Metrics.

  9. Weight Optimisation of Steel Monopile Foundations for Offshore Windfarms

    DEFF Research Database (Denmark)

    Fog Gjersøe, Nils; Bouvin Pedersen, Erik; Kristensen, Brian;

    2015-01-01

    The potential for mass reduction of monopiles in offshore windfarms using current design practice is investigated. Optimisation by sensitivity analysis is carried out for the following important parameters: wall thickness distribution between tower and monopile, soil stiffness, damping ratio and ...

  10. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  11. Multiobjective optimisation of bogie suspension to boost speed on curves

    Science.gov (United States)

    Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor

    2016-01-01

    To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.

  12. Optimisation as a process for understanding and managing river ecosystems

    OpenAIRE

    Barbour, EJ; Holz, L; G. Kuczera; Pollino, CA; Jakeman, AJ; Loucks, DP

    2016-01-01

    Optimisation can assist in the management of riverine ecosystems through the exploration of multiple alternative management strategies, and the evaluation of trade-offs between conflicting objectives. In addition, it can facilitate communication and learning about the system. However, the effectiveness of optimisation in aiding decision making for ecological management is currently limited by four major challenges: identification and quantification of ecosystem objectives; representation of e...

  13. Optimisation of the Sekwa blended-wing-Body research UAV

    CSIR Research Space (South Africa)

    Broughton, BA

    2008-10-01

    Full Text Available candidate design to bring the total mass up to the target total mass of 3.2 kg. The location of the ballast mass could be adjusted by the design code, which allowed the static margin to be used as a design variable. Finally, a series of checks were.... Overview of optimisation process OPTIMISER Genetic Algorithms + Gradient Based Methods Natural FQ constraints Geometric constraints Control system constraints Stall behaviour Design with best cruise performance Design parameters Generate...

  14. Aerodynamic optimisation of an industrial axial fan blade

    OpenAIRE

    2006-01-01

    Numerical optimisation methods have successfully been used for a variety of aerodynamic design problems over quite a few years. However the application of these methods to the aerodynamic blade shape optimisation of industrial axial fans has received much less attention in the literature probably given the fact that the majority of resources available to develop these automated design approaches is to be found in the aerospace field. This work presents the develo...

  15. Spreadsheets In Function Of Optimisation Of Logistics Network

    OpenAIRE

    Drago Pupavac; Mimo Draskovic

    2007-01-01

    This scientific paper discusses how estimated spreadsheets functions in logistics networks optimisation. The suggested working hypothesis for efficacy of estimated spreadsheets in designing logistics networks is proved and a practical example. In this way the given model can be applied to all logistics networks of similar problem capacity. Logistics network model confronting estimated spreadsheets present a real world at a level needed for understanding the problem of optimisation of logistic...

  16. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    Science.gov (United States)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  17. Optimised control of coal-fired boilers

    Energy Technology Data Exchange (ETDEWEB)

    Owens, D.H.; MacConnell, P.F.A.; Neuffer, D.; Dando, R. [University of Exeter, Exeter (United Kingdom). Centre for System and Control Engineering

    1997-07-01

    The objective of the project is to develop and specify a control methodology that will enable existing coal combustion plant to take maximum advantage of modern control techniques. The research is specifically aimed at chain-grate stoker plant (such as the test facility at the Coal Research Establishment, Cheltenham) on which little work has been done for thirty years yet which still represents a large proportion of industrial coal-fired plant in operation worldwide. In detail, the project: reviewed existing control strategies for moving grate stokers, highlighting their limitations and areas for improvements; carried out plant trials to identify the system characteristics such as response time and input/output behaviour; developed a theoretical process based on physical and chemical laws and backed up by trial data; specified control strategies for a single boiler; simulated and evaluated the control strategies using model simulations; developed of an optimised. Control strategy for a single boiler; and assessed the applicability and effects of this control strategy on multiple boiler installations. 67 refs., 34 figs.

  18. Semantic Query Optimisation with Ontology Simulation

    CERN Document Server

    Gupta, Siddharth

    2010-01-01

    Semantic Web is, without a doubt, gaining momentum in both industry and academia. The word "Semantic" refers to "meaning" - a semantic web is a web of meaning. In this fast changing and result oriented practical world, gone are the days where an individual had to struggle for finding information on the Internet where knowledge management was the major issue. The semantic web has a vision of linking, integrating and analysing data from various data sources and forming a new information stream, hence a web of databases connected with each other and machines interacting with other machines to yield results which are user oriented and accurate. With the emergence of Semantic Web framework the na\\"ive approach of searching information on the syntactic web is clich\\'e. This paper proposes an optimised semantic searching of keywords exemplified by simulation an ontology of Indian universities with a proposed algorithm which ramifies the effective semantic retrieval of information which is easy to access and time sav...

  19. ENERGY OPTIMISATION SCHEMES FOR WIRELESS SENSOR NETWORK

    Directory of Open Access Journals (Sweden)

    Vivekanand Jha

    2012-05-01

    Full Text Available A sensor network is composed of a large number of sensor nodes, which are densely deployed either inside the phenomenon or very close to it. Sensor nodes have sensing, processing and transmitting capability . They however have limited energy and measures need to be taken to make op- timum usage of their energy and save them from task of only receiving and transmitting data without processing. Various techniques for energy utilization optimisation have been proposed Ma jor players are however clustering and relay node placement. In the research related to relay node placement, it has been proposed to deploy some relay nodes such that the sensors can transmit the sensed data to a nearby relay node, which in turn delivers the data to the base stations. In general, the relay node placement problems aim to meet certain connectivity and/or survivabil- ity requirements of the network by deploying a minimum number of relay nodes. The other approach is grouping sensor nodes into clusters with each cluster having a cluster head (CH. The CH nodes aggregate the data and transmit them to the base station (BS. These two approaches has been widely adopted by the research community to satisfy the scala- bility objective and generally achieve high energy efficiency and prolong network lifetime in large-scale WSN environments and hence are discussed here along with single hop and multi hop characteristic of sensor node.

  20. Topological optimisation of rod-stirring devices

    CERN Document Server

    Finn, Matthew D

    2011-01-01

    There are many industrial situations where rods are used to stir a fluid, or where rods repeatedly stretch a material such as bread dough or taffy. The goal in these applications is to stretch either material lines (in a fluid) or the material itself (for dough or taffy) as rapidly as possible. The growth rate of material lines is conveniently given by the topological entropy of the rod motion. We discuss the problem of optimising such rod devices from a topological viewpoint. We express rod motions in terms of generators of the braid group, and assign a cost based on the minimum number of generators needed to write the braid. We show that for one cost function -- the topological entropy per generator -- the optimal growth rate is the logarithm of the golden ratio. For a more realistic cost function,involving the topological entropy per operation where rods are allowed to move together, the optimal growth rate is the logarithm of the silver ratio, $1+\\sqrt{2}$. We show how to construct devices that realise th...

  1. Optimising preterm nutrition: present and future

    LENUS (Irish Health Repository)

    Brennan, Ann-Marie

    2016-04-01

    The goal of preterm nutrition in achieving growth and body composition approximating that of the fetus of the same postmenstrual age is difficult to achieve. Current nutrition recommendations depend largely on expert opinion, due to lack of evidence, and are primarily birth weight based, with no consideration given to gestational age and\\/or need for catch-up growth. Assessment of growth is based predominately on anthropometry, which gives insufficient attention to the quality of growth. The present paper provides a review of the current literature on the nutritional management and assessment of growth in preterm infants. It explores several approaches that may be required to optimise nutrient intakes in preterm infants, such as personalising nutritional support, collection of nutrient intake data in real-time, and measurement of body composition. In clinical practice, the response to inappropriate nutrient intakes is delayed as the effects of under- or overnutrition are not immediate, and there is limited nutritional feedback at the cot-side. The accurate and non-invasive measurement of infant body composition, assessed by means of air displacement plethysmography, has been shown to be useful in assessing quality of growth. The development and implementation of personalised, responsive nutritional management of preterm infants, utilising real-time nutrient intake data collection, with ongoing nutritional assessments that include measurement of body composition is required to help meet the individual needs of preterm infants.

  2. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  3. A hybrid of bees algorithm and flux balance analysis (BAFBA) for the optimisation of microbial strains.

    Science.gov (United States)

    Choon, Yee Wen; Mohamad, Mohd Saberi; Deris, Safaai; Illias, Rosli Md

    2014-01-01

    The development of microbial production system has become popular in recent years as microbial hosts offer a number of unique advantages for both native and heterologous small-molecules. However, the main drawback is low yield or productivity of the desired products. Optimisation algorithms are implemented in previous works to identify the effects of gene knockout. Nevertheless, the previous works faced performance issue. Thus, a hybrid of Bees Algorithm and Flux Balance Analysis (BAFBA) is proposed in this paper to improve the performance in predicting optimal sets of gene deletion for maximising the growth rate and production yield of certain metabolite. This paper involves two datasets which are E. coli and S. cerevisiae. The list of knockout genes, growth rate and production yield after the deletion are the results from the experiments. BAFBA presents better results compared to the other methods and the identified list may be useful in solving genetic engineering problems.

  4. A conceptual optimisation strategy for radiography in a digital environment.

    Science.gov (United States)

    Båth, Magnus; Håkansson, Markus; Hansson, Jonny; Månsson, Lars Gunnar

    2005-01-01

    Using a completely digital environment for the entire imaging process leads to new possibilities for optimisation of radiography since many restrictions of screen/film systems, such as the small dynamic range and the lack of possibilities for image processing, do not apply any longer. However, at the same time these new possibilities lead to a more complicated optimisation process, since more freedom is given to alter parameters. This paper focuses on describing an optimisation strategy that concentrates on taking advantage of the conceptual differences between digital systems and screen/film systems. The strategy can be summarised as: (a) always include the anatomical background during the optimisation, (b) perform all comparisons at a constant effective dose and (c) separate the image display stage from the image collection stage. A three-step process is proposed where the optimal setting of the technique parameters is determined at first, followed by an optimisation of the image processing. In the final step the optimal dose level-given the optimal settings of the image collection and image display stages-is determined.

  5. CT dose optimisation and reduction in osteoarticular disease.

    Science.gov (United States)

    Gervaise, A; Teixeira, P; Villani, N; Lecocq, S; Louis, M; Blum, A

    2013-04-01

    With an improvement in the temporal and spatial resolution, computed tomography (CT) is indicated in the evaluation of a great many osteoarticular diseases. New exploration techniques such as the dynamic CT and CT bone perfusion also provide new indications. However, CT is still an irradiating imaging technique and dose optimisation and reduction remains primordial. In this paper, the authors first present the typical doses delivered during CT in osteoarticular disease. They then discuss the different ways to optimise and reduce these doses by distinguishing the behavioural factors from the technical factors. Among the latter, the optimisation of the milliamps and kilovoltage is indispensable and should be adapted to the type of exploration and the morphotype of each individual. These technical factors also benefit from recent technological evolutions with the distribution of iterative reconstructions. In this way, the dose may be divided by two and provide an image of equal quality. With these dose optimisation and reduction techniques, it is now possible, while maintaining an excellent quality of the image, to obtain low-dose or even very low-dose acquisitions with a dose sometimes similar that of a standard X-ray assessment. Nevertheless, although these technical factors provide a major reduction in the dose delivered, behavioural factors, such as compliance with the indications, remain fundamental. Finally, the authors describe how to optimise and reduce the dose with specific applications in musculoskeletal imaging such as the dynamic CT, CT bone perfusion and dual energy CT.

  6. Hybrid Genetic Algorithm with PSO Effect for Combinatorial Optimisation Problems

    Directory of Open Access Journals (Sweden)

    M. H. Mehta

    2012-12-01

    Full Text Available In engineering field, many problems are hard to solve in some definite interval of time. These problems known as “combinatorial optimisation problems” are of the category NP. These problems are easy to solve in some polynomial time when input size is small but as input size grows problems become toughest to solve in some definite interval of time. Long known conventional methods are not able to solve the problems and thus proper heuristics is necessary. Evolutionary algorithms based on behaviours of different animals and species have been invented and studied for this purpose. Genetic Algorithm is considered a powerful algorithm for solving combinatorial optimisation problems. Genetic algorithms work on these problems mimicking the human genetics. It follows principle of “survival of the fittest” kind of strategy. Particle swarm optimisation is a new evolutionary approach that copies behaviour of swarm in nature. However, neither traditional genetic algorithms nor particle swarm optimisation alone has been completely successful for solving combinatorial optimisation problems. Here a hybrid algorithm is proposed in which strengths of both algorithms are merged and performance of proposed algorithm is compared with simple genetic algorithm. Results show that proposed algorithm works definitely better than the simple genetic algorithm.

  7. Transmit Power Optimisation in Wireless Network

    Directory of Open Access Journals (Sweden)

    Besnik Terziu

    2011-09-01

    Full Text Available Transmit power optimisation in wireless networks based on beamforming have emerged as a promising technique to enhance the spectrum efficiency of present and future wireless communication systems. The aim of this study is to minimise the access point power consumption in cellular networks while maintaining a targeted quality of service (QoS for the mobile terminals. In this study, the targeted quality of service is delivered to a mobile station by providing a desired level of Signal to Interference and Noise Ratio (SINR. Base-stations are coordinated across multiple cells in a multi-antenna beamforming system. This study focuses on a multi-cell multi-antenna downlink scenario where each mobile user is equipped with a single antenna, but where multiple mobile users may be active simultaneously in each cell and are separated via spatial multiplexing using beamforming. The design criteria is to minimize the total weighted transmitted power across the base-stations subject to SINR constraints at the mobile users. The main contribution of this study is to define an iterative algorithm that is capable of finding the joint optimal beamformers for all basestations, based on a correlation-based channel model, the full-correlation model. Among all correlated channel models, the correlated channel model used in this study is the most accurate, giving the best performance in terms of power consumption. The environment here in this study is chosen to be Non-Light of- Sight (NLOS condition, where a signal from a wireless transmitter passes several obstructions before arriving at a wireless receiver. Moreover there are many scatterers local to the mobile, and multiple reflections can occur among them before energy arrives at the mobile. The proposed algorithm is based on uplink-downlink duality using the Lagrangian duality theory. Time-Division Duplex (TDD is chosen as the platform for this study since it has been adopted to the latest technologies in Fourth

  8. Multiobjective design optimisation of coronary stents.

    Science.gov (United States)

    Pant, Sanjay; Limbert, Georges; Curzen, Nick P; Bressloff, Neil W

    2011-11-01

    representative CYPHER stent are shown. The methodology and the results of this work could potentially be useful in further optimisation studies and development of a family of stents with increased resistance to in-stent restenosis and thrombosis.

  9. Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions

    Science.gov (United States)

    Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin

    2017-03-01

    To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell’s equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than -15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally.

  10. Fabrication optimisation of carbon fiber electrode with Taguchi method.

    Science.gov (United States)

    Cheng, Ching-Ching; Young, Ming-Shing; Chuang, Chang-Lin; Chang, Ching-Chang

    2003-07-01

    In this study, we describe an optimised procedure for fabricating carbon fiber electrodes using Taguchi quality engineering method (TQEM). The preliminary results show a S/N ratio improvement from 22 to 30 db (decibel). The optimised parameter was tested by using a glass micropipette (0.3 mm outer/2.5 mm inner length of carbon fiber) dipped into PBS solution under 2.9 V triangle-wave electrochemical processing for 15 s, followed by coating treatment of micropipette on 2.6 V DC for 45 s in 5% Nafion solution. It is thus shown that Taguchi process optimisation can improve cost, manufacture time and quality of carbon fiber electrodes.

  11. Quasi-combinatorial energy landscapes for nanoalloy structure optimisation.

    Science.gov (United States)

    Schebarchov, D; Wales, D J

    2015-11-14

    We formulate nanoalloy structure prediction as a mixed-variable optimisation problem, where the homotops can be associated with an effective, quasi-combinatorial energy landscape in permutation space. We survey this effective landscape for a representative set of binary systems modelled by the Gupta potential. In segregating systems with small lattice mismatch, we find that homotops have a relatively straightforward landscape with few local optima - a scenario well-suited for local (combinatorial) optimisation techniques that scale quadratically with system size. Combining these techniques with multiple local-neighbourhood structures yields a search for multiminima, and we demonstrate that generalised basin-hopping with a metropolis acceptance criterion in the space of multiminima can then be effective for global optimisation of binary and ternary nanoalloys.

  12. Microfluidic converging/diverging channels optimised for homogeneous extensional deformation

    Science.gov (United States)

    Zografos, K.; Oliveira, M. S. N.

    2016-01-01

    In this work, we optimise microfluidic converging/diverging geometries in order to produce constant strain-rates along the centreline of the flow, for performing studies under homogeneous extension. The design is examined for both two-dimensional and three-dimensional flows where the effects of aspect ratio and dimensionless contraction length are investigated. Initially, pressure driven flows of Newtonian fluids under creeping flow conditions are considered, which is a reasonable approximation in microfluidics, and the limits of the applicability of the design in terms of Reynolds numbers are investigated. The optimised geometry is then used for studying the flow of viscoelastic fluids and the practical limitations in terms of Weissenberg number are reported. Furthermore, the optimisation strategy is also applied for electro-osmotic driven flows, where the development of a plug-like velocity profile allows for a wider region of homogeneous extensional deformation in the flow field. PMID:27478523

  13. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2005-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi......An increasing number of real-time applications are today implemented using distributed heterogeneous architectures composed of interconnected networks of processors. The systems are heterogeneous, not only in terms of hardware components, but also in terms of communication protocols and scheduling......-clusters, composed of several networks interconnected via gateways. They present a schedulability analysis for safety-critical applications distributed on multi-cluster systems and briefly highlight characteristic design optimisation problems: the partitioning and mapping of functionality, and the packing...

  14. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi......An increasing number of real-time applications are today implemented using distributed heterogeneous architectures composed of interconnected networks of processors. The systems are heterogeneous, not only in terms of hardware components, but also in terms of communication protocols and scheduling......-clusters, composed of several networks interconnected via gateways. They present a schedulability analysis for safety-critical applications distributed on multi-cluster systems and briefly highlight characteristic design optimisation problems: the partitioning and mapping of functionality, and the packing...

  15. Numerical optimisation of friction stir welding: review of future challenges

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2011-01-01

    During the last decade, the combination of increasingly more advanced numerical simulation software with high computational power has resulted in models for friction stir welding (FSW), which have improved the understanding of the determining physical phenomena behind the process substantially....... This has made optimisation of certain process parameters possible and has in turn led to better performing friction stir welded products, thus contributing to a general increase in the popularity of the process and its applications. However, most of these optimisation studies do not go well beyond manual...

  16. Alternatives for optimisation of rumen fermentation in ruminants

    Directory of Open Access Journals (Sweden)

    T. Slavov

    2017-06-01

    Full Text Available Abstract. The proper knowledge on the variety of events occurring in the rumen makes possible their optimisation with respect to the complete feed conversion and increasing the productive performance of ruminants. The inclusion of various dietary additives (supplements, biologically active substances, nutritional antibiotics, probiotics, enzymatic preparations, plant extracts etc. has an effect on the intensity and specific pathway of fermentation, and thus, on the general digestion and systemic metabolism. The optimisation of rumen digestion is a method with substantial potential for improving the efficiency of ruminant husbandry, increasing of quality of their produce and health maintenance.

  17. Separative power of an optimised concurrent gas centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Bogovalov, Sergey; Boman, Vladimir [National Research Nuclear University (MEPHI), Moscow (Russian Federation)

    2016-06-15

    The problem of separation of isotopes in a concurrent gas centrifuge is solved analytically for an arbitrary binary mixture of isotopes. The separative power of the optimised concurrent gas centrifuges for the uranium isotopes equals to δU = 12.7 (V/700 m/s)2(300 K/T)(L/1 m) kg·SWU/yr, where L and V are the length and linear velocity of the rotor of the gas centrifuge and T is the temperature. This equation agrees well with the empirically determined separative power of optimised counter-current gas centrifuges.

  18. Separative Power of an Optimised Concurrent Gas Centrifuge

    Directory of Open Access Journals (Sweden)

    Sergey Bogovalov

    2016-06-01

    Full Text Available The problem of separation of isotopes in a concurrent gas centrifuge is solved analytically for an arbitrary binary mixture of isotopes. The separative power of the optimised concurrent gas centrifuges for the uranium isotopes equals to δU = 12.7 (V/700 m/s2(300 K/T(L/1 m kg·SWU/yr, where L and V are the length and linear velocity of the rotor of the gas centrifuge and T is the temperature. This equation agrees well with the empirically determined separative power of optimised counter-current gas centrifuges.

  19. Multiscale Analysis and Optimisation of Photosynthetic Solar Energy Systems

    CERN Document Server

    Ringsmuth, Andrew K

    2014-01-01

    This work asks how light harvesting in photosynthetic systems can be optimised for economically scalable, sustainable energy production. Hierarchy theory is introduced as a system-analysis and optimisation tool better able to handle multiscale, multiprocess complexities in photosynthetic energetics compared with standard linear-process analysis. Within this framework, new insights are given into relationships between composition, structure and energetics at the scale of the thylakoid membrane, and also into how components at different scales cooperate under functional objectives of the whole photosynthetic system. Combining these reductionistic and holistic analyses creates a platform for modelling multiscale-optimal, idealised photosynthetic systems in silico.

  20. Optimisation of BPMN Business Models via Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for the optimisation of business processes modelled in the business process modelling language BPMN, which builds upon earlier work, where we developed a model checking based method for the analysis of BPMN models. We define a structure for expressing optimisation goals...... candidate improved processes based on the fittest of the previous generation. The evaluation of the fitness of each candidate in a generation is performed via model checking, detailed in previous work. At each iteration, this allows the determination of the precise numerical evaluation of the performance...

  1. Optimisation of beam-orientations in conformal radiotherapy treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Rowbottom, C.G

    1999-07-01

    Many synergistic advances have led to the beginnings of the routine use of conformal radiotherapy. These include advances in diagnostic imaging, in 3D treatment planning, in the technology for complex treatment delivery and in computer assessment of rival treatment plans. A conformal radiotherapy treatment plan more closely conforms the high-dose volume to the target volume, reducing the dose to normal healthy tissue. Traditionally, human planners have devised the treatment parameters used in radiotherapy treatment plans via a manually iterative process. Computer 'optimisation' algorithms have been shown to improve treatment plans as they can explore much more of the search space in a relatively short time. This thesis examines beam-orientation computer 'optimisation' in radiotherapy treatment planning and several new techniques were developed. Using these techniques a comparison was performed between treatment plans with 'standard', fixed beam-orientations and treatment plans with 'optimised' beam-orientations for patients with cancer of the prostate, oesophagus and brain. Plans were compared on the basis of dose-distributions and in some cases biological models for the probability of damage to the target volume and the major organs-at-risk (OARs) in each patient group. A cohort of patients was considered in each group to avoid bias from a specific patient geometry. In the case of the patient cohort with cancer of the prostate, a coplanar beam-orientation 'optimisation' scheme led to an average increase in the TCP of (5.7{+-}1.4)% compared to the standard plans after the dose to the isocentre had been scaled to produce a rectal NTCP of 1%. For the patient cohort with cancer of the oesophagus, the beam-orientation 'optimisation' scheme reduced the average lung NTCP by (0.7{+-}0.2)% at the expense of a modest increase in the average spinal cord NTCP of (0.1{+-}0.2)%. A non-coplanar beam-orientation 'optimisation

  2. Plant-wide performance optimisation – The refrigeration system case

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Green, Torben; Razavi-Far, Roozbeh

    2012-01-01

    This paper investigates the problem of plant-wide performance optimisation seen from an industrial perspective. The refrigeration system is used as a case study, because it has a distributed control architecture and operates in steady state conditions, which is common for many industrial applicat......This paper investigates the problem of plant-wide performance optimisation seen from an industrial perspective. The refrigeration system is used as a case study, because it has a distributed control architecture and operates in steady state conditions, which is common for many industrial...

  3. Optimising a shaft's geometry by applying genetic algorithms

    Directory of Open Access Journals (Sweden)

    María Alejandra Guzmán

    2010-04-01

    Full Text Available Many engnieering design tasks involve optimising several conflicting goals; these types of problem are known as Multiobjective Optimisation Problems (MOPs. Evolutionary techniques have proved to be an effective tool for finding solutions to these MOPs during the last decade, Variations on the basic generic algorithm have been particulary proposed by different researchers for finding rapid optimal solutions to MOPs. The NSGA (Non-dominated Sorting Generic Algorithm has been implemented in this paper for finding an optimal design for a shaft subjected to cyclic loads, the conflycting goals being minimum weight and minimum lateral deflection.

  4. An Improved Lower Bound Limit State Optimisation Algorithm

    DEFF Research Database (Denmark)

    Frier, Christian; Damkilde, Lars

    2010-01-01

    Limit State analysis has been used in engineering practice for many years e.g. the yield-line method for concrete slabs and slip-line solutions in geotechnics. In the recent years there has been an increased interest in numerical Limit State analysis, and today algorithms take into account the non......-linear yield criteria. The aim of the paper is to refine an earlier presented effective method which reduces the number of optimisation variables considerably by eliminating the equilibrium equations a priori and improvements are made on the interior point optimisation algorithm....

  5. Robust eigenstructure clustering by non-smooth optimisation

    Science.gov (United States)

    Dao, Minh Ngoc; Noll, Dominikus; Apkarian, Pierre

    2015-08-01

    We extend classical eigenstructure assignment to more realistic problems, where additional performance and robustness specifications arise. Our aim is to combine time-domain constraints, as reflected by pole location and eigenvector structure, with frequency-domain objectives such as the H2, H∞ or Hankel norms. Using pole clustering, we allow poles to move in polydisks of prescribed size around their nominal values, driven by optimisation. Eigenelements, that is poles and eigenvectors, are allowed to move simultaneously and serve as decision variables in a specialised non-smooth optimisation technique. Two aerospace applications illustrate the power of the new method.

  6. Optimisation of Oil Production in Two – Phase Flow Reservoir Using Simultaneous Method and Interior Point Optimiser

    DEFF Research Database (Denmark)

    2012-01-01

    in the reservoir. A promising decrease of these remained resources can be provided by smart wells applying water injections to sustain satisfactory pressure level in the reservoir throughout the whole process of oil production. Basically to enhance secondary recovery of the remaining oil after drilling, water...... fields, or closed loop optimisation, can be used for optimising the reservoir performance in terms of net present value of oil recovery or another economic objective. In order to solve an optimal control problem we use a direct collocation method where we translate a continuous problem into a discrete...... for large scale nonlinear optimisation was applied. Because of its versatile compatibility with programming technologies, a C++ programming language in Microsoft Visual Studio integrated development environment was used for modelling the optimal control problem. Thanks to object oriented features...

  7. Solving dynamic multi-objective problems with vector evaluated particle swarm optimisation

    CSIR Research Space (South Africa)

    Greeff, M

    2008-06-01

    Full Text Available Many optimisation problems are multi-objective and change dynamically. Many methods use a weighted average approach to the multiple objectives. This paper introduces the usage of the vector evaluated particle swarm optimiser (VEPSO) to solve dynamic...

  8. Agent-Based Decision Control—How to Appreciate Multivariate Optimisation in Architecture

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas Holmer; Kolarik, Jakub

    2015-01-01

    in the early design stage. The main focus is to demonstrate the optimisation method, which is done in two ways. Firstly, the newly developed agent-based optimisation algorithm named Moth is tested on three different single objective search spaces. Here Moth is compared to two evolutionary algorithms. Secondly......, the method is applied to a multivariate optimisation problem. The aim is specifically to demonstrate optimisation for entire building energy consumption, daylight distribution and capital cost. Based on the demonstrations Moth’s ability to find local minima is discussed. It is concluded that agent-based...... optimisation algorithms like Moth open up for new uses of optimisation in the early design stage. With Moth the final outcome is less dependent on pre- and post-processing, and Moth allows user intervention during optimisation. Therefore, agent-based models for optimisation such as Moth can be a powerful...

  9. Non-linear Total Energy Optimisation of a Fleet of Power Plants

    Science.gov (United States)

    Nolle, Lars; Biegler-König, Friedrich; Deeskow, Peter

    In order to optimise the energy production in a fleet of power plants, it is necessary to solve a mixed integer optimisation problem. Traditionally, the continuous parts of the problem are linearized and a Simplex scheme is applied. Alternatively, heuristic "bionic" optimisation methods can be used without having to linearize the problem. Weare going to demonstrate this approach by modelling power plant blocks with fast Neural Networks and optimising the operation of multi-block power plants over one day with Simulated Annealing.

  10. Mucosal immunization with recombinant adenoviral vectors expressing murine gammaherpesvirus-68 genes M2 and M3 can reduce latent viral load

    DEFF Research Database (Denmark)

    Hoegh-Petersen, Mette; Thomsen, Allan R; Christensen, Jan P

    2009-01-01

    of the gammaherpesvirinae speaks against using a similar approach in humans. DNA immunization with plasmids encoding the MHV-68 genes M2 or M3 caused a reduction in either acute or early latent viral load, respectively, but neither immunization had an effect at times later than 14 days post-infection. Adenovirus......-based vaccines are substantially more immunogenic than DNA vaccines and can be applied to induce mucosal immunity. Here we show that a significant reduction of the late viral load in the spleens, at 60 days post-infection, was achieved when immunizing mice both intranasally and subcutaneously with adenoviral...

  11. Optimisation of a novel trailing edge concept for a high lift device

    CSIR Research Space (South Africa)

    Botha, JDM

    2014-09-01

    Full Text Available A novel concept (referred to as the flap extension) is implemented on the leading edge of the flap of a three element high lift device. The concept is optimised using two optimisation approaches based on Genetic Algorithm optimisations. A zero order...

  12. Mobile app and app store analysis, testing and optimisation

    OpenAIRE

    Harman, M.; Al-Subaihin, A.; Jia, Y.; Martin, W.; Sarro, F.; Zhang, Y.

    2016-01-01

    This talk presents results on analysis and testing of mobile apps and app stores, reviewing the work of the UCL App Analysis Group (UCLappA) on App Store Mining and Analysis. The talk also covers the work of the UCL CREST centre on Genetic Improvement, applicable to app improvement and optimisation.

  13. Optimisation of sampling windows design for population pharmacokinetic experiments.

    Science.gov (United States)

    Ogungbenro, Kayode; Aarons, Leon

    2008-08-01

    This paper describes an approach for optimising sampling windows for population pharmacokinetic experiments. Sampling windows designs are more practical in late phase drug development where patients are enrolled in many centres and in out-patient clinic settings. Collection of samples under the uncontrolled environment at these centres at fixed times may be problematic and can result in uninformative data. Population pharmacokinetic sampling windows design provides an opportunity to control when samples are collected by allowing some flexibility and yet provide satisfactory parameter estimation. This approach uses information obtained from previous experiments about the model and parameter estimates to optimise sampling windows for population pharmacokinetic experiments within a space of admissible sampling windows sequences. The optimisation is based on a continuous design and in addition to sampling windows the structure of the population design in terms of the proportion of subjects in elementary designs, number of elementary designs in the population design and number of sampling windows per elementary design is also optimised. The results obtained showed that optimal sampling windows designs obtained using this approach are very efficient for estimating population PK parameters and provide greater flexibility in terms of when samples are collected. The results obtained also showed that the generalized equivalence theorem holds for this approach.

  14. Plant-wide performance optimisation – The refrigeration system case

    DEFF Research Database (Denmark)

    Green, Torben; Razavi-Far, Roozbeh; Izadi-Zamanabadi, Roozbeh;

    2012-01-01

    This paper investigates the problem of plant-wide performance optimisation seen from an industrial perspective. The refrigeration system is used as a case study, because it has a distributed control architecture and operates in steady state conditions, which is common for many industrial applicat...

  15. Evaluation of a high throughput starch analysis optimised for wood.

    Directory of Open Access Journals (Sweden)

    Chandra Bellasio

    Full Text Available Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11 was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood of four species (coniferous and flowering plants. The optimised protocol proved to be remarkably precise and accurate (3%, suitable for a high throughput routine analysis (35 samples a day of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  16. Evaluation of a high throughput starch analysis optimised for wood.

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  17. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  18. Optimisation of Kinematics for Tracked Vehicle Hydro Gas Suspension System

    Directory of Open Access Journals (Sweden)

    S. Sridhar

    2006-11-01

    Full Text Available The modern-day armoured fighting vehicles (AFVs are basically tracked vehicles equippedwith hydro gas suspensions, in lieu of conventional mechanical suspensions like torsion barand coil spring bogie suspensions. The uniqueness of hydro gas suspension is that it offersa nonlinear spring rate, which is very much required for the cross-country moveability of atracked vehicle. The AFVs have to negotiate different cross-country terrains like sandy, rocky,riverbed, etc. and the road irregularities provide enumerable problems during dynamic loadingsto the design of hydro gas suspension system. Optimising various design parameters demandsinnovative design methodologies to achieve better ride performance. Hence, a comprehensivekinematic analysis is needed. In this study, a methodology has been derived to optimise thekinematics of the suspension by reorienting the cylinder axis and optimising the loadtransferringleverage factor so that the side thrust on the cylinder is minimised to a greaterextent. The optimisation ultimately increases the life of the high-pressure and high-temperaturepiston seals, resulting in enhanced system life for better dependability.

  19. Cluster Optimisation using Cgroups at a Tier-2

    Science.gov (United States)

    Qin, G.; Roy, G.; Crooks, D.; Skipsey, S. C.; Stewart, G. P.; Britton, D.

    2016-10-01

    The Linux kernel feature Control Groups (cgroups) has been used to gather metrics on the resource usage of single and eight-core ATLAS workloads. It has been used to study the effects on performance of a reduction in the amount of physical memory. The results were used to optimise cluster performance, and consequently increase cluster throughput by up to 10%.

  20. A comparative study of marriage in honey bees optimisation (MBO ...

    African Journals Online (AJOL)

    2012-02-15

    Feb 15, 2012 ... complicate water management decision-making. ... evolutionary algorithms, such as the genetic algorithm (GA), ant colony optimisation for continuous ... biological properties. ... and proposed a new algorithm,called the 'artificial bee colony' ... as a set of transitions in a state–space (the environment), where.

  1. Smart optimisation and sensitivity analysis in water distribution systems

    CSIR Research Space (South Africa)

    Page, Philip R

    2015-12-01

    Full Text Available stream_source_info Page_16028_2015.pdf.txt stream_content_type text/plain stream_size 1806 Content-Encoding ISO-8859-1 stream_name Page_16028_2015.pdf.txt Content-Type text/plain; charset=ISO-8859-1 SMART OPTIMISATION...

  2. Application of Surpac and Whittle Software in Open Pit Optimisation ...

    African Journals Online (AJOL)

    Michael

    2015-06-01

    Jun 1, 2015 ... Optimisation and Design”, Ghana Mining Journal, Vol. 15, No. 1, pp. 35 - 43. Abstract ... reduction in metal price in the market. The work involved in open .... also aided in viewing the model in graphics. A constraint is a logical ...

  3. Optimising Microbial Growth with a Bench-Top Bioreactor

    Science.gov (United States)

    Baker, A. M. R.; Borin, S. L.; Chooi, K. P.; Huang, S. S.; Newgas, A. J. S.; Sodagar, D.; Ziegler, C. A.; Chan, G. H. T.; Walsh, K. A. P.

    2006-01-01

    The effects of impeller size, agitation and aeration on the rate of yeast growth were investigated using bench-top bioreactors. This exercise, carried out over a six-month period, served as an effective demonstration of the importance of different operating parameters on cell growth and provided a means of determining the optimisation conditions…

  4. Deterministic and robust optimisation strategies for metal forming proceesses

    NARCIS (Netherlands)

    Bonte, M.H.A.; Boogaard, van den A.H.; Huetink, J.

    2007-01-01

    Product improvement and cost reduction have always been important goals in the metal forming industry. The rise of Finite Element simulations for metal forming processes has contributed to these goals in a major way. More recently, coupling FEM simulations to mathematical optimisation techniques has

  5. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  6. Optimising a fall out dust monitoring sampling programme at a ...

    African Journals Online (AJOL)

    GREG

    The aim of this study at the specific cement manufacturing plant and open cast mine was ... Key words: Fall out dust monitoring, cement plant, optimising, air pollution sampling, ..... meters as this is in line with the height of a typical fall out dust.

  7. Optimisation of searches for Supersymmetry with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Zvolsky, Milan

    2012-01-15

    The ATLAS experiment is one of the four large experiments at the Large Hadron Collider which is specifically designed to search for the Higgs boson and physics beyond the Standard Model. The aim of this thesis is the optimisation of searches for Supersymmetry in decays with two leptons and missing transverse energy in the final state. Two different optimisation studies have been performed for two important analysis aspects: The final signal region selection and the choice of the trigger selection. In the first part of the analysis, a cut-based optimisation of signal regions is performed, maximising the signal for a minimal background contamination. By this, the signal yield can in parts be more than doubled. The second approach is to introduce di-lepton triggers which allow to lower the lepton transverse momentum threshold, thus enhancing the number of selected signal events significantly. The signal region optimisation was considered for the choice of the final event selection in the ATLAS di-lepton analyses. The trigger study contributed to the incorporation of di-lepton triggers to the ATLAS trigger menu. (orig.)

  8. Optimised prefactored compact schemes for linear wave propagation phenomena

    Science.gov (United States)

    Rona, A.; Spisso, I.; Hall, E.; Bernardini, M.; Pirozzoli, S.

    2017-01-01

    A family of space- and time-optimised prefactored compact schemes are developed that minimise the computational cost for given levels of numerical error in wave propagation phenomena, with special reference to aerodynamic sound. This work extends the approach of Pirozzoli [1] to the MacCormack type prefactored compact high-order schemes developed by Hixon [2], in which their shorter Padé stencil from the prefactorisation leads to a simpler enforcement of numerical boundary conditions. An explicit low-storage multi-step Runge-Kutta integration advances the states in time. Theoretical predictions for spatial and temporal error bounds are derived for the cost-optimised schemes and compared against benchmark schemes of current use in computational aeroacoustic applications in terms of computational cost for a given relative numerical error value. One- and two-dimensional test cases are presented to examine the effectiveness of the cost-optimised schemes for practical flow computations. An effectiveness up to about 50% higher than the standard schemes is verified for the linear one-dimensional advection solver, which is a popular baseline solver kernel for computational physics problems. A substantial error reduction for a given cost is also obtained in the more complex case of a two-dimensional acoustic pulse propagation, provided the optimised schemes are made to operate close to their nominal design points.

  9. Using break quantities for tactical optimisation in multistage distribution systems

    NARCIS (Netherlands)

    M.J. Kleijn (Marcel); R. Dekker (Rommert)

    1997-01-01

    textabstractIn this chapter we discuss a tactical optimisation problem that arises in a multistage distribution system where customer orders can be delivered from any stockpoint. A simple rule to allocate orders to locations is a break quantity rule, which routes large orders to higher-stage stockpo

  10. Wear/comfort Pareto optimisation of bogie suspension

    Science.gov (United States)

    Milad Mousavi Bideleh, Seyed; Berbyuk, Viktor; Persson, Rickard

    2016-08-01

    Pareto optimisation of bogie suspension components is considered for a 50 degrees of freedom railway vehicle model to reduce wheel/rail contact wear and improve passenger ride comfort. Several operational scenarios including tracks with different curve radii ranging from very small radii up to straight tracks are considered for the analysis. In each case, the maximum admissible speed is applied to the vehicle. Design parameters are categorised into two levels and the wear/comfort Pareto optimisation is accordingly accomplished in a multistep manner to improve the computational efficiency. The genetic algorithm (GA) is employed to perform the multi-objective optimisation. Two suspension system configurations are considered, a symmetric and an asymmetric in which the primary or secondary suspension elements on the right- and left-hand sides of the vehicle are not the same. It is shown that the vehicle performance on curves can be significantly improved using the asymmetric suspension configuration. The Pareto-optimised values of the design parameters achieved here guarantee wear reduction and comfort improvement for railway vehicles and can also be utilised in developing the reference vehicle models for design of bogie active suspension systems.

  11. Adaptive optimisation of a generalised beam shaping system

    DEFF Research Database (Denmark)

    Kenny, F.; Choi, F. S.; Glückstad, Jesper

    2015-01-01

    filter were generated by the SLM. This provided extra flexibility and control over the parameters of the system including the phase step magnitude, shape, radius and position of the filter. A feedback method for the on-line optimisation of these properties was also developed. Using feedback from images...

  12. Multi-disciplinary design optimisation via dashboard portals

    NARCIS (Netherlands)

    Uijtenhaak, T.; Coenders, J.L.

    2012-01-01

    This paper will present the opportunities a dashboard-based system provides to gather information on alternatives and display this in such a way that it will help the user making decisions by making use of multi-disciplinary optimisation (MDO) technology. The multi-disciplinary set up of the program

  13. A metamodel based optimisation algorithm for metal forming processes

    NARCIS (Netherlands)

    Bonte, M.H.A.; Boogaard, van den A.H.; Huetink, J.; Banabic, Dorel

    2007-01-01

    Cost saving and product improvement have always been important goals in the metal forming industry. To achieve these goals, metal forming processes need to be optimised. During the last decades, simulation software based on the Finite Element Method (FEM) has significantly contributed to designing f

  14. Optimised cantilever biosensor with piezoresistive read-out

    DEFF Research Database (Denmark)

    Rasmussen, Peter; Thaysen, J.; Hansen, Ole

    2003-01-01

    We present a cantilever-based biochemical sensor with piezoresistive read-out which has been optimised for measuring surface stress. The resistors and the electrical wiring on the chip are encapsulated in low-pressure chemical vapor deposition (LPCVD) silicon nitride, so that the chip is well sui...

  15. Optimisation of selective breeding program for Nile tilapia (Oreochromis niloticus)

    NARCIS (Netherlands)

    Trong, T.Q.

    2013-01-01

      The aim of this thesis was to optimise the selective breeding program for Nile tilapia in the Mekong Delta region of Vietnam. Two breeding schemes, the “classic” BLUP scheme following the GIFT method (with pair mating) and a rotational mating scheme with own performance selection

  16. SINGLE FIXED CRANE OPTIMISATION WITHIN A DISTRIBUTION CENTRE

    Directory of Open Access Journals (Sweden)

    J. Matthews

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper considersthe optimisation of the movement of a fixed crane operating in a single aisle of a distribution centre. The crane must move pallets in inventory between docking bays, storage locations, and picking lines. Both a static and a dynamic approach to the problem are presented. The optimisation is performed by means of tabu search, ant colony metaheuristics,and hybrids of these two methods. All these solution approaches were tested on real life data obtained from an operational distribution centre. Results indicate that the hybrid methods outperform the other approaches.

    AFRIKAANSE OPSOMMING: Die optimisering van die beweging van 'n vaste hyskraan in 'n enkele gang van 'n distribusiesentrum word in hierdie artikel beskou. Die hyskraan moet pallette vervoer tussen dokhokke, stoorposisies, en opmaaklyne. Beide 'n statiese en 'n dinamiese benadering tot die probleem word aangebied. Die optimisering word gedoen met behulp van tabu-soektogte, mierkolonieoptimisering,en hibriede van hierdie twee metodes. Al die oplossingsbenaderings is getoets met werklike data wat van 'n operasionele distribusiesentrum verkry is. Die resultate toon aan dat die hibriedmetodes die beste oplossings lewer.

  17. Hierarchical optimisation on scissor seat suspension characteristic and structure

    Science.gov (United States)

    Wang, Chunlei; Zhang, Xinjie; Guo, Konghui; Lv, Jiming; Yang, Yi

    2016-11-01

    Scissor seat suspension has been applied widely to attenuate the cab vibrations of commercial vehicles, while its design generally needs a trade-off between the seat acceleration and suspension travel, which creates a typical optimisation issue. A complexity for this issue is that the optimal dynamics parameters are not easy to approach solutions fast and unequivocally. Hence, the hierarchical optimisation on scissor seat suspension characteristic and structure is proposed, providing a top-down methodology with the globally optimal and fast convergent solutions to compromise these design contradictions. In details, a characteristic-oriented non-parametric dynamics model of the scissor seat suspension is formulated firstly via databases, describing its vertical dynamics accurately. Then, the ideal vertical stiffness-damping characteristic is cascaded via the characteristic-oriented model, and the structure parameters are optimised in accordance with a structure-oriented multi-body dynamics model of the scissor seat suspension. Eventually, the seat effective amplitude transmissibility factor, suspension travel and the CPU time for solving are evaluated. The results show the seat suspension performance and convergent speed of the globally optimal solutions are improved well. Hence, the proposed hierarchical optimisation methodology regarding characteristic and structure of the scissor seat suspension is promising for its virtual development.

  18. Multi-disciplinary design optimisation via dashboard portals

    NARCIS (Netherlands)

    Uijtenhaak, T.; Coenders, J.L.

    2012-01-01

    This paper will present the opportunities a dashboard-based system provides to gather information on alternatives and display this in such a way that it will help the user making decisions by making use of multi-disciplinary optimisation (MDO) technology. The multi-disciplinary set up of the program

  19. Analysing the performance of dynamic multi-objective optimisation algorithms

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available Congress on Evolutionary Computation, 20-23 June 2013, Cancún, México Analysing the Performance of Dynamic Multi-objective Optimisation Algorithms Marde Helbig CSIR: Meraka Institute, Brummeria, South Africa; and University of Pretoria Computer...

  20. Self-organising sensor web using cell-fate optimisation

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2009-07-01

    Full Text Available may be doing so both dynamically and stochastically. When presented by a dynamic and stochastic changing environment, such as a sensor resource unexpectedly going down, a self-adaptive system should exhibit robustness. Cell-fate optimisation and signal...

  1. Gene

    Data.gov (United States)

    U.S. Department of Health & Human Services — Gene integrates information from a wide range of species. A record may include nomenclature, Reference Sequences (RefSeqs), maps, pathways, variations, phenotypes,...

  2. Optimised electroporation mediated DNA vaccination for treatment of prostate cancer.

    LENUS (Irish Health Repository)

    Ahmad, Sarfraz

    2010-01-01

    ABSTRACT: BACKGROUND: Immunological therapies enhance the ability of the immune system to recognise and destroy cancer cells via selective killing mechanisms. DNA vaccines have potential to activate the immune system against specific antigens, with accompanying potent immunological adjuvant effects from unmethylated CpG motifs as on prokaryotic DNA. We investigated an electroporation driven plasmid DNA vaccination strategy in animal models for treatment of prostate cancer. METHODS: Plasmid expressing human PSA gene (phPSA) was delivered in vivo by intra-muscular electroporation, to induce effective anti-tumour immune responses against prostate antigen expressing tumours. Groups of male C57 BL\\/6 mice received intra-muscular injections of phPSA plasmid. For phPSA delivery, quadriceps muscle was injected with 50 mug plasmid. After 80 seconds, square-wave pulses were administered in sequence using a custom designed pulse generator and acustom-designed applicator with 2 needles placed through the skin central to the muscle. To determine an optimum treatment regimen, three different vaccination schedules were investigated. In a separate experiment, the immune potential of the phPSA vaccine was further enhanced with co- administration of synthetic CpG rich oligonucleotides. One week after last vaccination, the mice were challenged subcutaneously with TRAMPC1\\/hPSA (prostate cancer cell line stably expressing human PSA) and tumour growth was monitored. Serum from animals was examined by ELISA for anti-hPSA antibodies and for IFNgamma. Histological assessment of the tumours was also carried out. In vivo and in vitro cytotoxicity assays were performed with splenocytes from treated mice. RESULTS: The phPSA vaccine therapy significantly delayed the appearance of tumours and resulted in prolonged survival of the animals. Four-dose vaccination regimen provided optimal immunological effects. Co - administration of the synthetic CpG with phPSA increased anti-tumour responses

  3. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters...... control; a rule based expert system. On the other hand, compared with a regulatory control technique designed earlier in Mollerup et al. (2015), the optimisation showed similar performance with respect to minimising overflow volume. Hence for operation of small sewer systems, regulatory control strategies...... can offer promising potential and should be considered along more advanced strategies when identifying novel solutions....

  4. Mesh dependence in PDE-constrained optimisation an application in tidal turbine array layouts

    CERN Document Server

    Schwedes, Tobias; Funke, Simon W; Piggott, Matthew D

    2017-01-01

    This book provides an introduction to PDE-constrained optimisation using finite elements and the adjoint approach. The practical impact of the mathematical insights presented here are demonstrated using the realistic scenario of the optimal placement of marine power turbines, thereby illustrating the real-world relevance of best-practice Hilbert space aware approaches to PDE-constrained optimisation problems. Many optimisation problems that arise in a real-world context are constrained by partial differential equations (PDEs). That is, the system whose configuration is to be optimised follows physical laws given by PDEs. This book describes general Hilbert space formulations of optimisation algorithms, thereby facilitating optimisations whose controls are functions of space. It demonstrates the importance of methods that respect the Hilbert space structure of the problem by analysing the mathematical drawbacks of failing to do so. The approaches considered are illustrated using the optimisation problem arisin...

  5. Optimising steel production schedules via a hierarchical genetic algorithm

    Directory of Open Access Journals (Sweden)

    Worapradya, Kiatkajohn

    2014-08-01

    Full Text Available This paper presents an effective scheduling in a steel-making continuous casting (SCC plant. The main contribution of this paper is the formulation of a new optimisation model that more closely represents real-world situations, and a hierarchical genetic algorithm (HGA tailored particularly for searching for an optimal SCC schedule. The optimisation model is developed by integrating two main planning phases of traditional scheduling: (1 planning cast sequence, and (2 scheduling of steel-making and timing of all jobs. A novel procedure is given for genetic algorithm (GA chromosome coding that maps Gantt chart and hierarchical chromosomes. The performance of the proposed methodology is illustrated and compared with a two-phase traditional scheduling and a standard GA toolbox. Both qualitative and quantitative performance measures are investigated.

  6. CAE process to simulate and optimise engine noise and vibration

    Science.gov (United States)

    Junhong, Zhang; Jun, Han

    2006-08-01

    The vibratory and acoustic behaviour of the internal combustion engine is a highly complex one, consisting of many components that are subject to loads that vary greatly in magnitude and which operate at wide range of speed. CAE tools development will lead to a significant reduction in the duration of the development period for engine as well as ensure a dramatic increase in product quality. This paper presents today's state-of-the-art CAE capabilities in the simulation of the dynamic and acoustic behaviour of engine and focuses on the relative merits of modification and full-scale structural/acoustic optimisation of engine, together with the creation of new low-noise designs. Modern CAE tools allow the analysis, assessment and acoustic optimisation of the engine.

  7. Optimisation of metal charge material for electric arc furnace

    Directory of Open Access Journals (Sweden)

    T. Lis

    2011-10-01

    Full Text Available The analysis of the changes in the crude steel production volumes implies gradual increase of production since the mid 20th century. This tendency has been slightly hampered by the economic depression. At the same time, the market requirements enforce improvement of the quality of the products manufactured on simultaneous minimisation of the production costs. One of the tools applied to solve these problems is mathematical optimisation. The author of this paper has presented an example of application of the multi-criteria optimisation method to improvement of efficiency of steel smelting in an electric arc furnace (EAF through appropriate choice of the charge scrap. A measurable effect of applying such a methodology of choosing the metal charge is the ability to reduce the unit cost of steel smelting.

  8. Compressed Sensing with Nonlinear Observations and Related Nonlinear Optimisation Problems

    CERN Document Server

    Blumensath, Thomas

    2012-01-01

    Non-convex constraints have recently proven a valuable tool in many optimisation problems. In particular sparsity constraints have had a significant impact on sampling theory, where they are used in Compressed Sensing and allow structured signals to be sampled far below the rate traditionally prescribed. Nearly all of the theory developed for Compressed Sensing signal recovery assumes that samples are taken using linear measurements. In this paper we instead address the Compressed Sensing recovery problem in a setting where the observations are non-linear. We show that, under conditions similar to those required in the linear setting, the Iterative Hard Thresholding algorithm can be used to accurately recover sparse or structured signals from few non-linear observations. Similar ideas can also be developed in a more general non-linear optimisation framework. In the second part of this paper we therefore present related result that show how this can be done under sparsity and union of subspaces constraints, wh...

  9. Optimisation of VSC-HVDC Transmission for Wind Power Plants

    DEFF Research Database (Denmark)

    Silva, Rodrigo Da

    Connection of Wind Power Plants (WPP), typically oshore, using VSCHVDC transmission is an emerging solution with many benefits compared to the traditional AC solution, especially concerning the impact on control architecture of the wind farms and the grid. The VSC-HVDC solution is likely to meet...... more stringent grid codes than a conventional AC transmission connection. The purpose of this project is to analyse how HVDC solution, considering the voltage-source converter based technology, for grid connection of large wind power plants can be designed and optimised. By optimisation, the project...... the requirements established by the operators in the multiterminal VSC-HVDC transmission system. Moreover, the possibility in minimising the overall transmission losses can be a solution for small grids and the minimisation in the dispatch error is a new solution for power deliver maximisation. The second study...

  10. Comparing and Optimising Parallel Haskell Implementations for Multicore Machines

    DEFF Research Database (Denmark)

    Berthold, Jost; Marlow, Simon; Hammond, Kevin

    2009-01-01

    by our testing: for example, we implemented a work-stealing approach to task allocation. Our optimisations improved the performance of the shared-heap GpH implementation by as much as 30% on eight cores. Secondly, the shared heap approach is, rather surprisingly, not superior to a distributed heap......H implementation investigated here uses a physically-shared heap, which should be well-suited to multicore architectures. In contrast, the Eden implementation adopts an approach that has been designed for use on distributed-memory parallel machines: a system of multiple, independent heaps (one per core......), with inter-core communication handled by message-passing rather than through shared heap cells. We report two main results. Firstly, we report on the effect of a number of optimisations that we applied to the shared-memory GpH implementation in order to address some performance issues that were revealed...

  11. Optimising stroke volume and oxygen delivery in abdominal aortic surgery

    DEFF Research Database (Denmark)

    Bisgaard, J; Gilsaa, T; Rønholm, E

    2012-01-01

    group, stroke volume was optimised by 250 ml colloid boluses intraoperatively and for the first 6 h post-operatively. The optimisation aimed at an oxygen delivery of 600 ml/min/m(2) in the post-operative period. Haemodynamic data were collected at pre-defined time points, including baseline......BACKGROUND: Post-operative complications after open elective abdominal aortic surgery are common, and individualised goal-directed therapy may improve outcome in high-risk surgery. We hypothesised that individualised goal-directed therapy, targeting stroke volume and oxygen delivery, can reduce......, intraoperatively and post-operatively. Patients were followed up for 30 days. RESULTS: Stroke volume index and oxygen delivery index were both higher in the post-operative period in the intervention group. In this group, 27 of 32 achieved the post-operative oxygen delivery index target vs. 18 of 32 in the control...

  12. Eyes Wide Open - Optimising Cosmological Surveys in a Crowded Market

    CERN Document Server

    Bassett, B A

    2004-01-01

    Optimising the major next-generation cosmological surveys (such as SNAP, KAOS etc...) is a key problem given our ignorance of the physics underlying cosmic acceleration and the plethora of surveys planned. We propose a Bayesian design framework which (1) maximises the discrimination power of a survey without assuming any underlying dark energy model, (2) finds the best niche survey geometry given current data and future competing experiments, (3) maximises the cross-section for serendipitous discoveries and (4) can be adapted to answer specific questions (such as `is dark energy dynamical?'). Integrated Parameter Space Optimisation (IPSO) is a design framework that integrates projected parameter errors over an entire dark energy parameter space and then extremises a figure of merit (such as Shannon entropy gain wich we show is stable to off-diagonal covariance matrix perturbations) as a function of survey parameters using analytical, grid or MCMC techniques. IPSO is thus a flexible, model-independent and scal...

  13. Biomass supply chain optimisation for Organosolv-based biorefineries.

    Science.gov (United States)

    Giarola, Sara; Patel, Mayank; Shah, Nilay

    2014-05-01

    This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs.

  14. Optimised Calibration Method for Six-Port Junction

    Institute of Scientific and Technical Information of China (English)

    XIONG Xiang-zheng; LIAO Cheng; XIAO Hua-qing

    2008-01-01

    A dual-tone technique is used to produce multi-samples in optimising calibration of six-port junction. More accurate results are achieved by using the least-square method and excluding those samples which may cause bigger errors. A 0.80 1.10 GHz microwave integrated circuit (MIC) six-port reflectometer is constructed. Nine test samples are used in the measurement. With Engens calibration procedure, the difference between the HP8510 and the six-port reflectrometer is in the order of 0.20 dB/1.5° for most cases, above 0.50 dB/5.0° at boundary frequency . With the optimised method, the difference is less than 0.10 dB/1.0° for most cases, and the biggest error is 0.42 dB/2.1° for boundary frequencies.

  15. An Experimental Approach for Optimising Mobile Agent Migrations

    CERN Document Server

    Gavalas, Damianos

    2010-01-01

    The field of mobile agent (MA) technology has been intensively researched during the past few years, resulting in the phenomenal proliferation of available MA platforms, all sharing several common design characteristics. Research projects have mainly focused on identifying applications where the employment of MAs is preferable compared to centralised or alternative distributed computing models. Very little work has been made on examining how MA platforms design can be optimised so as the network traffic and latency associated with MA transfers are minimised. The work presented in this paper addresses these issues by investigating the effect of several optimisation ideas applied on our MA platform prototype. Furthermore, we discuss the results of a set of timing experiments that offers a better understanding of the agent migration process and recommend new techniques for reducing MA transfers delay.

  16. Topology Optimised Broadband Photonic Crystal Y-Splitter

    DEFF Research Database (Denmark)

    Borel, Peter Ingo; Frandsen, Lars Hagedorn; Harpøth, Anders

    2005-01-01

    A planar photonic crystal waveguide Y-splitter that exhibits large-bandwidth low-loss 3 dB splitting for TE-polarised light has been fabricated in silicon-on-insulator material. The high performance is achieved by utilising topology optimisation to design the Y-junction and by using topology...... optimised low-loss 60° bends. The average excess loss of the entire component is found to be 0.44±0.29 dB for a 100 nm bandwidth, and the excess loss due to the Y-junction is found to be 0.34±0.30 dB in a 175 nm bandwidth....

  17. Development and optimisation of electrodematerials in solid oxide fuel cells

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Solid oxide fuel cell (SOFC) is an all solid electrochemical device to convert fuels such as hydrogen and natural gas to electricity with high efficiency and very low greenhouse gas emission compared to traditional thermal power generation plants. Moreover, the reliability and efficiency of SOFC is critically dependent on the performance and stability of its components including anode, cathode and electrolyte. This in turn is largely dependent on the material selection and the fabrication processes. In this paper, specific examples are given to demonstrate strategy and process in the development and optimisation of electrode materials such as Ni/Y2O3-ZrO2 cermet anodes and (LaSr)MnO3 based cathodes. The results also demonstrate the importance of fabrication processes and that the understanding of the electrode process plays a very important role in the optimisation process of electrode materials.

  18. Computed tomography dose optimisation in cystic fibrosis: A review.

    LENUS (Irish Health Repository)

    Ferris, Helena

    2016-04-28

    Cystic fibrosis (CF) is the most common autosomal recessive disease of the Caucasian population worldwide, with respiratory disease remaining the most relevant source of morbidity and mortality. Computed tomography (CT) is frequently used for monitoring disease complications and progression. Over the last fifteen years there has been a six-fold increase in the use of CT, which has lead to a growing concern in relation to cumulative radiation exposure. The challenge to the medical profession is to identify dose reduction strategies that meet acceptable image quality, but fulfil the requirements of a diagnostic quality CT. Dose-optimisation, particularly in CT, is essential as it reduces the chances of patients receiving cumulative radiation doses in excess of 100 mSv, a dose deemed significant by the United Nations Scientific Committee on the Effects of Atomic Radiation. This review article explores the current trends in imaging in CF with particular emphasis on new developments in dose optimisation.

  19. Alginate microencapsulated hepatocytes optimised for transplantation in acute liver failure.

    Directory of Open Access Journals (Sweden)

    Suttiruk Jitraruch

    Full Text Available BACKGROUND AND AIM: Intraperitoneal transplantation of alginate-microencapsulated human hepatocytes is an attractive option for the management of acute liver failure (ALF providing short-term support to allow native liver regeneration. The main aim of this study was to establish an optimised protocol for production of alginate-encapsulated human hepatocytes and evaluate their suitability for clinical use. METHODS: Human hepatocyte microbeads (HMBs were prepared using sterile GMP grade materials. We determined physical stability, cell viability, and hepatocyte metabolic function of HMBs using different polymerisation times and cell densities. The immune activation of peripheral blood mononuclear cells (PBMCs after co-culture with HMBs was studied. Rats with ALF induced by galactosamine were transplanted intraperitoneally with rat hepatocyte microbeads (RMBs produced using a similar optimised protocol. Survival rate and biochemical profiles were determined. Retrieved microbeads were evaluated for morphology and functionality. RESULTS: The optimised HMBs were of uniform size (583.5±3.3 µm and mechanically stable using 15 min polymerisation time compared to 10 min and 20 min (p<0.001. 3D confocal microscopy images demonstrated that hepatocytes with similar cell viability were evenly distributed within HMBs. Cell density of 3.5×10(6 cells/ml provided the highest viability. HMBs incubated in human ascitic fluid showed better cell viability and function than controls. There was no significant activation of PBMCs co-cultured with empty or hepatocyte microbeads, compared to PBMCs alone. Intraperitoneal transplantation of RMBs was safe and significantly improved the severity of liver damage compared to control groups (empty microbeads and medium alone; p<0.01. Retrieved RMBs were intact and free of immune cell adherence and contained viable hepatocytes with preserved function. CONCLUSION: An optimised protocol to produce GMP grade alginate

  20. Integrated planning tool for optimisation in municipal home care

    OpenAIRE

    Røhne, Mette; Sandåker, Torjus; Ausen, Dag; Grut, Lisbet

    2016-01-01

    Purpose: The objective is to improve collaboration and enhance quality of care services in municipal, home care services by implementing and developing an integrated planning tool making use of optimisation technology for better decision support. The project will through piloting and action based research establish knowledge on change in work processes to improve collaboration and efficiency. Context: A planning tool called Spider has been piloted in home care in Horten municipality since 201...

  1. Topology Optimisation for Energy Management in Underwater Sensor Networks

    Science.gov (United States)

    2015-01-01

    International Journal of Control, 2015 http://dx.doi.org/10.1080/00207179.2015.1017006 Topology optimisation for energy management in underwater...State University, University Park , PA 16802-1412, USA; bNaval Undersea Warfare Center, Newport, RI 02841-1708, USA and Department of Mechanical & Nuclear...Engineering, Pennsylvania State University, University Park , PA 16802-1412, USA; cUnited Technology Research Center, Cork, Ireland (Received 28

  2. Optimising automation of a manual enzyme-linked immunosorbent assay

    OpenAIRE

    Corena de Beer; Monika Esser; Wolfgang Preiser

    2011-01-01

    Objective: Enzyme-linked immunosorbent assays (ELISAs) are widely used to quantify immunoglobulin levels induced by infection or vaccination. Compared to conventional manual assays, automated ELISA systems offer more accurate and reproducible results, faster turnaround times and cost effectiveness due to the use of multianalyte reagents.Design: The VaccZyme™ Human Anti-Haemophilus influenzae type B (Hib) kit (MK016) from The Binding Site Company was optimised to be used on an automated BioRad...

  3. Optimisation of brain SPET and portability of normal databases

    Energy Technology Data Exchange (ETDEWEB)

    Barnden, Leighton R.; Behin-Ain, Setayesh; Goble, Elizabeth A. [The Queen Elizabeth Hospital, Adelaide (Australia); Hatton, Rochelle L.; Hutton, Brian F. [Westmead Hospital, Sydney (Australia)

    2004-03-01

    Use of a normal database in quantitative regional analysis of brain single-photon emission tomography (SPET) facilitates the detection of functional defects in individual or group studies by accounting for inter-subject variability. Different reconstruction methods and suboptimal attenuation and scatter correction methods can introduce additional variance that will adversely affect such analysis. Similarly, processing differences across different instruments and/or institutions may invalidate the use of external normal databases. The object of this study was to minimise additional variance by comparing reconstructions of a physical phantom with its numerical template so as to optimise processing parameters. Age- and gender-matched normal scans acquired on two different systems were compared using SPM99 after processing with both standard and optimised parameters. For three SPET systems we have optimised parameters for attenuation correction, lower window scatter subtraction, reconstructed pixel size and fanbeam focal length for both filtered back-projection (FBP) and iterative (OSEM) reconstruction. Both attenuation and scatter correction improved accuracy for all systems. For single-iteration Chang attenuation correction the optimum attenuation coefficient (mu) was 0.45-0.85 of the narrow beam value (Nmu) before, and 0.75-0.85 Nmu after, scatter subtraction. For accurately modelled OSEM attenuation correction, optimum mu was 0.6-0.9 Nmu before and 0.9-1.1 Nmu after scatter subtraction. FBP appeared to change in-plane voxel dimensions by about 2% and this was confirmed by line phantom measurements. Improvement in accuracy with scatter subtraction was most marked for the highest spatial resolution system. Optimised processing reduced but did not remove highly significant regional differences between normal databases acquired on two different SPET systems. (orig.)

  4. Geometrical optimisation of a biochip microchannel fluidic separator.

    Science.gov (United States)

    Xue, Xiangdong; Patel, Mayur K; Bailey, Chris; Desmulliez, Marc P Y

    2012-01-01

    This article reports on the geometric optimisation of a T-shaped biochip microchannel fluidic separator aiming to maximise the separation efficiency of plasma from blood through the improvement of the unbalanced separation performance among different channel bifurcations. For this purpose, an algebraic analysis is firstly implemented to identify the key parameters affecting fluid separation. A numerical optimisation is then carried out to search the key parameters for improved separation performance of the biochip. Three parameters, the interval length between bifurcations, the main channel length from the outlet to the bifurcation region and the side channel geometry, are identified as the key characteristic sizes and defined as optimisation variables. A balanced flow rate ratio between the main and side channels, which is an indication of separation effectiveness, is defined as the objective. It is found that the degradation of the separation performance is caused by the unbalanced channel resistance ratio between the main and side channel routes from bifurcations to outlets. The effects of the three key parameters can be summarised as follows: (a) shortening the interval length between bifurcations moderately reduces the differences in the flow rate ratios; (b) extending the length of the main channel from the main outlet is effective for achieving a uniformity of flow rate ratio but ineffective in changing the velocity difference of the side channels and (c) decreasing the lengths of side channels from upstream to downstream is effective for both obtaining a uniform flow rate ratio and reducing the differences in the flow velocities between the side branch channels. An optimisation process combining the three parameters is suggested as this integration approach leads to fast convergent process and also offers flexible design options for satisfying different requirements.

  5. Using Lean principles to optimise inpatient phlebotomy services.

    Science.gov (United States)

    Le, Rachel D; Melanson, Stacy E F; Santos, Katherine S; Paredes, Jose D; Baum, Jonathan M; Goonan, Ellen M; Torrence-Hill, Joi N; Gustafson, Michael L; Tanasijevic, Milenko J

    2014-08-01

    In the USA, inpatient phlebotomy services are under constant operational pressure to optimise workflow, improve timeliness of blood draws, and decrease error in the context of increasing patient volume and complexity of work. To date, the principles of Lean continuous process improvement have been rarely applied to inpatient phlebotomy. To optimise supply replenishment and cart standardisation, communication and workload management, blood draw process standardisation, and rounding schedules and assignments using Lean principles in inpatient phlebotomy services. We conducted four Lean process improvement events and implemented a number of interventions in inpatient phlebotomy over a 9-month period. We then assessed their impact using three primary metrics: (1) percentage of phlebotomists drawing their first patient by 05:30 for 05:00 rounds, (2) percentage of phlebotomists completing 08:00 rounds by 09:30, and (3) number of errors per 1000 draws. We saw marked increases in the percentage of phlebotomists drawing their first patient by 05:30, and the percentage of phlebotomists completing rounds by 09:30 postprocess improvement. A decrease in the number of errors per 1000 draws was also observed. This study illustrates how continuous process improvement through Lean can optimise workflow, improve timeliness, and decrease error in inpatient phlebotomy. We believe this manuscript adds to the field of clinical pathology as it can be used as a guide for other laboratories with similar goals of optimising workflow, improving timeliness, and decreasing error, providing examples of interventions and metrics that can be tailored to specific laboratories with particular services and resources. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Integrated planning tool for optimisation in municipal home care

    OpenAIRE

    Røhne, Mette; Sandåker, Torjus; Ausen, Dag; Grut, Lisbet

    2016-01-01

    Purpose: The objective is to improve collaboration and enhance quality of care services in municipal, home care services by implementing and developing an integrated planning tool making use of optimisation technology for better decision support. The project will through piloting and action based research establish knowledge on change in work processes to improve collaboration and efficiency. Context: A planning tool called Spider has been piloted in home care in Horten municipality since 201...

  7. Parameter Screening and Optimisation for ILP Using Designed Experiments

    Science.gov (United States)

    Srinivasan, Ashwin; Ramakrishnan, Ganesh

    Reports of experiments conducted with an Inductive Logic Programming system rarely describe how specific values of parameters of the system are arrived at when constructing models. Usually, no attempt is made to identify sensitive parameters, and those that are used are often given "factory-supplied" default values, or values obtained from some non-systematic exploratory analysis. The immediate consequence of this is, of course, that it is not clear if better models could have been obtained if some form of parameter selection and optimisation had been performed. Questions follow inevitably on the experiments themselves: specifically, are all algorithms being treated fairly, and is the exploratory phase sufficiently well-defined to allow the experiments to be replicated? In this paper, we investigate the use of parameter selection and optimisation techniques grouped under the study of experimental design. Screening and "response surface" methods determine, in turn, sensitive parameters and good values for these parameters. This combined use of parameter selection and response surface-driven optimisation has a long history of application in industrial engineering, and its role in ILP is investigated using two well-known benchmarks. The results suggest that computational overheads from this preliminary phase are not substantial, and that much can be gained, both on improving system performance and on enabling controlled experimentation, by adopting well-established procedures such as the ones proposed here.

  8. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  9. Optimising the Target and Capture Sections of the Neutrino Factory

    CERN Document Server

    Hansen, Ole Martin; Stapnes, Steinar

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accelerator and thus the neutrino beam intensity. The optimisation studies were performed with the use of Monte Carlo simulation tools. The production of secondary particles, by interactions between the incoming proton beam and the mercury target, was optimised by varying the proton beam impact position and impact angles on the target. The proton beam and target interaction region was studied and showed to be off the central axis of the capture section in the baseline configuration. The off-centred interaction region resulted in ...

  10. Direct and Indirect Gradient Control for Static Optimisation

    Institute of Scientific and Technical Information of China (English)

    Yi Cao

    2005-01-01

    Static "self-optimising" control is an important concept, which provides a link between static optimisation and control[1]. According to the concept, a dynamic control system could be configured in such a way that when a set of certain variables are maintained at their setpoints, the overall process operation is automatically optimal or near optimal at steadystate in the presence of disturbances. A novel approach using constrained gradient control to achieve "self-optimisation" has been proposed by Cao[2]. However, for most process plants, the information required to get the gradient measure may not be available in real-time. In such cases, controlled variable selection has to be carried out based on measurable candidates. In this work, the idea of direct gradient control has been extended to controlled variable selection based on gradient sensitivity analysis (indirect gradient control). New criteria, which indicate the sensitivity of the gradient function to disturbances and implementation errors, have been derived for selection. The particular case study shows that the controlled variables selected by gradient sensitivity measures are able to achieve near optimal performance.

  11. Optimisation of Lilla Edet Landslide GPS Monitoring Network

    Science.gov (United States)

    Alizadeh-Khameneh, M. A.; Eshagh, M.; Sjöberg, L. E.

    2015-06-01

    Since the year 2000, some periodic investigations have been performed in the Lilla Edet region to monitor and possibly determine the landslide of the area with GPS measurements. The responsible consultant has conducted this project by setting up some stable stations for GPS receivers in the risky areas of Lilla Edet and measured the independent baselines amongst the stations according to their observation plan. Here, we optimise the existing surveying network and determine the optimal configuration of the observation plan based on different criteria.We aim to optimise the current network to become sensitive to detect 5 mm possible displacements in each net point. The network quality criteria of precision, reliability and cost are used as object functions to perform single-, bi- and multi-objective optimisation models. It has been shown in the results that the single-objective model of reliability, which is constrained to the precision, provides much higher precision than the defined criterion by preserving almost all of the observations. However, in this study, the multi-objective model can fulfil all the mentioned quality criteria of the network by 17% less measurements than the original observation plan, meaning 17%of saving time, cost and effort in the project.

  12. Module detection in complex networks using integer optimisation

    Directory of Open Access Journals (Sweden)

    Tsoka Sophia

    2010-11-01

    Full Text Available Abstract Background The detection of modules or community structure is widely used to reveal the underlying properties of complex networks in biology, as well as physical and social sciences. Since the adoption of modularity as a measure of network topological properties, several methodologies for the discovery of community structure based on modularity maximisation have been developed. However, satisfactory partitions of large graphs with modest computational resources are particularly challenging due to the NP-hard nature of the related optimisation problem. Furthermore, it has been suggested that optimising the modularity metric can reach a resolution limit whereby the algorithm fails to detect smaller communities than a specific size in large networks. Results We present a novel solution approach to identify community structure in large complex networks and address resolution limitations in module detection. The proposed algorithm employs modularity to express network community structure and it is based on mixed integer optimisation models. The solution procedure is extended through an iterative procedure to diminish effects that tend to agglomerate smaller modules (resolution limitations. Conclusions A comprehensive comparative analysis of methodologies for module detection based on modularity maximisation shows that our approach outperforms previously reported methods. Furthermore, in contrast to previous reports, we propose a strategy to handle resolution limitations in modularity maximisation. Overall, we illustrate ways to improve existing methodologies for community structure identification so as to increase its efficiency and applicability.

  13. Customisable 3D printed microfluidics for integrated analysis and optimisation.

    Science.gov (United States)

    Monaghan, T; Harding, M J; Harris, R A; Friel, R J; Christie, S D R

    2016-08-16

    The formation of smart Lab-on-a-Chip (LOC) devices featuring integrated sensing optics is currently hindered by convoluted and expensive manufacturing procedures. In this work, a series of 3D-printed LOC devices were designed and manufactured via stereolithography (SL) in a matter of hours. The spectroscopic performance of a variety of optical fibre combinations were tested, and the optimum path length for performing Ultraviolet-visible (UV-vis) spectroscopy determined. The information gained in these trials was then used in a reaction optimisation for the formation of carvone semicarbazone. The production of high resolution surface channels (100-500 μm) means that these devices were capable of handling a wide range of concentrations (9 μM-38 mM), and are ideally suited to both analyte detection and process optimisation. This ability to tailor the chip design and its integrated features as a direct result of the reaction being assessed, at such a low time and cost penalty greatly increases the user's ability to optimise both their device and reaction. As a result of the information gained in this investigation, we are able to report the first instance of a 3D-printed LOC device with fully integrated, in-line monitoring capabilities via the use of embedded optical fibres capable of performing UV-vis spectroscopy directly inside micro channels.

  14. MOPTOP: a multi-colour optimised optical polarimeter

    Science.gov (United States)

    Jermak, Helen; Steele, Iain A.; Smith, Robert J.

    2016-08-01

    We present the design and science case for the Liverpool Telescope's fourth-generation polarimeter; MOPTOP: a Multicolour OPTimised Optical Polarimeter which is optimised for sensitivity and bi-colour observations. We introduce an optimised polarimeter which is as far as possible limited only by the photon counting efficiency of the detectors. Using a combination of CMOS cameras, a continuously rotating half-wave plate and a wire grid polarising beamsplitter, we predict we can accurately measure the polarisation of sources to 1% at 19th magnitude in 10 minutes on a 2 metre telescope. For brighter sources we anticipate much low systematics (design also gives the ability to measure polarization and photometric variability on timescales as short as a few seconds. Overall the instrument will allow accurate measurements of the intra-nightly variability of the polarisation of sources such as gamma-ray bursts and blazars (AGN orientated with the jet pointing toward the observer), allowing the constraint of magnetic field models revealing more information about the formation, ejection and collimation of jets.

  15. Crystal structure optimisation using an auxiliary equation of state.

    Science.gov (United States)

    Jackson, Adam J; Skelton, Jonathan M; Hendon, Christopher H; Butler, Keith T; Walsh, Aron

    2015-11-14

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.

  16. Crystal structure optimisation using an auxiliary equation of state

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T. [Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Walsh, Aron, E-mail: a.walsh@bath.ac.uk [Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Global E" 3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of)

    2015-11-14

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy–volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other “beyond” density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu{sub 2}ZnSnS{sub 4} and the magnetic metal-organic framework HKUST-1.

  17. Numerical optimisation of an axial turbine; Numerische Optimierung einer Axialturbine

    Energy Technology Data Exchange (ETDEWEB)

    Welzel, B.

    1998-12-31

    The author presents a method for automatic shape optimisation of components with internal or external flow. The method combines a program for numerical calculation of frictional turbulent flow with an optimisation algorithm. Algorithms are a simplex search strategy and an evolution strategy. The shape of the component to be optimized is variable due to shape parameters modified by the algorithm. For each shape, a flow calculation is carried out on whose basis a functional value like performance, loss, lift or resistivity is calculated. For validation, the optimisation method is used in simple examples with known solutions. It is applied. It is applied to the components of a slow-running axial turbine. Components with accelerated and delayed rotationally symmetric flow and 2D blade profiles are optimized. [Deutsch] Es wird eine Methode zur automatischen Formoptimierung durchstroemter oder umstroemter Bauteile vorgestellt. Diese koppelt ein Programm zur numerischen Berechnung reibungsbehafteter turbulenter Stroemungen mit einem Optimierungsalgorithmus. Dabei kommen als Algorithmen eine Simplex-Suchstrategie und eine Evolutionsstrategie zum Einsatz. Die Form des zu optimierenden Koerpers ist durch Formparameter, die vom Algorithmus veraendert werden, variabel. Fuer jede Form wird eine Stroemungsberechnung durchgefuehrt und mit dieser ein Funktionswert wie Wirkungsgrad, Verlust, Auftrieb oder Widerstandskraft berechnet. Die Optimierungsmethode wird zur Validierung in einfachen Beispielen mit bekannter Loesung eingesetzt. Zur Anwendung kommt sie in den einzelnen Komponenten einer langsamlaeufigen Axialturbine. Es werden Bauteile mit beschleunigter und verzoegerter rotationssymmetrischer Stroemung und 2D-Schaufelprofile optimiert. (orig.)

  18. Honeybee economics: optimisation of foraging in a variable world.

    Science.gov (United States)

    Stabentheiner, Anton; Kovac, Helmut

    2016-06-20

    In honeybees fast and efficient exploitation of nectar and pollen sources is achieved by persistent endothermy throughout the foraging cycle, which means extremely high energy costs. The need for food promotes maximisation of the intake rate, and the high costs call for energetic optimisation. Experiments on how honeybees resolve this conflict have to consider that foraging takes place in a variable environment concerning microclimate and food quality and availability. Here we report, in simultaneous measurements of energy costs, gains, and intake rate and efficiency, how honeybee foragers manage this challenge in their highly variable environment. If possible, during unlimited sucrose flow, they follow an 'investment-guided' ('time is honey') economic strategy promising increased returns. They maximise net intake rate by investing both own heat production and solar heat to increase body temperature to a level which guarantees a high suction velocity. They switch to an 'economizing' ('save the honey') optimisation of energetic efficiency if the intake rate is restricted by the food source when an increased body temperature would not guarantee a high intake rate. With this flexible and graded change between economic strategies honeybees can do both maximise colony intake rate and optimise foraging efficiency in reaction to environmental variation.

  19. Techno-economic optimisation of energy systems; Contribution a l'optimisation technico-economique de systemes energetiques

    Energy Technology Data Exchange (ETDEWEB)

    Mansilla Pellen, Ch

    2006-07-15

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  20. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi

    2012-09-04

    In this paper, shape optimisation of flapping wings in forward flight is considered. This analysis is performed by combining a local gradient-based optimizer with the unsteady vortex lattice method (UVLM). Although the UVLM applies only to incompressible, inviscid flows where the separation lines are known a priori, Persson et al. [1] showed through a detailed comparison between UVLM and higher-fidelity computational fluid dynamics methods for flapping flight that the UVLM schemes produce accurate results for attached flow cases and even remain trend-relevant in the presence of flow separation. As such, they recommended the use of an aerodynamic model based on UVLM to perform preliminary design studies of flapping wing vehicles Unlike standard computational fluid dynamics schemes, this method requires meshing of the wing surface only and not of the whole flow domain [2]. From the design or optimisation perspective taken in our work, it is fairly common (and sometimes entirely necessary, as a result of the excessive computational cost of the highest fidelity tools such as Navier-Stokes solvers) to rely upon such a moderate level of modelling fidelity to traverse the design space in an economical manner. The objective of the work, described in this paper, is to identify a set of optimised shapes that maximise the propulsive efficiency, defined as the ratio of the propulsive power over the aerodynamic power, under lift, thrust, and area constraints. The shape of the wings is modelled using B-splines, a technology used in the computer-aided design (CAD) field for decades. This basis can be used to smoothly discretize wing shapes with few degrees of freedom, referred to as control points. The locations of the control points constitute the design variables. The results suggest that changing the shape yields significant improvement in the performance of the flapping wings. The optimisation pushes the design to "bird-like" shapes with substantial increase in the time

  1. Differential integrity of TALE nuclease genes following adenoviral and lentiviral vector gene transfer into human cells.

    Science.gov (United States)

    Holkers, Maarten; Maggio, Ignazio; Liu, Jin; Janssen, Josephine M; Miselli, Francesca; Mussolino, Claudio; Recchia, Alessandra; Cathomen, Toni; Gonçalves, Manuel A F V

    2013-03-01

    The array of genome editing strategies based on targeted double-stranded DNA break formation have recently been enriched through the introduction of transcription activator-like type III effector (TALE) nucleases (TALENs). To advance the testing of TALE-based approaches, it will be crucial to deliver these custom-designed proteins not only into transformed cell types but also into more relevant, chromosomally stable, primary cells. Viral vectors are among the most effective gene transfer vehicles. Here, we investigated the capacity of human immunodeficiency virus type 1- and adenovirus-based vectors to package and deliver functional TALEN genes into various human cell types. To this end, we attempted to assemble particles of these two vector classes, each encoding a monomer of a TALEN pair targeted to a bipartite sequence within the AAVS1 'safe harbor' locus. Vector DNA analyses revealed that adenoviral vectors transferred intact TALEN genes, whereas lentiviral vectors failed to do so, as shown by their heterogeneously sized proviruses in target cells. Importantly, adenoviral vector-mediated TALEN gene delivery resulted in site-specific double-stranded DNA break formation at the intended AAVS1 target site at similarly high levels in both transformed and non-transformed cells. In conclusion, we demonstrate that adenoviral, but not lentiviral, vectors constitute a valuable TALEN gene delivery platform.

  2. Optimisation of the Nonlinear Suspension Characteristics of a Light Commercial Vehicle

    Directory of Open Access Journals (Sweden)

    Dinçer Özcan

    2013-01-01

    Full Text Available The optimum functional characteristics of suspension components, namely, linear/nonlinear spring and nonlinear damper characteristic functions are determined using simple lumped parameter models. A quarter car model is used to represent the front independent suspension, and a half car model is used to represent the rear solid axle suspension of a light commercial vehicle. The functional shapes of the suspension characteristics used in the optimisation process are based on typical shapes supplied by a car manufacturer. The complexity of a nonlinear function optimisation problem is reduced by scaling it up or down from the aforementioned shape in the optimisation process. The nonlinear optimised suspension characteristics are first obtained using lower complexity lumped parameter models. Then, the performance of the optimised suspension units are verified using the higher fidelity and more realistic Carmaker model. An interactive software module is developed to ease the nonlinear suspension optimisation process using the Matlab Graphical User Interface tool.

  3. Functional testing of topical skin formulations using an optimised ex vivo skin organ culture model.

    Science.gov (United States)

    Sidgwick, G P; McGeorge, D; Bayat, A

    2016-07-01

    A number of equivalent-skin models are available for investigation of the ex vivo effect of topical application of drugs and cosmaceuticals onto skin, however many have their drawbacks. With the March 2013 ban on animal models for cosmetic testing of products or ingredients for sale in the EU, their utility for testing toxicity and effect on skin becomes more relevant. The aim of this study was to demonstrate proof of principle that altered expression of key gene and protein markers could be quantified in an optimised whole tissue biopsy culture model. Topical formulations containing green tea catechins (GTC) were investigated in a skin biopsy culture model (n = 11). Punch biopsies were harvested at 3, 7 and 10 days, and analysed using qRT-PCR, histology and HPLC to determine gene and protein expression, and transdermal delivery of compounds of interest. Reduced gene expression of α-SMA, fibronectin, mast cell tryptase, mast cell chymase, TGF-β1, CTGF and PAI-1 was observed after 7 and 10 days compared with treated controls (p skin, negating the requirement for animal models in this context, prior to study in a clinical trial environment.

  4. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  5. Stacking sequence optimisation of composite panels subjected to slamming impact loads using a genetic algorithm

    OpenAIRE

    Khedmati,Mohammad Reza; Sangtabi,Mohammad Rezai; Fakoori,Mehdi

    2013-01-01

    Optimisation of stacking sequence for composite panels under slamming impact loads using a genetic algorithm method is studied in this paper. For this purpose, slamming load is assumed to have a uniform distribution with a triangular-pulse type of intensity function. In order to perform optimisation based on a genetic algorithm, a special code is written in MATLAB software environment. The optimiser is coupled with the commercial software ANSYS in order to analyse the composite panel under st...

  6. Optimisation of logistics processes of energy grass collection

    Science.gov (United States)

    Bányai, Tamás.

    2010-05-01

    The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The

  7. Optimisation of fertiliser rates in crop production against energy use indicators

    DEFF Research Database (Denmark)

    Rossner, Helis; Ritz, Christian; Astover, Alar

    2014-01-01

    Optimising mineral nitrogen (N) use in crop production is inevitable target as mineral fertilisers reflectone of the highest inputs both in terms of economy and energy. The aim of the study was to comparethe relationship between the rate of N fertiliser application and different measures of energy.......05) optimisation. Both the new combined indices gave optimum N norms in between the rate ofER an EG. Composted cow manure background did not affect mineral N optimisation significantly. Wesuggest optimisation of mineral N according to bi-dimensional parameters as they capture important fea-tures of production...

  8. An Inverse Robust Optimisation Approach for a Class of Vehicle Routing Problems under Uncertainty

    Directory of Open Access Journals (Sweden)

    Liang Sun

    2016-01-01

    Full Text Available There is a trade-off between the total penalty paid to customers (TPC and the total transportation cost (TTC in depot for vehicle routing problems under uncertainty (VRPU. The trade-off refers to the fact that the TTC in depot inevitably increases when the TPC decreases and vice versa. With respect to this issue, the vehicle routing problem (VRP with uncertain customer demand and travel time was studied to optimise the TPC and the TTC in depot. In addition, an inverse robust optimisation approach was proposed to solve this kind of VRPU by combining the ideas of inverse optimisation and robust optimisation so as to improve both the TPC and the TTC in depot. The method aimed to improve the corresponding TTC of the robust optimisation solution under the minimum TPC through minimising the adjustment of benchmark road transportation cost. According to the characteristics of the inverse robust optimisation model, a genetic algorithm (GA and column generation algorithm are combined to solve the problem. Moreover, 39 test problems are solved by using an inverse robust optimisation approach: the results show that both the TPC and TTC obtained by using the inverse robust optimisation approach are less than those calculated using a robust optimisation approach.

  9. Challenges of additive manufacturing technologies from an optimisation perspective

    Directory of Open Access Journals (Sweden)

    Guessasma Sofiane

    2015-01-01

    Full Text Available Three-dimensional printing offers varied possibilities of design that can be bridged to optimisation tools. In this review paper, a critical opinion on optimal design is delivered to show limits, benefits and ways of improvement in additive manufacturing. This review emphasises on design constrains related to additive manufacturing and differences that may appear between virtual and real design. These differences are explored based on 3D imaging techniques that are intended to show defect related processing. Guidelines of safe use of the term “optimal design” are derived based on 3D structural information.

  10. Semiconductor lasers as integrated optical biosensors: sensitivity optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Coote, J; Sweeney, S J [Advanced Technology Institute, University of Surrey, Guildford, UK GU2 7XH (United Kingdom)

    2007-07-15

    Semiconductor lasers contain both a light source and waveguide, rendering them suitable for adaptation to evanescent field biosensing. One-dimensional simulations using the beam propagation method have been carried out for planar semiconductor waveguide structures, with a view to maximising sensitivity of the effective index to changes in the refractive index and thickness of a film on the waveguide surface. Various structural parameters are investigated and it is found that thinning the upper cladding layer maximises the sensitivity. Implications for laser operation are considered, and an optimised structure is proposed. Surface layer index and thickness resolutions of 0.2 and 2nm are predicted.

  11. Review of magnesium hydride-based materials: development and optimisation

    Science.gov (United States)

    Crivello, J.-C.; Dam, B.; Denys, R. V.; Dornheim, M.; Grant, D. M.; Huot, J.; Jensen, T. R.; de Jongh, P.; Latroche, M.; Milanese, C.; Milčius, D.; Walker, G. S.; Webb, C. J.; Zlotea, C.; Yartys, V. A.

    2016-02-01

    Magnesium hydride has been studied extensively for applications as a hydrogen storage material owing to the favourable cost and high gravimetric and volumetric hydrogen densities. However, its high enthalpy of decomposition necessitates high working temperatures for hydrogen desorption while the slow rates for some processes such as hydrogen diffusion through the bulk create challenges for large-scale implementation. The present paper reviews fundamentals of the Mg-H system and looks at the recent advances in the optimisation of magnesium hydride as a hydrogen storage material through the use of catalytic additives, incorporation of defects and an understanding of the rate-limiting processes during absorption and desorption.

  12. Adaptive optimisation of a generalised phase contrast beam shaping system

    Science.gov (United States)

    Kenny, F.; Choi, F. S.; Glückstad, J.; Booth, M. J.

    2015-05-01

    The generalised phase contrast (GPC) method provides versatile and efficient light shaping for a range of applications. We have implemented a generalised phase contrast system that used two passes on a single spatial light modulator (SLM). Both the pupil phase distribution and the phase contrast filter were generated by the SLM. This provided extra flexibility and control over the parameters of the system including the phase step magnitude, shape, radius and position of the filter. A feedback method for the on-line optimisation of these properties was also developed. Using feedback from images of the generated light field, it was possible to dynamically adjust the phase filter parameters to provide optimum contrast.

  13. Optimisation of biodiesel production by sunflower oil transesterification.

    Science.gov (United States)

    Antolín, G; Tinaut, F V; Briceño, Y; Castaño, V; Pérez, C; Ramírez, A I

    2002-06-01

    In this work the transformation process of sunflower oil in order to obtain biodiesel by means of transesterification was studied. Taguchi's methodology was chosen for the optimisation of the most important variables (temperature conditions, reactants proportion and methods of purification), with the purpose of obtaining a high quality biodiesel that fulfils the European pre-legislation with the maximum process yield. Finally, sunflower methyl esters were characterised to test their properties as fuels in diesel engines, such as viscosity, flash point, cold filter plugging point and acid value. Results showed that biodiesel obtained under the optimum conditions is an excellent substitute for fossil fuels.

  14. An optimisation framework for determination of capacity in railway networks

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup

    2015-01-01

    Within the railway industry, high quality estimates on railway capacity is crucial information, that helps railway companies to utilise the expensive (infrastructure) resources as efficiently as possible. This paper therefore proposes an optimisation framework to estimate the capacity of a railway...... to the train types. This is done using a mathematical model which is solved with a heuristic. The developed approach is used on a case network to obtain the capacity of the given railway system. Furthermore, we test different parameters to explore computation time, precision and sensitivity to input...

  15. Word Sense Disambiguation using Optimised Combinations of Knowledge Sources

    CERN Document Server

    Wilks, Y A; Wilks, Yorick; Stevenson, Mark

    1998-01-01

    Word sense disambiguation algorithms, with few exceptions, have made use of only one lexical knowledge source. We describe a system which performs unrestricted word sense disambiguation (on all content words in free text) by combining different knowledge sources: semantic preferences, dictionary definitions and subject/domain codes along with part-of-speech tags. The usefulness of these sources is optimised by means of a learning algorithm. We also describe the creation of a new sense tagged corpus by combining existing resources. Tested accuracy of our approach on this corpus exceeds 92%, demonstrating the viability of all-word disambiguation rather than restricting oneself to a small sample.

  16. Optimising performance in steady state for a supermarket refrigeration system

    DEFF Research Database (Denmark)

    Green, Torben; Kinnaert, Michel; Razavi-Far, Roozbeh

    2012-01-01

    Using a supermarket refrigeration system as an illustrative example, the paper postulates that by appropriately utilising knowledge of plant operation, the plant wide performance can be optimised based on a small set of variables. Focusing on steady state operations, the total system performance...... is shown to predominantly be influenced by the suction pressure. Employing appropriate performance function leads to conclusions on the choice of set-point for the suction pressure that are contrary to the existing practice. Analysis of the resulting data leads to a simple method for finding optimal...

  17. Optimisation is at the heart of the operation.

    Science.gov (United States)

    Jones, Darren

    2013-11-01

    In our other article based around operating theatres in this issue of HEJ (see pages 64-72), we examine how some of the latest technology is benefiting users, but in this article--with all areas of the NHS charged with reducing energy consumption and cutting carbon emissions--Darren Jones, MD at carbon and energy management specialist, Low Carbon Europe, takes a detailed look, with the help of a 'real-life' case study based on recent experience at London's Heart Hospital, at operating theatre optimisation and HTM 03-01 audits.

  18. Optimised quantum hacking of superconducting nanowire single-photon detectors

    CERN Document Server

    Tanner, Michael G; Hadfield, Robert H

    2013-01-01

    We explore optimised control of superconducting nanowire single-photon detectors (SNSPDs) through bright illumination. We consider the behaviour of the SNSPD in the shunted configuration (a practical measure to avoid latching) in long-running quantum key distribution experiments. We propose and demonstrate an effective bright-light attack on this realistic configuration, by applying transient blinding illumination lasting for a fraction of a microsecond and producing several deterministic fake clicks during this time. We show that this attack does not lead to elevated timing jitter in the spoofed output pulse, and is hence not introducing significant errors. Five different SNSPD chip designs were tested. We consider possible countermeasures to this attack.

  19. Operation, optimisation, and performance of the DELPHI RICH detectors

    CERN Document Server

    Albrecht, E; Augustinus, A; Baillon, Paul; Battaglia, Marco; Bloch, D; Boudinov, E; Brunet, J M; Carrié, P; Cavalli, P; Christophel, E; Davenport, M; Dracos, M; Eklund, L; Erzen, B; Fischer, P A; Fokitis, E; Fontanelli, F; Gracco, Valerio; Hallgren, A; Joram, C; Juillot, P; Kjaer, N J; Kluit, P M; Lenzen, G; Liko, D; Mahon, J R; Maltezos, S; Markou, A; Neufeld, N; Nielsen, B S; Petrolini, A; Podobnik, T; Polok, G; Sajot, G; Sannino, M; Schyns, E; Strub, R; Tegenfeldt, F; Thadome, J; Tristram, G; Ullaland, O; Vulpen, I V

    1999-01-01

    The Ring Imaging Cherenkov detectors of DELPHI represent a large-scale particle identification system which covers almost the full angular acceptance of DELPHI. The combination of liquid and gas radiators (C sub 4 F sub 1 sub 0 , C sub 5 F sub 1 sub 2 , and C sub 6 F sub 1 sub 4) provides particle identification over the whole secondary particle momentum spectrum at LEP I and LEP II. Continuing optimisation on the hardware as well as on the online and offline software level have resulted in a stable operation of the complete detector system for more than five years at full physics performance.

  20. Transmitter antenna placement in indoor environments using particle swarm optimisation

    Science.gov (United States)

    Talepour, Zeinab; Tavakoli, Saeed; Ahmadi-Shokouh, Javad

    2013-07-01

    The aim of this article is to suitably locate the minimum number of transmitter antennas in a given indoor environment to achieve good propagation coverage. To calculate the electromagnetic field in various points of the environment, we develop a software engine, named ray-tracing engine (RTE), in Matlab. To achieve realistic calculations, all parameters of geometry and material of building are considered. Particle swarm optimisation is employed to determine good location of transmitters. Simulation results show that a full coverage is obtained through suitably locating three transmitters.

  1. Optimising the Target and Capture Sections of the Neutrino Factory

    OpenAIRE

    Hansen, Ole Martin

    2016-01-01

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accel...

  2. Design and manufacture of Portland cement - application of sensitivity analysis in exploration and optimisation Part II. Optimisation

    DEFF Research Database (Denmark)

    Svinning, K.; Høskuldsson, Agnar

    2006-01-01

    A program for a model-based optimisation has been developed. The program contains two subprograms. The first one does minimising or maximising constrained by one original PLS-component or one equal to a combination of several. The second one does searching for the optimal combination of PLS......-components, which gives max or min y. The program has proved to be applicable for achieving realistic results for implementation in the design of Portland cement with respect to performance and in the quality control during production....

  3. Optimisation of surface passivation for highly reliable angular AMR sensors

    Energy Technology Data Exchange (ETDEWEB)

    Isler, M.; Christoffer, B.; Schoer, G.; Philippsen, B.; Mahnke, M.; Thorns, A.; Riethmueller, W.; Matz, H.; Tobescu, C. [NXP Semiconductors, Stresemannallee 101, 22529 Hamburg (Germany); Vanhelmont, F.; Wolters, R. [NXP Semiconductors, High Tech Campus 4, 5656 AE Eindhoven (Netherlands); Engelen, R.; Opran, A. [Philips Applied Technologies, High Tech Campus 7, 5656 AE Eindhoven (Netherlands); Stolk, P. [NXP Semiconductors, Gerstweg 2, 6534 AE Nijmegen (Netherlands)

    2010-02-15

    For state-of-the-art angular sensors based on the anisotropic magnetoresistive (AMR) effect in NiFe layers, the angular accuracy over time is limited by a drift of the offset voltage of the Wheatstone bridge configuration. It is shown that the interaction of the passivation layer and the magnetic permalloy is crucial for the drift of the offset voltage. By investigating the time and temperature dependence, the offset drift is attributed to stress relief of the PECVD passivation layer due to microstructural changes. Hydrogen outdiffusion from the passivation layer is involved in the observed stress evolution. It is demonstrated that optimising the passivation layer composition as well as the time of the subsequent annealing is beneficial for stress stabilisation of the permalloy-passivation layer system. With this optimised passivation layer a significant offset drift reduction of the NiFe Wheatstone bridge has been achieved resulting in highly accurate and long-term stable angular AMR sensors. (copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  4. An optimisation approach for shallow lake restoration through macrophyte management

    Science.gov (United States)

    Xu, Z. H.; Yin, X. A.; Yang, Z. F.

    2014-06-01

    Lake eutrophication is a serious global environmental issue. Phytoremediation is a promising, cost-effective, and environmentally friendly technology for water quality restoration. However, besides nutrient removal, macrophytes also deeply affect the hydrologic cycle of a lake system through evapotranspiration. Changes in hydrologic cycle caused by macrophytes have a great influence on lake water quality restoration. As a result of the two opposite effects of macrophytes on water quality restoration (i.e. an increase in macrophytes can increase nutrient removal and improve water quality while also increasing evapotranspiration, reducing water volume and consequently decreasing water quality), rational macrophyte control through planting and harvest is very important. In this study, a new approach is proposed to optimise the initial planting area and monthly harvest scheme of macrophytes for water quality restoration. The month-by-month effects of macrophyte management on lake water quality are considered. Baiyangdian Lake serves as a case study, using the common reed. It was found that water quality was closest to Grade III on the Chinese water quality scale when the reed planting area was 123 km2 (40% of the lake surface area) and most reeds would be harvested at the end of June. The optimisation approach proposed in this study will be a useful reference for lake restoration.

  5. An optimisation approach for shallow lake restoration through macrophyte management

    Directory of Open Access Journals (Sweden)

    Z. H. Xu

    2014-01-01

    Full Text Available Lake eutrophication is a serious global environmental issue. Phytoremediation is a promising, cost-effective, and environmentally friendly technology for water quality restoration. However, besides nutrient removal, macrophytes also deeply affect the hydrologic cycle of lake system through evapotranspiration. Changes in hydrologic cycle caused by macrophytes have a great influence on lake water quality restoration. As a result of the two opposite effects of macrophytes on water quality restoration (i.e. an increase in macrophytes can increase nutrient removal and improve water quality while also increasing evapotranspiration, reducing water volume and consequently decreasing water quality, rational macrophyte control through planting and harvest is very important. In this study, a new approach is proposed to optimise the initial planting area and monthly harvest scheme of macrophytes for water quality restoration. The month-by-month effects of macrophyte management on lake water quality are considered. Baiyangdian Lake serves as a case study, using the common reed. It was found that water quality was closest to Grade III on the Chinese water quality scale when the reed planting area was 123 km2 (40% of the lake surface area and most reeds would be harvested at the end of June. The optimisation approach proposed in this study will be a useful reference for lake restoration.

  6. Optimisation of Storage for Concentrated Solar Power Plants

    Directory of Open Access Journals (Sweden)

    Luigi Cirocco

    2014-12-01

    Full Text Available The proliferation of non-scheduled generation from renewable electrical energy sources such concentrated solar power (CSP presents a need for enabling scheduled generation by incorporating energy storage; either via directly coupled Thermal Energy Storage (TES or Electrical Storage Systems (ESS distributed within the electrical network or grid. The challenges for 100% renewable energy generation are: to minimise capitalisation cost and to maximise energy dispatch capacity. The aims of this review article are twofold: to review storage technologies and to survey the most appropriate optimisation techniques to determine optimal operation and size of storage of a system to operate in the Australian National Energy Market (NEM. Storage technologies are reviewed to establish indicative characterisations of energy density, conversion efficiency, charge/discharge rates and costings. A partitioning of optimisation techniques based on methods most appropriate for various time scales is performed: from “whole of year”, seasonal, monthly, weekly and daily averaging to those best suited matching the NEM bid timing of five minute dispatch bidding, averaged on the half hour as the trading settlement spot price. Finally, a selection of the most promising research directions and methods to determine the optimal operation and sizing of storage for renewables in the grid is presented.

  7. Nuclear power plant maintenance optimisation SENUF network activity

    Energy Technology Data Exchange (ETDEWEB)

    Ahlstrand, R.; Bieth, M.; Pla, P.; Rieg, C.; Trampus, P. [Inst. for Energy, EC DG Joint Research Centre, Petten (Netherlands)

    2004-07-01

    During providing scientific and technical support to TACIS and PHARE nuclear safety programs a large amount of knowledge related to Russian design reactor systems has accumulated and led to creation of a new Network concerning Nuclear Safety in Central and Eastern Europe called ''Safety of Eastern European type Nuclear Facilities'' (SENUF). SENUF contributes to bring together all stakeholders of TACIS and PHARE: beneficiaries, end users, Eastern und Western nuclear industries, and thus, to favour fruitful technical exchanges and feedback of experience. At present the main focus of SENUF is the nuclear power plant maintenance as substantial element of plant operational safety as well as life management. A Working Group has been established on plant maintenance. One of its major tasks in 2004 is to prepare a status report on advanced strategies to optimise maintenance. Optimisation projects have an interface with the plant's overall life management program. Today, almost all plants involved in SENUF network have an explicit policy to extend their service life, thus, component ageing management, modernization and refurbishment actions became much more important. A database is also under development, which intends to help sharing the available knowledge and specific equipment and tools. (orig.)

  8. Computed tomography dose optimisation in cystic fibrosis:A review

    Institute of Scientific and Technical Information of China (English)

    Helena Ferris; Maria Twomey; Fiachra Moloney; Siobhan B O’Neill; Kevin Murphy; Owen J O’Connor; Michael Maher

    2016-01-01

    Cystic fibrosis(CF)is the most common autosomal recessive disease of the Caucasian population worldwide,with respiratory disease remaining the most relevant source of morbidity and mortality.Computed tomography(CT)is frequently used for monitoring disease complications and progression.Over the last fifteen years there has been a six-fold increase in the use of CT,which has lead to a growing concern in relation to cumulative radiation exposure.The challenge to the medical profession is to identify dose reduction strategies that meet acceptable image quality,but fulfil the requirements of a diagnostic quality CT.Dose-optimisation,particularly in CT,is essential as it reduces the chances of patients receiving cumulative radiation doses in excess of 100 m Sv,a dose deemed significant by the United Nations Scientific Committee on the Effects of Atomic Radiation.This review article explores the current trends in imaging in CF with particular emphasis on new developments in dose optimisation.

  9. Power and delay optimisation in multi-hop wireless networks

    KAUST Repository

    Xia, Li

    2014-02-05

    In this paper, we study the optimisation problem of transmission power and delay in a multi-hop wireless network consisting of multiple nodes. The goal is to determine the optimal policy of transmission rates at various buffer and channel states in order to minimise the power consumption and the queueing delay of the whole network. With the assumptions of interference-free links and independently and identically distributed (i.i.d.) channel states, we formulate this problem using a semi-open Jackson network model for data transmission and a Markov model for channel states transition. We derive a difference equation of the system performance under any two different policies. The necessary and sufficient condition of optimal policy is obtained. We also prove that the system performance is monotonic with respect to (w.r.t.) the transmission rate and the optimal transmission rate can be either maximal or minimal. That is, the ‘bang-bang’ control is an optimal control. This optimality structure greatly reduces the problem complexity. Furthermore, we develop an iterative algorithm to find the optimal solution. Finally, we conduct the simulation experiments to demonstrate the effectiveness of our approach. We hope our work can shed some insights on solving this complicated optimisation problem.

  10. Merchandise and Replenishment Planning Optimisation for Fashion Retail

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-08-01

    Full Text Available The integration among different companies functions, collaborative planning and the elaboration of focused distribution plans are critical to the success of each kind of company working in the complex retail sector. In this contest, the present work proposes the description of a model able to support coordinated strategic choices continually made by Supply Chain (SC actors. The final objective is achievement of the full optimisation of Merchandise & Replenishment Planning phases, identifying the right replenishment quantities and periods. To test the proposed model’s effectiveness, it was applied to an important Italian fashion company in the complex field of fast-fashion, a sector in which promptness is a main competitive leverage and, therefore, the planning cannot exclude the time variable. The passage from a total push strategy, currently used by the company, to a push-pull one, suggested by the model, allowed us not only to estimate a reduction in goods quantities to purchase at the beginning of a sales period (with considerable economic savings, but also elaborate a focused replenishment plan that permits reduction and optimisation of departures from network warehouses to Points of Sale (POS.

  11. Optimising Performance of a Cantilever-type Micro Accelerometer Sensor

    Directory of Open Access Journals (Sweden)

    B.P. Joshi

    2007-05-01

    Full Text Available A technique for optimising performance of cantilever-type micro acceleration sensor hasbeen developed. Performance of a sensor is judged mainly by its sensitivity and bandwidth.Maximising product of these two important parameters of inertial sensors helps to optimise thesensor performance. It is observed that placement of a lumped mass (add-mass on the sensor'sproof-mass helps to control both sensitivity and the first resonant frequency of the cantileverstructure to the designer's choice. Simulation and modelling of various dimensions of rectangularstructures for acceleration sensor with this novel add-mass technique are discussed. CoventorwareMEMSCAD has been used to model, simulate, and carry out FEM analysis. A simple analyticalmodel is discussed to elaborate the mechanics of cantilever-type micro accelerometer. Thecomparison of the results obtained from analytical model and the finite element simulations revealthese to be in good agreement. The advantages of this technique for choosing the two mostimportant sensor parameters (i.e., sensitivity and bandwidth of an inertial sensor are brought out.

  12. Tanlock loop noise reduction using an optimised phase detector

    Science.gov (United States)

    Al-kharji Al-Ali, Omar; Anani, Nader; Al-Qutayri, Mahmoud; Al-Araji, Saleh

    2013-06-01

    This article proposes a time-delay digital tanlock loop (TDTL), which uses a new phase detector (PD) design that is optimised for noise reduction making it amenable for applications that require wide lock range without sacrificing the level of noise immunity. The proposed system uses an improved phase detector design which uses two phase detectors; one PD is used to optimise the noise immunity whilst the other is used to control the acquisition time of the TDTL system. Using the modified phase detector it is possible to reduce the second- and higher-order harmonics by at least 50% compared with the conventional TDTL system. The proposed system was simulated and tested using MATLAB/Simulink using frequency step inputs and inputs corrupted with varying levels of harmonic distortion. A hardware prototype of the system was implemented using a field programmable gate array (FPGA). The practical and simulation results indicate considerable improvement in the noise performance of the proposed system over the conventional TDTL architecture.

  13. Water distribution systems design optimisation using metaheuristics and hyperheuristics

    Directory of Open Access Journals (Sweden)

    DN Raad

    2011-06-01

    Full Text Available The topic of multi-objective water distribution systems (WDS design optimisation using metaheuristics is investigated, comparing numerous modern metaheuristics, including sev- eral multi-objective evolutionary algorithms, an estimation of distribution algorithm and a recent hyperheuristic named AMALGAM (an evolutionary framework for the simultaneous incorporation of multiple metaheuristics, in order to determine which approach is most capa- ble with respect to WDS design optimisation. Novel metaheuristics and variants of existing algorithms are developed, for a total of twenty-three algorithms examined. Testing with re- spect to eight small-to-large-sized WDS benchmarks from the literature reveal that the four top-performing algorithms are mutually non-dominated with respect to the various perfor- mance metrics used. These algorithms are NSGA-II, TAMALGAMJndu , TAMALGAMndu and AMALGAMSndp (the last three being novel variants of AMALGAM. However, when these four algorithms are applied to the design of a very large real-world benchmark, the AMALGAM paradigm outperforms NSGA-II convincingly, with AMALGAMSndp exhibiting the best performance overall.

  14. Track gauge optimisation of railway switches using a genetic algorithm

    Science.gov (United States)

    Pålsson, Björn A.; Nielsen, Jens C. O.

    2012-01-01

    A methodology for the optimisation of a prescribed track gauge variation (gauge widening) in the switch panel of a railway turnout (switch and crossing, S&C) is presented. The aim is to reduce rail profile degradation. A holistic approach is applied, where both routes and travel directions (moves) of traffic in the switch panel are considered simultaneously. The problem is formulated as a multi-objective minimisation problem which is solved using a genetic-type optimisation algorithm which provides a set of Pareto optimal solutions. The dynamic vehicle-turnout interaction is evaluated using a multi-body simulation tool and the energy dissipation in the wheel-rail contacts is used for the assessment of gauge parameters. Two different vehicle models are used, one freight car and one passenger train set, and a stochastic spread in wheel profile and wheel-rail friction coefficient is accounted for. It is found that gauge configurations with a large gauge-widening amplitude for the stock rail on the field side are optimal for both the through and diverging routes, while the results for the gauge side show a larger route dependence. The optimal gauge configurations are observed to be similar for both vehicle types.

  15. Fractures in sport: Optimising their management and outcome.

    Science.gov (United States)

    Robertson, Greg Aj; Wood, Alexander M

    2015-12-18

    Fractures in sport are a specialised cohort of fracture injuries, occurring in a high functioning population, in which the goals are rapid restoration of function and return to play with the minimal symptom profile possible. While the general principles of fracture management, namely accurate fracture reduction, appropriate immobilisation and timely rehabilitation, guide the treatment of these injuries, management of fractures in athletic populations can differ significantly from those in the general population, due to the need to facilitate a rapid return to high demand activities. However, despite fractures comprising up to 10% of all of sporting injuries, dedicated research into the management and outcome of sport-related fractures is limited. In order to assess the optimal methods of treating such injuries, and so allow optimisation of their outcome, the evidence for the management of each specific sport-related fracture type requires assessment and analysis. We present and review the current evidence directing management of fractures in athletes with an aim to promote valid innovative methods and optimise the outcome of such injuries. From this, key recommendations are provided for the management of the common fracture types seen in the athlete. Six case reports are also presented to illustrate the management planning and application of sport-focussed fracture management in the clinical setting.

  16. Optimising Gas Quenching Technology through Modelling of Heat Transfer

    Institute of Scientific and Technical Information of China (English)

    Florent Chaffotte; Linda L(e)fevre; Didier Domergue; Aymeric Goldsteinas; Xavier Doussot; Qingfei Zhang

    2004-01-01

    Gas Quenching represents an environmentally friendly alternative to more commonly-used oil quenching. Yet,the performances of this technology remain limited in terms of cooling rates reached compared to oil quenching. Distortion and process homogeneity also have to be controlled carefully. The efficiency of the gas quenching process fully depends on the heat transfer between the gas and the quenched parts. The goal of this study is the optimisation of the gas quenching process efficiency through a better understanding of the heat transfer phenomena involved. The study has been performed with modelling means and validated by an experimental approach. The configuration of the gas flow has a major influence on the heat transfer phenomena between the gas and the parts. The fluid dynamics modelling approach performed in this study allows to optimise the heat transfer phenomena. New gas quenching processes allowing enhanced gas quenching performance through higher cooling rates can be thereby identified. The new solutions have been validated in experimental and industrial conditions. Results obtained allow to expect significant improvement of high pressure gas quenching technology.

  17. Microstructure stability: Optimisation of 263 Ni-based superalloy

    Directory of Open Access Journals (Sweden)

    Crozet Coraline

    2014-01-01

    Full Text Available To reduce CO2 emissions on coal-fired power plant, A-ultra supercritical (A-USC power plant whose steam conditions exceed 700 °C are being developed. At these elevated temperatures, the use of Ni-base superalloys becomes necessary. In this context and within the European project NextGenPower, focus is made on commercial Nimonic C-263 as a candidate material for turbine rotors. Nimonic C-263 is known to have low sensitivity to segregation, high workability and high weldability which are major properties for the manufacture of large shafts. Long-term creep strength is also required for this application and unfortunately Nimonic C-263 shows η-phase precipitation after long-time exposure between 700 °C–900 °C which is detrimental for long-term creep properties. The composition of Nimonic C-263 was thus optimised to overcome the formation of η-phase. Trial tests were made in order to study the effect of hardening contribution elements on microstructural and mechanical properties. Then, a 500 mm diameter forged rotor was made from optimised 263 alloy and shows promising properties.

  18. Optimisation of a large WWTP thanks to mathematical modelling.

    Science.gov (United States)

    Printemps, C; Baudin, A; Dormoy, T; Zug, M; Vanrolleghem, P A

    2004-01-01

    Better controlling and optimising the plant's processes has become a priority for WWTP (Wastewater Treatment Plant) managers. The main objective of this project is to develop a simplified mathematical tool able to reproduce and anticipate the behaviour of the Tougas WWTP (Nantes, France). This tool is aimed to be used directly by the managers of the site. The mathematical WWTP model was created using the software WEST. This paper describes the studied site and the modelling results obtained during the stage of the model calibration and validation. The good simulation results have allowed to show that despite a first very simple description of the WWTP, the model was able to correctly predict the nitrogen composition (ammonia and nitrate) of the effluent and the daily sludge extraction. Then, a second more detailed configuration of the WWTP was implemented. It has allowed to independently study the behaviour of each of four biological trains. Once this first stage will be completely achieved, the remainder of the study will focus on the operational use of a simplified simulator with the purpose of optimising the Tougas WWTP operation.

  19. Dynamic Programming and Genetic Algorithm for Business Processes Optimisation

    Directory of Open Access Journals (Sweden)

    Mateusz Wibig

    2012-12-01

    Full Text Available There are many business process modelling techniques, which allow to capture features of those processes, but graphical, diagrammatic models seems to be used most in companies and organizations. Although the modelling notations are more and more mature and can be used not only to visualise the process idea but also to implement it in the workflow solution and although modern software allows us to gather a lot of data for analysis purposes, there is still not much commercial used business process optimisation methods. In this paper the scheduling / optimisation method for automatic task scheduling in business processes models is described. The Petri Net model is used, but it can be easily applied to any other modelling notation, where the process is presented as a set of tasks, i.e. BPMN (Business Process Modelling Notation. The method uses Petri Nets’, business processes’ scalability and dynamic programming concept to reduce the necessary computations, by revising only those parts of the model, to which the change was applied.

  20. Radiation protection optimisation techniques and their application in industry

    Energy Technology Data Exchange (ETDEWEB)

    Lefaure, C

    1996-12-31

    Since the International Commission on Radiation Protection (ICRP) recommendation 60, the optimisation principle appears to be the core of the radiation protection system. In practice applying it, means implementing an approach both predictive and evolutionary - that relies essentially on a prudent and responsible state of mind. the formal expression of this process, called optimization procedure, implies and indispensable tool for its implementation: the system of monetary values for the unit of collective dose. During the last few years, feed ALARA principle means that a global work management approach must be adopted, considering together all factors contributing to radiation dose. In the nuclear field, the ALARA approach appears to be more successful when implemented in the framework of a managerial approach through structure ALARA programmes. Outside the nuclear industry it is necessary to clearly define priorities through generic optimisation studies and ALARA audits. At the international level much efforts remain to be done to expand efficiently the ALARA process to internal exposure as well as to public exposure. (author) 2 graphs, 5 figs., 3 tabs.

  1. A Synchronous-Asynchronous Particle Swarm Optimisation Algorithm

    Directory of Open Access Journals (Sweden)

    Nor Azlina Ab Aziz

    2014-01-01

    Full Text Available In the original particle swarm optimisation (PSO algorithm, the particles’ velocities and positions are updated after the whole swarm performance is evaluated. This algorithm is also known as synchronous PSO (S-PSO. The strength of this update method is in the exploitation of the information. Asynchronous update PSO (A-PSO has been proposed as an alternative to S-PSO. A particle in A-PSO updates its velocity and position as soon as its own performance has been evaluated. Hence, particles are updated using partial information, leading to stronger exploration. In this paper, we attempt to improve PSO by merging both update methods to utilise the strengths of both methods. The proposed synchronous-asynchronous PSO (SA-PSO algorithm divides the particles into smaller groups. The best member of a group and the swarm’s best are chosen to lead the search. Members within a group are updated synchronously, while the groups themselves are asynchronously updated. Five well-known unimodal functions, four multimodal functions, and a real world optimisation problem are used to study the performance of SA-PSO, which is compared with the performances of S-PSO and A-PSO. The results are statistically analysed and show that the proposed SA-PSO has performed consistently well.

  2. Optimisation of Transmission Systems by use of Phase Shifting Transformers

    Energy Technology Data Exchange (ETDEWEB)

    Verboomen, J.

    2008-10-13

    In this thesis, transmission grids with PSTs (Phase Shifting Transformers) are investigated. In particular, the following goals are put forward: (a) The analysis and quantification of the impact of a PST on a meshed grid. This includes the development of models for the device; (b) The development of methods to obtain optimal coordination of several PSTs in a meshed grid. An objective function should be formulated, and an optimisation method must be adopted to solve the problem; and (c) The investigation of different strategies to use a PST. Chapter 2 gives a short overview of active power flow controlling devices. In chapter 3, a first step towards optimal PST coordination is taken. In chapter 4, metaheuristic optimisation methods are discussed. Chapter 5 introduces DC load flow approximations, leading to analytically closed equations that describe the relation between PST settings and active power flows. In chapter 6, some applications of the methods that are developed in earlier chapters are presented. Chapter 7 contains the conclusions of this thesis, as well as recommendations for future work.

  3. Optimising Gas Quenching Technology through Modelling of Heat Transfer

    Institute of Scientific and Technical Information of China (English)

    FiorentChaffotte; LindaLefevre; DidierDomergue; AymericGoidsteinas; XavierDoussot; QingfeiZhang

    2004-01-01

    Gas Quenching represents an environmentally friendly alternative to more commonly-used oil quenching. Yet,the performances of this technology remain limited in terms of cooling rates reached compared to oil quenching. Distortion and process homogeneity also have to be controlled carefully. The efficiency of the gas quenching process fully depends on the heat transfer between the gas and the quenched parts. The goal of this study is the optimisation of the gas quenching process efficiency through a better understanding of the heat transfer phenomena involved. The study has been performed with modelling means and validated by an experimental approach. ThE configuration of the gas flow has a major influence on the heat transfer phenomena between the gas and the parts. The fluid dynamics modelling approach performed in this study allows to optimise the heat transfer phenomena. New gas quenching processes allowing enhanced gas quenching performance through higher cooling rates can be thereby identified. The new solutions have been validated in experimental and industrial conditions. Results obtained allow to expect significant improvement of high pressure gas quenching technology.

  4. Multigrid Implementation of Cellular Automata for Topology Optimisation of Continuum Structures with Design Dependent loads

    NARCIS (Netherlands)

    Zakhama, R.

    2009-01-01

    Topology optimisation of continuum structures has become mature enough to be often applied in industry and continues to attract the attention of researchers and software companies in various engineering fields. Traditionally, most available algorithms for solving topology optimisation problems are b

  5. SCOOP: A Tool for SymboliC Optimisations Of Probabilistic Processes

    NARCIS (Netherlands)

    Timmer, Mark; Palamidessi, C.; Riska, A.

    2011-01-01

    This paper presents SCOOP: a tool that symbolically optimises process-algebraic specifications of probabilistic processes. It takes specifications in the prCRL language (combining data and probabilities), which are linearised first to an intermediate format: the LPPE. On this format, optimisations s

  6. Asynchronous Adaptive Optimisation for Generic Data-Parallel Array Programming and Beyond

    NARCIS (Netherlands)

    Grelck, C.

    2011-01-01

    We present the concept of an adaptive compiler optimisation framework for the functional array programming language SaC, Single Assignment C. SaC advocates shape- and rank-generic programming with multidimensional arrays. A sophisticated, highly optimising compiler technology nonetheless achieves co

  7. Asynchronous Adaptive Optimisation for Generic Data-Parallel Array Programming and Beyond

    NARCIS (Netherlands)

    Grelck, C.

    2011-01-01

    We present the concept of an adaptive compiler optimisation framework for the functional array programming language SaC, Single Assignment C. SaC advocates shape- and rank-generic programming with multidimensional arrays. A sophisticated, highly optimising compiler technology nonetheless achieves co

  8. Optimisation of the $VH$ to $b\\bar{b} + X$ Selection

    CERN Document Server

    Wilk, Fabian

    2013-01-01

    This report presents results from two separated, yet related studies both using truth data: First a cut optimisation study and its results are presented. This study aims to provide a numerically optimised set of cuts for the current $8\\, \\mathrm{TeV}$ and the upcoming $14 \\, \\mathrm{TeV}$ analysis of the $WH \\to l\

  9. Optimisation of electrical system for offshore wind farms via genetic algorithm

    DEFF Research Database (Denmark)

    Chen, Zhe; Zhao, Menghua; Blaabjerg, Frede

    2009-01-01

    An optimisation platform based on genetic algorithm (GA) is presented, where the main components of a wind farm and key technical specifications are used as input parameters and the electrical system design of the wind farm is optimised in terms of both production cost and system reliability...

  10. Design of Circularly-Polarised, Crossed Drooping Dipole, Phased Array Antenna Using Genetic Algorithm Optimisation

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal

    2007-01-01

    A printed drooping dipole array is designed and constructed. The design is based on a genetic algorithm optimisation procedure used in conjunction with the software programme AWAS. By optimising the array G/T for specific combinations of scan angles and frequencies an optimum design is obtained...

  11. Optimal Energy Consumption in Refrigeration Systems - Modelling and Non-Convex Optimisation

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F. S.; Skovrup, Morten J.

    2012-01-01

    that is somewhat more efficient than general purpose optimisation algorithms for NMPC and still near to optimal. Since the non-convex cost function has multiple extrema, standard methods for optimisation cannot be directly applied. A qualitative analysis of the system's constraints is presented and a unique...

  12. Design of Circularly-Polarised, Crossed Drooping Dipole, Phased Array Antenna Using Genetic Algorithm Optimisation

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal

    2007-01-01

    A printed drooping dipole array is designed and constructed. The design is based on a genetic algorithm optimisation procedure used in conjunction with the software programme AWAS. By optimising the array G/T for specific combinations of scan angles and frequencies an optimum design is obtained...

  13. Using response surface methodology in optimisation of biodiesel production via alkali catalysed transesterification of waste cooking oil

    CSIR Research Space (South Africa)

    Naidoo, R

    2016-03-01

    Full Text Available The report focuses on optimisation of alkali catalysis as a process for producing biodiesel from waste cooking oils. Biodiesel production parameters that were optimised were methanol to oil ratio, catalyst concentration, reaction temperature...

  14. Towards an unbiased metabolic profiling of protozoan parasites : optimisation of a Leishmania sampling protocol for HILIC-orbitrap analysis

    NARCIS (Netherlands)

    t'Kindt, Ruben; Jankevics, Andris; Scheltema, Richard A.; Zheng, Liang; Watson, David G.; Dujardin, Jean-Claude; Breitling, Rainer; Coombs, Graham H.; Decuypere, Saskia; Kindt, Ruben t’

    2010-01-01

    Comparative metabolomics of Leishmania species requires the simultaneous identification and quantification of a large number of intracellular metabolites. Here, we describe the optimisation of a comprehensive metabolite extraction protocol for Leishmania parasites and the subsequent optimisation of

  15. Sustainable management of a coupled groundwater-agriculture hydrosystem using multi-criteria simulation based optimisation.

    Science.gov (United States)

    Grundmann, Jens; Schütze, Niels; Lennartz, Franz

    2013-01-01

    In this paper we present a new simulation-based integrated water management tool for sustainable water resources management in arid coastal environments. This tool delivers optimised groundwater withdrawal scenarios considering saltwater intrusion as a result of agricultural and municipal water abstraction. It also yields a substantially improved water use efficiency of irrigated agriculture. To allow for a robust and fast operation we unified process modelling with artificial intelligence tools and evolutionary optimisation techniques. The aquifer behaviour is represented using an artificial neural network (ANN) which emulates a numerical density-dependent groundwater flow model. The impact of agriculture is represented by stochastic crop water production functions (SCWPF). Simulation-based optimisation techniques together with the SCWPF and ANN deliver optimal groundwater abstraction and cropping patterns. To address contradicting objectives, e.g. profit-oriented agriculture vs. sustainable abstraction scenarios, we performed multi-objective optimisations using a multi-criteria optimisation algorithm.

  16. Simulation and optimisation modelling approach for operation of the Hoa Binh Reservoir, Vietnam

    DEFF Research Database (Denmark)

    Ngo, Long le; Madsen, Henrik; Rosbjerg, Dan

    2007-01-01

    Hoa Binh, the largest reservoir in Vietnam, plays an important role in flood control for the Red River delta and hydropower generation. Due to its multi-purpose character, conflicts and disputes in operating the reservoir have been ongoing since its construction, particularly in the flood season......, the hydro-meteorological conditions, and the time of the year. A heuristic global optimisation tool, the shuffled complex evolution (SCE) algorithm, is adopted for optimising the reservoir operation. The optimisation puts focus on the trade-off between flood control and hydropower generation for the Hoa....... This paper proposes to optimise the control strategies for the Hoa Binh reservoir operation by applying a combination of simulation and optimisation models. The control strategies are set up in the MIKE 11 simulation model to guide the releases of the reservoir system according to the current storage level...

  17. Large scale three-dimensional topology optimisation of heat sinks cooled by natural convection

    CERN Document Server

    Alexandersen, Joe; Aage, Niels

    2015-01-01

    This work presents the application of density-based topology optimisation to the design of three-dimensional heat sinks cooled by natural convection. The governing equations are the steady-state incompressible Navier-Stokes equations coupled to the thermal convection-diffusion equation through the Bousinessq approximation. The fully coupled non-linear multiphysics system is solved using stabilised trilinear equal-order finite elements in a parallel framework allowing for the optimisation of large scale problems with order of 40-330 million state degrees of freedom. The flow is assumed to be laminar and several optimised designs are presented for Grashof numbers between $10^3$ and $10^6$. Interestingly, it is observed that the number of branches in the optimised design increases with increasing Grashof numbers, which is opposite to two-dimensional optimised designs.

  18. Analysis and Optimisation of Hierarchically Scheduled Multiprocessor Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Traian; Pop, Paul; Eles, Petru;

    2008-01-01

    , they are organised in a hierarchy. In this paper, we first develop a holistic scheduling and schedulability analysis that determines the timing properties of a hierarchically scheduled system. Second, we address design problems that are characteristic to such hierarchically scheduled systems: assignment......We present an approach to the analysis and optimisation of heterogeneous multiprocessor embedded systems. The systems are heterogeneous not only in terms of hardware components, but also in terms of communication protocols and scheduling policies. When several scheduling policies share a resource...... of scheduling policies to tasks, mapping of tasks to hardware components, and the scheduling of the activities. We also present several algorithms for solving these problems. Our heuristics are able to find schedulable implementations under limited resources, achieving an efficient utilisation of the system...

  19. Optimisation of LHCb Applications for Multi- and Manycore Job Submission

    CERN Document Server

    Rauschmayr, Nathalie; Graciani Diaz, Ricardo; Charpentier, Philippe

    The Worldwide LHC Computing Grid (WLCG) is the largest Computing Grid and is used by all Large Hadron Collider experiments in order to process their recorded data. It provides approximately 400k cores and storages. Nowadays, most of the resources consist of multi- and manycore processors. Conditions at the Large Hadron Collider experiments will change and much larger workloads and jobs consuming more memory are expected in future. This has lead to a shift of paradigm which focuses on executing jobs as multiprocessor tasks in order to use multi- and manycore processors more efficiently. All experiments at CERN are currently investigating how such computing resources can be used more efficiently in terms of memory requirements and handling of concurrency. Until now, there are still many unsolved issues regarding software, scheduling, CPU accounting, task queues, which need to be solved by grid sites and experiments. This thesis develops a systematic approach to optimise the software of the LHCb experiment fo...

  20. Optimisation de contrôleurs par essaim particulaire

    OpenAIRE

    Fix, Jérémy; Geist, Matthieu

    2012-01-01

    http://cap2012.loria.fr/pub/Papers/10.pdf; National audience; Trouver des contrôleurs optimaux pour des systèmes stochastiques est un problème particulièrement difficile abordé dans les communautés d'apprentissage par renforcement et de contrôle optimal. Le paradigme classique employé pour résoudre ces problèmes est celui des processus décisionnel de Markov. Néanmoins, le problème d'optimisation qui en découle peut être difficile à résoudre. Dans ce papier, nous explorons l'utilisation de l'o...

  1. Optimised to Fail: Card Readers for Online Banking

    Science.gov (United States)

    Drimer, Saar; Murdoch, Steven J.; Anderson, Ross

    The Chip Authentication Programme (CAP) has been introduced by banks in Europe to deal with the soaring losses due to online banking fraud. A handheld reader is used together with the customer’s debit card to generate one-time codes for both login and transaction authentication. The CAP protocol is not public, and was rolled out without any public scrutiny. We reverse engineered the UK variant of card readers and smart cards and here provide the first public description of the protocol. We found numerous weaknesses that are due to design errors such as reusing authentication tokens, overloading data semantics, and failing to ensure freshness of responses. The overall strategic error was excessive optimisation. There are also policy implications. The move from signature to PIN for authorising point-of-sale transactions shifted liability from banks to customers; CAP introduces the same problem for online banking. It may also expose customers to physical harm.

  2. Optimised Iteration in Coupled Monte Carlo - Thermal-Hydraulics Calculations

    Science.gov (United States)

    Hoogenboom, J. Eduard; Dufek, Jan

    2014-06-01

    This paper describes an optimised iteration scheme for the number of neutron histories and the relaxation factor in successive iterations of coupled Monte Carlo and thermal-hydraulic reactor calculations based on the stochastic iteration method. The scheme results in an increasing number of neutron histories for the Monte Carlo calculation in successive iteration steps and a decreasing relaxation factor for the spatial power distribution to be used as input to the thermal-hydraulics calculation. The theoretical basis is discussed in detail and practical consequences of the scheme are shown, among which a nearly linear increase per iteration of the number of cycles in the Monte Carlo calculation. The scheme is demonstrated for a full PWR type fuel assembly. Results are shown for the axial power distribution during several iteration steps. A few alternative iteration method are also tested and it is concluded that the presented iteration method is near optimal.

  3. A Metric and Optimisation Scheme for Microlens Planet Searches

    CERN Document Server

    Horne, Keith; Tsapras, Yianni

    2009-01-01

    OGLE III and MOA II are discovering 600-1000 Galactic Bulge microlens events each year. This stretches the resources available for intensive follow-up monitoring of the lightcurves in search of anomalies caused by planets near the lens stars. We advocate optimizing microlens planet searches by using an automatic prioritization algorithm based on the planet detection zone area probed by each new data point. This optimization scheme takes account of the telescope and detector characteristics, observing overheads, sky conditions, and the time available for observing on each night. The predicted brightness and magnification of each microlens target is estimated by fitting to available data points. The optimisation scheme then yields a decision on which targets to observe and which to skip, and a recommended exposure time for each target, designed to maximize the planet detection capability of the observations. The optimal strategy maximizes detection of planet anomalies, and must be coupled with rapid data reduct...

  4. Optimisation of Resource Scheduling in VCIM Systems Using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Son Duy Dao

    2012-11-01

    Full Text Available The concept of Virtual Computer-Integrated Manufacturing (VCIM has been proposed for one and a half decade with purpose of overcoming the limitation of traditional Computer-Integrated Manufacturing (CIM as it only works within an enterprise. VCIM system is a promising solution for enterprises to survive in the globally competitive market because it can exploit effectively locally as well as globally distributed resources. In this paper, a Genetic Algorithm (GA based approach for optimising resource scheduling in the VCIM system is proposed. Firstly, based on the latest concept of VCIM system, a class of resource scheduling problems in the system is modelled by using agent-based approach. Secondly, GA with new strategies of handling constraint, chromosome encoding, crossover and mutation is developed to search for optimal solution for the problem. Finally, a case study is given to demonstrate the robustness of the proposed approach.

  5. Analysis and Optimisation of Pulse Dynamics for Magnetic Stimulation

    CERN Document Server

    Goetz, Stefan M; Gerhofer, Manuel G; Weyh, Thomas; Herzog, Hans-Georg

    2011-01-01

    Magnetic stimulation is a standard tool in brain research and many fields of neurology, as well as psychiatry. From a physical perspective, one key aspect of this method is the inefficiency of available setups. Whereas the spatial field properties have been studied rather intensively with coil designs, the dynamics have been neglected almost completely for a long time. Instead, the devices and their technology defined the waveform. Here, an analysis of the waveform space is performed. Based on these data, an appropriate optimisation approach is outlined which makes use of a modern nonlinear axon description of a mammalian motor nerve. The approach is based on a hybrid global-local method; different coordinate systems for describing the continuous waveforms in a limited parameter space are defined for sufficient stability. The results of the numeric setup suggest that there is plenty of room for waveforms with higher efficiency than the traditional shapes. One class of such pulses is analysed further. Although...

  6. Pre-Industry-Optimisation of the Laser Welding Process

    DEFF Research Database (Denmark)

    Gong, Hui

    This dissertation documents the investigations into on-line monitoring the CO2 laser welding process and optimising the process parameters for achieving high quality welds. The requirements for realisation of an on-line control system are, first of all, a clear understanding of the dynamic...... phenomena of the laser welding process including the behaviour of the keyhole and plume, and the correlation between the adjustable process parameters: laser power, welding speed, focal point position, gas parameters etc. and the characteristics describing the quality of the weld: seam depth and width......, porosity etc. Secondly, a reliable monitoring system for sensing the laser-induced plasma and plume emission and detecting weld defects and process parameter deviations from the optimum conditions. Finally, an efficient control system with a fast signal processor and a precise feed-back controller...

  7. Analysis and Optimisation of Carcass Production for Flexible Pipes

    DEFF Research Database (Denmark)

    Nielsen, Peter Søe

    structure that provides mechanical and collapse strength for the flexible pipe. The manufacturing process of carcass is a combination of roll forming stainless steel strips and helical winding the profiles around a mandrel interlocking the profiles with themselves. The focus of the present project...... is the analysis and optimisation of the carcass manufacturing process by means of a fundamental investigation in the fields of formability, failure modes / mechanisms, Finite Element Analysis (FEA), simulative testing and tribology. A study of failure mechanisms in carcass production is performed by being present...... strip thickness deep. Simulative tribo-testing in the strip-reduction-test showed that biodegradable rapeseed oil is an acceptable lubricant for the carcass process. Testing of two lean duplex stainless steel surfaces showed that a EN 2E brushed surface had better lubricant entrapment capabilities than...

  8. Dose evaluation from multiple detector outputs using convex optimisation.

    Science.gov (United States)

    Hashimoto, Makoto; Iimoto, Takeshi; Kosako, Toshiso

    2011-07-01

    A dose evaluation using multiple radiation detectors can be improved by the convex optimisation method. It enables flexible dose evaluation corresponding to the actual radiation energy spectrum. An application to the neutron ambient dose equivalent evaluation is investigated using a mixed-gas proportional counter. The convex derives the certain neutron ambient dose with certain width corresponding to the true neutron energy spectrum. The range of the evaluated dose is comparable to the error of conventional neutron dose measurement equipments. An application to the neutron individual dose equivalent measurement is also investigated. Convexes of particular dosemeter combinations evaluate the individual dose equivalent better than the dose evaluation of a single dosemeter. The combinations of dosemeters with high orthogonality of their response characteristics tend to provide a good suitability for dose evaluation.

  9. Optimisation of geometrical ratchets for spin-current amplification

    Energy Technology Data Exchange (ETDEWEB)

    Abdullah, Ranjdar M. [Department of Electronics, University of York, Heslington, York YO10 5DD (United Kingdom); Vick, Andrew J. [Department of Electronics, University of York, Heslington, York YO10 5DD (United Kingdom); Department of Physics, University of York, York YO10 5DD (United Kingdom); Murphy, Benedict A. [Department of Physics, University of York, York YO10 5DD (United Kingdom); Hirohata, Atsufumi, E-mail: atsufumi.hirohata@york.ac.uk [Department of Electronics, University of York, Heslington, York YO10 5DD (United Kingdom); PRESTO, Japan Science and Technology Agency, Kawaguchi 332-0012 (Japan)

    2015-05-07

    A two-dimensional model is used to study the geometrical effects of a nonmagnetic (NM) nanowire upon a spin-polarised electron current in a lateral spin-valve structure. We found that the implemented ratchet shapes at the centre of the NM have a crucial effect on the diffusive rate for up- and down-spin electrons along the wire, which leads to the amplification of non-local spin-current signals. By using our simple model, the geometries have been optimised. The calculated spin-current signals are in good qualitative agreement with our recent experimental results [Abdullah et al., J. Phys. D: Appl. Phys. 47, 482001(FTC) (2014)]. Our model may be very useful to evaluate such a geometrical effect on spin-polarised electron transport.

  10. X-ray energy optimisation in computed microtomography.

    Science.gov (United States)

    Spanne, P

    1989-06-01

    Expressions describing the absorbed dose and the number of incident photons necessary for the detection of a contrasting detail in x-ray transmission CT imaging of a circular phantom are derived as functions of the linear attenuation coefficients of the materials comprising the object and the detail. A shell of a different material can be included to allow simulation of CT imaging of the skulls of small laboratory animals. The equations are used to estimate the optimum photon energy in x-ray transmission computed microtomography. The optimum energy depends on whether the number of incident photons or the absorbed dose at a point in the object is minimised. For a water object of 300 mm diameter the two optimisation criteria yield optimum photon energies differing by an order of magnitude.

  11. Symmetrisation schemes for global optimisation of atomic clusters.

    Science.gov (United States)

    Oakley, Mark T; Johnston, Roy L; Wales, David J

    2013-03-21

    Locating the global minima of atomic and molecular clusters can be a difficult optimisation problem. Here we report benchmarks for procedures that exploit approximate symmetry. This strategy was implemented in the GMIN program following a theoretical analysis, which explained why high-symmetry structures are more likely to have particularly high or particularly low energy. The analysis, and the corresponding algorithms, allow for approximate point group symmetry, and can be combined with basin-hopping and genetic algorithms. We report results for 38-, 75-, and 98-atom Lennard-Jones clusters, which are all multiple-funnel systems. Exploiting approximate symmetry reduces the mean time taken to locate the global minimum by up to two orders of magnitude, with smaller improvements in efficiency for LJ(55) and LJ(74), which correspond to simpler single-funnel energy landscapes.

  12. Optimisation-Based Solution Methods for Set Partitioning Models

    DEFF Research Database (Denmark)

    Rasmussen, Matias Sevel

    The scheduling of crew, i.e. the construction of work schedules for crew members, is often not a trivial task, but a complex puzzle. The task is complicated by rules, restrictions, and preferences. Therefore, manual solutions as well as solutions from standard software packages are not always su......_cient with respect to solution quality and solution time. Enhancement of the overall solution quality as well as the solution time can be of vital importance to many organisations. The _elds of operations research and mathematical optimisation deal with mathematical modelling of di_cult scheduling problems (among...... other topics). The _elds also deal with the development of sophisticated solution methods for these mathematical models. This thesis describes the set partitioning model which has been widely used for modelling crew scheduling problems. Integer properties for the set partitioning model are shown...

  13. Optimisation of arbitrary light beam generation with spatial light modulators

    Science.gov (United States)

    Radwell, Neal; Offer, Rachel F.; Selyem, Adam; Franke-Arnold, Sonja

    2017-09-01

    Phase only spatial light modulators (SLMs) have become the tool of choice for shaped light generation, allowing the creation of arbitrary amplitude and phase patterns. These patterns are generated using digital holograms and are useful for a wide range of applications as well as for fundamental research. There have been many proposed methods for optimal generation of the digital holograms, all of which perform well under ideal conditions. Here we test a range of these methods under specific experimental constraints, by varying grating period, filter size, hologram resolution, number of phase levels, phase throw and phase nonlinearity. We model beam generation accuracy and efficiency and show that our results are not limited to the specific beam shapes, but should hold for general beam shaping. Our aim is to demonstrate how to optimise and improve the performance of phase-only SLMs for experimentally relevant implementations.

  14. Kinetic model of metabolic network for xiamenmycin biosynthetic optimisation.

    Science.gov (United States)

    Xu, Min-juan; Chen, Yong-cong; Xu, Jun; Ao, Ping; Zhu, Xiao-mei

    2016-02-01

    Xiamenmycins, a series of prenylated benzopyran compounds with anti-fibrotic bioactivities, were isolated from a mangrove-derived Streptomyces xiamenensis. To fulfil the requirements of pharmaceutical investigations, a high production of xiamenmycin is needed. In this study, the authors present a kinetic metabolic model to evaluate fluxes in an engineered Streptomyces lividans with xiamenmycin-oriented genetic modification based on generic enzymatic rate equations and stability constraints. Lyapunov function was used for a viability optimisation. From their kinetic model, the flux distributions for the engineered S. lividans fed on glucose and glycerol as carbon sources were calculated. They found that if the bacterium can utilise glucose simultaneously with glycerol, xiamenmycin production can be enhanced by 40% theoretically, while maintaining the same growth rate. Glycerol may increase the flux for phosphoenolpyruvate synthesis without interfering citric acid cycle. They therefore believe this study demonstrates a possible new direction for bioengineering of S. lividans.

  15. Optimising, generalising and integrating educational practice using neuroscience

    Science.gov (United States)

    Colvin, Robert

    2016-07-01

    Practical collaboration at the intersection of education and neuroscience research is difficult because the combined discipline encompasses both the activity of microscopic neurons and the complex social interactions of teachers and students in a classroom. Taking a pragmatic view, this paper discusses three education objectives to which neuroscience can be effectively applied: optimising, generalising and integrating instructional techniques. These objectives are characterised by: (1) being of practical importance; (2) building on existing education and cognitive research; and (3) being infeasible to address based on behavioural experiments alone. The focus of the neuroscientific aspect of collaborative research should be on the activity of the brain before, during and after learning a task, as opposed to performance of a task. The objectives are informed by literature that highlights possible pitfalls with educational neuroscience research, and are described with respect to the static and dynamic aspects of brain physiology that can be measured by current technology.

  16. Optimisation of a parallel ocean general circulation model

    Directory of Open Access Journals (Sweden)

    M. I. Beare

    Full Text Available This paper presents the development of a general-purpose parallel ocean circulation model, for use on a wide range of computer platforms, from traditional scalar machines to workstation clusters and massively parallel processors. Parallelism is provided, as a modular option, via high-level message-passing routines, thus hiding the technical intricacies from the user. An initial implementation highlights that the parallel efficiency of the model is adversely affected by a number of factors, for which optimisations are discussed and implemented. The resulting ocean code is portable and, in particular, allows science to be achieved on local workstations that could otherwise only be undertaken on state-of-the-art supercomputers.

  17. Optimisation of assembly scheduling in VCIM systems using genetic algorithm

    Science.gov (United States)

    Dao, Son Duy; Abhary, Kazem; Marian, Romeo

    2017-01-01

    Assembly plays an important role in any production system as it constitutes a significant portion of the lead time and cost of a product. Virtual computer-integrated manufacturing (VCIM) system is a modern production system being conceptually developed to extend the application of traditional computer-integrated manufacturing (CIM) system to global level. Assembly scheduling in VCIM systems is quite different from one in traditional production systems because of the difference in the working principles of the two systems. In this article, the assembly scheduling problem in VCIM systems is modeled and then an integrated approach based on genetic algorithm (GA) is proposed to search for a global optimised solution to the problem. Because of dynamic nature of the scheduling problem, a novel GA with unique chromosome representation and modified genetic operations is developed herein. Robustness of the proposed approach is verified by a numerical example.

  18. Topology optimisation for energy management in underwater sensor networks

    Science.gov (United States)

    Jha, Devesh K.; Wettergren, Thomas A.; Ray, Asok; Mukherjee, Kushal

    2015-09-01

    In general, battery-powered sensors in a sensor network are operable as long as they can communicate sensed data to a processing node. In this context, a sensor network has two competing objectives: (1) maximisation of the network performance with respect to the probability of successful search for a specified upper bound on the probability of false alarms, and (2) maximisation of the network's operable life. As both sensing and communication of data consume battery energy at the sensing nodes of the sensor network, judicious use of sensing power and communication power is needed to improve the lifetime of the sensor network. This paper presents an adaptive energy management policy that will optimally allocate the available energy between sensing and communication at each sensing node to maximise the network performance subject to specified constraints. Under the assumptions of fixed total energy allocation for a sensor network operating for a specified time period, the problem is reduced to synthesis of an optimal network topology that maximises the probability of successful search (of a target) over a surveillance region. In a two-stage optimisation, a genetic algorithm-based meta-heuristic search is first used to efficiently explore the global design space, and then a local pattern search algorithm is used for convergence to an optimal solution. The results of performance optimisation are generated on a simulation test bed to validate the proposed concept. Adaptation to energy variations across the network is shown to be manifested as a change in the optimal network topology by using sensing and communication models for underwater environment. The approximate Pareto-optimal surface is obtained as a trade-off between network lifetime and probability of successful search over the surveillance region.

  19. Optimisation of Sintering Factors of Titanium Foams Using Taguchi Method

    Directory of Open Access Journals (Sweden)

    S. Ahmad

    2010-06-01

    Full Text Available Metal foams have the potential to be used in the production of bipolar plates in Polymer Electron Membrane Fuel Cells (PEMFC. In this paper, pure titanium was used to prepare titanium foam using the slurry method. The electrical conductivity is the most important parameter to be considered in the production of good bipolar plates. To achieve a high conductivity of the titanium foam, the effects of various parameters including temperature, time profile and composition have to be characterised and optimised. This paper reports the use of the Taguchi method in optimising the processing parameters of pure titanium foams. The effects of four sintering factors, namely, composition, sintering temperature, heating rate and soaking time on the electrical conductivity has been studied. The titanium slurry was prepared by mixing titanium alloy powder, polyethylene glycol (PEG, methylcellulose and water. Polyurethane (PU foams were then impregnated into the slurry and later dried at room temperature. These were next sintered in a high temperature vacuum furnace. The various factors were assigned to an L9 orthogonal array. From the Analysis of Variance (ANOVA, the composition of titanium powder has the highest percentage of contribution (24.51 to the electrical conductivity followed by the heating rate (10.29. The optimum electrical conductivity was found to be 1336.227 ± 240.61 S/cm-1 for this titanium foam. It was achieved with a 70% composition of titanium, sintering temperature of 1200oC, a heating rate of 0.5oC/min and 2 hours soaking time. Confirmatory experiments have produced results that lay within the 90% confidence interval.

  20. Ants Colony Optimisation of a Measuring Path of Prismatic Parts on a CMM

    Directory of Open Access Journals (Sweden)

    Stojadinovic Slavenko M.

    2016-03-01

    Full Text Available This paper presents optimisation of a measuring probe path in inspecting the prismatic parts on a CMM. The optimisation model is based on: (i the mathematical model that establishes an initial collision-free path presented by a set of points, and (ii the solution of Travelling Salesman Problem (TSP obtained with Ant Colony Optimisation (ACO. In order to solve TSP, an ACO algorithm that aims to find the shortest path of ant colony movement (i.e. the optimised path is applied. Then, the optimised path is compared with the measuring path obtained with online programming on CMM ZEISS UMM500 and with the measuring path obtained in the CMM inspection module of Pro/ENGINEER® software. The results of comparing the optimised path with the other two generated paths show that the optimised path is at least 20% shorter than the path obtained by on-line programming on CMM ZEISS UMM500, and at least 10% shorter than the path obtained by using the CMM module in Pro/ENGINEER®.

  1. Optimisation of culture parameters for exopolysaccharides production by the microalga Rhodella violacea.

    Science.gov (United States)

    Villay, Aurore; Laroche, Céline; Roriz, Diane; El Alaoui, Hicham; Delbac, Frederic; Michaud, Philippe

    2013-10-01

    A unicellular Rhodophyte was identified by sequencing of its 18S rRNA encoding gene as belonging to the Rhodella violacea specie. With the objective to optimise the production of biomass and exopolysaccharide by this strain, effects of irradiance, pH and temperature on its photosynthetic activity were investigated. In a second time a stoichiometric study of the well-known f/2 medium led to its supplementation in N and P to increase biomass and then exopolysaccharide yields when the strain was cultivated in photobioreactors. The use of optimal conditions of culture (irradiance of 420 μE/m(2)/s, pH of 8.3 and temperature of 24 °C) and f/2 supplemented medium led to significant increases of biomass and exopolysaccharide productions. The structural characterisation of the produced exopolysaccharide revealed that it was sulphated and mainly composed of xylose. The different culture conditions and culture media tested had no significant impact on the structure of produced exopolysaccharides.

  2. Thickness Optimisation of Textiles Subjected to Heat and Mass Transport during Ironing

    Directory of Open Access Journals (Sweden)

    Korycki Ryszard

    2016-09-01

    Full Text Available Let us next analyse the coupled problem during ironing of textiles, that is, the heat is transported with mass whereas the mass transport with heat is negligible. It is necessary to define both physical and mathematical models. Introducing two-phase system of mass sorption by fibres, the transport equations are introduced and accompanied by the set of boundary and initial conditions. Optimisation of material thickness during ironing is gradient oriented. The first-order sensitivity of an arbitrary objective functional is analysed and included in optimisation procedure. Numerical example is the thickness optimisation of different textile materials in ironing device.

  3. Distributed convex optimisation with event-triggered communication in networked systems

    Science.gov (United States)

    Liu, Jiayun; Chen, Weisheng

    2016-12-01

    This paper studies the distributed convex optimisation problem over directed networks. Motivated by practical considerations, we propose a novel distributed zero-gradient-sum optimisation algorithm with event-triggered communication. Therefore, communication and control updates just occur at discrete instants when some predefined condition satisfies. Thus, compared with the time-driven distributed optimisation algorithms, the proposed algorithm has the advantages of less energy consumption and less communication cost. Based on Lyapunov approaches, we show that the proposed algorithm makes the system states asymptotically converge to the solution of the problem exponentially fast and the Zeno behaviour is excluded. Finally, simulation example is given to illustrate the effectiveness of the proposed algorithm.

  4. Combined Optimisation of Indoor Environment and Energy Consumptionusing the Eco-factor

    DEFF Research Database (Denmark)

    Brohus, Henrik

    2006-01-01

    The paper presents a combined optimisation of the indoor environment and the energy consumption using the Eco-factor. It enables environmental assessment of different energy sources and techniques in the design and planning of energy efficient buildings with low environmental impact and desired...... indoor comfort. A typical office building is selected for further investigation and several solutions are investigated and optimised. It is found that the combined optimisation of indoor environment and energy consumption may influence otherwise obvious solutions significantly. Furthermore, it is found...

  5. Interventions to optimise prescribing for older people in care homes.

    Science.gov (United States)

    Alldred, David P; Raynor, David K; Hughes, Carmel; Barber, Nick; Chen, Timothy F; Spoor, Pat

    2013-02-28

    There is a substantial body of evidence that prescribing for care home residents is suboptimal and requires improvement. Consequently, there is a need to identify effective interventions to optimise prescribing and resident outcomes in this context. The objective of the review was to determine the effect of interventions to optimise prescribing for older people living in care homes. We searched the Cochrane Effective Practice and Organisation of Care (EPOC) Group Specialised Register; Cochrane Central Register of Controlled Trials (CENTRAL), The Cochrane Library (Issue 11, 2012); Cochrane Database of Systematic Reviews, The Cochrane Library (Issue 11, 2012); MEDLINE OvidSP (1980 on); EMBASE, OvidSP (1980 on); Ageline, EBSCO (1966 on); CINAHL, EBSCO (1980 on); International Pharmaceutical Abstracts, OvidSP (1980 on); PsycINFO, OvidSP (1980 on); conference proceedings in Web of Science, Conference Proceedings Citation Index - SSH & Science, ISI Web of Knowledge (1990 on); grey literature sources and trial registries; and contacted authors of relevant studies. We also reviewed the references lists of included studies and related reviews (search period November 2012). We included randomised controlled trials evaluating interventions aimed at optimising prescribing for older people (aged 65 years or older) living in institutionalised care facilities. Studies were included if they measured one or more of the following primary outcomes, adverse drug events; hospital admissions;mortality; or secondary outcomes, quality of life (using validated instrument); medication-related problems; medication appropriateness (using validated instrument); medicine costs. Two authors independently screened titles and abstracts, assessed studies for eligibility, assessed risk of bias and extracted data. A narrative summary of results was presented. The eight included studies involved 7653 residents in 262 (range 1 to 85) care homes in six countries. Six studies were cluster

  6. Modelling and Optimising TinyTP over IrDA Stacks

    Directory of Open Access Journals (Sweden)

    Boucouvalas A. C.

    2005-01-01

    Full Text Available TinyTP is the IrDA transport layer protocol for indoor infrared communications. For the first time, this paper presents a mathematical model for TinyTP over the IrDA protocol stacks taking into account the presence of bit errors. Based on this model, we carry out a comprehensive optimisation study to improve system performance at the transport layer. Four major parameters are optimised for maximum throughput including TinyTP receiver window, IrLAP window and frame size, as well as IrLAP turnaround time. Equations are derived for the optimum IrLAP window and frame sizes. Numerical results show that the system throughput is significantly improved by implementing the optimised parameters. The major contribution of this work is the modelling of TinyTP including the low-layer protocols and optimisation of the overall throughput by appropriate parameter selection.

  7. Plant-wide dynamic and static optimisation of supermarket refrigeration systems

    DEFF Research Database (Denmark)

    Green, Torben; Izadi-Zamanabadi, Roozbeh; Razavi-Far, Roozbeh;

    2013-01-01

    Optimising the operation of a supermarket refrigeration system under dynamic as well as steadystate conditions is addressedin thispaper. For thispurpose anappropriateperformance function that encompasses food quality, system efficiency, and also component reliability is established. The choice...

  8. Large scale three-dimensional topology optimisation of heat sinks cooled by natural convection

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Sigmund, Ole; Aage, Niels

    2016-01-01

    This work presents the application of density-based topology optimisation to the design of three-dimensional heat sinks cooled by natural convection. The governing equations are the steady-state incompressible Navier-Stokes equations coupled to the thermal convection-diffusion equation through...... and several optimised designs are presented for Grashof numbers between 103 and 106. Interestingly, it is observed that the number of branches in the optimised design increases with increasing Grashof numbers, which is opposite to two-dimensional topology optimised designs. Furthermore, the obtained...... topologies verify prior conclusions regarding fin length/thickness ratios and Biot numbers, but also indicate that carefully tailored and complex geometries may improve cooling behaviour considerably compared to simple heat fin geometries. (C) 2016 Elsevier Ltd. All rights reserved....

  9. APPLICATION OF THE MULTIRESPONSE OPTIMISATION SIMPLEX METHOD TO THE BIODIESEL - B100 OBTAINING PROCESS

    Directory of Open Access Journals (Sweden)

    Julyane Karolyne Teixeira da Costa

    2016-03-01

    Full Text Available The process of obtaining B100 biodiesel from vegetable oil and animal fat mixtures by transesterification under basic conditions was optimised using the super-modified simplex method. For simultaneous optimisation, yield, cost, oxidative stability and Cold Filter Plugging Point (CFPP, were used as responses, and the limits were established according to the experimental data and the conformity parameters established by legislations. Based on the predictive equations obtained from the simplex-centroid design-coupled functions, the multi-response optimisation showed an optimal formulation containing 38.34 % soybean oil, 21.90 % beef tallow and 39.25 % poultry fat. The validation showed that there are no significant differences between the predicted and experimental values. The simplex-centroid mixture design and simplex optimisation methods were effective tools in obtaining biodiesel B100, using a mixture of different raw materials.

  10. Fault diagnosis based on support vector machines with parameter optimisation by artificial immunisation algorithm

    Science.gov (United States)

    Yuan, Shengfa; Chu, Fulei

    2007-04-01

    Support vector machines (SVM) is a new general machine-learning tool based on the structural risk minimisation principle that exhibits good generalisation when fault samples are few, it is especially fit for classification, forecasting and estimation in small-sample cases such as fault diagnosis, but some parameters in SVM are selected by man's experience, this has hampered its efficiency in practical application. Artificial immunisation algorithm (AIA) is used to optimise the parameters in SVM in this paper. The AIA is a new optimisation method based on the biologic immune principle of human being and other living beings. It can effectively avoid the premature convergence and guarantees the variety of solution. With the parameters optimised by AIA, the total capability of the SVM classifier is improved. The fault diagnosis of turbo pump rotor shows that the SVM optimised by AIA can give higher recognition accuracy than the normal SVM.

  11. The use of a genetic algorithm in optical thin film design and optimisation

    Directory of Open Access Journals (Sweden)

    Efrem K. Ejigu

    2010-07-01

    Full Text Available We used a genetic algorithm in the design and optimisation of optical thin films and present the effects of the choice of variables, refractive index and optical thickness, in both applications of this algorithm, in this paper. The Fourier transform optical thin film design method was used to create a starting population, which was later optimised by the genetic algorithm. In the genetic algorithm design application, the effect of the choice of variable was not distinct, as it depended on the type of design specification. In the genetic algorithm optimisation application, the choice of refractive index as a variable showed a better performance than that of optical thickness. The results of this study indicate that a genetic algorithm is more effective in the design application than in the optimisation application of optical thin film synthesis.

  12. Muscle forces during running predicted by gradient-based and random search static optimisation algorithms.

    Science.gov (United States)

    Miller, Ross H; Gillette, Jason C; Derrick, Timothy R; Caldwell, Graham E

    2009-04-01

    Muscle forces during locomotion are often predicted using static optimisation and SQP. SQP has been criticised for over-estimating force magnitudes and under-estimating co-contraction. These problems may be related to SQP's difficulty in locating the global minimum to complex optimisation problems. Algorithms designed to locate the global minimum may be useful in addressing these problems. Muscle forces for 18 flexors and extensors of the lower extremity were predicted for 10 subjects during the stance phase of running. Static optimisation using SQP and two random search (RS) algorithms (a genetic algorithm and simulated annealing) estimated muscle forces by minimising the sum of cubed muscle stresses. The RS algorithms predicted smaller peak forces (42% smaller on average) and smaller muscle impulses (46% smaller on average) than SQP, and located solutions with smaller cost function scores. Results suggest that RS may be a more effective tool than SQP for minimising the sum of cubed muscle stresses in static optimisation.

  13. Optimising network flow for cost- and value- efficient operation of the supplier-to-foundry system

    Directory of Open Access Journals (Sweden)

    A. Smoliński

    2007-04-01

    Full Text Available Skillful control of a network flow, which creates a real bridge between the supplier and user, is one of the most important conditions for cost-efficient operation of an enterprise, foundry shop included. This paper describes modern principles of the network optimising for better distribution of the moulding sand, using modern methods of operational research and commonly available Excel calculation sheet equipped with an optimising tool called Solver.

  14. Brief Cognitive Behavioural Therapy Compared to Optimised General Practitioners? Care for Depression: A Randomised Trial

    OpenAIRE

    Schene, A.H.; Baas, K.D.; Koeter, M; Lucassen, P.; Bockting, C.L.H.; Wittkampf, K. F.; van Weert, H.C.; Huyser, J.

    2014-01-01

    Background: How to treat Major Depressive Disorder (MDD) in primary care? Studies that compared (brief) Cognitive Behavioural Therapy (CBT) with care as usual by the General Practitioner (GP) found the first to be more effective. However, to make a fair comparison GP care should be optimised and protocolised according to current evidence based guidelines for depression. So far this has not been the case. We studied whether a protocolised 8 session CBT is more effective than optimised and prot...

  15. Advanced treatment planning using direct 4D optimisation for pencil-beam scanned particle therapy

    Science.gov (United States)

    Bernatowicz, Kinga; Zhang, Ye; Perrin, Rosalind; Weber, Damien C.; Lomax, Antony J.

    2017-08-01

    We report on development of a new four-dimensional (4D) optimisation approach for scanned proton beams, which incorporates both irregular motion patterns and the delivery dynamics of the treatment machine into the plan optimiser. Furthermore, we assess the effectiveness of this technique to reduce dose to critical structures in proximity to moving targets, while maintaining effective target dose homogeneity and coverage. The proposed approach has been tested using both a simulated phantom and a clinical liver cancer case, and allows for realistic 4D calculations and optimisation using irregular breathing patterns extracted from e.g. 4DCT-MRI (4D computed tomography-magnetic resonance imaging). 4D dose distributions resulting from our 4D optimisation can achieve almost the same quality as static plans, independent of the studied geometry/anatomy or selected motion (regular and irregular). Additionally, current implementation of the 4D optimisation approach requires less than 3 min to find the solution for a single field planned on 4DCT of a liver cancer patient. Although 4D optimisation allows for realistic calculations using irregular breathing patterns, it is very sensitive to variations from the planned motion. Based on a sensitivity analysis, target dose homogeneity comparable to static plans (D5-D95  <5%) has been found only for differences in amplitude of up to 1 mm, for changes in respiratory phase  <200 ms and for changes in the breathing period of  <20 ms in comparison to the motions used during optimisation. As such, methods to robustly deliver 4D optimised plans employing 4D intensity-modulated delivery are discussed.

  16. Pre-segmented 2-Step IMRT with subsequent direct machine parameter optimisation – a planning study

    Directory of Open Access Journals (Sweden)

    Flentje Michael

    2008-11-01

    Full Text Available Abstract Background Modern intensity modulated radiotherapy (IMRT mostly uses iterative optimisation methods. The integration of machine parameters into the optimisation process of step and shoot leaf positions has been shown to be successful. For IMRT segmentation algorithms based on the analysis of the geometrical structure of the planning target volumes (PTV and the organs at risk (OAR, the potential of such procedures has not yet been fully explored. In this work, 2-Step IMRT was combined with subsequent direct machine parameter optimisation (DMPO-Raysearch Laboratories, Sweden to investigate this potential. Methods In a planning study DMPO on a commercial planning system was compared with manual primary 2-Step IMRT segment generation followed by DMPO optimisation. 15 clinical cases and the ESTRO Quasimodo phantom were employed. Both the same number of optimisation steps and the same set of objective values were used. The plans were compared with a clinical DMPO reference plan and a traditional IMRT plan based on fluence optimisation and consequent segmentation. The composite objective value (the weighted sum of quadratic deviations of the objective values and the related points in the dose volume histogram was used as a measure for the plan quality. Additionally, a more extended set of parameters was used for the breast cases to compare the plans. Results The plans with segments pre-defined with 2-Step IMRT were slightly superior to DMPO alone in the majority of cases. The composite objective value tended to be even lower for a smaller number of segments. The total number of monitor units was slightly higher than for the DMPO-plans. Traditional IMRT fluence optimisation with subsequent segmentation could not compete. Conclusion 2-Step IMRT segmentation is suitable as starting point for further DMPO optimisation and, in general, results in less complex plans which are equal or superior to plans generated by DMPO alone.

  17. Contribution à l'optimisation de la conduite des procédés alimentaires

    OpenAIRE

    Olmos-Perez, Alejandra

    2003-01-01

    The main objective of this work is the development of a methodology to calculate the optimal operating conditions applicable in food processes control. In the first part of this work, we developed an optimization strategy in two stages. Firstly, the optimisation problem is constructed. Afterwards, a feasible optimisation method is chosen to solve the problem. This choice is made through a decisional diagram, which proposes a deterministic (sequential quadratic programming, SQP), a stochastic ...

  18. An Optimisation Approach Applied to Design the Hydraulic Power Supply for a Forklift Truck

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben Ole; Hansen, Michael Rygaard

    2004-01-01

    -level optimisation approach, and is in the current paper exemplified through the design of the hydraulic power supply for a forklift truck. The paper first describes the prerequisites for the method and then explains the different steps in the approach to design the hydraulic system. Finally the results...... of the optimisation example for the forklift truck are presented along with a discussion of the method....

  19. PHYSICAL-MATEMATICALSCIENCE MECHANICS SIMULATION CHALLENGES IN OPTIMISING THEORETICAL METAL CUTTING TASKS

    Directory of Open Access Journals (Sweden)

    Rasul V. Guseynov

    2017-01-01

    Full Text Available Abstract. Objectives In the article, problems in the optimising of machining operations, which provide end-unit production of the required quality with a minimum processing cost, are addressed. Methods Increasing the effectiveness of experimental research was achieved through the use of mathematical methods for planning experiments for optimising metal cutting tasks. The minimal processing cost model, in which the objective function is polynomial, is adopted as a criterion for the selection of optimal parameters. Results Polynomial models of the influence of angles φ, α, γ on the torque applied when cutting threads in various steels are constructed. Optimum values of the geometrical tool parameters were obtained using the criterion of minimum cutting forces during processing. The high stability of tools having optimal geometric parameters is determined. It is shown that the use of experimental planning methods allows the optimisation of cutting parameters. In optimising solutions to metal cutting problems, it is found to be expedient to use multifactor experimental planning methods and to select the cutting force as the optimisation parameter when determining tool geometry. Conclusion The joint use of geometric programming and experiment planning methods in order to optimise the parameters of cutting significantly increases the efficiency of technological metal processing approaches. 

  20. Impact of Battery Ageing on an Electric Vehicle Powertrain Optimisation

    Directory of Open Access Journals (Sweden)

    Daniel J. Auger

    2014-12-01

    Full Text Available An electric vehicle’s battery is its most expensive component, and it cannot be charged and discharged indefinitely. This affects a consumer vehicle’s end-user value. Ageing is tolerated as an unwanted operational side-effect; manufacturers have little control over it. Recent publications have considered trade-offs between efficiency and ageing in plug-in hybrids (PHEVs but there is no equivalent literature for pure EVs. For PHEVs, battery ageing has been modelled by translating current demands into chemical degradation. Given such models it is possible to produce similar trade-offs for EVs. We consider the effects of varying battery size and introducing a parallel supercapacitor pack. (Supercapacitors can smooth current demands, but their weight and electronics reduce economy. We extend existing EV optimisation techniques to include battery ageing, illustrated with vehicle case studies. We comment on the applicability to similar EV problems and identify where additional research is needed to improve on our assumptions.

  1. Enhanced selective metal adsorption on optimised agroforestry waste mixtures.

    Science.gov (United States)

    Rosales, Emilio; Ferreira, Laura; Sanromán, M Ángeles; Tavares, Teresa; Pazos, Marta

    2015-04-01

    The aim of this work is to ascertain the potentials of different agroforestry wastes to be used as biosorbents in the removal of a mixture of heavy metals. Fern (FE), rice husk (RI) and oak leaves (OA) presented the best removal percentages for Cu(II) and Ni(II), Mn(II) and Zn(II) and Cr(VI), respectively. The performance of a mixture of these three biosorbents was evaluated, and an improvement of 10% in the overall removal was obtained (19.25mg/g). The optimum mixture proportions were determined using simplex-centroid mixture design method (FE:OA:RI=50:13.7:36.3). The adsorption kinetics and isotherms of the optimised mixture were fit by the pseudo-first order kinetic model and Langmuir isotherm. The adsorption mechanism was studied, and the effects of the carboxylic, hydroxyl and phenolic groups on metal-biomass binding were demonstrated. Finally, the recoveries of the metals using biomass were investigated, and cationic metal recoveries of 100% were achieved when acidic solutions were used.

  2. Optimising functional properties during preparation of cowpea protein concentrate.

    Science.gov (United States)

    Mune Mune, Martin Alain; Minka, Samuel René; Mbome, Israël Lape

    2014-07-01

    Response surface methodology (RSM) was used for modelisation and optimisation of protein extraction parameters in order to obtain a protein concentrate with high functional properties. A central composite rotatable design of experiments was used to investigate the effects of two factors, namely pH and NaCl concentration, on six responses: water solubility index (WSI), water absorption capacity (WAC), oil holding capacity (OHC), emulsifying activity (EA), emulsifying stability (ES) and foam ability (FA). The results of analysis of variance (ANOVA) and correlation showed that the second-order polynomial model was appropriate to fit experimental data. The optimum condition was: pH 8.43 and NaCl concentration 0.25M, and under this condition WSI was ⩾17.20%, WAC⩾383.62%, OHC⩾1.75g/g, EA⩾0.15, ES⩾19.76min and FA⩾66.30%. The suitability of the model employed was confirmed by the agreement between the experimental and predicted values for functional properties.

  3. Load optimised piezoelectric generator for powering battery-less TPMS

    Science.gov (United States)

    Blažević, D.; Kamenar, E.; Zelenika, S.

    2013-05-01

    The design of a piezoelectric device aimed at harvesting the kinetic energy of random vibrations on a vehicle's wheel is presented. The harvester is optimised for powering a Tire Pressure Monitoring System (TPMS). On-road experiments are performed in order to measure the frequencies and amplitudes of wheels' vibrations. It is hence determined that the highest amplitudes occur in an unperiodic manner. Initial tests of the battery-less TPMS are performed in laboratory conditions where tuning and system set-up optimization is achieved. The energy obtained from the piezoelectric bimorph is managed by employing the control electronics which converts AC voltage to DC and conditions the output voltage to make it compatible with the load (i.e. sensor electronics and transmitter). The control electronics also manages the sleep/measure/transmit cycles so that the harvested energy is efficiently used. The system is finally tested in real on-road conditions successfully powering the pressure sensor and transmitting the data to a receiver in the car cockpit.

  4. Optimisation of Marine Boilers using Model-based Multivariable Control

    DEFF Research Database (Denmark)

    Solberg, Brian

    Traditionally, marine boilers have been controlled using classical single loop controllers. To optimise marine boiler performance, reduce new installation time and minimise the physical dimensions of these large steel constructions, a more comprehensive and coherent control strategy is needed. Th......). In the thesis the pressure control is based on this new method when on/off burner switching is required while the water level control is handled by a model predictive controller........ This research deals with the application of advanced control to a specific class of marine boilers combining well-known design methods for multivariable systems. This thesis presents contributions for modelling and control of the one-pass smoke tube marine boilers as well as for hybrid systems control. Much...... of the focus has been directed towards water level control which is complicated by the nature of the disturbances acting on the system as well as by low frequency sensor noise. This focus was motivated by an estimated large potential to minimise the boiler geometry by reducing water level fluctuations...

  5. Process development and optimisation of lactic acid purification using electrodialysis.

    Science.gov (United States)

    Madzingaidzo, L; Danner, H; Braun, R

    2002-07-03

    Cell free sodium lactate solutions were subjected to purification based on mono- and bi-polar electrodialysis. Lactate concentration in the product stream increased to a maximum of 15% during mono-polar electrodialysis. Stack energy consumption averaged 0.6 kW h kg(-1) lactate transported at current efficiencies in the 90% range. Under optimum feed concentration (125 g l(-1)) and process conditions (auto-current mode with conductivity setpoints of minimum 5 and maximum 40 mS cm(-1)), lactate flux reached 300 g m(-2) h(-1) and water flux were low for mono-polar electrodialysis averaging 0.3 kg H(2)O per M lactate transported. Glucose in the concentrate stream solutions was reduced to concentrate stream solutions. After mono-polar electrodialysis, the concentrated sodium lactate solutions were further purified using bi-polar electrodialysis. Water transport during bi-polar electrodialysis reached figures of 0.070 - 0.222 kg H(2)O per M lactate. Free lactic acid concentration reached 16% with lactate flux of up to 300 g m(-2) h(-1). Stack energy consumption ranged from 0.6 to 1 kW h per kg lactate. Under optimised process conditions current efficiency during bi-polar electrodialysis was consistently around 90%. Glucose was further reduced from 2 to solution. Acetic acid impurity remained at around 1 g l(-1). Significant reduction in colour and minerals in the product streams was observed during electrodialysis purification.

  6. Optics Design and Lattice Optimisation for the HL-LHC

    CERN Document Server

    Holzer, B J; Fartoukh, S; Chancé, A; Dalena, B; Payet, J; Bogomyagkov, A; Appleby, R B; Kelly, S; Thomas, M B; Thompson, L; Korostelev, M; Hock, K M; Wolski, A; Milardi, C; Faus-Golfe, A; Resta Lopez, J

    2013-01-01

    The luminosity upgrade project of the LHC collider at CERN is based on a strong focusing scheme to reach lowest values of the β function at the collision points. Depending on the magnet technology (Nb3Sn or Nb-Ti) that will be available, a number of beam optics has been developed to define the specifications for the new superconducting magnets. In the context of the optics matching new issues have been addressed and new concepts have been used that play a major role in dealing with the extremely high beta functions. Quadrupole strength flexibility and chromatic corrections have been studied, the influence of the quadrupole fringe fields has been taken into account and the lattice in the matching section had been optimised including the needs of the crab cavities that will be installed. The transition between injection and low β optics has to guarantee smooth gradient changes over a wide range of β* values and the tolerances on misalignments and power converter ripple has been re-evaluated. Finally the succ...

  7. Metabolic approaches for the optimisation of recombinant fermentation processes.

    Science.gov (United States)

    Cserjan-Puschmann, M; Kramer, W; Duerrschmid, E; Striedner, G; Bayer, K

    1999-12-01

    The aim of this work was the establishment of a novel method to determine the metabolic load on host-cell metabolism resulting from recombinant protein production in Escherichia coli. This tool can be used to develop strategies to optimise recombinant fermentation processes through adjustment of recombinant-protein expression to the biosynthetic capacity of the host-cell. The signal molecule of the stringent-response network, guanosine tetraphosphate (ppGpp), and its precursor nucleotides were selected for the estimation of the metabolic load relating to recombinant-protein production. An improved analytical method for the quantification of nucleotides by ion-pair, high-performance liquid chromatography was established. The host-cell response upon overexpression of recombinant protein in fed-batch fermentations was investigated using the production of human superoxide dismutase (rhSOD) as a model system. E. coli strains with different recombinant systems (the T7 and pKK promoter system) exerting different loads on host-cell metabolism were analysed with regard to intracellular nucleotide concentration, rate of product formation and plasmid copy number.

  8. Parallelization Strategies for Ant Colony Optimisation on GPUs

    CERN Document Server

    Cecilia, Jose M; Ujaldon, Manuel; Nisbet, Andy; Amos, Martyn

    2011-01-01

    Ant Colony Optimisation (ACO) is an effective population-based meta-heuristic for the solution of a wide variety of problems. As a population-based algorithm, its computation is intrinsically massively parallel, and it is there- fore theoretically well-suited for implementation on Graphics Processing Units (GPUs). The ACO algorithm comprises two main stages: Tour construction and Pheromone update. The former has been previously implemented on the GPU, using a task-based parallelism approach. However, up until now, the latter has always been implemented on the CPU. In this paper, we discuss several parallelisation strategies for both stages of the ACO algorithm on the GPU. We propose an alternative data-based parallelism scheme for Tour construction, which fits better on the GPU architecture. We also describe novel GPU programming strategies for the Pheromone update stage. Our results show a total speed-up exceeding 28x for the Tour construction stage, and 20x for Pheromone update, and suggest that ACO is a po...

  9. Optimising pharmacological maintenance treatment for COPD in primary care.

    Science.gov (United States)

    Jones, Rupert; Østrem, Anders

    2011-03-01

    Chronic obstructive pulmonary disease (COPD) is a multi-faceted disease that is a major cause of morbidity and mortality worldwide, and is a significant burden in terms of healthcare resource utilisation and cost. Despite the availability of national and international guidelines, and effective, well-tolerated pharmacological treatments, COPD remains substantially under-diagnosed and under-treated within primary care. As COPD is both preventable and treatable there is an urgent need to raise the awareness and profile of the disease among primary care physicians and patients. Increasing evidence suggests that initiation of long-acting bronchodilator treatment at an early stage can significantly improve the patient's long-term health and quality of life (QoL). Recent large-scale trials in COPD have confirmed the longterm benefits of maintenance treatment with long-acting bronchodilators. A wide range of benefits have been shown in selected patient groups including improved lung function and QoL, reduced exacerbations and, in some studies, delayed disease progression and improved survival. In this review, we consider recent developments in our understanding of COPD, including current and emerging pharmacological treatment options, and identify steps for optimising early diagnosis and pharmacological treatment of COPD within the primary care environment.

  10. Optimisation of key performance measures in air cargo demand management

    Directory of Open Access Journals (Sweden)

    Alexander May

    2014-03-01

    Full Text Available This article sought to facilitate the optimisation of key performance measures utilised for demand management in air cargo operations. The focus was on the Revenue Management team at Virgin Atlantic Cargo and a fuzzy group decision-making method was used. Utilising intelligent fuzzy multi-criteria methods, the authors generated a ranking order of ten key outcome-based performance indicators for Virgin Atlantic air cargo Revenue Management. The result of this industry-driven study showed that for Air Cargo Revenue Management, ‘Network Optimisation’ represents a critical outcome-based performance indicator. This collaborative study contributes to existing logistics management literature, especially in the area of Revenue Management, and it seeks to enhance Revenue Management practice. It also provides a platform for Air Cargo operators seeking to improve reliability values for their key performance indicators as a means of enhancing operational monitoring power.

  11. Optimising rigid motion compensation for small animal brain PET imaging

    Science.gov (United States)

    Spangler-Bickell, Matthew G.; Zhou, Lin; Kyme, Andre Z.; De Laat, Bart; Fulton, Roger R.; Nuyts, Johan

    2016-10-01

    Motion compensation (MC) in PET brain imaging of awake small animals is attracting increased attention in preclinical studies since it avoids the confounding effects of anaesthesia and enables behavioural tests during the scan. A popular MC technique is to use multiple external cameras to track the motion of the animal’s head, which is assumed to be represented by the motion of a marker attached to its forehead. In this study we have explored several methods to improve the experimental setup and the reconstruction procedures of this method: optimising the camera-marker separation; improving the temporal synchronisation between the motion tracker measurements and the list-mode stream; post-acquisition smoothing and interpolation of the motion data; and list-mode reconstruction with appropriately selected subsets. These techniques have been tested and verified on measurements of a moving resolution phantom and brain scans of an awake rat. The proposed techniques improved the reconstructed spatial resolution of the phantom by 27% and of the rat brain by 14%. We suggest a set of optimal parameter values to use for awake animal PET studies and discuss the relative significance of each parameter choice.

  12. Production optimisation in the petrochemical industry by hierarchical multivariate modelling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Magnus; Furusjoe, Erik; Jansson, Aasa

    2004-06-01

    This project demonstrates the advantages of applying hierarchical multivariate modelling in the petrochemical industry in order to increase knowledge of the total process. The models indicate possible ways to optimise the process regarding the use of energy and raw material, which is directly linked to the environmental impact of the process. The refinery of Nynaes Refining AB (Goeteborg, Sweden) has acted as a demonstration site in this project. The models developed for the demonstration site resulted in: Detection of an unknown process disturbance and suggestions of possible causes; Indications on how to increase the yield in combination with energy savings; The possibility to predict product quality from on-line process measurements, making the results available at a higher frequency than customary laboratory analysis; Quantification of the gradually lowered efficiency of heat transfer in the furnace and increased fuel consumption as an effect of soot build-up on the furnace coils; Increased knowledge of the relation between production rate and the efficiency of the heat exchangers. This report is one of two reports from the project. It contains a technical discussion of the result with some degree of detail. A shorter and more easily accessible report is also available, see IVL report B1586-A.

  13. Optimisation of Graft Copolymerisation of Fibres from Banana Trunk

    Directory of Open Access Journals (Sweden)

    Richard Mpon

    2012-01-01

    Full Text Available Sheets from banana trunks were opened out and dried for several weeks in air. Pulp was obtained by the nitric acid process with a yield of 37.7% while fibres were obtained according to the modified standard Japanese method for cellulose in wood for pulp (JIS 8007 with a yield of 65% with respect to oven dried plant material. Single fibre obtained by the JIS method had an average diameter of 11.0 μm and Young's modulus, tensile strength and strain at break-off 7.05 GPa, 81.7 MPa and 5.2% respectively. Modification of the fibres was carried out by grafting ethyl acrylate in the presence of ammonium nitrate cerium(IV. Optimisation of the copolymerisation reaction conditions was studied by measuring the rate of conversion, the rate of grafting and the grafting efficiency. The results showed that at low values of ceric ion concentration (0.04 M, at ambient temperature, after three hours and at a concentration of 0.2 M ethyl acrylate, maximum values of the parameters cited were obtained.

  14. On the Selection of MAC Optimised Routing Protocol for VANET

    Directory of Open Access Journals (Sweden)

    Kanu Priya

    2017-02-01

    Full Text Available In today‘s era of modernization, the concept of smart vehicles, smart cities and automated vehicles is trending day by day. VANET (Vehicular Adhoc Network has also been emerging as a potential applicant to enable these smart applications. Though VANET is very much similar to MANET (Mobile Adhoc Network but VANET has more severe challenges as compared to MANET due to hostile channel conditions and high degree of mobility. So lot of work related to MAC and Network Layer need attention from the network designers. In this paper MAC Layer has been optimised in terms of Queue Size by using QoS Parameters namely Packet Collision Rate, Packet Drop Rate, Throughput Rate and Broadcast Rate. In doing so, simulative investigations have been done to find out optimum queue size. For this purpose various routing protocols namely DSDV, AODV, ADV and GOD have been considered and optimum queue length for each of these have been obtained. Further the most efficient routing protocol has also been identified. Moreover this paper also compares the performance of most efficient Routing Protocols selected in terms of QoS parameters for different MAC Interfaces.

  15. Random matrix theory filters and currency portfolio optimisation

    Science.gov (United States)

    Daly, J.; Crane, M.; Ruskin, H. J.

    2010-04-01

    Random matrix theory (RMT) filters have recently been shown to improve the optimisation of financial portfolios. This paper studies the effect of three RMT filters on realised portfolio risk, using bootstrap analysis and out-of-sample testing. We considered the case of a foreign exchange and commodity portfolio, weighted towards foreign exchange, and consisting of 39 assets. This was intended to test the limits of RMT filtering, which is more obviously applicable to portfolios with larger numbers of assets. We considered both equally and exponentially weighted covariance matrices, and observed that, despite the small number of assets involved, RMT filters reduced risk in a way that was consistent with a much larger S&P 500 portfolio. The exponential weightings indicated showed good consistency with the value suggested by Riskmetrics, in contrast to previous results involving stocks. This decay factor, along with the low number of past moves preferred in the filtered, equally weighted case, displayed a trend towards models which were reactive to recent market changes. On testing portfolios with fewer assets, RMT filtering provided less or no overall risk reduction. In particular, no long term out-of-sample risk reduction was observed for a portfolio consisting of 15 major currencies and commodities.

  16. Genetic Algorithm Optimisation of PID Controllers for a Multivariable Process

    Directory of Open Access Journals (Sweden)

    Wael Alharbi

    2017-03-01

    Full Text Available This project is about the design of PID controllers and the improvement of outputs in multivariable processes. The optimisation of PID controller for the Shell oil process is presented in this paper, using Genetic Algorithms (GAs. Genetic Algorithms (GAs are used to automatically tune PID controllers according to given specifications. They use an objective function, which is specially formulated and measures the performance of controller in terms of time-domain bounds on the responses of closed-loop process.A specific objective function is suggested that allows the designer for a single-input, single-output (SISO process to explicitly specify the process performance specifications associated with the given problem in terms of time-domain bounds, then experimentally evaluate the closed-loop responses. This is investigated using a simple two-term parametric PID controller tuning problem. The results are then analysed and compared with those obtained using a number of popular conventional controller tuning methods. The intention is to demonstrate that the proposed objective function is inherently capable of accurately quantifying complex performance specifications in the time domain. This is something that cannot normally be employed in conventional controller design or tuning methods.Finally, the recommended objective function will be used to examine the control problems of Multi-Input-Multi-Output (MIMO processes, and the results will be presented in order to determine the efficiency of the suggested control system.

  17. Production optimisation in the petrochemical industry by hierarchical multivariate modelling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Magnus; Furusjoe, Erik; Jansson, Aasa

    2004-06-01

    This project demonstrates the advantages of applying hierarchical multivariate modelling in the petrochemical industry in order to increase knowledge of the total process. The models indicate possible ways to optimise the process regarding the use of energy and raw material, which is directly linked to the environmental impact of the process. The refinery of Nynaes Refining AB (Goeteborg, Sweden) has acted as a demonstration site in this project. The models developed for the demonstration site resulted in: Detection of an unknown process disturbance and suggestions of possible causes; Indications on how to increase the yield in combination with energy savings; The possibility to predict product quality from on-line process measurements, making the results available at a higher frequency than customary laboratory analysis; Quantification of the gradually lowered efficiency of heat transfer in the furnace and increased fuel consumption as an effect of soot build-up on the furnace coils; Increased knowledge of the relation between production rate and the efficiency of the heat exchangers. This report is one of two reports from the project. It contains a technical discussion of the result with some degree of detail. A shorter and more easily accessible report is also available, see IVL report B1586-A.

  18. Design and optimisation of novel configurations of stormwater constructed wetlands

    Science.gov (United States)

    Kiiza, Christopher

    2017-04-01

    Constructed wetlands (CWs) are recognised as a cost-effective technology for wastewater treatment. CWs have been deployed and could be retrofitted into existing urban drainage systems to prevent surface water pollution, attenuate floods and act as sources for reusable water. However, there exist numerous criteria for design configuration and operation of CWs. The aim of the study was to examine effects of design and operational variables on performance of CWs. To achieve this, 8 novel designs of vertical flow CWs were continuously operated and monitored (weekly) for 2years. Pollutant removal efficiency in each CW unit was evaluated from physico-chemical analyses of influent and effluent water samples. Hybrid optimised multi-layer perceptron artificial neural networks (MLP ANNs) were applied to simulate treatment efficiency in the CWs. Subsequently, predictive and analytical models were developed for each design unit. Results show models have sound generalisation abilities; with various design configurations and operational variables influencing performance of CWs. Although some design configurations attained faster and higher removal efficiencies than others; all 8 CW designs produced effluents permissible for discharge into watercourses with strict regulatory standards.

  19. Heterogeneous locational optimisation using a generalised Voronoi partition

    Science.gov (United States)

    Guruprasad, K. R.; Ghose, Debasish

    2013-06-01

    In this paper a generalisation of the Voronoi partition is used for locational optimisation of facilities having different service capabilities and limited range or reach. The facilities can be stationary, such as base stations in a cellular network, hospitals, schools, etc., or mobile units, such as multiple unmanned aerial vehicles, automated guided vehicles, etc., carrying sensors, or mobile units carrying relief personnel and materials. An objective function for optimal deployment of the facilities is formulated, and its critical points are determined. The locally optimal deployment is shown to be a generalised centroidal Voronoi configuration in which the facilities are located at the centroids of the corresponding generalised Voronoi cells. The problem is formulated for more general mobile facilities, and formal results on the stability, convergence and spatial distribution of the proposed control laws responsible for the motion of the agents carrying facilities, under some constraints on the agents' speed and limit on the sensor range, are provided. The theoretical results are supported with illustrative simulation results.

  20. Interference Aware Optimisation of Throughput in Cognitive Radio System

    Directory of Open Access Journals (Sweden)

    Gaurav Verma

    2015-07-01

    Full Text Available In the cognitive radio (CR system where spectrum sensing and data transmissions are performed simultaneously, the proper selection of frame duration (τ is of utmost importance. Small τ leads to an increased false alarm probability while large value of it delays implementation of sensing decision of the current frame to the next. The former case decreases the achievable throughput of the CR user while latter one may disturb the licensed user communication. Under the constraints of maintaining a target detection probability of dP, this paper attempts to design a frame duration τ where achieved throughput of the CR system is maximised. To do so, an analysis of achievable throughput with τ was performed which reveals that, initially, with the increase in τ, the achievable throughput increases sharply, but after its certain value, the increments are negligible and achievable throughput appears to maintain a constant value. The performed analysis shows that, it is not possible to perfectly optimize τ, however, a close optimisation can still be performed which can maximise the achievable throughput. From the realistic point of view, the CR system is further modelled under uncertain noise conditions. The achieved simulation results well justify the presented analysis.Defence Science Journal, Vol. 65, No. 4, July 2015, pp. 312-318, DOI: http://dx.doi.org/10.14429/dsj.65.8739

  1. Optimised low-thrust mission to the Atira asteroids

    Science.gov (United States)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Ortiz Gomez, Natalia; Vasile, Massimiliano

    2017-04-01

    Atira asteroids are recently-discovered celestial bodies characterised by orbits lying completely inside the heliocentric orbit of the Earth. The study of these objects is difficult due to the limitations of ground-based observations: objects can only be detected when the Sun is not in the field of view of the telescope. However, many asteroids are expected to exist in the inner region of the Solar System, many of which could pose a significant threat to our planet. In this paper, a small, low-cost, mission to visit the known Atira asteroids and to discover new Near Earth Asteroids (NEA) is proposed. The mission is realised using electric propulsion. The trajectory is optimised to maximise the number of visited asteroids of the Atira group using the minimum propellant consumption. During the tour of the Atira asteroids an opportunistic NEA discovery campaign is proposed to increase our knowledge of the asteroid population. The mission ends with a transfer to an orbit with perihelion equal to Venus's orbit radius. This orbit represents a vantage point to monitor and detect asteroids in the inner part of the Solar System and provide early warning in the case of a potential impact.

  2. Ecological optimisation of an irreversible Stirling heat engine

    Energy Technology Data Exchange (ETDEWEB)

    He, J.; Chen, J. [Xiamen Univ. (China). Dept. of Physics; Wu, C. [US Naval Academy, Annapolis, MD (United States). Dept. of Mechanical Engineering

    2001-10-01

    A general cycle model of an irreversible Stirling heat engine using an ideal or Van der Waals gas as the working substance is established. It includes three main sources of the irreversibility such as the heat transfer across finite-temperature differences in the isothermal processes, the regenerative loss resulting from the non-perfect regeneration in the regenerator, and the heat leak loss between the external heat reservoirs. The ecological function is taken as an objective function for optimisation. The performance characteristics of the Stirling heat engine at maximum ecological function are revealed. They are compared with other performance characteristics of the Stirling heat engine at maximum power output and efficiency in order to expound the significance of the ecological objective function. The results obtained here are of importance in the optimal design and operation of real Stirling heat engines. Finally, it is pointed out that the results obtained in this paper are very general, from which the optimal performance of the Ericsson heat engine using an ideal gas as the working substance and the Carnot heat engine can be derived directly. (author)

  3. Optimising Ambient Setting Bayer Derived Fly Ash Geopolymers

    Directory of Open Access Journals (Sweden)

    Evan Jamieson

    2016-05-01

    Full Text Available The Bayer process utilises high concentrations of caustic and elevated temperature to liberate alumina from bauxite, for the production of aluminium and other chemicals. Within Australia, this process results in 40 million tonnes of mineral residues (Red mud each year. Over the same period, the energy production sector will produce 14 million tonnes of coal combustion products (Fly ash. Both industrial residues require impoundment storage, yet combining some of these components can produce geopolymers, an alternative to cement. Geopolymers derived from Bayer liquor and fly ash have been made successfully with a compressive strength in excess of 40 MPa after oven curing. However, any product from these industries would require large volume applications with robust operational conditions to maximise utilisation. To facilitate potential unconfined large-scale production, Bayer derived fly ash geopolymers have been optimised to achieve ambient curing. Fly ash from two different power stations have been successfully trialled showing the versatility of the Bayer liquor-ash combination for making geopolymers.

  4. Optimising the turbocharging of large engines in the future

    Energy Technology Data Exchange (ETDEWEB)

    Codan, E. [ABB Turbo Systems, Ltd., R and D Turbocharging, Baden (Switzerland)

    1998-12-31

    The new ABB turbocharger generations TPL and TPS were developed to match the most advanced turbocharged engines over 500 kW of the coming years. High performance in terms of pressure ratio and turbocharging efficiency no longer guarantees an efficient engine operation over the whole operating field. Therefore matched turbocharger characteristics for different applications are increasingly important. This paper shows the influence of the turbocharging system characteristics on the steady state and transient behaviour of a turbocharged engine for different applications. Basis for the study is the well proven simulation system SiSy, which is widely used for the performance prediction of the turbocharging engine. Some simple parameters were developed that numerically describe the correlation between the characteristic of the turbocharging system and the engine operation. The limits of the commonly used turbocharging systems are shown together with an overview of future possibilities, e.g. two-stage turbocharging and turbocompound. A joint optimisation of the turbocharging system and of the engine will be of paramount importance in the future, to exploit the improvement potential. (au)

  5. Optimisation of radio transmitter locations in mobile telecommunications networks

    Directory of Open Access Journals (Sweden)

    Schmidt-Dumont, Thorsten

    2016-08-01

    Full Text Available Multiple factors have to be taken into account when mobile telecommunication network providers make decisions about radio transmitter placement. Generally, area coverage and the average signal level provided are of prime importance in these decisions. These criteria give rise to a bi-objective problem of facility location, with the goal of achieving an acceptable trade-off between maximising the total area coverage and maximising the average signal level provided to the demand region by a network of radio transmitters. This paper establishes a mathematical modelling framework, based on these two placement criteria, for evaluating the effectiveness of a given set of radio transmitter locations. In the framework, coverage is measured according to the degree of obstruction of the so-called ‘Fresnel zone’ that is formed between handset and base station, while signal strength is modelled taking radio wave propagation loss into account. This framework is used to formulate a novel bi-objective facility location model that may form the basis for decision support aimed at identifying high-quality transmitter location trade-off solutions for mobile telecommunication network providers. But it may also find application in various other contexts (such as radar, watchtower, or surveillance camera placement optimisation.

  6. Optimising daytime deliveries when inducing labour using prostaglandin vaginal inserts.

    Science.gov (United States)

    Miller, Hugh; Goetzl, Laura; Wing, Deborah A; Powers, Barbara; Rugarn, Olof

    2016-01-01

    To determine induction start time(s) that would maximise daytime deliveries when using prostaglandin vaginal inserts. Women enrolled into the Phase III trial, EXPEDITE (clinical trial registration: NCT01127581), had labour induced with either a misoprostol or dinoprostone vaginal insert (MVI or DVI). A secondary analysis was conducted to determine the optimal start times for induction by identifying the 12-h period with the highest proportion of deliveries by parity and treatment. Optimal start times for achieving daytime deliveries when using MVI appear to be 19:00 in nulliparae and 23:00 in multiparae. Applying these start times, the median time of onset of active labour would be approximately 08:30 for both parities and the median time of delivery would be the following day at approximately 16:30 for nulliparae and 12:00 (midday) for multiparae. Optimal start times when using DVI appear to be 07:00 for nulliparae and 23:00 for multiparae. Using these start times, the median time of onset of active labour would be the following day at approximately 04:00 and 11:50, and the median time of delivery would be approximately 13:40 and 16:10, respectively. When optimising daytime deliveries, different times to initiate induction of labour may be appropriate depending on parity and the type of retrievable prostaglandin vaginal insert used.

  7. Optimised Dirac operators on the lattice. Construction, properties and applications

    Energy Technology Data Exchange (ETDEWEB)

    Bietenholz, W. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik]|[Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC

    2006-11-15

    We review a number of topics related to block variable renormalisation group transformations of quantum fields on the lattice, and to the emerging perfect lattice actions. We first illustrate this procedure by considering scalar fields. Then we proceed to lattice fermions, where we discuss perfect actions for free fields, for the Gross-Neveu model and for a supersymmetric spin model. We also consider the extension to perfect lattice perturbation theory, in particular regarding the axial anomaly and the quark gluon vertex function. Next we deal with properties and applications of truncated perfect fermions, and their chiral correction by means of the overlap formula. This yields a formulation of lattice fermions, which combines exact chiral symmetry with an optimisation of further essential properties. We summarise simulation results for these so-called overlap-hypercube fermions in the two-flavour Schwinger model and in quenched QCD. In the latter framework we establish a link to Chiral Perturbation Theory, both, in the p-regime and in the epsilon-regime. In particular we present an evaluation of the leading Low Energy Constants of the chiral Lagrangian - the chiral condensate and the pion decay constant - from QCD simulations with extremely light quarks. (orig.)

  8. Optimised Dirac operators on the lattice: construction, properties and applications

    Energy Technology Data Exchange (ETDEWEB)

    Bietenholz, Wolfgang [Humbolt-Universitaet zu Berlin (Germany). Inst. fuer Physik; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing (NIC)

    2006-12-15

    We review a number of topics related to block variable renormalisation group transformations of quantum fields on the lattice, and to the emerging perfect lattice actions. We first illustrate this procedure by considering scalar fields. Then we proceed to lattice fermions, where we discuss perfect actions for free fields, for the Gross-Neveu model and for a supersymmetric spin model. We also consider the extension to perfect lattice perturbation theory, in particular regarding the axial anomaly and the quark gluon vertex function. Next we deal with properties and applications of truncated perfect fermions, and their chiral correction by means of the overlap formula. This yields a formulation of lattice fermions, which combines exact chiral symmetry with an optimisation of further essential properties. We summarise simulation results for these so-called overlap-hypercube fermions in the two-flavour Schwinger model and in quenched QCD. In the latter framework we establish a link to Chiral Perturbation Theory, both, in the p-regime and in the e-regime. In particular we present an evaluation of the leading Low Energy Constants of the chiral Lagrangian - the chiral condensate and the pion decay constant - from QCD simulations with extremely light quarks. (author)

  9. Optimising neuroimaging effectiveness in a district general hospital.

    Science.gov (United States)

    McCarron, M O; Wade, C; McCarron, P

    2014-01-01

    Diagnostic accuracy in neurology frequently depends on clinical assessment and neuroimaging interpretation. We assessed neuroimaging discrepancy rates in reported findings between general radiologists and neuroradiologists among patients from a district general hospital (DGH). A neuroradiologist's report was sought on selected DGH patients over 28 months. Pre-planned outcomes included comparisons of primary findings (main diagnosis or abnormality), secondary findings (differential diagnoses and incidental findings) and advice from neuroradiologists for further investigations. A total of 233 patients (119 men and 114 women), mean age 47.2 (SD 17.8) years were studied: 43 had a computed tomography (CT) brain scan only, 37 had CT and magnetic resonance imaging (MRI) scans and 153 had only MRI scans. Discrepancies in the primary diagnosis/abnormality were identified in 33 patients (14.2%). This included 7 of 43 patients (16.3%) who had a CT brain scan as their only neuroimaging. Secondary outcomes differed in 50 patients (21.5%). Neuroradiologists recommended further neuroimaging for 29 patients (12.4%). The most common discrepancies in the primary diagnosis/abnormality were misinterpreting normal for hippocampal sclerosis and missed posterior fossa lesions. There was no evidence of temporal changes in discrepancy rates. Selecting CT and MR neuroimaging studies from general hospitals for reviewing by neuroradiologists is an important and effective way of optimising management of neurological patients.

  10. A stepwise atomic, valence-molecular, and full-molecular optimisation of the Hartree-Fock/Kohn-Sham energy.

    Science.gov (United States)

    Jansík, Branislav; Høst, Stinne; Johansson, Mikael P; Olsen, Jeppe; Jørgensen, Poul; Helgaker, Trygve

    2009-07-21

    A hierarchical optimisation strategy has been introduced for minimising the Hartree-Fock/Kohn-Sham energy, consisting of three levels (3L): an atom-in-a-molecule optimisation, a valence-basis molecular optimisation, and a full-basis molecular optimisation. The density matrix formed at one level is used as a starting density matrix at the next level with no loss of information. To ensure a fast and reliable convergence to a minimum, the augmented Roothaan-Hall (ARH) algorithm is used in both the valence-basis and full-basis molecular optimisations. The performance of the ARH-3L method is compared with standard optimisation algorithms. Both for efficiency and reliability, we recommend to use the ARH-3L algorithm.

  11. An optimised protocol for molecular identification of Eimeria from chickens.

    Science.gov (United States)

    Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L; Macdonald, Sarah E; Chaudhry, Abdul S; Sparagano, Olivier; Banerjee, Partha S; Kundu, Krishnendu; Tomley, Fiona M; Blake, Damer P

    2014-01-17

    Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples.

  12. Optimising automation of a manual enzyme-linked immunosorbent assay

    Directory of Open Access Journals (Sweden)

    Corena de Beer

    2011-12-01

    Full Text Available Objective: Enzyme-linked immunosorbent assays (ELISAs are widely used to quantify immunoglobulin levels induced by infection or vaccination. Compared to conventional manual assays, automated ELISA systems offer more accurate and reproducible results, faster turnaround times and cost effectiveness due to the use of multianalyte reagents.Design: The VaccZyme™ Human Anti-Haemophilus influenzae type B (Hib kit (MK016 from The Binding Site Company was optimised to be used on an automated BioRad PhD™ system in the Immunology Laboratory (National Health Laboratory Service in Tygerberg, South Africa.Methods: An automated ELISA system that uses individual well incubation was compared to a manual method that uses whole-plate incubation.Results: Results were calculated from calibration curves constructed with each assay. Marked differences in calibration curves were observed for the two methods. The automated method produced lower-than-recommended optical density values and resulted in invalid calibration curves and diagnostic results. A comparison of the individual steps of the two methods showed a difference of 10 minutes per incubation cycle. All incubation steps of the automated method were subsequently increased from 30 minutes to 40 minutes. Several comparative assays were performed according to the amended protocol and all calibration curves obtained were valid. Calibrators and controls were also included as samples in different positions and orders on the plate and all results were valid.Conclusion: Proper validation is vital before converting manual ELISA assays to automated or semi-automated methods. 

  13. Toward an optimisation technique for dynamically monitored environment

    Science.gov (United States)

    Shurrab, Orabi M.

    2016-10-01

    The data fusion community has introduced multiple procedures of situational assessments; this is to facilitate timely responses to emerging situations. More directly, the process refinement of the Joint Directors of Laboratories (JDL) is a meta-process to assess and improve the data fusion task during real-time operation. In other wording, it is an optimisation technique to verify the overall data fusion performance, and enhance it toward the top goals of the decision-making resources. This paper discusses the theoretical concept of prioritisation. Where the analysts team is required to keep an up to date with the dynamically changing environment, concerning different domains such as air, sea, land, space and cyberspace. Furthermore, it demonstrates an illustration example of how various tracking activities are ranked, simultaneously into a predetermined order. Specifically, it presents a modelling scheme for a case study based scenario, where the real-time system is reporting different classes of prioritised events. Followed by a performance metrics for evaluating the prioritisation process of situational awareness (SWA) domain. The proposed performance metrics has been designed and evaluated using an analytical approach. The modelling scheme represents the situational awareness system outputs mathematically, in the form of a list of activities. Such methods allowed the evaluation process to conduct a rigorous analysis of the prioritisation process, despite any constrained related to a domain-specific configuration. After conducted three levels of assessments over three separates scenario, The Prioritisation Capability Score (PCS) has provided an appropriate scoring scheme for different ranking instances, Indeed, from the data fusion perspectives, the proposed metric has assessed real-time system performance adequately, and it is capable of conducting a verification process, to direct the operator's attention to any issue, concerning the prioritisation capability

  14. Towards the optimisation and adaptation of dry powder inhalers.

    Science.gov (United States)

    Cui, Y; Schmalfuß, S; Zellnitz, S; Sommerfeld, M; Urbanetz, N

    2014-08-15

    Pulmonary drug delivery by dry powder inhalers is becoming more and more popular. Such an inhalation device must insure that during the inhalation process the drug powder is detached from the carrier due to fluid flow stresses. The goal of the project is the development of a drug powder detachment model to be used in numerical computations (CFD, computational fluid dynamics) of fluid flow and carrier particle motion through the inhaler and the resulting efficiency of drug delivery. This programme will be the basis for the optimisation of inhaler geometry and dry powder inhaler formulation. For this purpose a multi-scale approach is adopted. First the flow field through the inhaler is numerically calculated with OpenFOAM(®) and the flow stresses experienced by the carrier particles are recorded. This information is used for micro-scale simulations using the Lattice-Boltzmann method where only one carrier particle covered with drug powder is placed in cubic flow domain and exposed to the relevant flow situations, e.g. plug and shear flow with different Reynolds numbers. Therefrom the fluid forces on the drug particles are obtained. In order to allow the determination of the drug particle detachment possibility by lift-off, sliding or rolling, also measurements by AFM (atomic force microscope) were conducted for different carrier particle surface structures. The contact properties, such as van der Waals force, friction coefficient and adhesion surface energy were used to determine, from a force or moment balance (fluid forces versus contact forces), the detachment probability by the three mechanisms as a function of carrier particle Reynolds number. These results will be used for deriving the drug powder detachment model.

  15. Optimising treatment for COPD--new strategies for combination therapy.

    Science.gov (United States)

    Welte, T

    2009-08-01

    Chronic obstructive pulmonary disease (COPD) is a multi-component disease characterised by airflow limitation and airway inflammation. Exacerbations of COPD have a considerable impact on the quality of life, daily activities and general well-being of patients and are a great burden on the health system. Thus, the aims of COPD management include not only relieving symptoms and preventing disease progression but also preventing and treating exacerbations. Attention towards the day-to-day burden of the disease is also required in light of evidence that suggests COPD may be variable throughout the day with morning being the time when symptoms are most severe and patients' ability to perform regular morning activities the most problematic. While available therapies improve clinical symptoms and decrease airway inflammation, they do not unequivocally slow long-term progression or address all disease components. With the burden of COPD continuing to increase, research into new and improved treatment strategies to optimise pharmacotherapy is ongoing - in particular, combination therapies, with a view to their complementary modes of action enabling multiple components of the disease to be addressed. Evidence from recent clinical trials indicates that triple therapy, combining an anticholinergic with an inhaled corticosteroid and a long-acting beta(2)-agonist, may provide clinical benefits additional to those associated with each treatment alone in patients with more severe COPD. This article reviews the evidence for treatment strategies used in COPD with a focus on combination therapies and introduces the 3-month CLIMB study (Evaluation of Efficacy and Safety of Symbicort as an Add-on Treatment to Spiriva in Patients With Severe COPD) which investigated the potential treatment benefits of combining tiotropium with budesonide/formoterol in patients with COPD with regard to lung function, exacerbations, symptoms and morning activities.

  16. Dose optimisation of double-contrast barium enema examinations.

    Science.gov (United States)

    Berner, K; Båth, M; Jonasson, P; Cappelen-Smith, J; Fogelstam, P; Söderberg, J

    2010-01-01

    The purpose of the present work was to optimise the filtration and dose setting for double-contrast barium enema examinations using a Philips MultiDiagnost Eleva FD system. A phantom study was performed prior to a patient study. A CDRAD phantom was used in a study where copper and aluminium filtration, different detector doses and tube potentials were examined. The image quality was evaluated using the software CDRAD Analyser and the phantom dose was determined using the Monte Carlo-based software PCXMC. The original setting [100 % detector dose (660 nGy air kerma) and a total filtration of 3.5 mm Al, at 81 kVp] and two other settings identified by the phantom study (100 % detector dose and additional filtration of 1 mm Al and 0.2 mm Cu as well as 80 % detector dose and added filtration of 1 mm Al and 0.2 mm Cu) were included in the patient study. The patient study included 60 patients and up to 8 images from each patient. Six radiologists performed a visual grading characteristics study to evaluate the image quality. A four-step scale was used to judge the fulfillment of three image quality criteria. No overall statistical significant difference in image quality was found between the three settings (P > 0.05). The decrease in the effective dose for the settings in the patient study was 15 % when filtration was added and 34 % when both filtrations was added and detector dose was reduced. The study indicates that additional filtration of 1 mm Al and 0.2 mm Cu and a decrease in detector dose by 20 % from the original setting can be used in colon examinations with Philips MultiDiagnost Eleva FD to reduce the patient dose by 30 % without significantly affecting the image quality. For 20 exposures, this corresponds to a decrease in the effective dose from 1.6 to 1.1 mSv.

  17. Optimisation of computed radiography systems for chest imaging

    Energy Technology Data Exchange (ETDEWEB)

    Alzimami, K. [Department of Physics, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom)], E-mail: k.alzimami@surrey.ac.uk; Sassi, S. [Royal Marsden NHS Foundation Trust, Sutton, Surrey SM2 5PT (United Kingdom); Alkhorayef, M. [Department of Physics, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Britten, A.J. [Department of Medical Physics, St George' s Hospital, London SW17 0QT (United Kingdom); Spyrou, N.M. [Department of Physics, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom)

    2009-03-01

    The main thrust of this study is to investigate methods of optimising the radiation dose-image quality relationship in computed radiography (CR) systems for chest imaging. Specifically, this study investigates the possibility of reducing the patient radiation exposure through an optimal selection of tube filtration, exposure parameters and air gap technique, in parallel with a study of the image quality, particularly low contrast detail detectability, signal-to-noise ratio (SNR) and scatter fraction (SF). The CDRAD phantom was used to assess the quality of the CR images. Tissue equivalent Polystyrene blocks were placed in the front of the phantom as scattering material with thicknesses of 5 and 15 cm to simulate an adult chest and heart/diaphragm regions, respectively. A series of exposure techniques were used, including Cu filtration with various thicknesses of Cu in the presence and absence of an air gap, whilst the exposure was kept as constant as possible throughout. The estimated patient effective dose and skin entrance dose were calculated using the NRPB-SR262 X-ray dose calculation software. The results have shown that the low contrast-detail detectability in the lung and the heart/diaphragm regions improves when using an air gap and no Cu filtration, particularly at low kilovoltage (kVp). However, there is no significant difference in low contrast-detail in the absence or presence of a 0.2 mm Cu filtration. SF values for the lung and heart regions decrease when using both, the air gap technique and a 0.2 mm Cu filtration, particularly at low kVp. SNR values for the lung and heart regions improve when using a small Cu thickness. In conclusion, this investigation has shown that the quality of chest CR images could be improved by using an air gap technique and a 0.2 mm Cu filtration at low kVp, particularly at 99 kVp.

  18. Reduction environmental effects of civil aircraft through multi-objective flight plan optimisation

    Science.gov (United States)

    Lee, D. S.; Gonzalez, L. F.; Walker, R.; Periaux, J.; Onate, E.

    2010-06-01

    With rising environmental alarm, the reduction of critical aircraft emissions including carbon dioxides (CO2) and nitrogen oxides (NOx) is one of most important aeronautical problems. There can be many possible attempts to solve such problem by designing new wing/aircraft shape, new efficient engine, etc. The paper rather provides a set of acceptable flight plans as a first step besides replacing current aircrafts. The paper investigates a green aircraft design optimisation in terms of aircraft range, mission fuel weight (CO2) and NOx using advanced Evolutionary Algorithms coupled to flight optimisation system software. Two multi-objective design optimisations are conducted to find the best set of flight plans for current aircrafts considering discretised altitude and Mach numbers without designing aircraft shape and engine types. The objectives of first optimisation are to maximise range of aircraft while minimising NOx with constant mission fuel weight. The second optimisation considers minimisation of mission fuel weight and NOx with fixed aircraft range. Numerical results show that the method is able to capture a set of useful trade-offs that reduce NOx and CO2 (minimum mission fuel weight).

  19. A simulation-optimisation approach for designing water distribution networks under multiple objectives

    Science.gov (United States)

    Grundmann, Jens; Pham Van, Tinh; Müller, Ruben; Schütze, Niels

    2014-05-01

    Especially in arid and semi-arid regions, water distribution networks are of major importance for an integrated water resources management in order to convey water over long distances from sources to consumers. However, to design a network optimally is still a challenge which requires an appropriate determination of: (1) pipe/pump/tank characteristics - decision variables (2) cost/network reliability - objective functions including (3) a given set of constraints. Thereby, objective functions are contradicting, which means that by minimising costs network reliability is decreasing resulting in a higher risk of network failures. For solving this multi-objective design problem, a simulation-optimisation approach is developed. The approach couples a hydraulic network model (Epanet) with an optimiser, namely the covariance matrix adaptation evolution strategy (CMAES). The simulation-optimisation model is applied on international published benchmark cases for single and multi-objective optimisation and simultaneous optimisation of above mentioned decision variables as well as network layout. Results are encouraging. The proposed model performs with similar or better results, which means smaller costs and higher network reliability. Subsequently, the new model is applied for an optimal design and operation of a water distribution system to supply the coastal arid region of Al-Batinah (North of Oman) with water for agricultural production.

  20. Portfolio optimisation for hydropower producers that balances riverine ecosystem protection and producer needs

    Science.gov (United States)

    Yin, X. A.; Yang, Z. F.; Liu, C. L.

    2014-04-01

    In deregulated electricity markets, hydropower portfolio design has become an essential task for producers. The previous research on hydropower portfolio optimisation focused mainly on the maximisation of profits but did not take into account riverine ecosystem protection. Although profit maximisation is the major objective for producers in deregulated markets, protection of riverine ecosystems must be incorporated into the process of hydropower portfolio optimisation, especially against a background of increasing attention to environmental protection and stronger opposition to hydropower generation. This research seeks mainly to remind hydropower producers of the requirement of river protection when they design portfolios and help shift portfolio optimisation from economically oriented to ecologically friendly. We establish a framework to determine the optimal portfolio for a hydropower reservoir, accounting for both economic benefits and ecological needs. In this framework, the degree of natural flow regime alteration is adopted as a constraint on hydropower generation to protect riverine ecosystems, and the maximisation of mean annual revenue is set as the optimisation objective. The electricity volumes assigned in different electricity submarkets are optimised by the noisy genetic algorithm. The proposed framework is applied to China's Wangkuai Reservoir to test its effectiveness. The results show that the new framework could help to design eco-friendly portfolios that can ensure a planned profit and reduce alteration of the natural flow regime.

  1. Optimisation of Fabric Reinforced Polymer Composites Using a Variant of Genetic Algorithm

    Science.gov (United States)

    Axinte, Andrei; Taranu, Nicolae; Bejan, Liliana; Hudisteanu, Iuliana

    2017-03-01

    Fabric reinforced polymeric composites are high performance materials with a rather complex fabric geometry. Therefore, modelling this type of material is a cumbersome task, especially when an efficient use is targeted. One of the most important issue of its design process is the optimisation of the individual laminae and of the laminated structure as a whole. In order to do that, a parametric model of the material has been defined, emphasising the many geometric variables needed to be correlated in the complex process of optimisation. The input parameters involved in this work, include: widths or heights of the tows and the laminate stacking sequence, which are discrete variables, while the gaps between adjacent tows and the height of the neat matrix are continuous variables. This work is one of the first attempts of using a Genetic Algorithm (GA) to optimise the geometrical parameters of satin reinforced multi-layer composites. Given the mixed type of the input parameters involved, an original software called SOMGA (Satin Optimisation with a Modified Genetic Algorithm) has been conceived and utilised in this work. The main goal is to find the best possible solution to the problem of designing a composite material which is able to withstand to a given set of external, in-plane, loads. The optimisation process has been performed using a fitness function which can analyse and compare mechanical behaviour of different fabric reinforced composites, the results being correlated with the ultimate strains, which demonstrate the efficiency of the composite structure.

  2. Pump schedules optimisation with pressure aspects in complex large-scale water distribution systems

    Directory of Open Access Journals (Sweden)

    P. Skworcow

    2014-06-01

    Full Text Available This paper considers optimisation of pump and valve schedules in complex large-scale water distribution networks (WDN, taking into account pressure aspects such as minimum service pressure and pressure-dependent leakage. An optimisation model is automatically generated in the GAMS language from a hydraulic model in the EPANET format and from additional files describing operational constraints, electricity tariffs and pump station configurations. The paper describes in details how each hydraulic component is modelled. To reduce the size of the optimisation problem the full hydraulic model is simplified using module reduction algorithm, while retaining the nonlinear characteristics of the model. Subsequently, a nonlinear programming solver CONOPT is used to solve the optimisation model, which is in the form of Nonlinear Programming with Discontinuous Derivatives (DNLP. The results produced by CONOPT are processed further by heuristic algorithms to generate integer solution. The proposed approached was tested on a large-scale WDN model provided in the EPANET format. The considered WDN included complex structures and interactions between pump stations. Solving of several scenarios considering different horizons, time steps, operational constraints, demand levels and topological changes demonstrated ability of the approach to automatically generate and solve optimisation problems for a variety of requirements.

  3. Pump schedules optimisation with pressure aspects in complex large-scale water distribution systems

    Directory of Open Access Journals (Sweden)

    P. Skworcow

    2014-02-01

    Full Text Available This paper considers optimisation of pump and valve schedules in complex large-scale water distribution networks (WDN, taking into account pressure aspects such as minimum service pressure and pressure-dependent leakage. An optimisation model is automatically generated in GAMS language from a hydraulic model in EPANET format and from additional files describing operational constraints, electricity tariffs and pump station configurations. The paper describes in details how each hydraulic component is modelled. To reduce the size of the optimisation problem the full hydraulic model is simplified using module reduction algorithm, while retaining the nonlinear characteristics of the model. Subsequently, a nonlinear programming solver CONOPT is used to solve the optimisation model, which is in the form of Nonlinear Programming with Discontinuous Derivatives (DNLP. The results produced by CONOPT are processed further by heuristic algorithms to generate integer solution. The proposed approached was tested on a large-scale WDN model provided in EPANET format. The considered WDN included complex structures and interactions between pump stations. Solving of several scenarios considering different horizons, time steps, operational constraints, demand levels and topological changes demonstrated ability of the approach to automatically generate and solve optimisation problems for variety of requirements.

  4. A multiobjective scatter search algorithm for fault-tolerant NoC mapping optimisation

    Science.gov (United States)

    Le, Qianqi; Yang, Guowu; Hung, William N. N.; Zhang, Xinpeng; Fan, Fuyou

    2014-08-01

    Mapping IP cores to an on-chip network is an important step in Network-on-Chip (NoC) design and affects the performance of NoC systems. A mapping optimisation algorithm and a fault-tolerant mechanism are proposed in this article. The fault-tolerant mechanism and the corresponding routing algorithm can recover NoC communication from switch failures, while preserving high performance. The mapping optimisation algorithm is based on scatter search (SS), which is an intelligent algorithm with a powerful combinatorial search ability. To meet the requests of the NoC mapping application, the standard SS is improved for multiple objective optimisation. This method helps to obtain high-performance mapping layouts. The proposed algorithm was implemented on the Embedded Systems Synthesis Benchmarks Suite (E3S). Experimental results show that this optimisation algorithm achieves low-power consumption, little communication time, balanced link load and high reliability, compared to particle swarm optimisation and genetic algorithm.

  5. Approaches and challenges to optimising primary care teams’ electronic health record usage

    Directory of Open Access Journals (Sweden)

    Nancy Pandhi

    2014-07-01

    Full Text Available Background Although the presence of an electronic health record (EHR alone does not ensure high quality, efficient care, few studies have focused on the work of those charged with optimising use of existing EHR functionality.Objective To examine the approaches used and challenges perceived by analysts supporting the optimisation of primary care teams’ EHR use at a large U.S. academic health care system.Methods A qualitative study was conducted. Optimisation analysts and their supervisor were interviewed and data were analysed for themes.Results Analysts needed to reconcile the tension created by organisational mandates focused on the standardisation of EHR processes with the primary care teams’ demand for EHR customisation. They gained an understanding of health information technology (HIT leadership’s and primary care team’s goals through attending meetings, reading meeting minutes and visiting with clinical teams. Within what was organisationally possible, EHR education could then be tailored to fit team needs. Major challenges were related to organisational attempts to standardise EHR use despite varied clinic contexts, personnel readiness and technical issues with the EHR platform. Forcing standardisation upon clinical needs that current EHR functionality could not satisfy was difficult.Conclusions Dedicated optimisation analysts can add value to health systems through playing a mediating role between HIT leadership and care teams. Our findings imply that EHR optimisation should be performed with an in-depth understanding of the workflow, cognitive and interactional activities in primary care.

  6. System optimisation for automatic wood-fired heating systems; Systemoptimierung automatischer Holzheizung - Projektphase 1

    Energy Technology Data Exchange (ETDEWEB)

    Good, J.; Nussbaumer, T. [Verenum, Zuerich (Switzerland); Jenni, A. [Ardens GmbH, Liestal (Switzerland); Buehler, R. [Umwelt und Energie, Maschwanden (Switzerland)

    2002-07-01

    This final report for the Swiss Federal Office of Energy (SFOE) presents the results of the first phase of a project that is to optimise the performance of existing automatic wood-fired heating systems in the range 330 kW to 1 MW from the ecological and economical points of view. The report presents the results of an initial phase of the project in which five selected installations were optimised in order to be able to assess the potential for optimisation in general. The study looks at the efficiency of the plant as far as heat generation and distribution are concerned. The report presents details on various factors measured such as operating hours, heat distribution, control strategies, fuel-quality requirements, integration in heating systems and safety aspects and compares power delivered with rated power. The authors consider the potential for optimisation to be high and suggest optimisation targets concerning consumer density in district heating schemes, full-load operating hours, minimum yearly operational efficiency and the control of heating power.

  7. Navigating catastrophes: Local but not global optimisation allows for macro-economic navigation of crises

    Science.gov (United States)

    Harré, Michael S.

    2013-02-01

    Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.

  8. Induction Heating Process: 3D Modeling and Optimisation

    Science.gov (United States)

    Naar, R.; Bay, F.

    2011-05-01

    An increasing number of problems in mechanics and physics involves multiphysics coupled problems. Among these problems, we can often find electromagnetic coupled problems. Electromagnetic couplings may be involved through the use of direct or induced currents for thermal purposes—in order to generate heat inside a work piece in order to get either a prescribed temperature field or some given mechanical or metallurgical properties through an accurate control of temperature evolution with respect to time-, or for solid or fluid mechanics purposes—in order to create magnetic forces such as in fluid mechanics (electromagnetic stirring,…) or solid mechanics (magnetoforming,…). Induction heat treatment processes is therefore quite difficult to control; trying for instance to minimize distortions generated by such a process is not easy. In order to achieve these objectives, we have developed a computational tool which includes an optimsation stage. A 3D finite element modeling tool for local quenching after induction heating processes has already been developed in our laboratory. The modeling of such a multiphysics coupled process needs taking into account electromagnetic, thermal, mechanical and metallurgical phenomenon—as well as their mutual interactions during the whole process: heating and quenching. The model developed is based on Maxwell equations, heat transfer equation, mechanical equilibrium computations, Johnson-Mehl-Avrami and Koistinen-Marburger laws. All these equations and laws may be coupled but some coupling may be neglected. In our study, we will also focus on induction heating process aiming at optimising the Heat Affected Zone (HAZ). Thus problem is formalized as an optimization problem—minimizing a cost function which measures the difference between computed and optimal temperatures—along with some constraints on process parameters. The optimization algorithms may be of two kinds—either zero-order or first-order algorithms. First

  9. Modelling and genetic algorithm based optimisation of inverse supply chain

    Science.gov (United States)

    Bányai, T.

    2009-04-01

    The design and control of recycling systems of products with environmental risk have been discussed in the world already for a long time. The main reasons to address this subject are the followings: reduction of waste volume, intensification of recycling of materials, closing the loop, use of less resource, reducing environmental risk [1, 2]. The development of recycling systems is based on the integrated solution of technological and logistic resources and know-how [3]. However the financial conditions of recycling systems is partly based on the recovery, disassembly and remanufacturing options of the used products [4, 5, 6], but the investment and operation costs of recycling systems can be characterised with high logistic costs caused by the geographically wide collection system with more collection level and a high number of operation points of the inverse supply chain. The reduction of these costs is a popular area of the logistics researches. These researches include the design and implementation of comprehensive environmental waste and recycling program to suit business strategies (global system), design and supply all equipment for production line collection (external system), design logistics process to suit the economical and ecological requirements (external system) [7]. To the knowledge of the author, there has been no research work on supply chain design problems that purpose is the logistics oriented optimisation of inverse supply chain in the case of non-linear total cost function consisting not only operation costs but also environmental risk cost. The antecedent of this research is, that the author has taken part in some research projects in the field of closed loop economy ("Closing the loop of electr(on)ic products and domestic appliances from product planning to end-of-life technologies), environmental friendly disassembly (Concept for logistical and environmental disassembly technologies) and design of recycling systems of household appliances

  10. Optimising root system hydraulic architectures for water uptake

    Science.gov (United States)

    Meunier, Félicien; Couvreur, Valentin; Draye, Xavier; Javaux, Mathieu

    2015-04-01

    In this study we started from local hydraulic analysis of idealized root systems to develop a mathematical framework necessary for the understanding of global root systems behaviors. The underlying assumption of this study was that the plant is naturally optimised for the water uptake. The root system is thus a pipe network dedicated to the capture and transport of water. The main objective of the present research is to explain the fitness of major types of root architectures to their environment. In a first step, we developed links between local hydraulic properties and macroscopic parameters of (un)branched roots. The outcome of such an approach were functions of apparent conductance of entire root system and uptake distribution along the roots. We compared our development with some allometric scaling laws for the root water uptake: under the same simplifying assumptions we were able to obtain the same results and even to expand them to more physiological cases. Using empirical data of measured root conductance, we were also able to fit extremely well the data-set with this model. In a second stage we used generic architecture parameters and an existent root growth model to generate various types of root systems (from fibrous to tap). We combined both sides (hydraulic and architecture) then to maximize under a volume constraint either apparent conductance of root systems or the soil volume explored by active roots during the plant growth period. This approach has led to the sensitive parameters of the macroscopic parameters (conductance and location of the water uptake) of each single plant selected for this study. Scientific questions such as: "What is the optimal sowing density of a given hydraulic architecture ?" or "Which plant traits can we change to better explore the soil domain ?" can be also addressed with this approach: some potential applications are illustrated. The next (and ultimate phase) will be to validate our conclusions with real architectures

  11. MACHINING OPTIMISATION AND OPERATION ALLOCATION FOR NC LATHE MACHINES IN A JOB SHOP MANUFACTURING SYSTEM

    Directory of Open Access Journals (Sweden)

    MUSSA I. MGWATU

    2013-08-01

    Full Text Available Numerical control (NC machines in a job shop may not be cost and time effective if the assignment of cutting operations and optimisation of machining parameters are overlooked. In order to justify better utilisation and higher productivity of invested NC machine tools, it is necessary to determine the optimum machining parameters and realize effective assignment of cutting operations on machines. This paper presents two mathematical models for optimising machining parameters and effectively allocating turning operations on NC lathe machines in a job shop manufacturing system. The models are developed as non-linear programming problems and solved using a commercial LINGO software package. The results show that the decisions of machining optimisation and operation allocation on NC lathe machines can be simultaneously made while minimising both production cost and cycle time. In addition, the results indicate that production cost and cycle time can be minimised while significantly reducing or totally eliminating idle times among machines.

  12. Optimising Job-Shop Functions Utilising the Score-Function Method

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2000-01-01

    During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging to this ......During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging...... if the gradients are unbiased, the SA-algorithm will be known as a Robbins-Monro-algorithm. The present work will focus on the SF method and show how to migrate it to general types of discrete event simulation systems, in this case represented by SIMNET II, and discuss how the optimisation of the functioning...... of a Job-Shop can be handled by the SF method....

  13. An investigation into the Gustafsson limit for small planar antennas using optimisation

    CERN Document Server

    Shahpari, Morteza; Lewis, Andrew

    2013-01-01

    The fundamental limit for small antennas provides a guide to the effectiveness of designs. Gustafsson et al, Yaghjian et al, and Mohammadpour-Aghdam et al independently deduced a variation of the Chu-Harrington limit for planar antennas in different forms. Using a multi-parameter optimisation technique based on the ant colony algorithm, planar, meander dipole antenna designs were selected on the basis of lowest resonant frequency and maximum radiation efficiency. The optimal antenna designs across the spectrum from 570 to 1750 MHz occupying an area of $56mm \\times 25mm$ were compared with these limits calculated using the polarizability tensor. The results were compared with Sievenpiper's comparison of published planar antenna properties. The optimised antennas have greater than 90% polarizability compared to the containing conductive box in the range $0.3optimisation algorithm. The generalized absorption efficiency of the small meander line antennas is less than 50%, and resu...

  14. Recent research development of process integration in analysis and optimisation of energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, F.X.X.; Vaideeswaran, L. [UMIST, Manchester (United Kingdom). Dept. of Process Integration

    2000-11-01

    The design of energy systems in a process plant requires a good understanding of each subsystem (e.g. processes, heat exchanger networks, utility systems),and their interactions in the context of an overall plant. An effective design method should be able to explore the synergy between the subsystems to the maximum extent and allow users to interact with the design process. To achieve this, the effective way is to combine physical insights with mathematical optimisation techniques. Physical insights are used as a wise man's brain and eyes, while optimisation techniques are employed as a superman's power in searching for optimal solutions. In the past, concepts and methods have been developed for handling grassroots design, operational management, retrofit and debottlenecking scenarios. This paper describes the recent research progress at UMIST in developing fundamental concepts and methodologies for analysis and optimisation of energy systems. (author)

  15. Towards a Statistical Methodology to Evaluate Program Speedups and their Optimisation Techniques

    CERN Document Server

    Touati, Sid

    2009-01-01

    The community of program optimisation and analysis, code performance evaluation, parallelisation and optimising compilation has published since many decades hundreds of research and engineering articles in major conferences and journals. These articles study efficient algorithms, strategies and techniques to accelerate programs execution times, or optimise other performance metrics (MIPS, code size, energy/power, MFLOPS, etc.). Many speedups are published, but nobody is able to reproduce them exactly. The non-reproducibility of our research results is a dark point of the art, and we cannot be qualified as {\\it computer scientists} if we do not provide rigorous experimental methodology. This article provides a first effort towards a correct statistical protocol for analysing and measuring speedups. As we will see, some common mistakes are done by the community inside published articles, explaining part of the non-reproducibility of the results. Our current article is not sufficient by its own to deliver a comp...

  16. Global stability and optimisation of a general impulsive biological control model

    CERN Document Server

    Mailleret, Ludovic

    2008-01-01

    An impulsive model of augmentative biological control consisting of a general continuous predator-prey model in ordinary differential equations augmented by a discrete part describing periodic introductions of predators is considered. It is shown that there exists an invariant periodic solution that corresponds to prey eradication and a condition ensuring its global asymptotic stability is given. An optimisation problem related to the preemptive use of augmentative biological control is then considered. It is assumed that the per time unit budget of biological control (i.e. the number of predators to be released) is fixed and the best deployment of this budget is sought after in terms of release frequency. The cost function to be minimised is the time taken to reduce an unforeseen prey (pest) invasion under some harmless level. The analysis shows that the optimisation problem admits a countable infinite number of solutions. An argumentation considering the required robustness of the optimisation result is the...

  17. Particle swarm optimisation driven low cost single event transient fault secured design during architectural synthesis

    Directory of Open Access Journals (Sweden)

    Anirban Sengupta

    2017-04-01

    Full Text Available Owing to aggressive shrinking in nanometre scale as well as faster devices, particle strike manifesting itself into transient fault spanning multiple cycle and multiple units will be the centre-focus of application specific datapath generated through high-level synthesis (HLS/architectural synthesis. Addressing each problem above separately leads to large area/delay overhead; thus tackling both problems concurrently, leads to huge incurred overhead. To tackle this complex problem, this paper proposes a novel low cost particle swarm optimisation driven dual modular redundant (DMR based HLS methodology for generation of a transient fault secured design secured against its temporal and spatial effects. The authors' approach provides a low cost optimised fault secured solution through a particle swarm optimisation exploration framework based on user area-delay constraints. Results indicated that proposed approach obtains an area overhead reduction of 34.08% and latency overhead reduction of 5.8% compared with a recent approach.

  18. An analytical solution to patient prioritisation in radiotherapy based on utilitarian optimisation.

    Science.gov (United States)

    Ebert, M A; Li, W; Jennings, L

    2014-03-01

    The detrimental impact of a radiotherapy waiting list can in part be compensated by patient prioritisation. Such prioritisation is phrased as an optimisation problem where the probability of local control for the overall population is the objective to be maximised and a simple analytical solution derived. This solution is compared with a simulation of a waiting list for the same population of patients. It is found that the analytical solution can provide an optimal ordering of patients though cannot explicitly constrain optimal waiting times. The simulation-based solution was undertaken using both the analytical solution and a numerical optimisation routine for daily patient ordering. Both solutions provided very similar results with the analytical approach reducing the calculation time of the numerical solution by several orders of magnitude. It is suggested that treatment delays due to resource limitations and resulting waiting lists be incorporated into treatment optimisation and that the derived analytical solution provides a mechanism for this to occur.

  19. Simultaneous Topology, Shape, and Sizing Optimisation of Plane Trusses with Adaptive Ground Finite Elements Using MOEAs

    Directory of Open Access Journals (Sweden)

    Norapat Noilublao

    2013-01-01

    Full Text Available This paper proposes a novel integrated design strategy to accomplish simultaneous topology shape and sizing optimisation of a two-dimensional (2D truss. An optimisation problem is posed to find a structural topology, shape, and element sizes of the truss such that two objective functions, mass and compliance, are minimised. Design constraints include stress, buckling, and compliance. The procedure for an adaptive ground elements approach is proposed and its encoding/decoding process is detailed. Two sets of design variables defining truss layout, shape, and element sizes at the same time are applied. A number of multiobjective evolutionary algorithms (MOEAs are implemented to solve the design problem. Comparative performance based on a hypervolume indicator shows that multiobjective population-based incremental learning (PBIL is the best performer. Optimising three design variable types simultaneously is more efficient and effective.

  20. Optimisation of confinement in a fusion reactor using a nonlinear turbulence model

    CERN Document Server

    Highcock, E G; Barnes, M; Dorland, W

    2016-01-01

    The confinement of heat in the core of a magnetic fusion reactor is optimised using a multidimensional optimisation algorithm. For the first time in such a study, the loss of heat due to turbulence is modelled at every stage using first-principles nonlinear simulations which accurately capture the turbulent cascade and large-scale zonal flows. The simulations utilise a novel approach, with gyrofluid treatment of the small-scale drift waves and gyrokinetic treatment of the large-scale zonal flows. A simple near-circular equilibrium with standard parameters is chosen as the initial condition. The figure of merit, fusion power per unit volume, is calculated, and then two control parameters, the elongation and triangularity of the outer flux surface, are varied, with the algorithm seeking to optimise the chosen figure of merit. An optimal configuration is discovered at an elongation of 1.5 and a triangularity of 0.03.

  1. Optimisation of a Generic Ionic Model of Cardiac Myocyte Electrical Activity

    Directory of Open Access Journals (Sweden)

    Tianruo Guo

    2013-01-01

    Full Text Available A generic cardiomyocyte ionic model, whose complexity lies between a simple phenomenological formulation and a biophysically detailed ionic membrane current description, is presented. The model provides a user-defined number of ionic currents, employing two-gate Hodgkin-Huxley type kinetics. Its generic nature allows accurate reconstruction of action potential waveforms recorded experimentally from a range of cardiac myocytes. Using a multiobjective optimisation approach, the generic ionic model was optimised to accurately reproduce multiple action potential waveforms recorded from central and peripheral sinoatrial nodes and right atrial and left atrial myocytes from rabbit cardiac tissue preparations, under different electrical stimulus protocols and pharmacological conditions. When fitted simultaneously to multiple datasets, the time course of several physiologically realistic ionic currents could be reconstructed. Model behaviours tend to be well identified when extra experimental information is incorporated into the optimisation.

  2. Optimisation of polyherbal gels for vaginal drug delivery by Box-Behnken statistical design.

    Science.gov (United States)

    Chopra, Shruti; Motwani, Sanjay K; Iqbal, Zeenat; Talegaonkar, Sushma; Ahmad, Farhan J; Khar, Roop K

    2007-08-01

    The present research work aimed at development and optimisation of mucoadhesive polyherbal gels (MPG) for vaginal drug delivery. As the rheological and mucoadhesive properties of the gels correlate well to each other the prepared MPGs were optimised for maximum mucoadhesion using a relationship between the storage modulus (G') and Gel Index (GI), by employing a 3-factor, 3-level Box-Behnken statistical design. Independent variables studied were the polymer concentration (X(1)), honey concentration (X(2)) and aerosil concentration (X(3)). Aerosil has been investigated for the first time to improve the consistency of gels. The dependent variables studied were the elastic modulus, G'(Y(1)), gel index (Y(2)), and maximum detachment force (Y(3)) with applied constraints of 500optimised formulations was selected by feasibility and grid search. Three types of Carbopol studied were Carbopol 934P, Carbopol 974P and Polycarbophil. In vitro release studies were carried out for the optimised formulations and the data were fitted to release kinetics equations. Validation of the optimisation study with 8 confirmatory runs indicated high degree of prognostic ability of response surface methodology. Gels showed a gradual sustained release by a non-Fickian diffusion process. Incorporation of aerosil to gels was found to improve the rheological and mucoadhesion properties by about 50-54% and 7-11%, respectively. The Box-Behnken design facilitated the optimisation of polyherbal gel formulations for enhanced vaginal drug delivery by optimum mucoadhesion and longer retention.

  3. Understanding the Relationship between Interactive Optimisation and Visual Analytics in the Context of Prostate Brachytherapy.

    Science.gov (United States)

    Liu, Jie; Dwyer, Tim; Marriott, Kim; Millar, Jeremy; Haworth, Annette

    2017-08-29

    The fields of operations research and computer science have long sought to find automatic solver techniques that can find high-quality solutions to difficult real-world optimisation problems. The traditional workflow is to exactly model the problem and then enter this model into a general-purpose "black-box" solver. In practice, however, many problems cannot be solved completely automatically, but require a "human-in-the-loop" to iteratively refine the model and give hints to the solver. In this paper, we explore the parallels between this interactive optimisation workflow and the visual analytics sense-making loop. We assert that interactive optimisation is essentially a visual analytics task and propose a problem-solving loop analogous to the sense-making loop. We explore these ideas through an in-depth analysis of a use-case in prostate brachytherapy, an application where interactive optimisation may be able to provide significant assistance to practitioners in creating prostate cancer treatment plans customised to each patient's tumour characteristics. However, current brachytherapy treatment planning is usually a careful, mostly manual process involving multiple professionals. We developed a prototype interactive optimisation tool for brachytherapy that goes beyond current practice in supporting focal therapy - targeting tumour cells directly rather than simply seeking coverage of the whole prostate gland. We conducted semi-structured interviews, in two stages, with seven radiation oncology professionals in order to establish whether they would prefer to use interactive optimisation for treatment planning and whether such a tool could improve their trust in the novel focal therapy approach and in machine generated solutions to the problem.

  4. Optimisation of decision making under uncertainty throughout field lifetime: A fractured reservoir example

    Science.gov (United States)

    Arnold, Dan; Demyanov, Vasily; Christie, Mike; Bakay, Alexander; Gopa, Konstantin

    2016-10-01

    Assessing the change in uncertainty in reservoir production forecasts over field lifetime is rarely undertaken because of the complexity of joining together the individual workflows. This becomes particularly important in complex fields such as naturally fractured reservoirs. The impact of this problem has been identified in previous and many solutions have been proposed but never implemented on complex reservoir problems due to the computational cost of quantifying uncertainty and optimising the reservoir development, specifically knowing how many and what kind of simulations to run. This paper demonstrates a workflow that propagates uncertainty throughout field lifetime, and into the decision making process by a combination of a metric-based approach, multi-objective optimisation and Bayesian estimation of uncertainty. The workflow propagates uncertainty estimates from appraisal into initial development optimisation, then updates uncertainty through history matching and finally propagates it into late-life optimisation. The combination of techniques applied, namely the metric approach and multi-objective optimisation, help evaluate development options under uncertainty. This was achieved with a significantly reduced number of flow simulations, such that the combined workflow is computationally feasible to run for a real-field problem. This workflow is applied to two synthetic naturally fractured reservoir (NFR) case studies in appraisal, field development, history matching and mid-life EOR stages. The first is a simple sector model, while the second is a more complex full field example based on a real life analogue. This study infers geological uncertainty from an ensemble of models that are based on the carbonate Brazilian outcrop which are propagated through the field lifetime, before and after the start of production, with the inclusion of production data significantly collapsing the spread of P10-P90 in reservoir forecasts. The workflow links uncertainty

  5. Reservoir optimisation using El Niño information. Case study of Daule Peripa (Ecuador)

    Science.gov (United States)

    Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan

    2010-05-01

    The optimisation of water resources systems requires the ability to produce runoff scenarios that are consistent with available climatic information. We approach stochastic runoff modelling with a Markov-modulated autoregressive model with exogenous input, which belongs to the class of Markov-switching models. The model assumes runoff parameterisation to be conditioned on a hidden climatic state following a Markov chain, whose state transition probabilities depend on climatic information. This approach allows stochastic modeling of non-stationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We calibrate the model on the inflows of the Daule Peripa reservoir located in western Ecuador, where the occurrence of El Niño leads to anomalously heavy rainfall caused by positive sea surface temperature anomalies along the coast. El Niño - Southern Oscillation (ENSO) information is used to condition the runoff parameterisation. Inflow predictions are realistic, especially at the occurrence of El Niño events. The Daule Peripa reservoir serves a hydropower plant and a downstream water supply facility. Using historical ENSO records, synthetic monthly inflow scenarios are generated for the period 1950-2007. These scenarios are used as input to perform stochastic optimisation of the reservoir rule curves with a multi-objective Genetic Algorithm (MOGA). The optimised rule curves are assumed to be the reservoir base policy. ENSO standard indices are currently forecasted at monthly time scale with nine-month lead time. These forecasts are used to perform stochastic optimisation of reservoir releases at each monthly time step according to the following procedure: (i) nine-month inflow forecast scenarios are generated using ENSO forecasts; (ii) a MOGA is set up to optimise the upcoming nine monthly releases; (iii) the optimisation is carried out by simulating the releases on the

  6. Thermodynamic optimisation and analysis of four Kalina cycle layouts for high temperature applications

    DEFF Research Database (Denmark)

    Modi, Anish; Haglind, Fredrik

    2015-01-01

    The Kalina cycle has seen increased interest in the last few years as an efficient alternative to the conventional steam Rankine cycle. However, the available literature gives little information on the algorithms to solve or optimise this inherently complex cycle. This paper presents a detailed...... approach to solve and optimise a Kalina cycle for high temperature (a turbine inlet temperature of 500°C) and high pressure (over 100bar) applications using a computationally efficient solution algorithm. A central receiver solar thermal power plant with direct steam generation was considered as a case...

  7. Optimisation of a wet FGD pilot plant using fine limestone and organic acids

    DEFF Research Database (Denmark)

    Frandsen, Jan; Kiil, Søren; Johnsson, Jan Erik

    2001-01-01

    The effects of adding an organic acid or using a limestone with a fine particle size distribution (PSD) have been examined in a wet flue gas desulphurisation (FGD) pilot plant. Optimisation of the plant with respect to the degree of desulphurisation and the residual limestone content of the gypsum......, but the residual limestone content in the gypsum increased to somewhere between 19 and 30 wt%, making this pH range unsuitable for use in a full-scale plant. The investigations have shown that both the addition of organic acids and the use of a limestone with a fine PSD can be used to optimise wet FGD plants. (C...

  8. Vertical transportation systems embedded on shuffled frog leaping algorithm for manufacturing optimisation problems in industries.

    Science.gov (United States)

    Aungkulanon, Pasura; Luangpaiboon, Pongchanun

    2016-01-01

    Response surface methods via the first or second order models are important in manufacturing processes. This study, however, proposes different structured mechanisms of the vertical transportation systems or VTS embedded on a shuffled frog leaping-based approach. There are three VTS scenarios, a motion reaching a normal operating velocity, and both reaching and not reaching transitional motion. These variants were performed to simultaneously inspect multiple responses affected by machining parameters in multi-pass turning processes. The numerical results of two machining optimisation problems demonstrated the high performance measures of the proposed methods, when compared to other optimisation algorithms for an actual deep cut design.

  9. Optimisation of [11C]Raclopride production using a Synthra GPextent system.

    Science.gov (United States)

    Perkins, Gary; Sheth, Rajeev; Greguric, Ivan; Pascali, Giancarlo

    2014-01-01

    The dopamine D2 receptor radiotracer [(11)C]Raclopride is used extensively in clinical and preclinical imaging. Currently, a wide range of methods to produce [(11)C]Raclopride have been developed using traditional vessel reactions as well as cartridge or captive solvent. This work reports the optimisation of the production of [(11)C]Raclopride using a Synthra GPextent, comparing various methods. With optimised conditions, we were able to obtain 4±2% (ndc) yield of [(11)C]Raclopride (100 GBq [(11)C]CO2, n = 42) in 25 min. The radiochemical purity was >95% with specific activities of 135±41 MBq/nmol at end of synthesis.

  10. LHCb: Design of a Highly Optimised Vacuum Chamber Support for the LHCb Experiment

    CERN Multimedia

    Leduc, L; Veness, R

    2011-01-01

    The beam vacuum chamber in the LHCb experimental area passes through the centre of a large aperture dipole magnet. The vacuum chamber and all its support systems lie in the acceptance of the detector, so must be highly optimised for transparency to particles. As part of the upgrade programme for the LHCb vacuum system, the support system has been re-designed using advanced lightweight materials. In this paper we discuss the physics motivation for the modifications, the criteria for the selection of materials and tests performed to qualify them for the particular environment of a particle physics experiment. We also present the design of the re-optimised support system.

  11. Optimisations and evolution of the mammalian respiratory system : A suggestion of possible gene sharing in evolution.

    Science.gov (United States)

    Sapoval, Bernard; Filoche, Marcel

    2013-09-01

    The respiratory system of mammalians is made of two successive branched structures with different physiological functions. The upper structure, or bronchial tree, is a fluid transportation system made of approximately 15 generations of bifurcations leading to the order of about 2(15) = 30, 000 terminal bronchioles with a diameter of approximately 0.5mm in the human lung. The branching pattern continues up to generation 23 but the structure and function of each of the subsequent structures, called acini, is different. Each acinus consists in a branched system of ducts surrounded by alveoli and plays the role of a diffusion cell where oxygen and carbon dioxide are exchanged with blood across the alveolar membrane. We show here that the bronchial tree simultaneously presents several different optimal properties. It is first energy efficient, second, it is space filling and third it is also "rapid". This physically based multi-optimality suggests that, in the course of evolution, an organ selected against one criterion could have been used later for a totally different purpose. For example, once selected for its energetic efficiency for the transport of a viscous fluid like blood, the same genetic material could have been used for its optimized rapidity. This would have allowed the emergence of atmospheric respiration made of inspiration-expiration cycles. For this phenomenon to exist, rapidity is essential as fresh air has to reach the gas exchange organs, the pulmonary acini, before the beginning of expiration. We finally show that the pulmonary acinus is optimized in the sense that the acinus morphology is directly related to the notion of a "best possible" extraction of entropic energy by a diffusion exchanger that has to feed oxygen efficiently from air to blood across a membrane of finite permeability.

  12. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. I

  13. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs.

  14. Optimising ICT effectiveness in instruction and learning: Multilevel transformation theory and a pilot project in secondary education

    NARCIS (Netherlands)

    Mooij, Ton

    2016-01-01

    Specific combinations of educational and ICT conditions including computer use may optimise learning processes, particularly for learners at risk. This position paper asks which curricular, instructional, and ICT characteristics can be expected to optimise learning processes and outcomes,and how to

  15. Optimising ICT effectiveness in instruction and learning: Multilevel transformation theory and a pilot project in secondary education

    NARCIS (Netherlands)

    Mooij, Ton

    2016-01-01

    Specific combinations of educational and ICT conditions including computer use may optimise learning processes, particularly for learners at risk. This position paper asks which curricular, instructional, and ICT characteristics can be expected to optimise learning processes and outcomes,and how to

  16. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. I

  17. MULTIMODAL FUNCTION OPTIMISATION BASED ON IMPROVED GLOWWORM SWARM OPTIMISATION%基于改进萤火虫算法的多模函数优化

    Institute of Scientific and Technical Information of China (English)

    吴伟民; 亢少将; 林志毅; 郭涛

    2014-01-01

    In order to improve the performance of multimodal function optimisation with glowworm swarm optimisation (GSO),and to solve the problems of GSO in low peaks discovery rate,slow convergence speed and low computational accuracy,we propose an improved glowworm swarm optimisation (IGSO),in which the individual glowworm (agent)can adaptively search the peaks,and its moving step is variable.The IGSO introduces the tentative moving strategy to enhance the searching ability of the algorithm,and meantime it uses average neighbourhood distance as the reference to adjust agent’s moving step.The results of experiment on typical multimodal functions indicate that the IGSO is superior to GSO in multimodal function optimisation with high peaks discovery rate,fast convergence speed and high computational accuracy.%为了提高萤火虫算法GSO(Glowworm Swarm Optimization algorithm)多模函数优化性能,针对GSO峰值发现率低、收敛速度慢和求解精度不高的缺点,提出萤火虫个体可自适应搜索峰值且移动步长可变的改进萤火虫算法 IGSO (Improved Glowworm Swarm Optimization algorithm)。IGSO引入尝试性移动策略以增强算法的搜索能力,同时,以邻域平均距离为参考,对个体移动步长进行调整。采用典型多模函数进行测试,实验结果表明,I GS O峰值发现率高,收敛速度快且求解精度高,比GS O具有更优的多模函数优化性能。

  18. Optimisation and inhibition of anaerobic digestion of livestock manure

    Energy Technology Data Exchange (ETDEWEB)

    Sutaryo, S.

    2012-11-15

    The optimisation process during this PhD study focused on mixed enzyme (ME) addition, thermal pre-treatment and co-digestion of raw manure with solid fractions of acidified manure, while for inhibition processes, ammonia and sulphide inhibition were studied. ME addition increased methane yield of both dairy cow manure (DCM) and solid fractions of DCM (by 4.44% and 4.15% respectively, compared to the control) when ME was added to manure and incubated prior to anaerobic digestion (AD). However, no positive effect was found when ME was added to manure and fed immediately to either mesophilic (35 deg. C) or thermophilic (50 deg. C) digesters. Low-temperature pre-treatment (65 deg. C to 80 deg. C for 20 h) followed by batch assays increased the methane yield of pig manure in the range from 9.5% to 26.4% at 11 d incubation. These treatments also increased the methane yield of solid-fractions pig manure in the range from 6.1% to 25.3% at 11 d of the digestion test. However, at 90 d the increase in methane yield of pig manure was only significant at the 65 deg. C treatment, thus low-temperature thermal pre-treatment increased the rate of gas production, but did not increase the ultimate yield (B{sub o}). High-temperature pre-treatment (100 deg. C to 225 deg. C for 15 min.) increased the methane yield of DCM by 13% and 21% for treatments at 175 deg. C and 200 deg. C, respectively, at 27 d of batch assays. For pig manure, methane yield was increased by 29% following 200 deg. C treatment and 27 d of a batch digestion test. No positive effect was found of high-temperature pre-treatment on the methane yield of chicken manure. At the end of the experiment (90 d), high-temperature thermal pre-treatment was significantly increasing the B{sub 0} of pig manure and DCM. Acidification of animal manure using sulphuric acid is a well-known technology to reduce ammonia emission of animal manure. AD of acidified manure showed sulphide inhibition and consequently methane production was 45

  19. Optimised low-dose multidetector CT protocol for children with cranial deformity

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez, Jose Luis [Complejo Hospitalario Universitario de Vigo, Department of Radiology, Vigo, Pontevedra (Spain); Pombar, Miguel Angel [Complejo Hospitalario Universitario de Santiago, Department of Radiophysics, Santiago de Compostela, La Coruna (Spain); Pumar, Jose Manuel [Complejo Hospitalario Universitario de Santiago, Department of Radiology, Santiago de Compostela, La Coruna (Spain); Campo, Victor Miguel del [Complejo Hospitalario Universitario de Vigo, Department of Public Health, Vigo, Pontevedra (Spain)

    2013-08-15

    To present an optimised low-dose multidetector computed tomography (MDCT) protocol for the study of children with cranial deformity. Ninety-one consecutive MDCT studies were performed in 80 children. Studies were performed with either our standard head CT protocol (group 1, n = 20) or a low-dose cranial deformity protocol (groups 2 and 3). Group 2 (n = 38), initial, and group 3 (n = 33), final and more optimised. All studies were performed in the same 64-MDCT equipment. Cranial deformity protocol was gradationally optimised decreasing kVp, limiting mA range, using automatic exposure control (AEC) and increasing the noise index (NI). Image quality was assessed. Dose indicators such us CT dose index volume (CTDIvol), dose-length product (DLP) and effective dose (E) were used. The optimised low-dose protocol reached the following values: 80 kVp, mA range: 50-150 and NI = 23. We achieved a maximum dose reduction of 10-22 times in the 1- to 12-month-old cranium in regard to the 2004 European guidelines for MDCT. A low-dose MDCT protocol that may be used as the first diagnostic imaging option in clinically selected patients with skull abnormalities. (orig.)

  20. Décomposition-coordination en optimisation déterministe et stochastique

    CERN Document Server

    Carpentier, Pierre

    2017-01-01

    Ce livre considère le traitement de problèmes d'optimisation de grande taille. L'idée est d'éclater le problème d'optimisation global en sous-problèmes plus petits, donc plus faciles à résoudre, chacun impliquant l'un des sous-systèmes (décomposition), mais sans renoncer à obtenir l'optimum global, ce qui nécessite d'utiliser une procédure itérative (coordination). Ce sujet a fait l'objet de plusieurs livres publiés dans les années 70 dans le contexte de l'optimisation déterministe. Nous présentans ici les principes essentiels et méthodes de décomposition-coordination au travers de situations typiques, puis nous proposons un cadre général qui permet de construire des algorithmes corrects et d'étudier leur convergence. Cette théorie est présentée aussi bien dans le contexte de l'optimisation déterministe que stochastique. Ce matériel a été enseigné par les auteurs dans divers cours de 3ème cycle et également mis en œuvre dans de nombreuses applications industrielles. Des exerc...

  1. Optimising magnetic sentinel lymph node biopsy in an in vivo porcine model

    NARCIS (Netherlands)

    Ahmed, M.; Anninga, Bauke; Pouw, Joost Jacob; Vreemann, S.; Peek, M.; van Hemelrijck, Mieke; Pinder, Sara E.; ten Haken, Bernard; Pankhurst, Quentin A.; Douek, Michael

    2015-01-01

    The magnetic technique for sentinel lymph node biopsy (SLNB) has been evaluated in several clinical trials. An in vivo porcine model was developed to optimise the magnetic technique by evaluating the effect of differing volume, concentration and time of injection of magnetic tracer. A total of 60

  2. Statistical meandering wake model and its application to yaw-angle optimisation of wind farms

    Science.gov (United States)

    Thøgersen, E.; Tranberg, B.; Herp, J.; Greiner, M.

    2017-05-01

    The wake produced by a wind turbine is dynamically meandering and of rather narrow nature. Only when looking at large time averages, the wake appears to be static and rather broad, and is then well described by simple engineering models like the Jensen wake model (JWM). We generalise the latter deterministic models to a statistical meandering wake model (SMWM), where a random directional deflection is assigned to a narrow wake in such a way that on average it resembles a broad Jensen wake. In a second step, the model is further generalised to wind-farm level, where the deflections of the multiple wakes are treated as independently and identically distributed random variables. When carefully calibrated to the Nysted wind farm, the ensemble average of the statistical model produces the same wind-direction dependence of the power efficiency as obtained from the standard Jensen model. Upon using the JWM to perform a yaw-angle optimisation of wind-farm power output, we find an optimisation gain of 6.7% for the Nysted wind farm when compared to zero yaw angles and averaged over all wind directions. When applying the obtained JWM-based optimised yaw angles to the SMWM, the ensemble-averaged gain is calculated to be 7.5%. This outcome indicates the possible operational robustness of an optimised yaw control for real-life wind farms.

  3. An empirical study on website usability elements and how they affect search engine optimisation

    Directory of Open Access Journals (Sweden)

    Eugene B. Visser

    2011-03-01

    Full Text Available The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared to a usability-designed version of the control website,namely the experimental website. All optimisation elements were ignored, thus implementing only usability. The results clearly show that inclusion of the usability attributes positively affect conversion,indicating that usability is a prerequisite for effective website design. Search engine optimisation is also a prerequisite for the very reason that if a website does not rank on the first page of the search engine result page for a given keyword, then that website might as well not exist. According to this empirical work, usability is in contradiction to search engine optimisation best practices. Therefore the two need to be weighed up in terms of importance towards search engines and visitors.

  4. A study into ant colony optimisation, evolutionary computation and constraint programming on binary constraint satisfaction problems.

    NARCIS (Netherlands)

    J.I. van Hemert; C. Solnon

    2004-01-01

    textabstractWe compare two heuristic approaches, evolutionary computation and ant colony optimisation, and a complete tree-search approach, constraint programming, for solving binary constraint satisfaction problems. We experimentally show that, if evolutionary computation is far from being able to

  5. Dispersion-Flattened Composite Highly Nonlinear Fibre Optimised for Broadband Pulsed Four-Wave Mixing

    DEFF Research Database (Denmark)

    Lillieholm, Mads; Galili, Michael; Oxenløwe, Leif Katsuo

    2016-01-01

    We present a segmented composite HNLF optimised for mitigation of dispersion-fluctuation impairments for broadband pulsed four-wave mixing. The HNLF-segmentation allows for pulsed FWMprocessing of a 13-nm wide input WDM-signal with -4.6-dB conversion efficiency...

  6. Application of the genetic algorithm for optimisation of large solar hot water systems

    NARCIS (Netherlands)

    Loomans, M.G.L.C.; Visser, H.

    2002-01-01

    An implementation of the genetic algorithm in a design support tool for (large) solar hot water systems is described. The tool calculates the yield and the costs of solar hot water systems based on technical and financial data of the system components. The genetic algorithm allows for optimisation o

  7. Time course of specific AGEs during optimised glycaemic control in type 2 diabetes

    NARCIS (Netherlands)

    Mentink, CJAL; Kilhovd, BK; Rondas-Colbers, GJWM; Torjesen, PA; Wolffenbuttel, BHR

    2006-01-01

    Background: Several advanced glycation endproducts (AGEs) are formed in the hyperglycaemic state. Although serum AGEs correlate with average glycaemic control in patients with type 2 diabetes and predict the development of complications, it is not known how serum AGEs change during optimisation of d

  8. Topology optimisation of passive coolers for light-emitting diode lamps

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    2015-01-01

    This work applies topology optimisation to the design of passive coolers for light-emitting diode (LED) lamps. The heat sinks are cooled by the natural convection currents arising from the temperature difference between the LED lamp and the surrounding air. A large scale parallel computational...

  9. A numerical method to optimise the spatial dose distribution in carbon ion radiotherapy planning.

    Science.gov (United States)

    Grzanka, L; Korcyl, M; Olko, P; Waligorski, M P R

    2015-09-01

    The authors describe a numerical algorithm to optimise the entrance spectra of a composition of pristine carbon ion beams which delivers a pre-assumed dose-depth profile over a given depth range within the spread-out Bragg peak. The physical beam transport model is based on tabularised data generated using the SHIELD-HIT10A Monte-Carlo code. Depth-dose profile optimisation is achieved by minimising the deviation from the pre-assumed profile evaluated on a regular grid of points over a given depth range. This multi-dimensional minimisation problem is solved using the L-BFGS-B algorithm, with parallel processing support. Another multi-dimensional interpolation algorithm is used to calculate at given beam depths the cumulative energy-fluence spectra for primary and secondary ions in the optimised beam composition. Knowledge of such energy-fluence spectra for each ion is required by the mixed-field calculation of Katz's cellular Track Structure Theory (TST) that predicts the resulting depth-survival profile. The optimisation algorithm and the TST mixed-field calculation are essential tools in the development of a one-dimensional kernel of a carbon ion therapy planning system. All codes used in the work are generally accessible within the libamtrack open source platform.

  10. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    Science.gov (United States)

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  11. Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks

    Science.gov (United States)

    2015-04-01

    September, San Francisco , USA. Heller M, Burchill M, Wescott R, Waldman W, Kaye R, Evans R, McDonald M, 2009: Airframe life extension by optimised...9. TASK NUMBER AIR 07/283 10. TASK SPONSOR OIC- ASI -DGTA 11. NO. OF PAGES 123 12. NO. OF REFERENCES 35 13. DSTO PUBLICATIONS

  12. Direct drive TFPM wind generator analytical design optimised for minimum active mass usage

    DEFF Research Database (Denmark)

    Nica, Florin Valentin Traian; Leban, Krisztina Monika; Ritchie, Ewen

    2013-01-01

    The paper focuses of the Transverse Flux Permanent (TFPM) Generator as a solution for offshore direct drive wind turbines. A complex design algorithm is presented. Two topologies (U core and C core) of TFPM were considered. The analytical design is optimised using a combination of genetic...... algorithms and three dimensional finite element (FEM) analyses to obtain a minimum active mass for the generator....

  13. MIMO-Radar Waveform Design for Beampattern Using Particle-Swarm-Optimisation

    KAUST Repository

    Ahmed, Sajid

    2012-07-31

    Multiple input multiple output (MIMO) radars have many advantages over their phased-array counterparts: improved spatial resolution; better parametric identifiably and greater flexibility to acheive the desired transmit beampattern. The desired transmit beampatterns using MIMO-radar requires the waveforms to have arbitrary auto- and cross-correlations. To design such waveforms, generally a waveform covariance matrix, R, is synthesised first then the actual waveforms are designed. Synthesis of the covariance matrix, R, is a constrained optimisation problem, which requires R to be positive semidefinite and all of its diagonal elements to be equal. To simplify the first constraint the covariance matrix is synthesised indirectly from its square-root matrix U, while for the second constraint the elements of the m-th column of U are parameterised using the coordinates of the m-hypersphere. This implicitly fulfils both of the constraints and enables us to write the cost-function in closed form. Then the cost-function is optimised using a simple particle-swarm-optimisation (PSO) technique, which requires only the cost-function and can optimise any choice of norm cost-function. © 2012 IEEE.

  14. Optimisation of ultrafiltration of a highly viscous protein solution using spiral-wound modules

    DEFF Research Database (Denmark)

    Lipnizki, Jens; Casani, S.; Jonsson, Gunnar Eigil

    2005-01-01

    The ultrafiltration process of highly viscous protein process water with spiral-wound modules was optimised by analysing the fouling and developing a strategy to reduce it. It was shown that the flux reduction during filtration is mainly caused by the adsorption of proteins on the membrane and no...

  15. Optimising Job-Shop Functions Utilising the Score-Function Method

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2000-01-01

    if the gradients are unbiased, the SA-algorithm will be known as a Robbins-Monro-algorithm. The present work will focus on the SF method and show how to migrate it to general types of discrete event simulation systems, in this case represented by SIMNET II, and discuss how the optimisation of the functioning...

  16. Optimisation of an idealised ocean model, stochastic parameterisation of sub-grid eddies

    CERN Document Server

    Cooper, Fenwick C

    2014-01-01

    An optimisation scheme is developed to accurately represent the sub-grid scale forcing of a high dimensional chaotic ocean system. Using a simple parameterisation scheme, the velocity components of a 30km resolution shallow water ocean model are optimised to have the same climatological mean and variance as that of a less viscous 7.5km resolution model. The 5 day lag-covariance is also optimised, leading to a more accurate estimate of the high resolution response to forcing using the low resolution model. The system considered is an idealised barotropic double gyre that is chaotic at both resolutions. Using the optimisation scheme, we find and apply the constant in time, but spatially varying, forcing term that is equal to the time integrated forcing of the sub-mesoscale eddies. A linear stochastic term, independent of the large-scale flow, with no spatial correlation but a spatially varying amplitude and time scale is used to represent the transient eddies. The climatological mean, variance and 5 day lag-cov...

  17. Brief Cognitive Behavioural Therapy Compared to Optimised General Practitioners? Care for Depression: A Randomised Trial

    NARCIS (Netherlands)

    Schene, A. H.; Baas, K. D.; Koeter, M.; Lucassen, P.; Bockting, C. L. H.; Wittkampf, K. F.; van Weert, H. C.; Huyser, J.

    2014-01-01

    Background: How to treat Major Depressive Disorder (MDD) in primary care? Studies that compared (brief) Cognitive Behavioural Therapy (CBT) with care as usual by the General Practitioner (GP) found the first to be more effective. However, to make a fair comparison GP care should be optimised and pro

  18. Brief Cognitive Behavioural Therapy compared to optimised general practitioners’ care for depression : A randomised trial

    NARCIS (Netherlands)

    Schene, A.H.; Baas, K.D.; Koeter, M.W.J.; Lucassen, P.; Bockting, C.L.H.; Wittkampf, K.A.; Huyser, J.; van Weert, H.C.

    2014-01-01

    Background: How to treat Major Depressive Disorder (MDD) in primary care? Studies that compared (brief) Cognitive Behavioural Therapy (CBT) with care as usual by the General Practitioner (GP) found the first to be more effective. However, to make a fair comparison GP care should be optimised and pro

  19. Energetic analysis and optimisation of an integrated coal gasification-combined cycle power plant

    NARCIS (Netherlands)

    Vlaswinkel, E.E.

    1992-01-01

    Methods are presented to analyse and optimise the energetic performance of integrated coal gasification-combined cycle (IGCC) power plants. The methods involve exergy analysis and pinch technology and can be used to identify key process parameters and to generate alternative design options for impro

  20. STATISTICALLY OPTIMISED NEAR FIELD ACOUSTIC HOLOGRAPHYAND THE HELMHOLTZ EQUATION LEAST SQUARESMETHOD: A COMPARISON

    DEFF Research Database (Denmark)

    Gomes, Jesper Skovhus; Jacobsen, Finn

    Several variants of near field acoustic holography (NAH) that do not require measurement areas larger than the source have been proposed. This paper examines and compares two such methods, statistically optimised near field acoustic holography (SONAH) and the Helmholtz equation least squares method...

  1. Neuromuscular blockade for optimising surgical conditions during abdominal and gynaecological surgery

    DEFF Research Database (Denmark)

    Madsen, M V; Staehr-Rye, A K; Gätke, M R

    2015-01-01

    BACKGROUND: The level of neuromuscular blockade (NMB) that provides optimal surgical conditions during abdominal surgery has not been well established. The aim of this systematic review was to evaluate current evidence on the use of neuromuscular blocking agents in order to optimise surgical cond...

  2. Technical model for optimising PV/diesel/battery hybrid power systems

    CSIR Research Space (South Africa)

    Tazvinga, Henerica

    2010-08-31

    Full Text Available is required for optimising the sizing and operational strategy of the PV-diesel-battery hybrid system than is required for single-source systems. Various models are available on the market and in research groups but the challenge is to customise these to suit...

  3. Sizing Combined Heat and Power Units and Domestic Building Energy Cost Optimisation

    Directory of Open Access Journals (Sweden)

    Dongmin Yu

    2017-06-01

    Full Text Available Many combined heat and power (CHP units have been installed in domestic buildings to increase energy efficiency and reduce energy costs. However, inappropriate sizing of a CHP may actually increase energy costs and reduce energy efficiency. Moreover, the high manufacturing cost of batteries makes batteries less affordable. Therefore, this paper will attempt to size the capacity of CHP and optimise daily energy costs for a domestic building with only CHP installed. In this paper, electricity and heat loads are firstly used as sizing criteria in finding the best capacities of different types of CHP with the help of the maximum rectangle (MR method. Subsequently, the genetic algorithm (GA will be used to optimise the daily energy costs of the different cases. Then, heat and electricity loads are jointly considered for sizing different types of CHP and for optimising the daily energy costs through the GA method. The optimisation results show that the GA sizing method gives a higher average daily energy cost saving, which is 13% reduction compared to a building without installing CHP. However, to achieve this, there will be about 3% energy efficiency reduction and 7% input power to rated power ratio reduction compared to using the MR method and heat demand in sizing CHP.

  4. Optimising Reactive Control in non-ideal Efficiency Wave Energy Converters

    DEFF Research Database (Denmark)

    Strager, Thomas; Lopez, Pablo Fernandez; Giorgio, Giuseppe

    2014-01-01

    When analytically optimising the control strategy in wave energy converters which use a point absorber, the efficiency aspect is generally neglected. The results presented in this paper provide an analytical expression for the mean harvested electrical power in non-ideal efficiency situations...

  5. Therapeutic drug monitoring: an aid to optimising response to antiretroviral drugs?

    NARCIS (Netherlands)

    Aarnoutse, R.E.; Schapiro, J.M.; Boucher, C.A.B.; Hekster, Y.A.; Burger, D.M.

    2003-01-01

    Therapeutic drug monitoring (TDM) has been proposed as a means to optimise response to highly active antiretroviral therapy (HAART) in HIV infection. Protease inhibitors (PIs) and the non-nucleoside reverse transcriptase inhibitors (NNRTIs) efavirenz and nevirapine satisfy many criteria for TDM. Nuc

  6. Optimising the Collaborative Practice of Nurses in Primary Care Settings Using a Knowledge Translation Approach

    Science.gov (United States)

    Oelke, Nelly; Wilhelm, Amanda; Jackson, Karen

    2016-01-01

    The role of nurses in primary care is poorly understood and many are not working to their full scope of practice. Building on previous research, this knowledge translation (KT) project's aim was to facilitate nurses' capacity to optimise their practice in these settings. A Summit engaging Alberta stakeholders in a deliberative discussion was the…

  7. Optimised sensitivity to leptonic CP violation from spectral information: the LBNO case at 2300 km baseline

    CERN Document Server

    Agarwalla, S K; Aittola, M; Alekou, A; Andrieu, B; Antoniou, F; Asfandiyarov, R; Autiero, D; Bésida, O; Balik, A; Ballett, P; Bandac, I; Banerjee, D; Bartmann, W; Bay, F; Biskup, B; Blebea-Apostu, A M; Blondel, A; Bogomilov, M; Bolognesi, S; Borriello, E; Brancus, I; Bravar, A; Buizza-Avanzini, M; Caiulo, D; Calin, M; Calviani, M; Campanelli, M; Cantini, C; Cata-Danil, G; Chakraborty, S; Charitonidis, N; Chaussard, L; Chesneanu, D; Chipesiu, F; Crivelli, P; Dawson, J; De Bonis, I; Declais, Y; Sanchez, P Del Amo; Delbart, A; Di Luise, S; Duchesneau, D; Dumarchez, J; Efthymiopoulos, I; Eliseev, A; Emery, S; Enqvist, T; Enqvist, K; Epprecht, L; Erykalov, A N; Esanu, T; Franco, D; Friend, M; Galymov, V; Gavrilov, G; Gendotti, A; Giganti, C; Gilardoni, S; Goddard, B; Gomoiu, C M; Gornushkin, Y A; Gorodetzky, P; Haesler, A; Hasegawa, T; Horikawa, S; Huitu, K; Izmaylov, A; Jipa, A; Kainulainen, K; Karadzhov, Y; Khabibullin, M; Khotjantsev, A; Kopylov, A N; Korzenev, A; Kosyanenko, S; Kryn, D; Kudenko, Y; Kuusiniemi, P; Lazanu, I; Lazaridis, C; Levy, J -M; Loo, K; Maalampi, J; Margineanu, R M; Marteau, J; Martin-Mari, C; Matveev, V; Mazzucato, E; Mefodiev, A; Mineev, O; Mirizzi, A; Mitrica, B; Murphy, S; Nakadaira, T; Narita, S; Nesterenko, D A; Nguyen, K; Nikolics, K; Noah, E; Novikov, Yu; Oprima, A; Osborne, J; Ovsyannikova, T; Papaphilippou, Y; Pascoli, S; Patzak, T; Pectu, M; Pennacchio, E; Periale, L; Pessard, H; Popov, B; Ravonel, M; Rayner, M; Resnati, F; Ristea, O; Robert, A; Rubbia, A; Rummukainen, K; Saftoiu, A; Sakashita, K; Sanchez-Galan, F; Sarkamo, J; Saviano, N; Scantamburlo, E; Sergiampietri, F; Sgalaberna, D; Shaposhnikova, E; Slupecki, M; Smargianaki, D; Stanca, D; Steerenberg, R; Sterian, A R; Sterian, P; Stoica, S; Strabel, C; Suhonen, J; Suvorov, V; Toma, G; Tonazzo, A; Trzaska, W H; Tsenov, R; Tuominen, K; Valram, M; Vankova-Kirilova, G; Vannucci, F; Vasseur, G; Velotti, F; Velten, P; Venturi, V; Viant, T; Vihonen, S; Vincke, H; Vorobyev, A; Weber, A; Wu, S; Yershov, N; Zambelli, L; Zito, M

    2014-01-01

    One of the main goals of the Long Baseline Neutrino Observatory (LBNO) is to study the $L/E$ behaviour (spectral information) of the electron neutrino and antineutrino appearance probabilities, in order to determine the unknown CP-violation phase $\\delta_{CP}$ and discover CP-violation in the leptonic sector. The result is based on the measurement of the appearance probabilities in a broad range of energies, covering t he 1st and 2nd oscillation maxima, at a very long baseline of 2300 km. The sensitivity of the experiment can be maximised by optimising the energy spectra of the neutrino and anti-neutrino fluxes. Such an optimisation requires exploring an extended range of parameters describing in details the geometries and properties of the primary protons, hadron target and focusing elements in the neutrino beam line. In this paper we present a numerical solution that leads to an optimised energy spectra and study its impact on the sensitivity of LBNO to discover leptonic CP violation. In the optimised flux ...

  8. Application of the genetic algorithm for optimisation of large solar hot water systems

    NARCIS (Netherlands)

    Loomans, M.G.L.C.; Visser, H.

    2002-01-01

    An implementation of the genetic algorithm in a design support tool for (large) solar hot water systems is described. The tool calculates the yield and the costs of solar hot water systems based on technical and financial data of the system components. The genetic algorithm allows for optimisation

  9. Topology optimisation of manufacturable microstructural details without length scale separation using a spectral coarse basis preconditioner

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Lazarov, Boyan Stefanov

    2015-01-01

    to massive savings in computational cost. The density-based topology optimisation approach combined with a Heaviside projection filter and a stochastic robust formulation is used on various problems, with both periodic and layered microstructures. The presented approach is shown to allow for the topology...

  10. Optimisation of product quality and minimisation of its variation in climate controlled operations

    NARCIS (Netherlands)

    Verdijck, G.J.C.; Straten, van G.; Preisig, H.A.

    2005-01-01

    An optimisation procedure is presented for direct control of product quality of agro-material and minimisation of its quality variation. The procedure builds on a previously presented model structure, which is briefly reviewed, together forming a methodological framework for direct product quality c

  11. A soft-core processor architecture optimised for radar signal processing applications

    CSIR Research Space (South Africa)

    Broich, R

    2013-12-01

    Full Text Available Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.4 Datapath Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 4.5 Control Unit Architecture...View for FPGA, MATLAB/Sim- ulink plug-ins: Xilinx Sys- tem Generator, Altera DSP Builder) Graphical interconnection of different DSP blocks, multi- rate systems, fused datapath optimisations. Difficult to de- scribe and debug complex designs, no design trade...

  12. Bus Access Optimisation for FlexRay-based Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Traian; Pop, Paul; Eles, Petru

    2007-01-01

    -real time communication in a deterministic manner. In this paper, we propose techniques for optimising the FlexRay bus access mechanism of a distributed system, so that the hard real-time deadlines are met for all the tasks and messages in the system. We have evaluated the proposed techniques using...

  13. Optimising mechanical behaviour of new advanced steels based on fine non-equilibrium microstructures

    NARCIS (Netherlands)

    HajyAkbary, F.

    2015-01-01

    This Ph.D. thesis investigates the relation between microstructural and mechanical properties of Advanced High Strength Steels (AHSS), with the goal of developing a microstructure with optimised mechanical properties. Among different grades of AHSS, Quenching and Partitioning (Q&P) steel which is co

  14. Optimising mechanical behaviour of new advanced steels based on fine non-equilibrium microstructures

    NARCIS (Netherlands)

    HajyAkbary, F.

    2015-01-01

    This Ph.D. thesis investigates the relation between microstructural and mechanical properties of Advanced High Strength Steels (AHSS), with the goal of developing a microstructure with optimised mechanical properties. Among different grades of AHSS, Quenching and Partitioning (Q&P) steel which is

  15. Optimisation of product quality and minimisation of its variation in climate controlled operations

    NARCIS (Netherlands)

    Verdijck, G.J.C.; Straten, van G.; Preisig, H.A.

    2005-01-01

    An optimisation procedure is presented for direct control of product quality of agro-material and minimisation of its quality variation. The procedure builds on a previously presented model structure, which is briefly reviewed, together forming a methodological framework for direct product quality

  16. Optimisation of the level-1 calorimeter trigger at ATLAS for Run II

    Energy Technology Data Exchange (ETDEWEB)

    Suchek, Stanislav [Kirchhoff-Institute for Physics, Im Neuenheimer Feld 227, 69120 Heidelberg (Germany); Collaboration: ATLAS-Collaboration

    2015-07-01

    The Level-1 Calorimeter Trigger (L1Calo) is a central part of the ATLAS Level-1 Trigger system, designed to identify jet, electron, photon, and hadronic tau candidates, and to measure their transverse energies, as well total transverse energy and missing transverse energy. The optimisation of the jet energy resolution is an important part of the L1Calo upgrade for Run II. A Look-Up Table (LUT) is used to translate the electronic signal from each trigger tower to its transverse energy. By optimising the LUT calibration we can achieve better jet energy resolution and better performance of the jet transverse energy triggers, which are vital for many physics analyses. In addition, the improved energy calibration leads to significant improvements of the missing transverse energy resolution. A new Multi-Chip Module (MCM), as a part of the L1Calo upgrade, provides two separate LUTs for jets and electrons/photons/taus, allowing to optimise jet transverse energy and missing transverse energy separately from the electromagnetic objects. The optimisation is validated using jet transverse energy and missing transverse energy triggers turn-on curves and rates.

  17. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    Directory of Open Access Journals (Sweden)

    Vito Trianni

    Full Text Available The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled. However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  18. Optimising operational amplifiers by evolutionary algorithms and gm/Id method

    Science.gov (United States)

    Tlelo-Cuautle, E.; Sanabria-Borbon, A. C.

    2016-10-01

    The evolutionary algorithm called non-dominated sorting genetic algorithm (NSGA-II) is applied herein in the optimisation of operational transconductance amplifiers. NSGA-II is accelerated by applying the gm/Id method to estimate reduced search spaces associated to widths (W) and lengths (L) of the metal-oxide-semiconductor field-effect-transistor (MOSFETs), and to guarantee their appropriate bias levels conditions. In addition, we introduce an integer encoding for the W/L sizes of the MOSFETs to avoid a post-processing step for rounding-off their values to be multiples of the integrated circuit fabrication technology. Finally, from the feasible solutions generated by NSGA-II, we introduce a second optimisation stage to guarantee that the final feasible W/L sizes solutions support process, voltage and temperature (PVT) variations. The optimisation results lead us to conclude that the gm/Id method and integer encoding are quite useful to accelerate the convergence of the evolutionary algorithm NSGA-II, while the second optimisation stage guarantees robustness of the feasible solutions to PVT variations.

  19. GIS-based approach for optimised collection of household waste in Mostaganem city (Western Algeria).

    Science.gov (United States)

    Abdelli, I S; Abdelmalek, F; Djelloul, A; Mesghouni, K; Addou, A

    2016-05-01

    This work proposes an optimisation of municipal solid waste collection in terms of collection cost and polluting emissions (carbon oxides, carbon dioxides, nitrogen oxides and particulate matter). This method is based on a simultaneous optimisation of the vehicles routing (distance and time travelled) and the routing system for household wastes collection based on the existing network of containers, the capacity of vehicles and the quantities generated in every collecting point. The process of vehicle routing optimisation involves a geographical information system. This optimisation has enabled a reduction of travelled distances, collection time, fuel consumption and polluting emissions. Pertinent parameters affecting the fuel consumption have been utilised, such as the state of the road, the vehicles speed in the different paths, the vehicles load and collection frequencies. Several scenarios have been proposed. The results show the importance of the construction of a waste transfer station that can reduce the cost of household waste collection and emissions of waste transfer pollutants. Among the proposed five scenarios, we have noticed that the fourth scenario (by constructing a waste transfer centre) was the most performing. So, the routes of optimised travelled distance of the new circuits have been reduced by 71.81%. The fuel consumption has been reduced by 72.05% and the total cost of the collection has been reduced by 46.8%. For the polluting emissions, the reduction has been by 60.2% for carbon oxides, by 67.9% for carbon dioxides, by 74.2% for nitrogen oxides and by 65% for particulate matter.

  20. Finding gene-environment interactions for Phobias

    OpenAIRE

    Gregory, Alice M.; Lau, Jennifer Y. F.; Eley, Thalia C

    2008-01-01

    Phobias are common disorders causing a great deal of suffering. Studies of gene-environment interaction (G × E) have revealed much about the complex processes underlying the development of various psychiatric disorders but have told us little about phobias. This article describes what is already known about genetic and environmental influences upon phobias and suggests how this information can be used to optimise the chances of discovering G × Es for phobias. In addition to the careful concep...

  1. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    Science.gov (United States)

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  2. Multi-objective optimisation of cost-benefit of urban flood management using a 1D2D coupled model.

    Science.gov (United States)

    Delelegn, S W; Pathirana, A; Gersonius, B; Adeogun, A G; Vairavamoorthy, K

    2011-01-01

    This paper presents a multi-objective optimisation (MOO) tool for urban drainage management that is based on a 1D2D coupled model of SWMM5 (1D sub-surface flow model) and BreZo (2D surface flow model). This coupled model is linked with NSGA-II, which is an Evolutionary Algorithm-based optimiser. Previously the combination of a surface/sub-surface flow model and evolutionary optimisation has been considered to be infeasible due to the computational demands. The 1D2D coupled model used here shows a computational efficiency that is acceptable for optimisation. This technological advance is the result of the application of a triangular irregular discretisation process and an explicit finite volume solver in the 2D surface flow model. Besides that, OpenMP based parallelisation was employed at optimiser level to further improve the computational speed of the MOO tool. The MOO tool has been applied to an existing sewer network in West Garforth, UK. This application demonstrates the advantages of using multi-objective optimisation by providing an easy-to-comprehend Pareto-optimal front (relating investment cost to expected flood damage) that could be used for decision making processes, without repeatedly going through the modelling-optimisation stage.

  3. Multidisciplinary design optimisation of a recurve bow based on applications of the autogenetic design theory and distributed computing

    Science.gov (United States)

    Fritzsche, Matthias; Kittel, Konstantin; Blankenburg, Alexander; Vajna, Sándor

    2012-08-01

    The focus of this paper is to present a method of multidisciplinary design optimisation based on the autogenetic design theory (ADT) that provides methods, which are partially implemented in the optimisation software described here. The main thesis of the ADT is that biological evolution and the process of developing products are mainly similar, i.e. procedures from biological evolution can be transferred into product development. In order to fulfil requirements and boundary conditions of any kind (that may change at any time), both biological evolution and product development look for appropriate solution possibilities in a certain area, and try to optimise those that are actually promising by varying parameters and combinations of these solutions. As the time necessary for multidisciplinary design optimisations is a critical aspect in product development, ways to distribute the optimisation process with the effective use of unused calculating capacity, can reduce the optimisation time drastically. Finally, a practical example shows how ADT methods and distributed optimising are applied to improve a product.

  4. Organisational design elements and competencies for optimising the expertise of knowledge workers in a shared services centre

    Directory of Open Access Journals (Sweden)

    Mark Ramsey

    2011-02-01

    Full Text Available Orientation: Organisations are still structured according to the Industrial Age control model that restricts optimising the expertise of knowledge workers.Research purpose: The general aim of the research was to explore the organisation design elements and competencies that contribute to optimising the expertise of knowledge workers in a shared services centre.Motivation for the study: Current organisational design methodologies do not emphasise optimising the expertise of knowledge workers. This research addresses the challenge of how an organisation design can improve the creation and availability of the expertise of knowledge workers.Research design/approach method: The researcher followed a qualitative case study research design and collected data in six focus group sessions (N = 25.Main findings: The findings showed that the shared services centre (SSC is not designed to enable its structure, culture and codifying system to optimise the expertise of knowledge workers. In addition, the SSC does not share the knowledge generated with other knowledge workers. Furthermore, it does not use the output of the knowledge workers to improve business processes.Practical/managerial implications: The expertise of knowledge workers is the basis of competitive advantage. Therefore, managers should create an organisational design that is conducive to optimising knowledge work expertise.Contribution/value add: This research highlights the important organisational design elements and supportive organisational structures for optimising the expertise of knowledge workers. The research also proposes a framework for optimising the expertise of knowledge workers and helping an organisation to achieve sustainable competitive advantage.

  5. An optimised PCR/T-RFLP fingerprinting approach for the investigation of protistan communities in groundwater environments.

    Science.gov (United States)

    Euringer, Kathrin; Lueders, Tillmann

    2008-10-01

    Due to the scarcity or complete absence of higher organisms, protists may represent an important higher trophic level (above Prokaryotes) in the food webs of groundwater habitats. Nevertheless, the importance of aquifer protists, especially in contaminated groundwater environments, is poorly understood. Partly, this may be due to a lack of adequate PCR and fingerprinting approaches for protists in aquifers, which can be considered low in protistan or high in non-target rRNA gene copy numbers. Therefore, we have validated the suitability of distinct eukaryote-targeted primer pairs and restriction endonucleases for T-RFLP fingerprinting of protistan communities. By in silico predictions, and by fingerprinting, cloning and sequencing of microeukaryote amplicons from hydrocarbon-contaminated aquifer sediment DNA, we show that the Euk20f/Euk516r primer set in combination with Bsh1236I digestion is best suited for the recovery of diverse protistan 18S rRNA lineages. In contrast to other tested primer sets, a preferred recovery of fungal and archaeal non-target amplicons was not observed. In summary, we present an optimised microeukaryote-targeted PCR/T-RFLP fingerprinting approach which may be of value for the characterisation of protistan communities in groundwater and other habitats.

  6. Muscle fibre size optimisation provides flexibility for energy budgeting in calorie-restricted coho salmon transgenic for growth hormone.

    Science.gov (United States)

    Johnston, Ian A; de la Serrana, Daniel Garcia; Devlin, Robert H

    2014-10-01

    Coho salmon (Oncorhynchus kisutch) transgenic for growth hormone (GH) show substantially faster growth than wild-type (WT) fish. We fed GH-transgenic salmon either to satiation (1 year; TF) or the same smaller ration of wild-type fish (2 years; TR), resulting in groups matched for body size to WT salmon. The myotomes of TF and WT fish had the same number and size distribution of muscle fibres, indicating a twofold higher rate of fibre recruitment in the GH transgenics. Unexpectedly, calorie restriction was found to decrease the rate of fibre production in transgenics, resulting in a 20% increase in average fibre size and reduced costs of ionic homeostasis. Genes for myotube formation were downregulated in TR relative to TF and WT fish. We suggest that muscle fibre size optimisation allows the reallocation of energy from maintenance to locomotion, explaining the observation that calorie-restricted transgenics grow at the same rate as WT fish whilst exhibiting markedly higher foraging activity.

  7. Optimised collision avoidance for an ultra-close rendezvous with a failed satellite based on the Gauss pseudospectral method

    Science.gov (United States)

    Chu, Xiaoyu; Zhang, Jingrui; Lu, Shan; Zhang, Yao; Sun, Yue

    2016-11-01

    This paper presents a trajectory planning algorithm to optimise the collision avoidance of a chasing spacecraft operating in an ultra-close proximity to a failed satellite. The complex configuration and the tumbling motion of the failed satellite are considered. The two-spacecraft rendezvous dynamics are formulated based on the target body frame, and the collision avoidance constraints are detailed, particularly concerning the uncertainties. An optimisation solution of the approaching problem is generated using the Gauss pseudospectral method. A closed-loop control is used to track the optimised trajectory. Numerical results are provided to demonstrate the effectiveness of the proposed algorithms.

  8. Optimisation of protocol for Clostridium botulinum detection in mink feed

    Directory of Open Access Journals (Sweden)

    Grenda Tomasz

    2015-09-01

    Full Text Available As the test material mink feed with natural microflora was used. The analyses were conducted using Wrzosek and TPGY broth media, and Willis–Hobbs and Zeissler differential agar media. Wrzosek, Willis–Hobbs, and Zeissler media are described in Polish Standards approved by the National Standards Body in Poland and routinely used in detection of anaerobic bacteria in Poland. Detection and identification of C. botulinum was performed with a previously validated real-time PCR method based on ntnh gene detection, which is common in all C. botulinum toxotypes. The use of Wrzosek broth and Zeissler agar in routine analyses for detection and identification of C. botulinum was ineffective and limited. The obtained results showed the highest culturing process effectiveness in TPGY broth with 72 h incubation at 30°C and isolation on Willis–Hobbs agar. The real-time PCR method based on ntnh gene detection used in this study could be utilised as a supplementary tool to the mouse lethality assay.

  9. The hepatitis E virus ORF3 protein regulates the expression of liver-specific genes by modulating localization of hepatocyte nuclear factor 4.

    Directory of Open Access Journals (Sweden)

    Vivek Chandra

    Full Text Available The hepatitis E virus (HEV is a small RNA virus and the cause of acute viral hepatitis E. The open reading frame 3 protein (pORF3 of HEV appears to be a pleiotropic regulatory protein that helps in the establishment, propagation and progression of viral infection. However, the global cellular effects of this protein remain to be explored. In the absence of traditional in vitro viral infection systems or efficient replicon systems, we made an adenovirus based ORF3 protein expression system to study its effects on host cell gene expression. We infected Huh7 hepatoma cells with recombinant adenoviruses expressing pORF3 and performed microarray-based gene expression analyses. Several genes down regulated in pORF3-expressing cells were found to be under regulation of the liver-enriched hepatocyte nuclear factor 4 (HNF4, which regulates hepatocyte-specific gene expression. While HNF4 localizes to the nucleus, its phosphorylation results in impaired nuclear localization of HNF4. Here we report that pORF3 increases HNF4 phosphorylation through the ERK and Akt kinases, which results in impaired nuclear translocation of HNF4 and subsequently the down modulation of HNF4-responsive genes in pORF3-expressing cells. We propose that modulation of several hepatocyte specific genes by pORF3 will create an environment favorable for viral replication and pathogenesis.

  10. Long-term gene therapy in the CNS: reversal of hypothalamic diabetes insipidus in the Brattleboro rat by using an adenovirus expressing arginine vasopressin.

    Science.gov (United States)

    Geddes, B J; Harding, T C; Lightman, S L; Uney, J B

    1997-12-01

    The ability of adenovirus (Ad) to transfect most cell types efficiently has already resulted in human gene therapy trials involving the systemic administration of adenoviral constructs. However, because of the complexity of brain function and the difficulty in noninvasively monitoring alterations in neuronal gene expression, the potential of Ad gene therapy strategies for treating disorders of the CNS has been difficult to assess. In the present study, we have used an Ad encoding the arginine vasopressin cDNA (AdAVP) in an AVP-deficient animal model of diabetes insipidus (the Brattleboro rat), which allowed us to monitor chronically the success of the gene therapy treatment by noninvasive assays. Injection of AdAVP into the supraoptic nuclei (SON) of the hypothalamus resulted in expression of AVP in magnocellular neurons. This was accompanied by reduced daily water intake and urine volume, as well as increased urine osmolality lasting 4 months. These data show that a single gene defect leading to a neurological disorder can be corrected with an adenovirus-based strategy. This study highlights the potential of using Ad gene therapy for the long-term treatment of disorders of the CNS.

  11. La location-gérance, mode d’exploitation du fonds de commerce ou instrument d’optimisation fiscale ?

    National Research Council Canada - National Science Library

    Mohamed EL GDAIHI

    2014-01-01

    ...; fiscal optimisation is done more or less in transparency. The result will be reversed when the group of companies has neither legal personality nor processes and arrangements to allow tax savings, the lease under management is a case in point...

  12. LA LOCATION-GÉRANCE, MODE D'EXPLOITATION DU FONDS DE COMMERCE OU INSTRUMENT D'OPTIMISATION FISCALE ?

    National Research Council Canada - National Science Library

    Mohamed El Gdaihi

    2014-01-01

    ...; fiscal optimisation is done more or less in transparency. The result will be reversed when the group of companies has neither legal personality nor processes and arrangements to allow tax savings, the lease under management is a case in point...

  13. Optimisation methodologies and algorithms for research on catalysis employing high-throughput methods: comparison using the Selox benchmark.

    Science.gov (United States)

    Pereira, Sílvia Raquel Morais; Clerc, Frédéric; Farrusseng, David; van der Waala, Jan Cornelis; Maschmeyer, Thomas

    2007-02-01

    The Selox is a catalytic benchmark for the selective CO oxidation reaction in the presence of H(2), in the form of mathematical equations obtained via modelling of experimental results. The optimisation efficiencies of several Global Optimisation algorithms were studied using the Selox benchmark. Genetic Algorithms, Evolutionary Strategies, Simulated Annealing, Taboo Search and Genetic Algorithms hybridised with Knowledge Discovery procedures were the methods compared. A Design of Experiments search strategy was also exemplified using this benchmark. The main differences regarding the applicability of DoE and Global optimisation techniques are highlighted. Evolutionary strategies, Genetic algorithms, using the sharing procedure, and the Hybrid Genetic algorithms proved to be the most successful in the benchmark optimisation.

  14. Optimising threshold levels for information transmission in binary threshold networks: Independent multiplicative noise on each threshold

    Science.gov (United States)

    Zhou, Bingchang; McDonnell, Mark D.

    2015-02-01

    The problem of optimising the threshold levels in multilevel threshold system subject to multiplicative Gaussian and uniform noise is considered. Similar to previous results for additive noise, we find a bifurcation phenomenon in the optimal threshold values, as the noise intensity changes. This occurs when the number of threshold units is greater than one. We also study the optimal thresholds for combined additive and multiplicative Gaussian noise, and find that all threshold levels need to be identical to optimise the system when the additive noise intensity is a constant. However, this identical value is not equal to the signal mean, unlike the case of additive noise. When the multiplicative noise intensity is instead held constant, the optimal threshold levels are not all identical for small additive noise intensity but are all equal to zero for large additive noise intensity. The model and our results are potentially relevant for sensor network design and understanding neurobiological sensory neurons such as in the peripheral auditory system.

  15. Friction stir welding: multi-response optimisation using Taguchi-based GRA

    Directory of Open Access Journals (Sweden)

    Jitender Kundu

    2016-01-01

    Full Text Available In present experimental work, friction stir welding of aluminium alloy 5083- H321 is performed for optimisation of process parameters for maximum tensile strength. Taguchi’s L9 orthogonal array has been used for three parameters – tool rotational speed (TRS, traverse speed (TS, and tool tilt angle (TTA with three levels. Multi-response optimisation has been carried out through Taguchi-based grey relational analysis. The grey relational grade has been calculated for all three responses – ultimate tensile strength, percentage elongation, and micro-hardness. Analysis of variance is the tool used for obtaining grey relational grade to find out the significant process parameters. TRS and TS are the two most significant parameters which influence most of the quality characteristics of friction stir welded joint. Validation of predicted values done through confirmation experiments at optimum setting shows a good agreement with experimental values.

  16. Comparing, Optimising and Benchmarking Quantum Control Algorithms in a Unifying Programming Framework

    CERN Document Server

    Machnes, S; Glaser, S J; de Fouquieres, P; Gruslys, A; Schirmer, S; Schulte-Herbrueggen, T

    2010-01-01

    For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e. piecewise constant control amplitudes, iteratively into an optimised shape. Here, we present the first comparative study of optimal control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and KROTOV-type methods which do so sequentially. Guidelines for their use are given and open research questions are pointed out. --- Moreover we introduce a novel unifying algorithmic framework, DYNAMO (Dynamic Optimisation Platform) designed to provide the quantum-technology community with a convenient MATLAB-based toolset for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and compari...

  17. Optimal risky bidding strategy for a generating company by self-organising hierarchical particle swarm optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Boonchuay, Chanwit [Energy Field of Study, School of Environment, Resources and Development, Asian Institute of Technology (Thailand); Ongsakul, Weerakorn, E-mail: ongsakul@ait.asi [Energy Field of Study, School of Environment, Resources and Development, Asian Institute of Technology (Thailand)

    2011-02-15

    In this paper, an optimal risky bidding strategy for a generating company (GenCo) by self-organising hierarchical particle swarm optimisation with time-varying acceleration coefficients (SPSO-TVAC) is proposed. A significant risk index based on mean-standard deviation ratio (MSR) is maximised to provide the optimal bid prices and quantities. The Monte Carlo (MC) method is employed to simulate rivals' behaviour in competitive environment. Non-convex operating cost functions of thermal generating units and minimum up/down time constraints are taken into account. The proposed bidding strategy is implemented in a multi-hourly trading in a uniform price spot market and compared to other particle swarm optimisation (PSO). Test results indicate that the proposed SPSO-TVAC approach can provide a higher MSR than the other PSO methods. It is potentially applicable to risk management of profit variation of GenCo in spot market.

  18. Identification of the mechanical behaviour of biopolymer composites using multistart optimisation technique

    KAUST Repository

    Brahim, Elhacen

    2013-10-01

    This paper aims at identifying the mechanical behaviour of starch-zein composites as a function of zein content using a novel optimisation technique. Starting from bending experiments, force-deflection response is used to derive adequate mechanical parameters representing the elastic-plastic behaviour of the studied material. For such a purpose, a finite element model is developed accounting for a simple hardening rule, namely isotropic hardening model. A deterministic optimisation strategy is implemented to provide rapid matching between parameters of the constitutive law and the observed behaviour. Results are discussed based on the robustness of the numerical approach and predicted tendencies with regards to the role of zein content. © 2013 Elsevier Ltd.

  19. Optimised In2S3 Thin Films Deposited by Spray Pyrolysis

    Directory of Open Access Journals (Sweden)

    Hristina Spasevska

    2012-01-01

    Full Text Available Indium sulphide has been extensively investigated as a component for different kind of photovoltaic devices (organic-inorganic hybrid devices, all inorganic, dye sensitized cells. In this paper, we have optimised the growth conditions of indium sulphide thin films by means of a low cost, versatile deposition technique, like spray pyrolysis. The quality of the deposited films has been characterised by micro-Raman, vis-UV spectroscopy, and atomic force microscopy. Substrate deposition temperature and different postdeposition annealing conditions have been investigated in order to obtain information about the quality of the obtained compound (which crystalline or amorphous phases are present and the morphology of the deposited films. We have shown that the deposition temperature influences strongly the amount of amorphous phase and the roughness of the indium sulphide films. Optimised postdeposition annealing treatments can strongly improve the final amount of the beta phase almost independently from the percentage of the amorphous phase present in the as deposited films.

  20. Multi-parameter building thermal analysis using the lattice method for global optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Saporito, A. [Fire and Environmental Modelling Centre, Building Research Establishment, Watford (United Kingdom); Day, A.R.; Karayiannis, T.G. [School of Engineering Systems and Design, South Bank University, London (United Kingdom); Parand, F. [Centre for Construction IT, Building Research Establishment, Watford (United Kingdom)

    2000-07-01

    The energy performance in buildings is a complex function of the building form and structure, heating system, occupancy pattern, operating schedules, and external climatic conditions. Computer simulations can help understand the dynamic interactions of these parameters. However, to carry out a multi-parameter analysis for the optimisation of the building energy performance, it is necessary to reduce the large number of tests resulting from all possible parameter combinations. In this paper, the lattice method for global optimisation (LMGO) for reducing the number of tests was used. A multi-parameter study was performed to investigate the heating energy use in office buildings using the thermal simulation code APACHE (IES-FACET). From the results of the sensitivity analysis it was possible to estimate the relative importance of various energy saving features. (author)