WorldWideScience

Sample records for global optimisation toolbox

  1. A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus

    Science.gov (United States)

    AghaKouchak, A.; Sadegh, M.; Mallakpour, I.

    2017-12-01

    Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.

  2. Toolbox for super-structured and super-structure free multi-disciplinary building spatial design optimisation

    NARCIS (Netherlands)

    Boonstra, S.; van der Blom, K.; Hofmeyer, H.; Emmerich, M.T.M.; van Schijndel, A.W.M.; de Wilde, P.

    2018-01-01

    Multi-disciplinary optimisation of building spatial designs is characterised by large solution spaces. Here two approaches are introduced, one being super-structured and the other super-structure free. Both are different in nature and perform differently for large solution spaces and each requires

  3. Toolbox for Urban Mobility Simulation: High Resolution Population Dynamics for Global Cities

    Science.gov (United States)

    Bhaduri, B. L.; Lu, W.; Liu, C.; Thakur, G.; Karthik, R.

    2015-12-01

    In this rapidly urbanizing world, unprecedented rate of population growth is not only mirrored by increasing demand for energy, food, water, and other natural resources, but has detrimental impacts on environmental and human security. Transportation simulations are frequently used for mobility assessment in urban planning, traffic operation, and emergency management. Previous research, involving purely analytical techniques to simulations capturing behavior, has investigated questions and scenarios regarding the relationships among energy, emissions, air quality, and transportation. Primary limitations of past attempts have been availability of input data, useful "energy and behavior focused" models, validation data, and adequate computational capability that allows adequate understanding of the interdependencies of our transportation system. With increasing availability and quality of traditional and crowdsourced data, we have utilized the OpenStreetMap roads network, and has integrated high resolution population data with traffic simulation to create a Toolbox for Urban Mobility Simulations (TUMS) at global scale. TUMS consists of three major components: data processing, traffic simulation models, and Internet-based visualizations. It integrates OpenStreetMap, LandScanTM population, and other open data (Census Transportation Planning Products, National household Travel Survey, etc.) to generate both normal traffic operation and emergency evacuation scenarios. TUMS integrates TRANSIMS and MITSIM as traffic simulation engines, which are open-source and widely-accepted for scalable traffic simulations. Consistent data and simulation platform allows quick adaption to various geographic areas that has been demonstrated for multiple cities across the world. We are combining the strengths of geospatial data sciences, high performance simulations, transportation planning, and emissions, vehicle and energy technology development to design and develop a simulation

  4. Navigating catastrophes: Local but not global optimisation allows for macro-economic navigation of crises

    Science.gov (United States)

    Harré, Michael S.

    2013-02-01

    Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.

  5. Global performance enhancements via pedestal optimisation on ASDEX Upgrade

    NARCIS (Netherlands)

    Dunne, M.G.; Frassinetti, L.; Beurkens, M.N.A.; Cavedon, M.; Fietz, S.; Fischer, R.; Giannone, L.; Huijsmans, G.T.A.; Kurzan, B.; Laggner, F.; McCarhty, P.J.; McDermott, R.M.; Tardini, G.; Viezzer, E.; Willensdorfer, M.; Wolfrum, E.

    2017-01-01

    Results of experimental scans of heating power, plasma shape, and nitrogen content are presented, with a focus on global performance and pedestal alteration. In detailed scans at low triangularity, it is shown that the increase in stored energy due to nitrogen seeding stems from the pedestal. It is

  6. Combining local and global optimisation for virtual camera control

    OpenAIRE

    Burelli, Paolo; Yannakakis, Georgios N.; 2010 IEEE Symposium on Computational Intelligence and Games

    2010-01-01

    Controlling a virtual camera in 3D computer games is a complex task. The camera is required to react to dynamically changing environments and produce high quality visual results and smooth animations. This paper proposes an approach that combines local and global search to solve the virtual camera control problem. The automatic camera control problem is described and it is decomposed into sub-problems; then a hierarchical architecture that solves each sub-problem using the most appropriate op...

  7. Switch of Sensitivity Dynamics Revealed with DyGloSA Toolbox for Dynamical Global Sensitivity Analysis as an Early Warning for System's Critical Transition

    Science.gov (United States)

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574

  8. Switch of sensitivity dynamics revealed with DyGloSA toolbox for dynamical global sensitivity analysis as an early warning for system's critical transition.

    Science.gov (United States)

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.

  9. Global nuclear industry views: challenges arising from the evolution of the optimisation principle in radiological protection

    International Nuclear Information System (INIS)

    Saint-Pierre, S.

    2012-01-01

    Over the last few decades, the steady progress achieved in reducing planned exposures of both workers and the public has been admirable in the nuclear sector. However, the disproportionate focus on tiny public exposures and radioactive discharges associated with normal operations came at a high price, and the quasi-denial of a risk of major accident and related weaknesses in emergency preparedness and response came at an even higher price. Fukushima has unfortunately taught us that radiological protection (RP) for emergency and post-emergency situations can be much more than a simple evacuation that lasts 24–48 h, with people returning safely to their homes soon afterwards. On optimisation of emergency and post-emergency exposures, the only ‘show in town’ in terms of international RP policy improvements has been the issuance of the 2007 Recommendations of the International Commission on Radiological Protection (ICRP). However, no matter how genuine these improvements are, they have not been ‘road tested’ on the practical reality of severe accidents. Post-Fukushima, there is a compelling case to review the practical adequacy of key RP notions such as optimisation, evacuation, sheltering, and reference levels for workers and the public, and to amend these notions with a view to making the international RP system more useful in the event of a severe accident. On optimisation of planned exposures, the reality is that, nowadays, margins for further reductions of public doses in the nuclear sector are very small, and the smaller the dose, the greater the extra effort needed to reduce the dose further. If sufficient caution is not exercised in the use of RP notions such as dose constraints, there is a real risk of challenging nuclear power technologies beyond safety reasons. For nuclear new build, it is the optimisation of key operational parameters of nuclear power technologies (not RP) that is of paramount importance to improve their overall efficiency. In

  10. Global nuclear industry views: challenges arising from the evolution of the optimisation principle in radiological protection.

    Science.gov (United States)

    Saint-Pierre, S

    2012-01-01

    Over the last few decades, the steady progress achieved in reducing planned exposures of both workers and the public has been admirable in the nuclear sector. However, the disproportionate focus on tiny public exposures and radioactive discharges associated with normal operations came at a high price, and the quasi-denial of a risk of major accident and related weaknesses in emergency preparedness and response came at an even higher price. Fukushima has unfortunately taught us that radiological protection (RP) for emergency and post-emergency situations can be much more than a simple evacuation that lasts 24-48 h, with people returning safely to their homes soon afterwards. On optimisation of emergency and post-emergency exposures, the only 'show in town' in terms of international RP policy improvements has been the issuance of the 2007 Recommendations of the International Commission on Radiological Protection (ICRP). However, no matter how genuine these improvements are, they have not been 'road tested' on the practical reality of severe accidents. Post-Fukushima, there is a compelling case to review the practical adequacy of key RP notions such as optimisation, evacuation, sheltering, and reference levels for workers and the public, and to amend these notions with a view to making the international RP system more useful in the event of a severe accident. On optimisation of planned exposures, the reality is that, nowadays, margins for further reductions of public doses in the nuclear sector are very small, and the smaller the dose, the greater the extra effort needed to reduce the dose further. If sufficient caution is not exercised in the use of RP notions such as dose constraints, there is a real risk of challenging nuclear power technologies beyond safety reasons. For nuclear new build, it is the optimisation of key operational parameters of nuclear power technologies (not RP) that is of paramount importance to improve their overall efficiency. In pursuing

  11. Virtual toolbox

    Science.gov (United States)

    Jacobus, Charles J.; Jacobus, Heidi N.; Mitchell, Brian T.; Riggs, A. J.; Taylor, Mark J.

    1993-04-01

    At least three of the five senses must be fully addressed in a successful virtual reality (VR) system. Sight, sound, and touch are the most critical elements for the creation of the illusion of presence. Since humans depend so much on sight to collect information about their environment, this area has been the focus of much of the prior art in virtual reality, however, it is also crucial that we provide facilities for force, torque, and touch reflection, and sound replay and 3-D localization. In this paper we present a sampling of hardware and software in the virtual environment maker's `toolbox' which can support rapidly building up of customized VR systems. We provide demonstrative examples of how some of the tools work and we speculate about VR applications and future technology needs.

  12. Optimal Optimisation in Chemometrics

    NARCIS (Netherlands)

    Hageman, J.A.

    2004-01-01

    The use of global optimisation methods is not straightforward, especially for the more difficult optimisation problems. Solutions have to be found for items such as the evaluation function, representation, step function and meta-parameters, before any useful results can be obtained. This thesis aims

  13. The FRISBEE tool, a software for optimising the trade-off between food quality, energy use, and global warming impact of cold chains

    NARCIS (Netherlands)

    Gwanpua, S.G.; Verboven, P.; Leducq, D.; Brown, T.; Verlinden, B.E.; Bekele, E.; Aregawi, W. Evans, J.; Foster, A.; Duret, S.; Hoang, H.M.; Sluis, S. van der; Wissink, E.; Hendriksen, L.J.A.M.; Taoukis, P.; Gogou, E.; Stahl, V.; El Jabri, M.; Le Page, J.F.; Claussen, I.; Indergård, E.; Nicolai, B.M.; Alvarez, G.; Geeraerd, A.H.

    2015-01-01

    Food quality (including safety) along the cold chain, energy use and global warming impact of refrigeration systems are three key aspects in assessing cold chain sustainability. In this paper, we present the framework of a dedicated software, the FRISBEE tool, for optimising quality of refrigerated

  14. Air Sensor Toolbox

    Science.gov (United States)

    Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.

  15. Resolution of crystal structures by X-ray and neutrons powder diffraction using global optimisation methods; Resolution des structures cristallines par diffraction des rayons X et neutrons sur poudres en utilisant les methodes d'optimisation globale

    Energy Technology Data Exchange (ETDEWEB)

    Palin, L

    2005-03-15

    We have shown in this work that X-ray diffraction on powder is a powerful tool to analyze crystal structure. The purpose of this thesis is the resolution of crystal structures by X-ray and neutrons diffraction on powder using global optimisation methods. We have studied 3 different topics. The first one is the order-disorder phenomena observed in some globular organic molecular solids. The second is the opiate family of neuropeptides. These neurotransmitters regulate sensory functions including pain and control of respiration in the central nervous system. The aim of our study was to try to determine the crystal structure of Leu-enkephalin and some of its sub-fragments. The determination of the crystal structures has been done performing Monte Carlo simulations. The third one is the location of benzene in a sodium-X zeolite. The zeolite framework was already known and the benzene has been localized by simulated annealing and by the use of maximum entropy maps.

  16. Artificial neural network optimisation for monthly average daily global solar radiation prediction

    International Nuclear Information System (INIS)

    Alsina, Emanuel Federico; Bortolini, Marco; Gamberi, Mauro; Regattieri, Alberto

    2016-01-01

    Highlights: • Prediction of the monthly average daily global solar radiation over Italy. • Multi-location Artificial Neural Network (ANN) model: 45 locations considered. • Optimal ANN configuration with 7 input climatologic/geographical parameters. • Statistical indicators: MAPE, NRMSE, MPBE. - Abstract: The availability of reliable climatologic data is essential for multiple purposes in a wide set of anthropic activities and operative sectors. Frequently direct measures present spatial and temporal lacks so that predictive approaches become of interest. This paper focuses on the prediction of the Monthly Average Daily Global Solar Radiation (MADGSR) over Italy using Artificial Neural Networks (ANNs). Data from 45 locations compose the multi-location ANN training and testing sets. For each location, 13 input parameters are considered, including the geographical coordinates and the monthly values for the most frequently adopted climatologic parameters. A subset of 17 locations is used for ANN training, while the testing step is against data from the remaining 28 locations. Furthermore, the Automatic Relevance Determination method (ARD) is used to point out the most relevant input for the accurate MADGSR prediction. The ANN best configuration includes 7 parameters, only, i.e. Top of Atmosphere (TOA) radiation, day length, number of rainy days and average rainfall, latitude and altitude. The correlation performances, expressed through statistical indicators as the Mean Absolute Percentage Error (MAPE), range between 1.67% and 4.25%, depending on the number and type of the chosen input, representing a good solution compared to the current standards.

  17. DACE - A Matlab Kriging Toolbox

    DEFF Research Database (Denmark)

    2002-01-01

    DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models.......DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models....

  18. Broadview Radar Altimetry Toolbox

    Science.gov (United States)

    Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific

  19. Global Topology Optimisation

    Science.gov (United States)

    2016-10-31

    boundary Γ, but the boundary points are not equally spaced along Γ ( recall Fig. 2). The idea is that a given boundary Γ has many possible discretisations...b+ b a (a) (b) (B) = 4 abc radius, R m ea n cu rv at ur e, ̄ 0 100 200 300 4000 0.01 0.02 0.03(c) 1/R geometric finite dierence perturbation...perimeter of a curve. The setting is illustrated in Fig. 9. Recall the sensitivity is defined by (3). For a curve that is represented by a set of

  20. ICT: isotope correction toolbox.

    Science.gov (United States)

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. ESA Atmospheric Toolbox

    Science.gov (United States)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and

  2. Beam position optimisation for IMRT

    International Nuclear Information System (INIS)

    Holloway, L.; Hoban, P.

    2001-01-01

    Full text: The introduction of IMRT has not generally resulted in the use of optimised beam positions because to find the global solution of the problem a time consuming stochastic optimisation method must be used. Although a deterministic method may not achieve the global minimum it should achieve a superior dose distribution compared to no optimisation. This study aimed to develop and test such a method. The beam optimisation method developed relies on an iterative process to achieve the desired number of beams from a large initial number of beams. The number of beams is reduced in a 'weeding-out' process based on the total fluence which each beam delivers. The process is gradual, with only three beams removed each time (following a small number of iterations), ensuring that the reduction in beams does not dramatically affect the fluence maps of those remaining. A comparison was made between the dose distributions achieved when the beams positions were optimised in this fashion and when the beams positions were evenly distributed. The method has been shown to work quite effectively and efficiently. The Figure shows a comparison in dose distribution with optimised and non optimised beam positions for 5 beams. It can be clearly seen that there is an improvement in the dose distribution delivered to the tumour and a reduction in the dose to the critical structure with beam position optimisation. A method for beam position optimisation for use in IMRT optimisations has been developed. This method although not necessarily achieving the global minimum in beam position still achieves quite a dramatic improvement compared with no beam position optimisation and is very efficiently achieved. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  3. ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes

    Science.gov (United States)

    Benveniste, J.; Ambrozio, A.; Restano, M.

    2016-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE

  4. Towards a binaural modelling toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Culling, John F.; Dau, Torsten

    2011-01-01

    The Auditory Modelling Toolbox (AMToolbox) is a new Matlab / Octave toolbox for developing and applying auditory perceptual models and in particular binaural models. The philosophy behind the project is that the models should be implemented in a consistent manner, well documented and user...

  5. Ubuntu Linux toolbox

    CERN Document Server

    Negus, Christopher

    2012-01-01

    This bestseller from Linux guru Chris Negus is packed with an array of new and revised material As a longstanding bestseller, Ubuntu Linux Toolbox has taught you how to get the most out Ubuntu, the world?s most popular Linux distribution. With this eagerly anticipated new edition, Christopher Negus returns with a host of new and expanded coverage on tools for managing file systems, ways to connect to networks, techniques for securing Ubuntu systems, and a look at the latest Long Term Support (LTS) release of Ubuntu, all aimed at getting you up and running with Ubuntu Linux quickly.

  6. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  7. Wheat Rust Toolbox Related to New Initiatives on Yellow Rust

    DEFF Research Database (Denmark)

    Hansen, Jens Grønbech; Lassen, Poul

    ://www.fao.org/agriculture/crops/rust/stem/rust-report/en/). The Wheat rust toolbox is one of several International research platforms hosted by Aarhus University, and it uses the same ICT framework and databases as EuroWheat (www.eurowheat.org) and EuroBlight (www.EuroBlight.net). The Wheat Rust Toolbox will also serve the Global Rust Reference Centre (GRRC) as well...... – 2009), and as soon as possible this will be expanded to cover all global yellow rust data available via the GRRC. The presentation will focus on experiences from the previous work on global databases and web based information systems, as well as propose ideas how the toolbox can be helpful regarding...

  8. Simulation optimisation

    International Nuclear Information System (INIS)

    Anon

    2010-01-01

    Over the past decade there has been a significant advance in flotation circuit optimisation through performance benchmarking using metallurgical modelling and steady-state computer simulation. This benchmarking includes traditional measures, such as grade and recovery, as well as new flotation measures, such as ore floatability, bubble surface area flux and froth recovery. To further this optimisation, Outotec has released its HSC Chemistry software with simulation modules. The flotation model developed by the AMIRA P9 Project, of which Outotec is a sponsor, is regarded by industry as the most suitable flotation model to use for circuit optimisation. This model incorporates ore floatability with flotation cell pulp and froth parameters, residence time, entrainment and water recovery. Outotec's HSC Sim enables you to simulate mineral processes in different levels, from comminution circuits with sizes and no composition, through to flotation processes with minerals by size by floatability components, to full processes with true particles with MLA data.

  9. Smoke Ready Toolbox for Wildfires

    Science.gov (United States)

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  10. The Asian Criminological Paradigm and How It Links Global North and South: Combining an Extended Conceptual Toolbox from the North with Innovative Asian Contexts

    Directory of Open Access Journals (Sweden)

    Jianhong Liu

    2017-03-01

    Full Text Available In their recent seminal paper ‘Southern Criminology’, Carrington, Hogg and Sozzo (2016 address the issue of the global divide between South/North relations in the hierarchal production of criminological knowledge. They point out that the divide privileges theories, assumptions and methods that are largely based on the empirical specificities of the global North. Carrington et al. contend that the dominance of global North criminology has led to a severe underdevelopment of criminology in the global South, except ‘in Asia, with the establishment of the Asian Criminological Society and its journal’ (Liu 2009, in Carrington et al. 2016: 3. Carrington et al. propose an important task of bridging the global divide through further developing criminology in the global South. My present paper reviews the development of Asian criminology under the framework of the Asian Criminological Paradigm (Liu 2009. I primarily review the conceptual and theoretical developments, to suggest strategies that can contribute to the task of bridging the gap between global North and South. What Asian criminology has done is expand the theoretical tool box originally developed in the global North through the strategies of transportation of theories, elaboration of theories, and proposing new concepts and theories based on the empirical grounds of Asian contexts.

  11. Integrated Budget Office Toolbox

    Science.gov (United States)

    Rushing, Douglas A.; Blakeley, Chris; Chapman, Gerry; Robertson, Bill; Horton, Allison; Besser, Thomas; McCarthy, Debbie

    2010-01-01

    The Integrated Budget Office Toolbox (IBOT) combines budgeting, resource allocation, organizational funding, and reporting features in an automated, integrated tool that provides data from a single source for Johnson Space Center (JSC) personnel. Using a common interface, concurrent users can utilize the data without compromising its integrity. IBOT tracks planning changes and updates throughout the year using both phasing and POP-related (program-operating-plan-related) budget information for the current year, and up to six years out. Separating lump-sum funds received from HQ (Headquarters) into separate labor, travel, procurement, Center G&A (general & administrative), and servicepool categories, IBOT creates a script that significantly reduces manual input time. IBOT also manages the movement of travel and procurement funds down to the organizational level and, using its integrated funds management feature, helps better track funding at lower levels. Third-party software is used to create integrated reports in IBOT that can be generated for plans, actuals, funds received, and other combinations of data that are currently maintained in the centralized format. Based on Microsoft SQL, IBOT incorporates generic budget processes, is transportable, and is economical to deploy and support.

  12. Optimisation by hierarchical search

    Science.gov (United States)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  13. Particle Swarm Optimization Toolbox

    Science.gov (United States)

    Grant, Michael J.

    2010-01-01

    The Particle Swarm Optimization Toolbox is a library of evolutionary optimization tools developed in the MATLAB environment. The algorithms contained in the library include a genetic algorithm (GA), a single-objective particle swarm optimizer (SOPSO), and a multi-objective particle swarm optimizer (MOPSO). Development focused on both the SOPSO and MOPSO. A GA was included mainly for comparison purposes, and the particle swarm optimizers appeared to perform better for a wide variety of optimization problems. All algorithms are capable of performing unconstrained and constrained optimization. The particle swarm optimizers are capable of performing single and multi-objective optimization. The SOPSO and MOPSO algorithms are based on swarming theory and bird-flocking patterns to search the trade space for the optimal solution or optimal trade in competing objectives. The MOPSO generates Pareto fronts for objectives that are in competition. A GA, based on Darwin evolutionary theory, is also included in the library. The GA consists of individuals that form a population in the design space. The population mates to form offspring at new locations in the design space. These offspring contain traits from both of the parents. The algorithm is based on this combination of traits from parents to hopefully provide an improved solution than either of the original parents. As the algorithm progresses, individuals that hold these optimal traits will emerge as the optimal solutions. Due to the generic design of all optimization algorithms, each algorithm interfaces with a user-supplied objective function. This function serves as a "black-box" to the optimizers in which the only purpose of this function is to evaluate solutions provided by the optimizers. Hence, the user-supplied function can be numerical simulations, analytical functions, etc., since the specific detail of this function is of no concern to the optimizer. These algorithms were originally developed to support entry

  14. The GMT/MATLAB Toolbox

    Science.gov (United States)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  15. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  16. Remote care of a patient with stroke in rural Trinidad: use of telemedicine to optimise global neurological care.

    Science.gov (United States)

    Reyes, Antonio Jose; Ramcharan, Kanterpersad

    2016-08-02

    We report a patient driven home care system that successfully assisted 24/7 with the management of a 68-year-old woman after a stroke-a global illness. The patient's caregiver and physician used computer devices, smartphones and internet access for information exchange. Patient, caregiver, family and physician satisfaction, coupled with outcome and cost were indictors of quality of care. The novelty of this basic model of teleneurology is characterised by implementing a patient/caregiver driven system designed to improve access to cost-efficient neurological care, which has potential for use in primary, secondary and tertiary levels of healthcare in rural and underserved regions of the world. We suggest involvement of healthcare stakeholders in teleneurology to address this global problem of limited access to neurological care. This model can facilitate the management of neurological diseases, impact on outcome, reduce frequency of consultations and hospitalisations, facilitate teaching of healthcare workers and promote research. 2016 BMJ Publishing Group Ltd.

  17. Distributed Aerodynamic Sensing and Processing Toolbox

    Science.gov (United States)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  18. A toolbox for European judges

    NARCIS (Netherlands)

    Hesselink, M.W.

    2011-01-01

    The forthcoming instrument on European contract law, be it in the shape of an optional code for cross-border contracts or as an official toolbox for the European legislator, is likely to have a spill-over effect on private law adjudication in Europe. Judges will have no great difficulty in finding

  19. Extend your toolbox with R

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    R seems to be ideal tool for visualising your data as well as practically all other data-related tasks. Learn how to start with R as it is worth to be included not only in statistician's or data scientist's toolboxes.

  20. The Life Cycle Analysis Toolbox

    International Nuclear Information System (INIS)

    Bishop, L.; Tonn, B.E.; Williams, K.A.; Yerace, P.; Yuracko, K.L.

    1999-01-01

    The life cycle analysis toolbox is a valuable integration of decision-making tools and supporting materials developed by Oak Ridge National Laboratory (ORNL) to help Department of Energy managers improve environmental quality, reduce costs, and minimize risk. The toolbox provides decision-makers access to a wide variety of proven tools for pollution prevention (P2) and waste minimization (WMin), as well as ORNL expertise to select from this toolbox exactly the right tool to solve any given P2/WMin problem. The central element of the toolbox is a multiple criteria approach to life cycle analysis developed specifically to aid P2/WMin decision-making. ORNL has developed numerous tools that support this life cycle analysis approach. Tools are available to help model P2/WMin processes, estimate human health risks, estimate costs, and represent and manipulate uncertainties. Tools are available to help document P2/WMin decision-making and implement programs. Tools are also available to help track potential future environmental regulations that could impact P2/WMin programs and current regulations that must be followed. An Internet-site will provide broad access to the tools

  1. GOCE user toolbox and tutorial

    DEFF Research Database (Denmark)

    Knudsen, Per; Benveniste, Jerome

    2011-01-01

    consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made...

  2. Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function

    Energy Technology Data Exchange (ETDEWEB)

    Alekseychuk, O.

    2006-07-01

    A new algorithm for detection of longitudinal crack-like indications in radiographic images is developed in this work. Conventional local detection techniques give unsatisfactory results for this task due to the low signal to noise ratio (SNR {proportional_to} 1) of crack-like indications in radiographic images. The usage of global features of crack-like indications provides the necessary noise resistance, but this is connected with prohibitive computational complexities of detection and difficulties in a formal description of the indication shape. Conventionally, the excessive computational complexity of the solution is reduced by usage of heuristics. The heuristics to be used, are selected on a trial and error basis, are problem dependent and do not guarantee the optimal solution. Not following this way is a distinctive feature of the algorithm developed here. Instead, a global characteristic of crack-like indication (the estimation function) is used, whose maximum in the space of all possible positions, lengths and shapes can be found exactly, i.e. without any heuristics. The proposed estimation function is defined as a sum of a posteriori information gains about hypothesis of indication presence in each point along the whole hypothetical indication. The gain in the information about hypothesis of indication presence results from the analysis of the underlying image in the local area. Such an estimation function is theoretically justified and exhibits a desirable behaviour on changing signals. The developed algorithm is implemented in the C++ programming language and tested on synthetic as well as on real images. It delivers good results (high correct detection rate by given false alarm rate) which are comparable to the performance of trained human inspectors.

  3. Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function

    International Nuclear Information System (INIS)

    Alekseychuk, O.

    2006-01-01

    A new algorithm for detection of longitudinal crack-like indications in radiographic images is developed in this work. Conventional local detection techniques give unsatisfactory results for this task due to the low signal to noise ratio (SNR ∝ 1) of crack-like indications in radiographic images. The usage of global features of crack-like indications provides the necessary noise resistance, but this is connected with prohibitive computational complexities of detection and difficulties in a formal description of the indication shape. Conventionally, the excessive computational complexity of the solution is reduced by usage of heuristics. The heuristics to be used, are selected on a trial and error basis, are problem dependent and do not guarantee the optimal solution. Not following this way is a distinctive feature of the algorithm developed here. Instead, a global characteristic of crack-like indication (the estimation function) is used, whose maximum in the space of all possible positions, lengths and shapes can be found exactly, i.e. without any heuristics. The proposed estimation function is defined as a sum of a posteriori information gains about hypothesis of indication presence in each point along the whole hypothetical indication. The gain in the information about hypothesis of indication presence results from the analysis of the underlying image in the local area. Such an estimation function is theoretically justified and exhibits a desirable behaviour on changing signals. The developed algorithm is implemented in the C++ programming language and tested on synthetic as well as on real images. It delivers good results (high correct detection rate by given false alarm rate) which are comparable to the performance of trained human inspectors

  4. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  5. Wave data processing toolbox manual

    Science.gov (United States)

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata

  6. The Linear Time Frequency Analysis Toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Torrésani, Bruno; Balazs, Peter

    2011-01-01

    The Linear Time Frequency Analysis Toolbox is a Matlab/Octave toolbox for computational time-frequency analysis. It is intended both as an educational and computational tool. The toolbox provides the basic Gabor, Wilson and MDCT transform along with routines for constructing windows (lter...... prototypes) and routines for manipulating coe cients. It also provides a bunch of demo scripts devoted either to demonstrating the main functions of the toolbox, or to exemplify their use in specic signal processing applications. In this paper we describe the used algorithms, their mathematical background...

  7. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  8. Evolutionary programming for neutron instrument optimisation

    International Nuclear Information System (INIS)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelievre-Berna, Eddy

    2006-01-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations

  9. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  10. Structural Time Domain Identification Toolbox User's Guide

    DEFF Research Database (Denmark)

    Andersen, P.; Kirkegaard, Poul Henning; Brincker, Rune

    This manual describes the Structural Time Domain Identification toolbox for use with MA TLAB. This version of the tool box has been developed using the PC-based MA TLAB version 4.2c, but is compatible with prior versions of MATLAB and UNIX-based versions. The routines of the toolbox are the so...

  11. Sentinel-3 SAR Altimetry Toolbox

    Science.gov (United States)

    Benveniste, Jerome; Lucas, Bruno; DInardo, Salvatore

    2015-04-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage of ERS-2 and Envisat, and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the two Sentinels is expected to be launched in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth

  12. What's in Your Teaching Toolbox?

    Science.gov (United States)

    Mormer, Elaine

    2018-02-01

    Educators are faced with an array of tools available to enhance learning outcomes in the classroom and clinic. These tools range from those that are very simple to those that are sufficiently complex to require an investment in learning time. This article summarizes a collection of teaching tools, ordered by the time involved in learning proficient use. Simple tools described include specific online blogs providing support for faculty and student writing and a simple method to capture and download videos from YouTube for classroom use. More complex tools described include a Web-based application for custom-created animated videos and an interactive audience polling system. Readers are encouraged to reflect on the tools most appropriate for use in their own teaching toolbox by considering the requisite time to proficiency and suitability to address desired learner outcomes.

  13. NEMO: A Stellar Dynamics Toolbox

    Science.gov (United States)

    Barnes, Joshua; Hut, Piet; Teuben, Peter

    2010-10-01

    NEMO is an extendible Stellar Dynamics Toolbox, following an Open-Source Software model. It has various programs to create, integrate, analyze and visualize N-body and SPH like systems, following the pipe and filter architecture. In addition there are various tools to operate on images, tables and orbits, including FITS files to export/import to/from other astronomical data reduction packages. A large growing fraction of NEMO has been contributed by a growing list of authors. The source code consist of a little over 4000 files and a little under 1,000,000 lines of code and documentation, mostly C, and some C++ and Fortran. NEMO development started in 1986 in Princeton (USA) by Barnes, Hut and Teuben. See also ZENO (ascl:1102.027) for the version that Barnes maintains.

  14. Drinking Water Cyanotoxin Risk Communication Toolbox

    Science.gov (United States)

    The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.

  15. Channel Access Client Toolbox for Matlab

    International Nuclear Information System (INIS)

    2002-01-01

    This paper reports on MATLAB Channel Access (MCA) Toolbox--MATLAB [1] interface to EPICS Channel Access (CA) client library. We are developing the toolbox for SPEAR3 accelerator controls, but it is of general use for accelerator and experimental physics applications programming. It is packaged as a MATLAB toolbox to allow easy development of complex CA client applications entirely in MATLAB. The benefits include: the ability to calculate and display parameters that use EPICS process variables as inputs, availability of MATLAB graphics tools for user interface design, and integration with the MATLABbased accelerator modeling software - Accelerator Toolbox [2-4]. Another purpose of this paper is to propose a feasible path to a synergy between accelerator control systems and accelerator simulation codes, the idea known as on-line accelerator model

  16. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  17. Optimised Renormalisation Group Flows

    CERN Document Server

    Litim, Daniel F

    2001-01-01

    Exact renormalisation group (ERG) flows interpolate between a microscopic or classical theory and the corresponding macroscopic or quantum effective theory. For most problems of physical interest, the efficiency of the ERG is constrained due to unavoidable approximations. Approximate solutions of ERG flows depend spuriously on the regularisation scheme which is determined by a regulator function. This is similar to the spurious dependence on the ultraviolet regularisation known from perturbative QCD. Providing a good control over approximated ERG flows is at the root for reliable physical predictions. We explain why the convergence of approximate solutions towards the physical theory is optimised by appropriate choices of the regulator. We study specific optimised regulators for bosonic and fermionic fields and compare the optimised ERG flows with generic ones. This is done up to second order in the derivative expansion at both vanishing and non-vanishing temperature. An optimised flow for a ``proper-time ren...

  18. A conceptual toolbox for designing CSCW applications

    DEFF Research Database (Denmark)

    Bødker, Susanne; Christiansen, Ellen

    1995-01-01

    This paper presents a conceptual toolbox, developed to support the design of CSCW applications in a large Esprit project, EuroCODE. Here, several groups of designers work to investigate computer support for cooperative work in large use organizations, at the same time as they work to develop...... an open development platform for CSCW applications. The conceptual toolbox has been developed to support communication in and among these design groups, between designers and users and in future use of the open development platform. Rejecting the idea that one may design from a framework describing CSCW......, the toolbox aims to support design by doing and help bridging between work with users, technical design, and insights gained from theoretical and empirical CSCW research....

  19. Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox

    Directory of Open Access Journals (Sweden)

    Andre Santos Ribeiro

    2015-07-01

    Full Text Available Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity.Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI and positron emission tomography (PET. It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19–73 years old with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also.Results. It was observed both a high inter

  20. The thermal performance monitoring and optimisation system (TEMPO): lessons learnt

    International Nuclear Information System (INIS)

    Beere, W.H.Aa.

    2005-09-01

    The goal of condition monitoring, fault detection and diagnosis is to ensure the success of planned operations by recognizing anomalies in a plant. This is achieved by monitoring the condition of equipment and instrumentation, and by detection, identification, diagnosis and removal of faults. The method of using physical modelling for condition monitoring has been investigated at the Institutt for energiteknikk since 1998. The result of this work was the development of the TEMPO (ThErMal Performance monitoring and Optimisation) toolbox. In this toolbox plant wide models are built up of unit sub-models. These are then linked to measurements by using data reconciliation. This enables the comparison of calculated to measured values as well as an indication of the significance of any deviation. It also allows the calculation of unmeasured variables as well as an overall 'goodness of fit' indicator. Since its first release in 2000 the TEMPO toolbox has been used to model the turbine cycles of several NPPs. Installations include Forsmark 3 and Loviisa 2 with feasibility studies for Dukovany, Olkiluoto 2, Almaraz and Paks. The experience from creating and installing TEMPO at these plants has now been collated and is presented in this report. This experience is used to indicate which direction the further development of TEMPO should take. The experience of using TEMPO has shown that the data-reconciliation method can be applied to the turbine cycles of NPPs. Problems that have arose have primarily been connected to the usability of the toolbox. This has prompted a shift in the development emphasis from the task of developing the method to that of developing its usability. A summary of improvement proposals is given in this paper. The reader is welcome to comment on these proposals or to suggest alternative improvements. (Author)

  1. A Total Factor Productivity Toolbox for MATLAB

    NARCIS (Netherlands)

    B.M. Balk (Bert); J. Barbero (Javier); J.L. Zofío (José)

    2018-01-01

    textabstractTotal Factor Productivity Toolbox is a new package for MATLAB that includes functions to calculate the main Total Factor Productivity (TFP) indices and their decompositions, based on Shephard’s distance functions and using Data Envelopment Analysis (DEA) programming techniques. The

  2. The RTMM Toolbox for DMM Applications

    DEFF Research Database (Denmark)

    Sharp, Robin; Todirica, Edward Alexandru

    2002-01-01

    This paper describes an approach to implementing distributed multimedia applications based on the use of a software toolbox. The tools in the box allow the designer to specify which components are to be used, how they are logically connected and what properties the streams of data to be passed...

  3. Accelerator Modeling with MATLAB Accelerator Toolbox

    International Nuclear Information System (INIS)

    2002-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model storage rings and beam transport lines in the MATLAB environment. The objective is to illustrate the flexibility and efficiency of the AT-MATLAB framework. The paper discusses three examples of problems that are analyzed frequently in connection with ring-based synchrotron light sources

  4. Tolkku - a toolbox for decision support from condition monitoring data

    International Nuclear Information System (INIS)

    Saarela, Olli; Lehtonen, Mikko; Halme, Jari; Aikala, Antti; Raivio, Kimmo

    2012-01-01

    This paper describes a software toolbox (a software library) designed for condition monitoring and diagnosis of machines. This toolbox implements both new methods and prior art and is aimed for practical down-to-earth data analysis work. The target is to improve knowledge of the operation and behaviour of machines and processes throughout their entire life-cycles. The toolbox supports different phases of condition based maintenance with tools that extract essential information and automate data processing. The paper discusses principles that have guided toolbox design and the implemented toolbox structure. Case examples are used to illustrate how condition monitoring applications can be built using the toolbox. In the first case study the toolbox is applied to fault detection of industrial centrifuges based on measured electrical current. The second case study outlines an application for centralized monitoring of a fleet of machines that supports organizational learning.

  5. Optimising Magnetostatic Assemblies

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Smith, Anders

    theorem. This theorem formulates an energy equivalence principle with several implications concerning the optimisation of objective functionals that are linear with respect to the magnetic field. Linear functionals represent different optimisation goals, e.g. maximising a certain component of the field...... approached employing a heuristic algorithm, which led to new design concepts. Some of the procedures developed for linear objective functionals have been extended to non-linear objectives, by employing iterative techniques. Even though most the optimality results discussed in this work have been derived...

  6. Modified cuckoo search: A new gradient free optimisation algorithm

    International Nuclear Information System (INIS)

    Walton, S.; Hassan, O.; Morgan, K.; Brown, M.R.

    2011-01-01

    Highlights: → Modified cuckoo search (MCS) is a new gradient free optimisation algorithm. → MCS shows a high convergence rate, able to outperform other optimisers. → MCS is particularly strong at high dimension objective functions. → MCS performs well when applied to engineering problems. - Abstract: A new robust optimisation algorithm, which can be regarded as a modification of the recently developed cuckoo search, is presented. The modification involves the addition of information exchange between the top eggs, or the best solutions. Standard optimisation benchmarking functions are used to test the effects of these modifications and it is demonstrated that, in most cases, the modified cuckoo search performs as well as, or better than, the standard cuckoo search, a particle swarm optimiser, and a differential evolution strategy. In particular the modified cuckoo search shows a high convergence rate to the true global minimum even at high numbers of dimensions.

  7. The ROC Toolbox: A toolbox for analyzing receiver-operating characteristics derived from confidence ratings.

    Science.gov (United States)

    Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P

    2017-08-01

    Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.

  8. The laboratory test utilization management toolbox.

    Science.gov (United States)

    Baird, Geoffrey

    2014-01-01

    Efficiently managing laboratory test utilization requires both ensuring adequate utilization of needed tests in some patients and discouraging superfluous tests in other patients. After the difficult clinical decision is made to define the patients that do and do not need a test, a wealth of interventions are available to the clinician and laboratorian to help guide appropriate utilization. These interventions are collectively referred to here as the utilization management toolbox. Experience has shown that some tools in the toolbox are weak and other are strong, and that tools are most effective when many are used simultaneously. While the outcomes of utilization management studies are not always as concrete as may be desired, what data is available in the literature indicate that strong utilization management interventions are safe and effective measures to improve patient health and reduce waste in an era of increasing financial pressure.

  9. ESA's Multi-mission Sentinel-1 Toolbox

    Science.gov (United States)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for

  10. CONTINEX: A Toolbox for Continuation in Experiments

    DEFF Research Database (Denmark)

    Schilder, Frank; Bureau, Emil; Santos, Ilmar

    2014-01-01

    CONTINEX is a MATLAB toolbox for bifurcation analysis based on the development platform COCO (computational continuation core). CONTINEX is specifically designed for coupling to experimental test specimen via DSPACE, but provides also interfaces to SIMULINK-, ODE-, and so-called equation-free mod......CONTINEX is a MATLAB toolbox for bifurcation analysis based on the development platform COCO (computational continuation core). CONTINEX is specifically designed for coupling to experimental test specimen via DSPACE, but provides also interfaces to SIMULINK-, ODE-, and so-called equation......-free models. The current version of the interface for experimental set-ups implements an algorithm for tuning control parameters, a robust noise-tolerant covering algorithm, and functions for monitoring (in)stability. In this talk we will report on experiments with an impact oscillator with magnetic actuators...

  11. Segmentation Toolbox for Tomographic Image Data

    DEFF Research Database (Denmark)

    Einarsdottir, Hildur

    , techniques to automatically analyze such data becomes ever more important. Most segmentation methods for large datasets, such as CT images, deal with simple thresholding techniques, where intensity values cut offs are predetermined and hard coded. For data where the intensity difference is not sufficient......Motivation: Image acquisition has vastly improved over the past years, introducing techniques such as X-ray computed tomography (CT). CT images provide the means to probe a sample non-invasively to investigate its inner structure. Given the wide usage of this technique and massive data amounts......, and partial volume voxels occur frequently, thresholding methods do not suffice and more advanced methods are required. Contribution: To meet these requirements a toolbox has been developed, combining well known methods within the image analysis field. The toolbox includes cluster-based methods...

  12. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  13. Insights into the European Years’ Communication Toolboxes

    Directory of Open Access Journals (Sweden)

    Camelia-Mihaela Cmeciu

    2012-08-01

    Full Text Available Since 1983 the European syntagm “unity in diversity” has been implemented in the European Years’ communication campaigns. Dependent on subsidiarity and decentralization, European Years focus on a specific issue which constitutes the subject of a year-long awareness campaign. Beyond the involvement of Europe’s citizens through their local, regional and national authorities in the implementation of the European Years’ policies, there is a unity at the level of the visual communication of the EU by two important image-building elements: EY logos and communication toolboxes. The European Years’ communication toolboxes can be considered signs of inclusion since every organization is expected to customize the templates in the official campaign design of the European Year. The analysis will focus on the image-building elements of three European Years (2010, 2011, 2012. Having social semiotics as the qualitative research method and the analytical framework based on the distinction between design resources and representational resources, I will analyze the double layers of the high intensity point of inclusion: (1 the European Years’ branding process; (2 the visual deontic modality within the visual guidelines of the EY communication toolbox.

  14. HYDRORECESSION: A toolbox for streamflow recession analysis

    Science.gov (United States)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  15. Optimisation of radiation protection

    International Nuclear Information System (INIS)

    1988-01-01

    Optimisation of radiation protection is one of the key elements in the current radiation protection philosophy. The present system of dose limitation was issued in 1977 by the International Commission on Radiological Protection (ICRP) and includes, in addition to the requirements of justification of practices and limitation of individual doses, the requirement that all exposures be kept as low as is reasonably achievable, taking social and economic factors into account. This last principle is usually referred to as optimisation of radiation protection, or the ALARA principle. The NEA Committee on Radiation Protection and Public Health (CRPPH) organised an ad hoc meeting, in liaison with the NEA committees on the safety of nuclear installations and radioactive waste management. Separate abstracts were prepared for individual papers presented at the meeting

  16. Object Oriented Toolbox for Modelling and Simulation of Dynamical Systems

    DEFF Research Database (Denmark)

    Poulsen, Mikael Zebbelin; Wagner, Falko Jens; Thomsen, Per Grove

    1998-01-01

    This paper presents the results of an ongoing project, dealing with design and implementation of a simulation toolbox based on object oriented modelling techniques. The paper describes an experimental implementation of parts of such a toolbox in C++, and discusses the experiences drawn from that ...... that process. Essential to the work is the focus on simulation of complex dynamical systems, from modelling the single components/subsystems to building complete systemssuch a toolbox in C++, and discusses the experiences drawn from that process....

  17. The ABRAVIBE toolbox for teaching vibration analysis and structural dynamics

    DEFF Research Database (Denmark)

    Brandt, A.

    2013-01-01

    , a MATLAB toolbox (the ABRAVIBE toolbox) has been developed as an accompanying toolbox for the recent book "Noise and Vibration Analysis" by the author. This free, open software, published under GNU Public License, can be used with GNU Octave, if an entirely free software platform is wanted, with a few...... functional limitations. The toolbox includes functionality for simulation of mechanical models as well as advanced analysis such as time series analysis, spectral analysis, frequency response and correlation function estimation, modal parameter extraction, and rotating machinery analysis (order tracking...

  18. Advanced optimisation - coal fired power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Turney, D.M.; Mayes, I. [E.ON UK, Nottingham (United Kingdom)

    2005-03-01

    The purpose of this unit optimization project is to develop an integrated approach to unit optimisation and develop an overall optimiser that is able to resolve any conflicts between the individual optimisers. The individual optimisers have been considered during this project are: on-line thermal efficiency package, GNOCIS boiler optimiser, GNOCIS steam side optimiser, ESP optimisation, and intelligent sootblowing system. 6 refs., 7 figs., 3 tabs.

  19. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    Science.gov (United States)

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  20. Optimisation in radiotherapy II: Programmed and inversion optimisation algorithms

    International Nuclear Information System (INIS)

    Ebert, M.

    1997-01-01

    This is the second article in a three part examination of optimisation in radiotherapy. The previous article established the bases of optimisation in radiotherapy, and the formulation of the optimisation problem. This paper outlines several algorithms that have been used in radiotherapy, for searching for the best irradiation strategy within the full set of possible strategies. Two principle classes of algorithm are considered - those associated with mathematical programming which employ specific search techniques, linear programming type searches or artificial intelligence - and those which seek to perform a numerical inversion of the optimisation problem, finishing with deterministic iterative inversion. (author)

  1. Optimisation of monochrome images

    International Nuclear Information System (INIS)

    Potter, R.

    1983-01-01

    Gamma cameras with modern imaging systems usually digitize the signals to allow storage and processing of the image in a computer. Although such computer systems are widely used for the extraction of quantitative uptake estimates and the analysis of time variant data, the vast majority of nuclear medicine images is still interpreted on the basis of an observer's visual assessment of a photographic hardcopy image. The optimisation of hardcopy devices is therefore vital and factors such as resolution, uniformity, noise grey scales and display matrices are discussed. Once optimum display parameters have been determined, routine procedures for quality control need to be established; suitable procedures are discussed. (U.K.)

  2. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  3. The Matlab Radial Basis Function Toolbox

    Directory of Open Access Journals (Sweden)

    Scott A. Sarra

    2017-03-01

    Full Text Available Radial Basis Function (RBF methods are important tools for scattered data interpolation and for the solution of Partial Differential Equations in complexly shaped domains. The most straight forward approach used to evaluate the methods involves solving a linear system which is typically poorly conditioned. The Matlab Radial Basis Function toolbox features a regularization method for the ill-conditioned system, extended precision floating point arithmetic, and symmetry exploitation for the purpose of reducing flop counts of the associated numerical linear algebra algorithms.

  4. Multiobjective optimisation of bogie suspension to boost speed on curves

    Science.gov (United States)

    Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor

    2016-01-01

    To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.

  5. Rad Toolbox User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Eckerman, Keith F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sjoreen, Andrea L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2013-05-01

    The Radiological Toolbox software developed by Oak Ridge National Laboratory (ORNL) for U. S. Nuclear Regulatory Commission (NRC) is designed to provide electronic access to the vast and varied data that underlies the field of radiation protection. These data represent physical, chemical, anatomical, physiological, and mathematical parameters detailed in various handbooks which a health physicist might consult while in his office. The initial motivation for the software was to serve the needs of the health physicist away from his office and without access to his handbooks; e.g., NRC inspectors. The earlier releases of the software were widely used and accepted around the world by not only practicing health physicist but also those within educational programs. This release updates the software to accommodate changes in Windows operating systems and, in some aspects, radiation protection. This release has been tested on Windows 7 and 8 and on 32- and 64-bit machines. The nuclear decay data has been updated and thermal neutron capture cross sections and cancer risk coefficients have been included. This document and the software’s user’s guide provide further details and documentation of the information captured within the Radiological Toolbox.

  6. MTpy: A Python toolbox for magnetotellurics

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared R.

    2014-11-01

    We present the software package MTpy that allows handling, processing, and imaging of magnetotelluric (MT) data sets. Written in Python, the code is open source, containing sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides the independent definition of classes and functions, MTpy provides wrappers and convenience scripts to call standard external data processing and modelling software. In its current state, modules and functions of MTpy work on raw and pre-processed MT data. However, opposite to providing a static compilation of software, we prefer to introduce MTpy as a flexible software toolbox, whose contents can be combined and utilised according to the respective needs of the user. Just as the overall functionality of a mechanical toolbox can be extended by adding new tools, MTpy is a flexible framework, which will be dynamically extended in the future. Furthermore, it can help to unify and extend existing codes and algorithms within the (academic) MT community. In this paper, we introduce the structure and concept of MTpy. Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.

  7. Optimisation of occupational exposure

    International Nuclear Information System (INIS)

    Webb, G.A.M.; Fleishman, A.B.

    1982-01-01

    The general concept of the optimisation of protection of the public is briefly described. Some ideas being developed for extending the cost benefit framework to include radiation workers with full implementation of the ALARA criterion are described. The role of cost benefit analysis in radiological protection and the valuation of health detriment including the derivation of monetary values and practical implications are discussed. Cost benefit analysis can lay out for inspection the doses, the associated health detriment costs and the costs of protection for alternative courses of action. However it is emphasised that the cost benefit process is an input to decisions on what is 'as low as reasonably achievable' and not a prescription for making them. (U.K.)

  8. Standardised approach to optimisation

    International Nuclear Information System (INIS)

    Warren-Forward, Helen M.; Beckhaus, Ronald

    2004-01-01

    Optimisation of radiographic images is said to have been obtained if the patient has achieved an acceptable level of dose and the image is of diagnostic value. In the near future, it will probably be recommended that radiographers measure patient doses and compare them to reference levels. The aim of this paper is to describe a standardised approach to optimisation of radiographic examinations in a diagnostic imaging department. A three-step approach is outlined with specific examples for some common examinations (chest, abdomen, pelvis and lumbar spine series). Step One: Patient doses are calculated. Step Two: Doses are compared to existing reference levels and the technique used compared to image quality criteria. Step Three: Appropriate action is taken if doses are above the reference level. Results: Average entrance surface doses for two rooms were as follows AP Abdomen (6.3mGy and 3.4mGy); AP Lumbar Spine (6.4mGy and 4.1mGy) for AP Pelvis (4.8mGy and 2.6mGy) and PA chest (0.19mGy and 0.20mGy). Comparison with the Commission of the European Communities (CEC) recommended techniques identified large differences in the applied potential. The kVp values in this study were significantly lower (by up to lOkVp) than the CEC recommendations. The results of this study have indicated that there is a need to monitor radiation doses received by patients undergoing diagnostic radiography examinations. Not only has the assessment allowed valuable comparison with International Diagnostic Reference Levels and Radiography Good Practice but has demonstrated large variations in mean doses being delivered from different rooms of the same radiology department. Following the simple 3-step approach advocated in this paper should either provide evidence that department are practising the ALARA principle or assist in making suitable changes to current practice. Copyright (2004) Australian Institute of Radiography

  9. A DOE Computer Code Toolbox: Issues and Opportunities

    International Nuclear Information System (INIS)

    Vincent, A.M. III

    2001-01-01

    The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications

  10. Presentation of the International Building Physics Toolbox for Simulink

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Sasic Kalagasidis, Angela; Nielsen, Toke Rammer

    2003-01-01

    The international building physics toolbox (IBPT) is a software library specially constructed for HAM system analysis in building physics. The toolbox is constructed as a modular structure of the standard building elements using the graphical programming language Simulink. Two research groups have...... participated in this project. In order to enable the development of the toolbox, a common modelling platform was defined: a set of unique communication signals, material database and documentation protocol. The IBPT is open source and publicly available on the Internet. Any researcher and student can use...

  11. MMM: A toolbox for integrative structure modeling.

    Science.gov (United States)

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  12. Expanding the UniFrac Toolbox.

    Directory of Open Access Journals (Sweden)

    Ruth G Wong

    Full Text Available The UniFrac distance metric is often used to separate groups in microbiome analysis, but requires a constant sequencing depth to work properly. Here we demonstrate that unweighted UniFrac is highly sensitive to rarefaction instance and to sequencing depth in uniform data sets with no clear structure or separation between groups. We show that this arises because of subcompositional effects. We introduce information UniFrac and ratio UniFrac, two new weightings that are not as sensitive to rarefaction and allow greater separation of outliers than classic unweighted and weighted UniFrac. With this expansion of the UniFrac toolbox, we hope to empower researchers to extract more varied information from their data.

  13. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  14. Object-oriented Matlab adaptive optics toolbox

    Science.gov (United States)

    Conan, R.; Correia, C.

    2014-08-01

    Object-Oriented Matlab Adaptive Optics (OOMAO) is a Matlab toolbox dedicated to Adaptive Optics (AO) systems. OOMAO is based on a small set of classes representing the source, atmosphere, telescope, wavefront sensor, Deformable Mirror (DM) and an imager of an AO system. This simple set of classes allows simulating Natural Guide Star (NGS) and Laser Guide Star (LGS) Single Conjugate AO (SCAO) and tomography AO systems on telescopes up to the size of the Extremely Large Telescopes (ELT). The discrete phase screens that make the atmosphere model can be of infinite size, useful for modeling system performance on large time scales. OOMAO comes with its own parametric influence function model to emulate different types of DMs. The cone effect, altitude thickness and intensity profile of LGSs are also reproduced. Both modal and zonal modeling approach are implemented. OOMAO has also an extensive library of theoretical expressions to evaluate the statistical properties of turbulence wavefronts. The main design characteristics of the OOMAO toolbox are object-oriented modularity, vectorized code and transparent parallel computing. OOMAO has been used to simulate and to design the Multi-Object AO prototype Raven at the Subaru telescope and the Laser Tomography AO system of the Giant Magellan Telescope. In this paper, a Laser Tomography AO system on an ELT is simulated with OOMAO. In the first part, we set-up the class parameters and we link the instantiated objects to create the source optical path. Then we build the tomographic reconstructor and write the script for the pseudo-open-loop controller.

  15. The Toolbox for Local and Global Plagiarism Detection

    Science.gov (United States)

    Butakov, Sergey; Scherbinin, Vladislav

    2009-01-01

    Digital plagiarism is a problem for educators all over the world. There are many software tools on the market for uncovering digital plagiarism. Most of them can work only with text submissions. In this paper, we present a new architecture for a plagiarism detection tool that can work with many different kinds of digital submissions, from plain or…

  16. The dismantling of nuclear installations: The dismantling of nuclear installations at the CEA's Directorate for nuclear energy; The CEA's sanitation and dismantling works: example of one of the Marcoule UP1 program lots; Research and innovation in sanitation-dismantling; Global optimisation of the management of dismantling radioactive wastes

    International Nuclear Information System (INIS)

    Hauet, Jean-Pierre; Piketty, Laurence; Moitrier, Cyril; Blanchard, Samuel; Soulabaille, Yves; Georges, Christine; Dutzer, Michel; Legee, Frederic

    2016-01-01

    This publication proposes a set of four articles which addresses issues related to the dismantling of nuclear installations in France, notably for the different involved actors such as the CEA and the ANDRA. The authors more particularly address the issue and the general strategy of dismantling within the Directorate for nuclear energy of the CEA; comment the example of one of the Marcoule UP1 program lots to highlight sanitation and dismantling works performed by the CEA; discuss current research and innovation activities within the CEA regarding sanitation and dismantling; and comment how to globally optimise the management of radioactive wastes produced by dismantling activities

  17. European Studies and Public Engagement: A Conceptual Toolbox

    Directory of Open Access Journals (Sweden)

    Andreas Müllerleile

    2014-11-01

    Full Text Available Journal of Contemporary European Research User Username Password Remember me Subscribe... Sign up for issue alerts Follow JCER on Twitter Font Size Make font size smaller Make font size default Make font size larger Journal Content Search Search Scope Browse By Issue By Author By Title Information For Readers For Authors For Librarians Journal Help Keywords CFSP Communication ESDP EU EU enlargement EU trade policy Energy, EU, External Policy Europe European Commission European Parliament European Union European integration Europeanisation First Enlargement Germany Liberty Lisbon Treaty Poland Russia Security teaching European studies The UACES Blog The Commission after the 2014 EP... Power shift? The EU’s pivot to Asia 100 Books on Europe to be Remembered For a Global European Studies? EU Member State Building in the... Open Journal Systems Home About Login Register Search Current Archives Announcements UACES Home > Vol 10, No 4 (2014 > Müllerleile European Studies and Public Engagement: A Conceptual Toolbox Andreas Müllerleile Abstract This article examines public engagement strategies for academics working in the field of European Studies. Should academics engage with the public? What are the most effective outreach strategies? And what are the implications for universities and departments? The article argues that engaging with the public should be considered an integral part for academics working on topics that relate to the European Union or European politics. The article has a theoretical and a practical dimension. The first part of the paper deals with the nature of public engagement, explaining why it is an important issue and how it differs from the mainstream understanding of public engagement. The practical part of the paper presents the idea of building an online presence through which academics can engage with the public debate both during periods of low issue salience and high issue salience. The final section includes a toolbox

  18. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  19. Turbulence optimisation in stellarator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Proll, Josefine H.E. [Max-Planck/Princeton Center for Plasma Physics (Germany); Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstr. 1, 17491 Greifswald (Germany); Faber, Benjamin J. [HSX Plasma Laboratory, University of Wisconsin-Madison, Madison, WI 53706 (United States); Helander, Per; Xanthopoulos, Pavlos [Max-Planck/Princeton Center for Plasma Physics (Germany); Lazerson, Samuel A.; Mynick, Harry E. [Plasma Physics Laboratory, Princeton University, P.O. Box 451 Princeton, New Jersey 08543-0451 (United States)

    2015-05-01

    Stellarators, the twisted siblings of the axisymmetric fusion experiments called tokamaks, have historically suffered from confining the heat of the plasma insufficiently compared with tokamaks and were therefore considered to be less promising candidates for a fusion reactor. This has changed, however, with the advent of stellarators in which the laminar transport is reduced to levels below that of tokamaks by shaping the magnetic field accordingly. As in tokamaks, the turbulent transport remains as the now dominant transport channel. Recent analytical theory suggests that the large configuration space of stellarators allows for an additional optimisation of the magnetic field to also reduce the turbulent transport. In this talk, the idea behind the turbulence optimisation is explained. We also present how an optimised equilibrium is obtained and how it might differ from the equilibrium field of an already existing device, and we compare experimental turbulence measurements in different configurations of the HSX stellarator in order to test the optimisation procedure.

  20. Optimisation of load control

    International Nuclear Information System (INIS)

    Koponen, P.

    1998-01-01

    Electricity cannot be stored in large quantities. That is why the electricity supply and consumption are always almost equal in large power supply systems. If this balance were disturbed beyond stability, the system or a part of it would collapse until a new stable equilibrium is reached. The balance between supply and consumption is mainly maintained by controlling the power production, but also the electricity consumption or, in other words, the load is controlled. Controlling the load of the power supply system is important, if easily controllable power production capacity is limited. Temporary shortage of capacity causes high peaks in the energy price in the electricity market. Load control either reduces the electricity consumption during peak consumption and peak price or moves electricity consumption to some other time. The project Optimisation of Load Control is a part of the EDISON research program for distribution automation. The following areas were studied: Optimization of space heating and ventilation, when electricity price is time variable, load control model in power purchase optimization, optimization of direct load control sequences, interaction between load control optimization and power purchase optimization, literature on load control, optimization methods and field tests and response models of direct load control and the effects of the electricity market deregulation on load control. An overview of the main results is given in this chapter

  1. Optimisation of load control

    Energy Technology Data Exchange (ETDEWEB)

    Koponen, P [VTT Energy, Espoo (Finland)

    1998-08-01

    Electricity cannot be stored in large quantities. That is why the electricity supply and consumption are always almost equal in large power supply systems. If this balance were disturbed beyond stability, the system or a part of it would collapse until a new stable equilibrium is reached. The balance between supply and consumption is mainly maintained by controlling the power production, but also the electricity consumption or, in other words, the load is controlled. Controlling the load of the power supply system is important, if easily controllable power production capacity is limited. Temporary shortage of capacity causes high peaks in the energy price in the electricity market. Load control either reduces the electricity consumption during peak consumption and peak price or moves electricity consumption to some other time. The project Optimisation of Load Control is a part of the EDISON research program for distribution automation. The following areas were studied: Optimization of space heating and ventilation, when electricity price is time variable, load control model in power purchase optimization, optimization of direct load control sequences, interaction between load control optimization and power purchase optimization, literature on load control, optimization methods and field tests and response models of direct load control and the effects of the electricity market deregulation on load control. An overview of the main results is given in this chapter

  2. SPS batch spacing optimisation

    CERN Document Server

    Velotti, F M; Carlier, E; Goddard, B; Kain, V; Kotzian, G

    2017-01-01

    Until 2015, the LHC filling schemes used the batch spac-ing as specified in the LHC design report. The maximumnumber of bunches injectable in the LHC directly dependson the batch spacing at injection in the SPS and hence onthe MKP rise time.As part of the LHC Injectors Upgrade project for LHCheavy ions, a reduction of the batch spacing is needed. In thisdirection, studies to approach the MKP design rise time of150ns(2-98%) have been carried out. These measurementsgave clear indications that such optimisation, and beyond,could be done also for higher injection momentum beams,where the additional slower MKP (MKP-L) is needed.After the successful results from 2015 SPS batch spacingoptimisation for the Pb-Pb run [1], the same concept wasthought to be used also for proton beams. In fact, thanksto the SPS transverse feed back, it was already observedthat lower batch spacing than the design one (225ns) couldbe achieved. For the 2016 p-Pb run, a batch spacing of200nsfor the proton beam with100nsbunch spacing wasreque...

  3. ARC Code TI: X-Plane Communications Toolbox (XPC)

    Data.gov (United States)

    National Aeronautics and Space Administration — The X-Plane Communications Toolbox (XPC) is an open source research tool used to interact with the commercial flight simulator software X-Plane. XPC allows users to...

  4. iamxt: Max-tree toolbox for image processing and analysis

    Directory of Open Access Journals (Sweden)

    Roberto Souza

    2017-01-01

    Full Text Available The iamxt is an array-based max-tree toolbox implemented in Python using the NumPy library for array processing. It has state of the art methods for building and processing the max-tree, and a large set of visualization tools that allow to view the tree and the contents of its nodes. The array-based programming style and max-tree representation used in the toolbox make it simple to use. The intended audience of this toolbox includes mathematical morphology students and researchers that want to develop research in the field and image processing researchers that need a toolbox simple to use and easy to integrate in their applications.

  5. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  6. III. NIH TOOLBOX COGNITION BATTERY (CB): MEASURING EPISODIC MEMORY

    OpenAIRE

    Bauer, Patricia J.; Dikmen, Sureyya S.; Heaton, Robert K.; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L.

    2013-01-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are adminis...

  7. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  8. Globalization

    Directory of Open Access Journals (Sweden)

    Tulio Rosembuj

    2006-12-01

    Full Text Available There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  9. Globalization

    OpenAIRE

    Tulio Rosembuj

    2006-01-01

    There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  10. Vaccine strategies: Optimising outcomes.

    Science.gov (United States)

    Hardt, Karin; Bonanni, Paolo; King, Susan; Santos, Jose Ignacio; El-Hodhod, Mostafa; Zimet, Gregory D; Preiss, Scott

    2016-12-20

    factors that encourage success, which often include strong support from government and healthcare organisations, as well as tailored, culturally-appropriate local approaches to optimise outcomes. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Cultural-based particle swarm for dynamic optimisation problems

    Science.gov (United States)

    Daneshyari, Moayed; Yen, Gary G.

    2012-07-01

    Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.

  12. Globalization

    OpenAIRE

    Andru?cã Maria Carmen

    2013-01-01

    The field of globalization has highlighted an interdependence implied by a more harmonious understanding determined by the daily interaction between nations through the inducement of peace and the management of streamlining and the effectiveness of the global economy. For the functioning of the globalization, the developing countries that can be helped by the developed ones must be involved. The international community can contribute to the institution of the development environment of the gl...

  13. Design of a Toolbox of RNA Thermometers.

    Science.gov (United States)

    Sen, Shaunak; Apurva, Divyansh; Satija, Rohit; Siegal, Dan; Murray, Richard M

    2017-08-18

    Biomolecular temperature sensors can be used for efficient control of large-volume bioreactors, for spatiotemporal imaging and control of gene expression, and to engineer robustness to temperature in biomolecular circuit design. Although RNA-based sensors, called "thermometers", have been investigated in both natural and synthetic contexts, an important challenge is to design diverse responses to temperature differing in sensitivity and threshold. We address this issue by constructing a library of RNA thermometers based on thermodynamic computations and experimentally measuring their activities in cell-free biomolecular "breadboards". Using free energies of the minimum free energy structures as well as melt profile computations, we estimated that a diverse set of temperature responses were possible. We experimentally found a wide range of responses to temperature in the range 29-37 °C with fold-changes varying over 3-fold around the starting thermometer. The sensitivities of these responses ranged over 10-fold around the starting thermometer. We correlated these measurements with computational expectations, finding that although there was no strong correlation for the individual thermometers, overall trends of diversity, fold-changes, and sensitivities were similar. These results present a toolbox of RNA-based circuit elements with diverse temperature responses.

  14. Robotic inspection technology-process an toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Hermes, Markus [ROSEN Group (United States). R and D Dept.

    2005-07-01

    Pipeline deterioration grows progressively with ultimate aging of pipeline systems (on-plot and cross country). This includes both, very localized corrosion as well as increasing failure probability due to fatigue cracking. Limiting regular inspecting activities to the 'scrapable' part of the pipelines only, will ultimately result into a pipeline system with questionable integrity. The confidence level in the integrity of these systems will drop below acceptance levels. Inspection of presently un-inspectable sections of the pipeline system becomes a must. This paper provides information on ROSEN's progress on the 'robotic inspection technology' project. The robotic inspection concept developed by ROSEN is based on a modular toolbox principle. This is mandatory. A universal 'all purpose' robot would not be reliable and efficient in resolving the postulated inspection task. A preparatory Quality Function Deployment (QFD) analysis is performed prior to the decision about the adequate robotic solution. This enhances the serviceability and efficiency of the provided technology. The word 'robotic' can be understood in its full meaning of Recognition - Strategy - Motion - Control. Cooperation of different individual systems with an established communication, e.g. utilizing Bluetooth technology, support the robustness of the ROSEN robotic inspection approach. Beside the navigation strategy, the inspection strategy is also part of the QFD process. Multiple inspection technologies combined on a single carrier or distributed across interacting container must be selected with a clear vision of the particular goal. (author)

  15. Multi-Optimisation Consensus Clustering

    Science.gov (United States)

    Li, Jian; Swift, Stephen; Liu, Xiaohui

    Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.

  16. Isogeometric Analysis and Shape Optimisation

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Gersborg, Allan Roulund

    of the whole domain. So in every optimisation cycle we need to extend a parametrisation of the boundary of a domain to the whole domain. It has to be fast in order not to slow the optimisation down but it also has to be robust and give a parametrisation of high quality. These are conflicting requirements so we...... will explain how the validity of a parametrisation can be checked and we will describe various ways to parametrise a domain. We will in particular study the Winslow functional which turns out to have some desirable properties. Other problems we touch upon is clustering of boundary control points (design...

  17. Open Babel: An open chemical toolbox

    Directory of Open Access Journals (Sweden)

    O'Boyle Noel M

    2011-10-01

    Full Text Available Abstract Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language and de facto standards have arisen (for example, SMILES format, the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example, and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org.

  18. Open Babel: An open chemical toolbox

    Science.gov (United States)

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  19. FAST: FAST Analysis of Sequences Toolbox

    Directory of Open Access Journals (Sweden)

    Travis J. Lawrence

    2015-05-01

    Full Text Available FAST (FAST Analysis of Sequences Toolbox provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU’s Not Unix Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics makes FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format. Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought.

  20. A constriction factor based particle swarm optimisation algorithm to solve the economic dispatch problem including losses

    Energy Technology Data Exchange (ETDEWEB)

    Young, Steven; Montakhab, Mohammad; Nouri, Hassan

    2011-07-15

    Economic dispatch (ED) is one of the most important problems to be solved in power generation as fractional percentage fuel reductions represent significant cost savings. ED wishes to optimise the power generated by each generating unit in a system in order to find the minimum operating cost at a required load demand, whilst ensuring both equality and inequality constraints are met. For the process of optimisation, a model must be created for each generating unit. The particle swarm optimisation technique is an evolutionary computation technique with one of the most powerful methods for solving global optimisation problems. The aim of this paper is to add in a constriction factor to the particle swarm optimisation algorithm (CFBPSO). Results show that the algorithm is very good at solving the ED problem and that CFBPSO must be able to work in a practical environment and so a valve point effect with transmission losses should be included in future work.

  1. Simulation and optimisation modelling approach for operation of the Hoa Binh Reservoir, Vietnam

    DEFF Research Database (Denmark)

    Ngo, Long le; Madsen, Henrik; Rosbjerg, Dan

    2007-01-01

    Hoa Binh, the largest reservoir in Vietnam, plays an important role in flood control for the Red River delta and hydropower generation. Due to its multi-purpose character, conflicts and disputes in operating the reservoir have been ongoing since its construction, particularly in the flood season....... This paper proposes to optimise the control strategies for the Hoa Binh reservoir operation by applying a combination of simulation and optimisation models. The control strategies are set up in the MIKE 11 simulation model to guide the releases of the reservoir system according to the current storage level......, the hydro-meteorological conditions, and the time of the year. A heuristic global optimisation tool, the shuffled complex evolution (SCE) algorithm, is adopted for optimising the reservoir operation. The optimisation puts focus on the trade-off between flood control and hydropower generation for the Hoa...

  2. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  3. Cogeneration technologies, optimisation and implementation

    CERN Document Server

    Frangopoulos, Christos A

    2017-01-01

    Cogeneration refers to the use of a power station to deliver two or more useful forms of energy, for example, to generate electricity and heat at the same time. This book provides an integrated treatment of cogeneration, including a tour of the available technologies and their features, and how these systems can be analysed and optimised.

  4. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  5. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    Science.gov (United States)

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  6. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    Science.gov (United States)

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  7. Globalization

    DEFF Research Database (Denmark)

    Plum, Maja

    Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA...

  8. Globalization

    OpenAIRE

    F. Gerard Adams

    2008-01-01

    The rapid globalization of the world economy is causing fundamental changes in patterns of trade and finance. Some economists have argued that globalization has arrived and that the world is “flat†. While the geographic scope of markets has increased, the author argues that new patterns of trade and finance are a result of the discrepancies between “old†countries and “new†. As the differences are gradually wiped out, particularly if knowledge and technology spread worldwide, the t...

  9. NATbox: a network analysis toolbox in R.

    Science.gov (United States)

    Chavan, Shweta S; Bauer, Michael A; Scutari, Marco; Nagarajan, Radhakrishnan

    2009-10-08

    There has been recent interest in capturing the functional relationships (FRs) from high-throughput assays using suitable computational techniques. FRs elucidate the working of genes in concert as a system as opposed to independent entities hence may provide preliminary insights into biological pathways and signalling mechanisms. Bayesian structure learning (BSL) techniques and its extensions have been used successfully for modelling FRs from expression profiles. Such techniques are especially useful in discovering undocumented FRs, investigating non-canonical signalling mechanisms and cross-talk between pathways. The objective of the present study is to develop a graphical user interface (GUI), NATbox: Network Analysis Toolbox in the language R that houses a battery of BSL algorithms in conjunction with suitable statistical tools for modelling FRs in the form of acyclic networks from gene expression profiles and their subsequent analysis. NATbox is a menu-driven open-source GUI implemented in the R statistical language for modelling and analysis of FRs from gene expression profiles. It provides options to (i) impute missing observations in the given data (ii) model FRs and network structure from gene expression profiles using a battery of BSL algorithms and identify robust dependencies using a bootstrap procedure, (iii) present the FRs in the form of acyclic graphs for visualization and investigate its topological properties using network analysis metrics, (iv) retrieve FRs of interest from published literature. Subsequently, use these FRs as structural priors in BSL (v) enhance scalability of BSL across high-dimensional data by parallelizing the bootstrap routines. NATbox provides a menu-driven GUI for modelling and analysis of FRs from gene expression profiles. By incorporating readily available functions from existing R-packages, it minimizes redundancy and improves reproducibility, transparency and sustainability, characteristic of open-source environments

  10. Versatile Cas9-driven subpopulation selection toolbox for Lactococcus lactis

    NARCIS (Netherlands)

    Els, van der Simon; James, Jennelle K.; Kleerebezem, Michiel; Bron, Peter A.

    2018-01-01

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis. The Cas9 expression vector

  11. 40 CFR 141.719 - Additional filtration toolbox components.

    Science.gov (United States)

    2010-07-01

    ... taken from a surface water or GWUDI source. A cap, such as GAC, on a single stage of filtration is not... separate stage of filtration if both filtration stages treat entire plant flow taken from a surface water... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Additional filtration toolbox...

  12. Geoplotlib: a Python Toolbox for Visualizing Geographical Data

    OpenAIRE

    Cuttone, Andrea; Lehmann, Sune; Larsen, Jakob Eg

    2016-01-01

    We introduce geoplotlib, an open-source python toolbox for visualizing geographical data. geoplotlib supports the development of hardware-accelerated interactive visualizations in pure python, and provides implementations of dot maps, kernel density estimation, spatial graphs, Voronoi tesselation, shapefiles and many more common spatial visualizations. We describe geoplotlib design, functionalities and use cases.

  13. Wyrm: A Brain-Computer Interface Toolbox in Python.

    Science.gov (United States)

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  14. Testing adaptive toolbox models: a Bayesian hierarchical approach

    NARCIS (Netherlands)

    Scheibehenne, B.; Rieskamp, J.; Wagenmakers, E.-J.

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often

  15. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    1997-01-01

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  16. Structural Time Domain Identification (STDI) Toolbox for Use with MATLAB

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune

    The Structural Time Domain Identification (STDI) toolbox for use with MATLABTM is developed at Aalborg University, Denmark, based on the system identification research performed during recent years. By now, a reliable set of functions offers a wide spectrum of services for all the important steps...

  17. DeltaProt: a software toolbox for comparative genomics

    Directory of Open Access Journals (Sweden)

    Willassen Nils P

    2010-11-01

    Full Text Available Abstract Background Statistical bioinformatics is the study of biological data sets obtained by new micro-technologies by means of proper statistical methods. For a better understanding of environmental adaptations of proteins, orthologous sequences from different habitats may be explored and compared. The main goal of the DeltaProt Toolbox is to provide users with important functionality that is needed for comparative screening and studies of extremophile proteins and protein classes. Visualization of the data sets is also the focus of this article, since visualizations can play a key role in making the various relationships transparent. This application paper is intended to inform the reader of the existence, functionality, and applicability of the toolbox. Results We present the DeltaProt Toolbox, a software toolbox that may be useful in importing, analyzing and visualizing data from multiple alignments of proteins. The toolbox has been written in MATLAB™ to provide an easy and user-friendly platform, including a graphical user interface, while ensuring good numerical performance. Problems in genome biology may be easily stated thanks to a compact input format. The toolbox also offers the possibility of utilizing structural information from the SABLE or other structure predictors. Different sequence plots can then be viewed and compared in order to find their similarities and differences. Detailed statistics are also calculated during the procedure. Conclusions The DeltaProt package is open source and freely available for academic, non-commercial use. The latest version of DeltaProt can be obtained from http://services.cbu.uib.no/software/deltaprot/. The website also contains documentation, and the toolbox comes with real data sets that are intended for training in applying the models to carry out bioinformatical and statistical analyses of protein sequences. Equipped with the new algorithms proposed here, DeltaProt serves as an auxiliary

  18. Optimising Comprehensibility in Interlingual Translation

    DEFF Research Database (Denmark)

    Nisbeth Jensen, Matilde

    2015-01-01

    The increasing demand for citizen engagement in areas traditionally belonging exclusively to experts, such as health, law and technology has given rise to the necessity of making expert knowledge available to the general public through genres such as instruction manuals for consumer goods, patien...... the functional text type of Patient Information Leaflet. Finally, the usefulness of applying the principles of Plain Language and intralingual translation for optimising comprehensibility in interlingual translation is discussed....

  19. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  20. COMETS2: An advanced MATLAB toolbox for the numerical analysis of electric fields generated by transcranial direct current stimulation.

    Science.gov (United States)

    Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan

    2017-02-01

    Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.

  1. TEM turbulence optimisation in stellarators

    Science.gov (United States)

    Proll, J. H. E.; Mynick, H. E.; Xanthopoulos, P.; Lazerson, S. A.; Faber, B. J.

    2016-01-01

    With the advent of neoclassically optimised stellarators, optimising stellarators for turbulent transport is an important next step. The reduction of ion-temperature-gradient-driven turbulence has been achieved via shaping of the magnetic field, and the reduction of trapped-electron mode (TEM) turbulence is addressed in the present paper. Recent analytical and numerical findings suggest TEMs are stabilised when a large fraction of trapped particles experiences favourable bounce-averaged curvature. This is the case for example in Wendelstein 7-X (Beidler et al 1990 Fusion Technol. 17 148) and other Helias-type stellarators. Using this knowledge, a proxy function was designed to estimate the TEM dynamics, allowing optimal configurations for TEM stability to be determined with the STELLOPT (Spong et al 2001 Nucl. Fusion 41 711) code without extensive turbulence simulations. A first proof-of-principle optimised equilibrium stemming from the TEM-dominated stellarator experiment HSX (Anderson et al 1995 Fusion Technol. 27 273) is presented for which a reduction of the linear growth rates is achieved over a broad range of the operational parameter space. As an important consequence of this property, the turbulent heat flux levels are reduced compared with the initial configuration.

  2. Integrated system dynamics toolbox for water resources planning.

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.; Peplinski, William J.; Tidwell, Vincent Carroll; Coursey, Don (University of Chicago, Chicago, IL); Hanson, Jason (University of New Mexico, Albuquerque, NM); Grimsrud, Kristine (University of New Mexico, Albuquerque, NM); Thacher, Jennifer (University of New Mexico, Albuquerque, NM); Broadbent, Craig (University of New Mexico, Albuquerque, NM); Brookshire, David (University of New Mexico, Albuquerque, NM); Chemak, Janie (University of New Mexico, Albuquerque, NM); Cockerill, Kristan (Cockeril Consulting, Boone, NC); Aragon, Carlos (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Hallett, Heather (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Vivoni, Enrique (New Mexico Univeristy of Technology and Mining (NM-TECH), Socorro, NM); Roach, Jesse

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associated processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward

  3. Particle swarm optimisation classical and quantum perspectives

    CERN Document Server

    Sun, Jun; Wu, Xiao-Jun

    2016-01-01

    IntroductionOptimisation Problems and Optimisation MethodsRandom Search TechniquesMetaheuristic MethodsSwarm IntelligenceParticle Swarm OptimisationOverviewMotivationsPSO Algorithm: Basic Concepts and the ProcedureParadigm: How to Use PSO to Solve Optimisation ProblemsSome Harder Examples Some Variants of Particle Swarm Optimisation Why Does the PSO Algorithm Need to Be Improved? Inertia and Constriction-Acceleration Techniques for PSOLocal Best ModelProbabilistic AlgorithmsOther Variants of PSO Quantum-Behaved Particle Swarm Optimisation OverviewMotivation: From Classical Dynamics to Quantum MechanicsQuantum Model: Fundamentals of QPSOQPSO AlgorithmSome Essential ApplicationsSome Variants of QPSOSummary Advanced Topics Behaviour Analysis of Individual ParticlesConvergence Analysis of the AlgorithmTime Complexity and Rate of ConvergenceParameter Selection and PerformanceSummaryIndustrial Applications Inverse Problems for Partial Differential EquationsInverse Problems for Non-Linear Dynamical SystemsOptimal De...

  4. Energy Savings from Optimised In-Field Route Planning for Agricultural Machinery

    Directory of Open Access Journals (Sweden)

    Efthymios Rodias

    2017-10-01

    Full Text Available Various types of sensors technologies, such as machine vision and global positioning system (GPS have been implemented in navigation of agricultural vehicles. Automated navigation systems have proved the potential for the execution of optimised route plans for field area coverage. This paper presents an assessment of the reduction of the energy requirements derived from the implementation of optimised field area coverage planning. The assessment regards the analysis of the energy requirements and the comparison between the non-optimised and optimised plans for field area coverage in the whole sequence of operations required in two different cropping systems: Miscanthus and Switchgrass production. An algorithmic approach for the simulation of the executed field operations by following both non-optimised and optimised field-work patterns was developed. As a result, the corresponding time requirements were estimated as the basis of the subsequent energy cost analysis. Based on the results, the optimised routes reduce the fuel energy consumption up to 8%, the embodied energy consumption up to 7%, and the total energy consumption from 3% up to 8%.

  5. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  6. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...... using the boundary element method where absorption is incorporated. An example is given where the geometry of a room is defined by four design modes. The room geometry is optimised to get a uniform sound pressure....

  7. Optimisation of technical specifications using probabilistic methods

    International Nuclear Information System (INIS)

    Ericsson, G.; Knochenhauer, M.; Hultqvist, G.

    1986-01-01

    During the last few years the development of methods for modifying and optimising nuclear power plant Technical Specifications (TS) for plant operations has received increased attention. Probalistic methods in general, and the plant and system models of probabilistic safety assessment (PSA) in particular, seem to provide the most forceful tools for optimisation. This paper first gives some general comments on optimisation, identifying important parameters and then gives a description of recent Swedish experiences from the use of nuclear power plant PSA models and results for TS optimisation

  8. Layout Optimisation of Wave Energy Converter Arrays

    DEFF Research Database (Denmark)

    Ruiz, Pau Mercadé; Nava, Vincenzo; Topper, Mathew B. R.

    2017-01-01

    This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC) arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation......, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA), a genetic algorithm (GA) and the glowworm swarm optimisation (GSO) algorithm...

  9. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R H

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  10. Optimisation combinatoire Theorie et algorithmes

    CERN Document Server

    Korte, Bernhard; Fonlupt, Jean

    2010-01-01

    Ce livre est la traduction fran aise de la quatri me et derni re dition de Combinatorial Optimization: Theory and Algorithms crit par deux minents sp cialistes du domaine: Bernhard Korte et Jens Vygen de l'universit de Bonn en Allemagne. Il met l accent sur les aspects th oriques de l'optimisation combinatoire ainsi que sur les algorithmes efficaces et exacts de r solution de probl mes. Il se distingue en cela des approches heuristiques plus simples et souvent d crites par ailleurs. L ouvrage contient de nombreuses d monstrations, concises et l gantes, de r sultats difficiles. Destin aux tudia

  11. III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.

    Science.gov (United States)

    Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L

    2013-08-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.

  12. The panacea toolbox of a PhD biomedical student.

    Science.gov (United States)

    Skaik, Younis

    2014-01-01

    Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.

  13. Tadarida: A Toolbox for Animal Detection on Acoustic Recordings

    Directory of Open Access Journals (Sweden)

    Yves Bas

    2017-02-01

    Full Text Available Passive Acoustic Monitoring (PAM recently extended to a very wide range of animals, but no available open software has been sufficiently generic to automatically treat several taxonomic groups. Here we present Tadarida, a software toolbox allowing for the detection and labelling of recorded sound events, and to classify any new acoustic data into known classes. It is made up of three modules handling Detection, Labelling and Classification and running on either Linux or Windows. This development resulted in the first open software (1 allowing generic sound event detection (multi-taxa, (2 providing graphical sound labelling at a single-instance level and (3 covering the whole process from sound detection to classification. This generic and modular design opens numerous reuse opportunities among (bioacoustics researchers, especially for those managing and/or developing PAM schemes. The whole toolbox is openly developed in C++ (Detection and Labelling and R (Classification and stored at https://github.com/YvesBas.

  14. User's manual for Ecolego Toolbox and the Discretization Block

    International Nuclear Information System (INIS)

    Broed, Robert; Shulan Xu

    2008-03-01

    The CLIMB modelling team (Catchment LInked Models of radiological effects in the Biosphere) was instituted in 2004 to provide SSI with an independent modelling capability when reviewing SKB's assessment of long-term safety for a geological repository. Modelling in CLIMB covers all aspects of performance assessment (PA) from near-field releases to radiological consequences in the surface environment. Software used to implement assessment models has been developed within the project. The software comprises a toolbox based on the commercial packages Matlab and Simulink used to solve compartment based differential equation systems, but with an added user friendly graphical interface. This report documents the new simulation toolbox and a newly developed Discretisation Block, which is a powerful tool for solving problems involving a network of compartments in two dimensions

  15. System design through Matlab, control toolbox and Simulink

    CERN Document Server

    Singh, Krishna K

    2001-01-01

    MATLAB , a software package developed by Math Works, Inc. is powerful, versatile and interactive software for scientific and technical computations including simulations. Specialised toolboxes provided with several built-in functions are a special feature of MATLAB . This book titled System Design through MATLAB , Control Toolbox and SIMULINK aims at getting the reader started with computations and simulations in system engineering quickly and easily and then proceeds to build concepts for advanced computations and simulations that includes the control and compensation of systems. Simulation through SIMULINK has also been described to allow the reader to get the feel of the real world situation. This book is appropriate for undergraduate students undergoing final semester of their project work, postgraduate students who have MATLAB integrated in their course or wish to take up simulation problem in the area of system engineering for their dissertation work and research scholars for whom MATLABÊ

  16. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    Science.gov (United States)

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  17. Optimising resource management in neurorehabilitation.

    Science.gov (United States)

    Wood, Richard M; Griffiths, Jeff D; Williams, Janet E; Brouwers, Jakko

    2014-01-01

    To date, little research has been published regarding the effective and efficient management of resources (beds and staff) in neurorehabilitation, despite being an expensive service in limited supply. To demonstrate how mathematical modelling can be used to optimise service delivery, by way of a case study at a major 21 bed neurorehabilitation unit in the UK. An automated computer program for assigning weekly treatment sessions is developed. Queue modelling is used to construct a mathematical model of the hospital in terms of referral submissions to a waiting list, admission and treatment, and ultimately discharge. This is used to analyse the impact of hypothetical strategic decisions on a variety of performance measures and costs. The project culminates in a hybridised model of these two approaches, since a relationship is found between the number of therapy hours received each week (scheduling output) and length of stay (queuing model input). The introduction of the treatment scheduling program has substantially improved timetable quality (meaning a better and fairer service to patients) and has reduced employee time expended in its creation by approximately six hours each week (freeing up time for clinical work). The queuing model has been used to assess the effect of potential strategies, such as increasing the number of beds or employing more therapists. The use of mathematical modelling has not only optimised resources in the short term, but has allowed the optimality of longer term strategic decisions to be assessed.

  18. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    Science.gov (United States)

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  19. fMRI Artefact Rejection and Sleep Scoring Toolbox

    Directory of Open Access Journals (Sweden)

    Yves Leclercq

    2011-01-01

    Full Text Available We started writing the “fMRI artefact rejection and sleep scoring toolbox”, or “FAST”, to process our sleep EEG-fMRI data, that is, the simultaneous recording of electroencephalographic and functional magnetic resonance imaging data acquired while a subject is asleep. FAST tackles three crucial issues typical of this kind of data: (1 data manipulation (viewing, comparing, chunking, etc. of long continuous M/EEG recordings, (2 rejection of the fMRI-induced artefact in the EEG signal, and (3 manual sleep-scoring of the M/EEG recording. Currently, the toolbox can efficiently deal with these issues via a GUI, SPM8 batching system or hand-written script. The tools developed are, of course, also useful for other EEG applications, for example, involving simultaneous EEG-fMRI acquisition, continuous EEG eye-balling, and manipulation. Even though the toolbox was originally devised for EEG data, it will also gracefully handle MEG data without any problem. “FAST” is developed in Matlab as an add-on toolbox for SPM8 and, therefore, internally uses its SPM8-meeg data format. “FAST” is available for free, under the GNU-GPL.

  20. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  1. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori Ito

    2017-02-01

    Full Text Available Quantifying lesions in a robust manner is fundamental for studying the effects of neuroanatomical changes in the post-stroke brain on recovery. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. We developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.267213 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space, or comparisons between spaces. Here, we describe the methods implemented in the toolbox and demonstrate the outputs of the SRQL toolbox.

  2. An Open-Source Toolbox for PEM Fuel Cell Simulation

    Directory of Open Access Journals (Sweden)

    Jean-Paul Kone

    2018-05-01

    Full Text Available In this paper, an open-source toolbox that can be used to accurately predict the distribution of the major physical quantities that are transported within a proton exchange membrane (PEM fuel cell is presented. The toolbox has been developed using the Open Source Field Operation and Manipulation (OpenFOAM platform, which is an open-source computational fluid dynamics (CFD code. The base case results for the distribution of velocity, pressure, chemical species, Nernst potential, current density, and temperature are as expected. The plotted polarization curve was compared to the results from a numerical model and experimental data taken from the literature. The conducted simulations have generated a significant amount of data and information about the transport processes that are involved in the operation of a PEM fuel cell. The key role played by the concentration constant in shaping the cell polarization curve has been explored. The development of the present toolbox is in line with the objectives outlined in the International Energy Agency (IEA, Paris, France Advanced Fuel Cell Annex 37 that is devoted to developing open-source computational tools to facilitate fuel cell technologies. The work therefore serves as a basis for devising additional features that are not always feasible with a commercial code.

  3. A Module for Graphical Display of Model Results with the CBP Toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-21

    This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide a basis for further development of the CBP Toolbox.

  4. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. The BRAT and GUT Couple: Broadview Radar Altimetry and GOCE User Toolboxes

    Science.gov (United States)

    Benveniste, J.; Restano, M.; Ambrózio, A.

    2017-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's next release (4.2.0) is planned for October 2017. Based on the community feedback, the front-end has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.1 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's Variance-Covariance Matrix tool (VCM). BRAT and GUT toolboxes can be freely

  6. Dose optimisation in computed radiography

    International Nuclear Information System (INIS)

    Schreiner-Karoussou, A.

    2005-01-01

    After the installation of computed radiography (CR) systems in three hospitals in Luxembourg a patient dose survey was carried out for three radiographic examinations, thorax, pelvis and lumbar spine. It was found that the patient doses had changed in comparison with the patient doses measured for conventional radiography in the same three hospitals. A close collaboration between the manufacturers of the X-ray installations, the CR imaging systems and the medical physicists led to the discovery that the speed class with which each radiographic examination was to be performed, had been ignored, during installation of the digital imaging systems. A number of procedures were carried out in order to calibrate and program the X-ray installations in conjunction with the CR systems. Following this optimisation procedure, a new patient dose survey was carried out for the three radiographic examinations. It was found that patient doses for the three hospitals were reduced. (authors)

  7. Optimising costs in WLCG operations

    CERN Document Server

    Pradillo, Mar; Flix, Josep; Forti, Alessandra; Sciabà, Andrea

    2015-01-01

    The Worldwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse the 50 Petabytes of data annually generated by the LHC. The WLCG operations are coordinated by a distributed team of managers and experts and performed by people at all participating sites and from all the experiments. Several improvements in the WLCG infrastructure have been implemented during the first long LHC shutdown to prepare for the increasing needs of the experiments during Run2 and beyond. However, constraints in funding will affect not only the computing resources but also the available effort for operations. This paper presents the results of a detailed investigation on the allocation of the effort in the different areas of WLCG operations, identifies the most important sources of inefficiency and proposes viable strategies for optimising the operational cost, taking into account the current trends in the evolution of the computing infrastruc...

  8. Evaluation Toolbox: Ex-Ante Impact Assessment and Value Network Analysis for SI

    NARCIS (Netherlands)

    Dhondt, S.; Ven, H. van de; Cressey, P.; Kaderabkova, A.; Luna, Á.; Moghadam Saman, S.; Castro Spila, J.; Ziauberyte, R.; Torre, W. van der; Terstriep, J.

    2016-01-01

    This report contains a toolbox for use with the Ex-Ante Impact Assessment for social innovations as was developed in the report D7.1. This toolbox proposes a series of convenient and useful tools to apply in an ex-ante assessment of social innovation within SIMPACT's policy areas unemployment,

  9. FIT3D toolbox: multiple view geometry and 3D reconstruction for Matlab

    NARCIS (Netherlands)

    Esteban, I.; Dijk, J.; Groen, F.

    2010-01-01

    FIT3D is a Toolbox built for Matlab that aims at unifying and distributing a set of tools that will allow the researcher to obtain a complete 3D model from a set of calibrated images. In this paper we motivate and present the structure of the toolbox in a tutorial and example based approach. Given

  10. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...

  11. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  12. SPATIAL DATA MINING TOOLBOX FOR MAPPING SUITABILITY OF LANDFILL SITES USING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    S. K. M. Abujayyab

    2016-09-01

    Full Text Available Mapping the suitability of landfill sites is a complex field and is involved with multidiscipline. The purpose of this research is to create an ArcGIS spatial data mining toolbox for mapping the suitability of landfill sites at a regional scale using neural networks. The toolbox is constructed from six sub-tools to prepare, train, and process data. The employment of the toolbox is straightforward. The multilayer perceptron (MLP neural networks structure with a backpropagation learning algorithm is used. The dataset is mined from the north states in Malaysia. A total of 14 criteria are utilized to build the training dataset. The toolbox provides a platform for decision makers to implement neural networks for mapping the suitability of landfill sites in the ArcGIS environment. The result shows the ability of the toolbox to produce suitability maps for landfill sites.

  13. Combining simulation and multi-objective optimisation for equipment quantity optimisation in container terminals

    OpenAIRE

    Lin, Zhougeng

    2013-01-01

    This thesis proposes a combination framework to integrate simulation and multi-objective optimisation (MOO) for container terminal equipment optimisation. It addresses how the strengths of simulation and multi-objective optimisation can be integrated to find high quality solutions for multiple objectives with low computational cost. Three structures for the combination framework are proposed respectively: pre-MOO structure, integrated MOO structure and post-MOO structure. The applications of ...

  14. Layout Optimisation of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Pau Mercadé Ruiz

    2017-08-01

    Full Text Available This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA, a genetic algorithm (GA and the glowworm swarm optimisation (GSO algorithm. The results show slightly higher performances for the latter two algorithms; however, the first turns out to be significantly less computationally demanding.

  15. Topology optimisation of natural convection problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations...... coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences...... in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach...

  16. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    This thesis deals with topology optimisation for coupled convection problems. The aim is to extend and apply topology optimisation to steady-state conjugate heat transfer problems, where the heat conduction equation governs the heat transfer in a solid and is coupled to thermal transport...... in a surrounding uid, governed by a convection-diffusion equation, where the convective velocity field is found from solving the isothermal incompressible steady-state Navier-Stokes equations. Topology optimisation is also applied to steady-state natural convection problems. The modelling is done using stabilised...... finite elements, the formulation and implementation of which was done partly during a special course as prepatory work for this thesis. The formulation is extended with a Brinkman friction term in order to facilitate the topology optimisation of fluid flow and convective cooling problems. The derived...

  17. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  18. Credit price optimisation within retail banking

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... cost based pricing, where the price of a product or service is based on the .... function obtained from fitting a logistic regression model .... Note that the proposed optimisation approach below will allow us to also incorporate.

  19. Design, modeling and optimization of poly-air gap actuators with global coils: application to multi-rod linear structures; Conception, modelisation et optimisation des actionneurs polyentrefers a bobinages globaux: application aux structures lineaires multi-tiges

    Energy Technology Data Exchange (ETDEWEB)

    Cavarec, P.E.

    2002-11-15

    The aim of this thesis is the study and the conception of splitted structures of global coil synchronous machines for the maximization of specific torque or thrust. This concept of machine, called multi-air gap, is more precisely applied to the elaboration of a new linear multi-rods actuator. It is clearly connected to the context of direct drive solutions. First, a classification of different electromagnetic actuator families gives the particular place of multi-air gaps actuators. Then, a study, based on geometrical parameters optimizations, underlines the interest of that kind of topology for reaching very high specific forces and mechanical dynamics. A similitude law, governing those actuators, is then extracted. A study of mechanical behaviour, taking into account mechanic (tolerance) and normal forces (guidance), is carried out. Hence, methods for filtering the ripple force, and decreasing the parasitic forces without affecting the useful force are presented. This approach drives to the multi-rods structures. A prototype is then tested and validates the feasibility of that kind of devices, and the accuracy of the magnetic models. This motor, having only eight rods for an active volume of one litre, reaches an electromagnetic force of 1000 N in static conditions. A method for estimate optimal performances of multi-rods actuators under several mechanical stresses is presented. (author)

  20. User perspectives in public transport timetable optimisation

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    The present paper deals with timetable optimisation from the perspective of minimising the waiting time experienced by passengers when transferring either to or from a bus. Due to its inherent complexity, this bi-level minimisation problem is extremely difficult to solve mathematically, since tim...... on the large-scale public transport network in Denmark. The timetable optimisation approach yielded a yearly reduction in weighted waiting time equivalent to approximately 45 million Danish kroner (9 million USD)....

  1. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  2. Optimisation: how to develop stake holder involvement

    International Nuclear Information System (INIS)

    Weiss, W.

    2003-01-01

    The Precautionary Principle is an internationally recognised approach for dealing with risk situations characterised by uncertainties and potential irreversible damages. Since the late fifties, ICRP has adopted this prudent attitude because of the lack of scientific evidence concerning the existence of a threshold at low doses for stochastic effects. The 'linear, no-threshold' model and the 'optimisation of protection' principle have been developed as a pragmatic response for the management of the risk. The progress in epidemiology and radiobiology over the last decades have affirmed the initial assumption and the optimisation remains the appropriate response for the application of the precautionary principle in the context of radiological protection. The basic objective of optimisation is, for any source within the system of radiological protection, to maintain the level of exposure as low as reasonably achievable, taking into account social and economical factors. Methods tools and procedures have been developed over the last two decades to put into practice the optimisation principle with a central role given to the cost-benefit analysis as a means to determine the optimised level of protection. However, with the advancement in the implementation of the principle more emphasis was progressively given to good practice, as well as on the importance of controlling individual levels of exposure through the optimisation process. In the context of the revision of its present recommendations, the Commission is reenforcing the emphasis on protection of the individual with the adoption of an equity-based system that recognizes individual rights and a basic level of health protection. Another advancement is the role that is now recognised to 'stakeholders involvement' in the optimisation process as a mean to improve the quality of the decision aiding process for identifying and selecting protection actions considered as being accepted by all those involved. The paper

  3. Dose optimisation in single plane interstitial brachytherapy

    DEFF Research Database (Denmark)

    Tanderup, Kari; Hellebust, Taran Paulsen; Honoré, Henriette Benedicte

    2006-01-01

    patients,       treated for recurrent rectal and cervical cancer, flexible catheters were       sutured intra-operatively to the tumour bed in areas with compromised       surgical margin. Both non-optimised, geometrically and graphically       optimised CT -based dose plans were made. The overdose index...... on the       regularity of the implant, such that the benefit of optimisation was       larger for irregular implants. OI and HI correlated strongly with target       volume limiting the usability of these parameters for comparison of dose       plans between patients. CONCLUSIONS: Dwell time optimisation significantly......BACKGROUND AND PURPOSE: Brachytherapy dose distributions can be optimised       by modulation of source dwell times. In this study dose optimisation in       single planar interstitial implants was evaluated in order to quantify the       potential benefit in patients. MATERIAL AND METHODS: In 14...

  4. A part toolbox to tune genetic expression in Bacillus subtilis

    Science.gov (United States)

    Guiziou, Sarah; Sauveplane, Vincent; Chang, Hung-Ju; Clerté, Caroline; Declerck, Nathalie; Jules, Matthieu; Bonnet, Jerome

    2016-01-01

    Libraries of well-characterised components regulating gene expression levels are essential to many synthetic biology applications. While widely available for the Gram-negative model bacterium Escherichia coli, such libraries are lacking for the Gram-positive model Bacillus subtilis, a key organism for basic research and biotechnological applications. Here, we engineered a genetic toolbox comprising libraries of promoters, Ribosome Binding Sites (RBS), and protein degradation tags to precisely tune gene expression in B. subtilis. We first designed a modular Expression Operating Unit (EOU) facilitating parts assembly and modifications and providing a standard genetic context for gene circuits implementation. We then selected native, constitutive promoters of B. subtilis and efficient RBS sequences from which we engineered three promoters and three RBS sequence libraries exhibiting ∼14 000-fold dynamic range in gene expression levels. We also designed a collection of SsrA proteolysis tags of variable strength. Finally, by using fluorescence fluctuation methods coupled with two-photon microscopy, we quantified the absolute concentration of GFP in a subset of strains from the library. Our complete promoters and RBS sequences library comprising over 135 constructs enables tuning of GFP concentration over five orders of magnitude, from 0.05 to 700 μM. This toolbox of regulatory components will support many research and engineering applications in B. subtilis. PMID:27402159

  5. SCoT: a Python toolbox for EEG source connectivity.

    Science.gov (United States)

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT-a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  6. SCoT: A Python Toolbox for EEG Source Connectivity

    Directory of Open Access Journals (Sweden)

    Martin eBillinger

    2014-03-01

    Full Text Available Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG. Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs require single-trial estimation methods.In this paper, we present SCoT – a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with theMVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting.We demonstrate basic usage of SCoT on motor imagery (MI data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1 brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2 offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  7. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori L Ito

    2017-05-01

    Full Text Available Quantifying lesions in a reliable manner is fundamental for studying the effects of neuroanatomical changes related to recovery in the post-stroke brain. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This often makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. Thus, we developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.557114 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space. Here, we describe the methods implemented in the toolbox.

  8. Thermal performance monitoring and optimisation

    International Nuclear Information System (INIS)

    Sunde, Svein; Berg; Oeyvind

    1998-01-01

    Monitoring of the thermal efficiency of nuclear power plants is expected to become increasingly important as energy-market liberalisation exposes plants to increasing availability requirements and fiercer competition. The general goal in thermal performance monitoring is straightforward: to maximise the ratio of profit to cost under the constraints of safe operation. One may perceive this goal to be pursued in two ways, one oriented towards fault detection and cost-optimal predictive maintenance, and another determined at optimising target values of parameters in response to any component degradation detected, changes in ambient conditions, or the like. Annual savings associated with effective thermal-performance monitoring are expected to be in the order of $ 100 000 for power plants of representative size. A literature review shows that a number of computer systems for thermal-performance monitoring exists, either as prototypes or commercially available. The characteristics and needs of power plants may vary widely, however, and decisions concerning the exact scope, content and configuration of a thermal-performance monitor may well follow a heuristic approach. Furthermore, re-use of existing software modules may be desirable. Therefore, we suggest here the design of a flexible workbench for easy assembly of an experimental thermal-performance monitor at the Halden Project. The suggested design draws heavily on our extended experience in implementing control-room systems featured by assets like high levels of customisation, flexibility in configuration and modularity in structure, and on a number of relevant adjoining activities. The design includes a multi-computer communication system and a graphical user's interface, and aims at a system adaptable to any combination of in-house or end user's modules, as well as commercially available software. (author)

  9. Interval Coded Scoring: a toolbox for interpretable scoring systems

    Directory of Open Access Journals (Sweden)

    Lieven Billiet

    2018-04-01

    Full Text Available Over the last decades, clinical decision support systems have been gaining importance. They help clinicians to make effective use of the overload of available information to obtain correct diagnoses and appropriate treatments. However, their power often comes at the cost of a black box model which cannot be interpreted easily. This interpretability is of paramount importance in a medical setting with regard to trust and (legal responsibility. In contrast, existing medical scoring systems are easy to understand and use, but they are often a simplified rule-of-thumb summary of previous medical experience rather than a well-founded system based on available data. Interval Coded Scoring (ICS connects these two approaches, exploiting the power of sparse optimization to derive scoring systems from training data. The presented toolbox interface makes this theory easily applicable to both small and large datasets. It contains two possible problem formulations based on linear programming or elastic net. Both allow to construct a model for a binary classification problem and establish risk profiles that can be used for future diagnosis. All of this requires only a few lines of code. ICS differs from standard machine learning through its model consisting of interpretable main effects and interactions. Furthermore, insertion of expert knowledge is possible because the training can be semi-automatic. This allows end users to make a trade-off between complexity and performance based on cross-validation results and expert knowledge. Additionally, the toolbox offers an accessible way to assess classification performance via accuracy and the ROC curve, whereas the calibration of the risk profile can be evaluated via a calibration curve. Finally, the colour-coded model visualization has particular appeal if one wants to apply ICS manually on new observations, as well as for validation by experts in the specific application domains. The validity and applicability

  10. Balanced the Trade-offs problem of ANFIS Using Particle Swarm Optimisation

    Directory of Open Access Journals (Sweden)

    Dian Palupi Rini

    2013-11-01

    Full Text Available Improving the approximation accuracy and interpretability of fuzzy systems is an important issue either in fuzzy systems theory or in its applications . It is known that simultaneous optimisation both issues was the trade-offs problem, but it will improve performance of the system and avoid overtraining of data. Particle swarm optimisation (PSO is part of evolutionary algorithm that is good candidate algorithms to solve multiple optimal solution and better global search space. This paper introduces an integration of PSO dan ANFIS for optimise its learning especially for tuning membership function parameters and finding the optimal rule for better classification. The proposed method has been tested on four standard dataset from UCI machine learning i.e. Iris Flower, Habermans Survival Data, Balloon and Thyroid dataset. The results have shown better classification using the proposed PSO-ANFIS and the time complexity has reduced accordingly.

  11. Process and Economic Optimisation of a Milk Processing Plant with Solar Thermal Energy

    DEFF Research Database (Denmark)

    Bühler, Fabian; Nguyen, Tuong-Van; Elmegaard, Brian

    2016-01-01

    . Based on the case study of a dairy factory, where first a heat integration is performed to optimise the system, a model for solar thermal process integration is developed. The detailed model is based on annual hourly global direct and diffuse solar radiation, from which the radiation on a defined......This work investigates the integration of solar thermal systems for process energy use. A shift from fossil fuels to renewable energy could be beneficial both from environmental and economic perspectives, after the process itself has been optimised and efficiency measures have been implemented...... surface is calculated. Based on hourly process stream data from the dairy factory, the optimal streams for solar thermal process integration are found, with an optimal thermal storagetank volume. The last step consists of an economic optimisation of the problem to determine the optimal size...

  12. Design and Implementation of a Space Environment Simulation Toolbox for Small Satellites

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Larsen, Jesper A.; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents a developed toolbox for space environment model in SIMULINK that facilitates development and design of Attitude Determination and Control Systems (ADCS) for a Low Earth Orbit (LEO) spacecraft. The toolbox includes, among others, models of orbit propagators, disturbances, Earth...... gravity field, Earth magnetic field and eclipse. The structure and facilities within the toolbox are described and exemplified using a student satellite case (AAUSAT-II). The validity of developed models is confirmed by comparing the simulation results with the realistic data obtained from the Danish...

  13. Optimising India's small hydro resources

    International Nuclear Information System (INIS)

    Kumar, A.

    1995-01-01

    A brief history is given of an initiation to develop small scale hydropower projects in the Himalayas. The experience of the Indian project managers in utilising international funds from the Global Environment Facility could serve as a model for other small remote communities in the rest of the world. Lessons learned are reported. (UK)

  14. Optimisation of Investment Resources at Small Enterprises

    Directory of Open Access Journals (Sweden)

    Shvets Iryna B.

    2014-03-01

    Full Text Available The goal of the article lies in the study of the process of optimisation of the structure of investment resources, development of criteria and stages of optimisation of volumes of investment resources for small enterprises by types of economic activity. The article characterises the process of transformation of investment resources into assets and liabilities of the balances of small enterprises and conducts calculation of the structure of sources of formation of investment resources in Ukraine at small enterprises by types of economic activity in 2011. On the basis of the conducted analysis of the structure of investment resources of small enterprises the article forms main groups of criteria of optimisation in the context of individual small enterprises by types of economic activity. The article offers an algorithm and step-by-step scheme of optimisation of investment resources at small enterprises in the form of a multi-stage process of management of investment resources in the context of increase of their mobility and rate of transformation of existing resources into investments. The prospect of further studies in this direction is development of a structural and logic scheme of optimisation of volumes of investment resources at small enterprises.

  15. An effective approach to reducing strategy space for maintenance optimisation of multistate series–parallel systems

    International Nuclear Information System (INIS)

    Zhou, Yifan; Lin, Tian Ran; Sun, Yong; Bian, Yangqing; Ma, Lin

    2015-01-01

    Maintenance optimisation of series–parallel systems is a research topic of practical significance. Nevertheless, a cost-effective maintenance strategy is difficult to obtain due to the large strategy space for maintenance optimisation of such systems. The heuristic algorithm is often employed to deal with this problem. However, the solution obtained by the heuristic algorithm is not always the global optimum and the algorithm itself can be very time consuming. An alternative method based on linear programming is thus developed in this paper to overcome such difficulties by reducing strategy space of maintenance optimisation. A theoretical proof is provided in the paper to verify that the proposed method is at least as effective as the existing methods for strategy space reduction. Numerical examples for maintenance optimisation of series–parallel systems having multistate components and considering both economic dependence among components and multiple-level imperfect maintenance are also presented. The simulation results confirm that the proposed method is more effective than the existing methods in removing inappropriate maintenance strategies of multistate series–parallel systems. - Highlights: • A new method using linear programming is developed to reduce the strategy space. • The effectiveness of the new method for strategy reduction is theoretically proved. • Imperfect maintenance and economic dependence are considered during optimisation

  16. DYNAMIC SYSTEM ANALYSIS WITH pplane8.m (MATLAB® toolbox

    Directory of Open Access Journals (Sweden)

    Alejandro Regalado-Méndez

    2013-12-01

    Full Text Available In this work, four dynamic systems were analyzed (physical, chemical, ecological and economical, represented by autonomous systems of two ordinary differential equations. Main objective is proving that pplane8.m is an adequate and efficient computer tool. The analysis of autonomous systems was given by characterization of their critical points according to Coughanowr & LeBlanc (2009, with the MATLAB® toolbox pplane8.m. The main results are that pplane8.m (Polking, 2009 can quickly and precisely draw trajectories of each phase plane, it easily computes each critical point, and correctly characterize each equilibrium point of all autonomous studied systems. Finally, we can say that pplane8.m is a powerful tool to help the teaching-learning process for engineering students.

  17. The influence of industrial applications on a control system toolbox

    International Nuclear Information System (INIS)

    Clout, P.

    1992-01-01

    Vsystem is as an open, advanced software application toolbox for rapidly creating fast, efficient and cost-effective control and data-acquisition systems. Vsystem's modular architecture is designed for single computers, networked computers and workstations running under VAX/VMS or VAX/ELN. At the heart of Vsystem lies Vaccess, a user extendible real-time database and library of access routines. The application database provides the link to the hardware of the application and can be organized as one database or separate database installed in different computers on the network. Vsystem has found application in charged-particle accelerator control, tokamak control, and industrial research, as well as its more recent industrial applications. This paper describes the broad feature of Vsystem and the influence that recent industrial applications have had on the software. (author)

  18. Neuro-QOL and the NIH Toolbox: implications for epilepsy

    Science.gov (United States)

    Nowinski, Cindy J; Victorson, David; Cavazos, Jose E; Gershon, Richard; Cella, David

    2011-01-01

    The impact of neurological disorders on the lives of patients is often far more complex than what is measured in routine examination. Measurement of this impact can be challenging owing to a lack of brief, psychometrically sound and generally accepted instruments. Two NIH-funded initiatives are developing assessment tools, in English and Spanish, which address these issues, and should prove useful to the study and treatment of epilepsy and other neurological conditions. The first, Neuro-QOL, has created a set of health-related quality of life measures that are applicable for people with common neurological disorders. The second, the NIH Toolbox for the Assessment of Neurological and Behavioral Function, is assembling measures of cognitive, emotional, motor and sensory health and function that can be used across all ages, from 3 to 85 years. This article describes both the projects and their potential value to epilepsy treatment and research. PMID:21552344

  19. A fractured rock geophysical toolbox method selection tool

    Science.gov (United States)

    Day-Lewis, F. D.; Johnson, C.D.; Slater, L.D.; Robinson, J.L.; Williams, J.H.; Boyden, C.L.; Werkema, D.D.; Lane, J.W.

    2016-01-01

    Geophysical technologies have the potential to improve site characterization and monitoring in fractured rock, but the appropriate and effective application of geophysics at a particular site strongly depends on project goals (e.g., identifying discrete fractures) and site characteristics (e.g., lithology). No method works at every site or for every goal. New approaches are needed to identify a set of geophysical methods appropriate to specific project goals and site conditions while considering budget constraints. To this end, we present the Excel-based Fractured-Rock Geophysical Toolbox Method Selection Tool (FRGT-MST). We envision the FRGT-MST (1) equipping remediation professionals with a tool to understand what is likely to be realistic and cost-effective when contracting geophysical services, and (2) reducing applications of geophysics with unrealistic objectives or where methods are likely to fail.

  20. Matlab Stability and Control Toolbox: Trim and Static Stability Module

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.

    2006-01-01

    This paper presents the technical background of the Trim and Static module of the Matlab Stability and Control Toolbox. This module performs a low-fidelity stability and control assessment of an aircraft model for a set of flight critical conditions. This is attained by determining if the control authority available for trim is sufficient and if the static stability characteristics are adequate. These conditions can be selected from a prescribed set or can be specified to meet particular requirements. The prescribed set of conditions includes horizontal flight, take-off rotation, landing flare, steady roll, steady turn and pull-up/ push-over flight, for which several operating conditions can be specified. A mathematical model was developed allowing for six-dimensional trim, adjustable inertial properties, asymmetric vehicle layouts, arbitrary number of engines, multi-axial thrust vectoring, engine(s)-out conditions, crosswind and gyroscopic effects.

  1. Peptide chemistry toolbox - Transforming natural peptides into peptide therapeutics.

    Science.gov (United States)

    Erak, Miloš; Bellmann-Sickert, Kathrin; Els-Heindl, Sylvia; Beck-Sickinger, Annette G

    2018-06-01

    The development of solid phase peptide synthesis has released tremendous opportunities for using synthetic peptides in medicinal applications. In the last decades, peptide therapeutics became an emerging market in pharmaceutical industry. The need for synthetic strategies in order to improve peptidic properties, such as longer half-life, higher bioavailability, increased potency and efficiency is accordingly rising. In this mini-review, we present a toolbox of modifications in peptide chemistry for overcoming the main drawbacks during the transition from natural peptides to peptide therapeutics. Modifications at the level of the peptide backbone, amino acid side chains and higher orders of structures are described. Furthermore, we are discussing the future of peptide therapeutics development and their impact on the pharmaceutical market. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Optimisation of NMR dynamic models II. A new methodology for the dual optimisation of the model-free parameters and the Brownian rotational diffusion tensor

    International Nuclear Information System (INIS)

    D'Auvergne, Edward J.; Gooley, Paul R.

    2008-01-01

    Finding the dynamics of an entire macromolecule is a complex problem as the model-free parameter values are intricately linked to the Brownian rotational diffusion of the molecule, mathematically through the autocorrelation function of the motion and statistically through model selection. The solution to this problem was formulated using set theory as an element of the universal set U-the union of all model-free spaces (d'Auvergne EJ and Gooley PR (2007) Mol BioSyst 3(7), 483-494). The current procedure commonly used to find the universal solution is to initially estimate the diffusion tensor parameters, to optimise the model-free parameters of numerous models, and then to choose the best model via model selection. The global model is then optimised and the procedure repeated until convergence. In this paper a new methodology is presented which takes a different approach to this diffusion seeded model-free paradigm. Rather than starting with the diffusion tensor this iterative protocol begins by optimising the model-free parameters in the absence of any global model parameters, selecting between all the model-free models, and finally optimising the diffusion tensor. The new model-free optimisation protocol will be validated using synthetic data from Schurr JM et al. (1994) J Magn Reson B 105(3), 211-224 and the relaxation data of the bacteriorhodopsin (1-36)BR fragment from Orekhov VY (1999) J Biomol NMR 14(4), 345-356. To demonstrate the importance of this new procedure the NMR relaxation data of the Olfactory Marker Protein (OMP) of Gitti R et al. (2005) Biochem 44(28), 9673-9679 is reanalysed. The result is that the dynamics for certain secondary structural elements is very different from those originally reported

  3. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    Science.gov (United States)

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  4. PANDA: a pipeline toolbox for analyzing brain diffusion images

    Directory of Open Access Journals (Sweden)

    Zaixu eCui

    2013-02-01

    Full Text Available Diffusion magnetic resonance imaging (dMRI is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named Pipeline for Analyzing braiN Diffusion imAges (PANDA for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL, Pipeline System for Octave and Matlab (PSOM, Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics (e.g., FA and MD that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI, allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  5. GISMO: A MATLAB toolbox for seismic research, monitoring, & education

    Science.gov (United States)

    Thompson, G.; Reyes, C. G.; Kempler, L. A.

    2017-12-01

    GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS

  6. Development of an Ontology-Directed Signal Processing Toolbox

    Energy Technology Data Exchange (ETDEWEB)

    Stephen W. Lang

    2011-05-27

    This project was focused on the development of tools for the automatic configuration of signal processing systems. The goal is to develop tools that will be useful in a variety of Government and commercial areas and useable by people who are not signal processing experts. In order to get the most benefit from signal processing techniques, deep technical expertise is often required in order to select appropriate algorithms, combine them into a processing chain, and tune algorithm parameters for best performance on a specific problem. Therefore a significant benefit would result from the assembly of a toolbox of processing algorithms that has been selected for their effectiveness in a group of related problem areas, along with the means to allow people who are not signal processing experts to reliably select, combine, and tune these algorithms to solve specific problems. Defining a vocabulary for problem domain experts that is sufficiently expressive to drive the configuration of signal processing functions will allow the expertise of signal processing experts to be captured in rules for automated configuration. In order to test the feasibility of this approach, we addressed a lightning classification problem, which was proposed by DOE as a surrogate for problems encountered in nuclear nonproliferation data processing. We coded a toolbox of low-level signal processing algorithms for extracting features of RF waveforms, and demonstrated a prototype tool for screening data. We showed examples of using the tool for expediting the generation of ground-truth metadata, for training a signal recognizer, and for searching for signals with particular characteristics. The public benefits of this approach, if successful, will accrue to Government and commercial activities that face the same general problem - the development of sensor systems for complex environments. It will enable problem domain experts (e.g. analysts) to construct signal and image processing chains without

  7. Expanding access to rheumatology care: the rheumatology general practice toolbox.

    LENUS (Irish Health Repository)

    Conway, R

    2015-02-01

    Management guidelines for many rheumatic diseases are published in specialty rheumatology literature but rarely in general medical journals. Musculoskeletal disorders comprise 14% of all consultations in primary care. Formal post-graduate training in rheumatology is limited or absent for many primary care practitioners. Primary care practitioners can be trained to effectively treat complex diseases and have expressed a preference for interactive educational courses. The Rheumatology General Practice (GP) Toolbox is an intensive one day course designed to offer up to date information to primary care practitioners on the latest diagnostic and treatment guidelines for seven common rheumatic diseases. The course structure involves a short lecture on each topic and workshops on arthrocentesis, joint injection and DXA interpretation. Participants evaluated their knowledge and educational experience before, during and after the course. Thirty-two primary care practitioners attended, who had a median of 13 (IQR 6.5, 20) years experience in their specialty. The median number of educational symposia attended in the previous 5 years was 10 (IQR-5, 22.5), with a median of 0 (IQR 0, 1) in rheumatology. All respondents agreed that the course format was appropriate. Numerical improvements were demonstrated in participant\\'s confidence in diagnosing and managing all seven common rheumatologic conditions, with statistically significant improvements (p < 0.05) in 11 of the 14 aspects assessed. The Rheumatology Toolbox is an effective educational method for disseminating current knowledge in rheumatology to primary care physicians and improved participant\\'s self-assessed competence in diagnosis and management of common rheumatic diseases.

  8. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  9. Analysis and optimisation of a mixed fluid cascade (MFC) process

    Science.gov (United States)

    Ding, He; Sun, Heng; Sun, Shoujun; Chen, Cheng

    2017-04-01

    A mixed fluid cascade (MFC) process that comprises three refrigeration cycles has great capacity for large-scale LNG production, which consumes a great amount of energy. Therefore, any performance enhancement of the liquefaction process will significantly reduce the energy consumption. The MFC process is simulated and analysed by use of proprietary software, Aspen HYSYS. The effect of feed gas pressure, LNG storage pressure, water-cooler outlet temperature, different pre-cooling regimes, liquefaction, and sub-cooling refrigerant composition on MFC performance are investigated and presented. The characteristics of its excellent numerical calculation ability and the user-friendly interface of MATLAB™ and powerful thermo-physical property package of Aspen HYSYS are combined. A genetic algorithm is then invoked to optimise the MFC process globally. After optimisation, the unit power consumption can be reduced to 4.655 kW h/kmol, or 4.366 kW h/kmol on condition that the compressor adiabatic efficiency is 80%, or 85%, respectively. Additionally, to improve the process further, with regards its thermodynamic efficiency, configuration optimisation is conducted for the MFC process and several configurations are established. By analysing heat transfer and thermodynamic performances, the configuration entailing a pre-cooling cycle with three pressure levels, liquefaction, and a sub-cooling cycle with one pressure level is identified as the most efficient and thus optimal: its unit power consumption is 4.205 kW h/kmol. Additionally, the mechanism responsible for the weak performance of the suggested liquefaction cycle configuration lies in the unbalanced distribution of cold energy in the liquefaction temperature range.

  10. Multicriteria Optimisation in Logistics Forwarder Activities

    Directory of Open Access Journals (Sweden)

    Tanja Poletan Jugović

    2007-05-01

    Full Text Available Logistics forwarder, as organizer and planner of coordinationand integration of all the transport and logistics chains elements,uses adequate ways and methods in the process of planningand decision-making. One of these methods, analysed inthis paper, which could be used in optimisation of transportand logistics processes and activities of logistics forwarder, isthe multicriteria optimisation method. Using that method, inthis paper is suggested model of multicriteria optimisation of logisticsforwarder activities. The suggested model of optimisationis justified in keeping with method principles of multicriteriaoptimization, which is included in operation researchmethods and it represents the process of multicriteria optimizationof variants. Among many different processes of multicriteriaoptimization, PROMETHEE (Preference Ranking OrganizationMethod for Enrichment Evaluations and Promcalc& Gaia V. 3.2., computer program of multicriteria programming,which is based on the mentioned process, were used.

  11. Noise aspects at aerodynamic blade optimisation projects

    International Nuclear Information System (INIS)

    Schepers, J.G.

    1997-06-01

    The Netherlands Energy Research Foundation (ECN) has often been involved in industrial projects, in which blade geometries are created automatic by means of numerical optimisation. Usually, these projects aim at the determination of the aerodynamic optimal wind turbine blade, i.e. the goal is to design a blade which is optimal with regard to energy yield. In other cases, blades have been designed which are optimal with regard to cost of generated energy. However, it is obvious that the wind turbine blade designs which result from these optimisations, are not necessarily optimal with regard to noise emission. In this paper an example is shown of an aerodynamic blade optimisation, using the ECN-program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. 11 figs., 8 refs

  12. Topology Optimisation of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Thike Aye Min

    2016-01-01

    Full Text Available Wireless sensor networks are widely used in a variety of fields including industrial environments. In case of a clustered network the location of cluster head affects the reliability of the network operation. Finding of the optimum location of the cluster head, therefore, is critical for the design of a network. This paper discusses the optimisation approach, based on the brute force algorithm, in the context of topology optimisation of a cluster structure centralised wireless sensor network. Two examples are given to verify the approach that demonstrate the implementation of the brute force algorithm to find an optimum location of the cluster head.

  13. The ASTRA Toolbox: A platform for advanced algorithm development in electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Aarle, Wim van, E-mail: wim.vanaarle@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Palenstijn, Willem Jan, E-mail: willemjan.palenstijn@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde & Informatica, Science Park 123, NL-1098 XG Amsterdam (Netherlands); De Beenhouwer, Jan, E-mail: jan.debeenhouwer@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Altantzis, Thomas, E-mail: thomas.altantzis@uantwerpen.be [Electron Microscopy for Materials Science, University of Antwerp, Groenenborgerlaan 171, B-2020 Wilrijk (Belgium); Bals, Sara, E-mail: sara.bals@uantwerpen.be [Electron Microscopy for Materials Science, University of Antwerp, Groenenborgerlaan 171, B-2020 Wilrijk (Belgium); Batenburg, K. Joost, E-mail: joost.batenburg@cwi.nl [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde & Informatica, Science Park 123, NL-1098 XG Amsterdam (Netherlands); Mathematical Institute, Leiden University, P.O. Box 9512, NL-2300 RA Leiden (Netherlands); Sijbers, Jan, E-mail: jan.sijbers@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-10-15

    We present the ASTRA Toolbox as an open platform for 3D image reconstruction in tomography. Most of the software tools that are currently used in electron tomography offer limited flexibility with respect to the geometrical parameters of the acquisition model and the algorithms used for reconstruction. The ASTRA Toolbox provides an extensive set of fast and flexible building blocks that can be used to develop advanced reconstruction algorithms, effectively removing these limitations. We demonstrate this flexibility, the resulting reconstruction quality, and the computational efficiency of this toolbox by a series of experiments, based on experimental dual-axis tilt series. - Highlights: • The ASTRA Toolbox is an open platform for 3D image reconstruction in tomography. • Advanced reconstruction algorithms can be prototyped using the fast and flexible building blocks. • This flexibility is demonstrated on a common use case: dual-axis tilt series reconstruction with prior knowledge. • The computational efficiency is validated on an experimentally measured tilt series.

  14. Reinforcement Toolbox, a Parametric Reinforcement Modelling Tool for Curved Surface Structures

    NARCIS (Netherlands)

    Lauppe, J.; Rolvink, A.; Coenders, J.L.

    2013-01-01

    This paper presents a computational strategy and parametric modelling toolbox which aim at enhancing the design- and production process of reinforcement in freeform curved surface structures. The computational strategy encompasses the necessary steps of raising an architectural curved surface model

  15. A microfluidic toolbox for the development of in-situ product removal strategies in biocatalysis

    DEFF Research Database (Denmark)

    Heintz, Søren; Mitic, Aleksandar; Ringborg, Rolf Hoffmeyer

    2016-01-01

    A microfluidic toolbox for accelerated development of biocatalytic processes has great potential. This is especially the case for the development of advanced biocatalytic process concepts, where reactors and product separation methods are closely linked together to intensify the process performan...

  16. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    Science.gov (United States)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  17. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    Directory of Open Access Journals (Sweden)

    David M. Zalk

    2011-06-01

    Conclusion: The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  18. The ASTRA Toolbox: A platform for advanced algorithm development in electron tomography

    International Nuclear Information System (INIS)

    Aarle, Wim van; Palenstijn, Willem Jan; De Beenhouwer, Jan; Altantzis, Thomas; Bals, Sara; Batenburg, K. Joost; Sijbers, Jan

    2015-01-01

    We present the ASTRA Toolbox as an open platform for 3D image reconstruction in tomography. Most of the software tools that are currently used in electron tomography offer limited flexibility with respect to the geometrical parameters of the acquisition model and the algorithms used for reconstruction. The ASTRA Toolbox provides an extensive set of fast and flexible building blocks that can be used to develop advanced reconstruction algorithms, effectively removing these limitations. We demonstrate this flexibility, the resulting reconstruction quality, and the computational efficiency of this toolbox by a series of experiments, based on experimental dual-axis tilt series. - Highlights: • The ASTRA Toolbox is an open platform for 3D image reconstruction in tomography. • Advanced reconstruction algorithms can be prototyped using the fast and flexible building blocks. • This flexibility is demonstrated on a common use case: dual-axis tilt series reconstruction with prior knowledge. • The computational efficiency is validated on an experimentally measured tilt series

  19. Radiation protection optimisation techniques and their application in industry

    Energy Technology Data Exchange (ETDEWEB)

    Lefaure, C

    1997-12-31

    Since the International Commission on Radiation Protection (ICRP) recommendation 60, the optimisation principle appears to be the core of the radiation protection system. In practice applying it, means implementing an approach both predictive and evolutionary - that relies essentially on a prudent and responsible state of mind. the formal expression of this process, called optimization procedure, implies and indispensable tool for its implementation: the system of monetary values for the unit of collective dose. During the last few years, feed ALARA principle means that a global work management approach must be adopted, considering together all factors contributing to radiation dose. In the nuclear field, the ALARA approach appears to be more successful when implemented in the framework of a managerial approach through structure ALARA programmes. Outside the nuclear industry it is necessary to clearly define priorities through generic optimisation studies and ALARA audits. At the international level much efforts remain to be done to expand efficiently the ALARA process to internal exposure as well as to public exposure. (author) 2 graphs, 5 figs., 3 tabs.

  20. Radiation protection optimisation techniques and their application in industry

    International Nuclear Information System (INIS)

    Lefaure, C.

    1996-01-01

    Since the International Commission on Radiation Protection (ICRP) recommendation 60, the optimisation principle appears to be the core of the radiation protection system. In practice applying it, means implementing an approach both predictive and evolutionary - that relies essentially on a prudent and responsible state of mind. the formal expression of this process, called optimization procedure, implies and indispensable tool for its implementation: the system of monetary values for the unit of collective dose. During the last few years, feed ALARA principle means that a global work management approach must be adopted, considering together all factors contributing to radiation dose. In the nuclear field, the ALARA approach appears to be more successful when implemented in the framework of a managerial approach through structure ALARA programmes. Outside the nuclear industry it is necessary to clearly define priorities through generic optimisation studies and ALARA audits. At the international level much efforts remain to be done to expand efficiently the ALARA process to internal exposure as well as to public exposure. (author)

  1. Radiation protection optimisation techniques and their application in industry

    Energy Technology Data Exchange (ETDEWEB)

    Lefaure, C

    1996-12-31

    Since the International Commission on Radiation Protection (ICRP) recommendation 60, the optimisation principle appears to be the core of the radiation protection system. In practice applying it, means implementing an approach both predictive and evolutionary - that relies essentially on a prudent and responsible state of mind. the formal expression of this process, called optimization procedure, implies and indispensable tool for its implementation: the system of monetary values for the unit of collective dose. During the last few years, feed ALARA principle means that a global work management approach must be adopted, considering together all factors contributing to radiation dose. In the nuclear field, the ALARA approach appears to be more successful when implemented in the framework of a managerial approach through structure ALARA programmes. Outside the nuclear industry it is necessary to clearly define priorities through generic optimisation studies and ALARA audits. At the international level much efforts remain to be done to expand efficiently the ALARA process to internal exposure as well as to public exposure. (author) 2 graphs, 5 figs., 3 tabs.

  2. Non-stationary Condition Monitoring of large diesel engines with the AEWATT toolbox

    DEFF Research Database (Denmark)

    Pontoppidan, Niels Henrik; Larsen, Jan; Sigurdsson, Sigurdur

    2005-01-01

    We are developing a specialized toolbox for non-stationary condition monitoring of large 2-stroke diesel engines based on acoustic emission measurements. The main contribution of this toolbox has so far been the utilization of adaptive linear models such as Principal and Independent Component Ana......, the inversion of those angular timing changes called “event alignment”, has allowed for condition monitoring across operation load settings, successfully enabling a single model to be used with realistic data under varying operational conditions-...

  3. Décomposition-coordination en optimisation déterministe et stochastique

    CERN Document Server

    Carpentier, Pierre

    2017-01-01

    Ce livre considère le traitement de problèmes d'optimisation de grande taille. L'idée est d'éclater le problème d'optimisation global en sous-problèmes plus petits, donc plus faciles à résoudre, chacun impliquant l'un des sous-systèmes (décomposition), mais sans renoncer à obtenir l'optimum global, ce qui nécessite d'utiliser une procédure itérative (coordination). Ce sujet a fait l'objet de plusieurs livres publiés dans les années 70 dans le contexte de l'optimisation déterministe. Nous présentans ici les principes essentiels et méthodes de décomposition-coordination au travers de situations typiques, puis nous proposons un cadre général qui permet de construire des algorithmes corrects et d'étudier leur convergence. Cette théorie est présentée aussi bien dans le contexte de l'optimisation déterministe que stochastique. Ce matériel a été enseigné par les auteurs dans divers cours de 3ème cycle et également mis en œuvre dans de nombreuses applications industrielles. Des exerc...

  4. Application of Surpac and Whittle Software in Open Pit Optimisation ...

    African Journals Online (AJOL)

    Application of Surpac and Whittle Software in Open Pit Optimisation and Design. ... This paper studies the Surpac and Whittle software and their application in designing an optimised pit. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  5. (MBO) algorithm in multi-reservoir system optimisation

    African Journals Online (AJOL)

    A comparative study of marriage in honey bees optimisation (MBO) algorithm in ... A practical application of the marriage in honey bees optimisation (MBO) ... to those of other evolutionary algorithms, such as the genetic algorithm (GA), ant ...

  6. Optimisation of logistics processes of energy grass collection

    Science.gov (United States)

    Bányai, Tamás.

    2010-05-01

    objective function of the optimisation is the maximisation of the profit which means the maximization of the difference between revenue and cost. The objective function trades off the income of the assigned transportation demands against the logistic costs. The constraints are the followings: (1) the free capacity of the assigned transportation resource is more than the re-quested capacity of the transportation demand; the calculated arrival time of the transportation resource to the harvesting place is not later than the requested arrival time of them; (3) the calculated arrival time of the transportation demand to the processing and production facility is not later than the requested arrival time; (4) one transportation demand is assigned to one transportation resource and one resource is assigned to one transportation resource. The decision variable of the optimisation problem is the set of scheduling variables and the assignment of resources to transportation demands. The evaluation parameters of the optimised system are the followings: total costs of the collection process; utilisation of transportation resources and warehouses; efficiency of production and/or processing facilities. However the multidimensional heuristic optimisation method is based on genetic algorithm, but the routing sequence of the optimisation works on the base of an ant colony algorithm. The optimal routes are calculated by the aid of the ant colony algorithm as a subroutine of the global optimisation method and the optimal assignment is given by the genetic algorithm. One important part of the mathematical method is the sensibility analysis of the objective function, which shows the influence rate of the different input parameters. Acknowledgements This research was implemented within the frame of the project entitled "Development and operation of the Technology and Knowledge Transfer Centre of the University of Miskolc". with support by the European Union and co-funding of the European Social

  7. Extending Particle Swarm Optimisers with Self-Organized Criticality

    DEFF Research Database (Denmark)

    Løvbjerg, Morten; Krink, Thiemo

    2002-01-01

    Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.......Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions....

  8. Operational Radiological Protection and Aspects of Optimisation

    International Nuclear Information System (INIS)

    Lazo, E.; Lindvall, C.G.

    2005-01-01

    Since 1992, the Nuclear Energy Agency (NEA), along with the International Atomic Energy Agency (IAEA), has sponsored the Information System on Occupational Exposure (ISOE). ISOE collects and analyses occupational exposure data and experience from over 400 nuclear power plants around the world and is a forum for radiological protection experts from both nuclear power plants and regulatory authorities to share lessons learned and best practices in the management of worker radiation exposures. In connection to the ongoing work of the International Commission on Radiological Protection (ICRP) to develop new recommendations, the ISOE programme has been interested in how the new recommendations would affect operational radiological protection application at nuclear power plants. Bearing in mind that the ICRP is developing, in addition to new general recommendations, a new recommendation specifically on optimisation, the ISOE programme created a working group to study the operational aspects of optimisation, and to identify the key factors in optimisation that could usefully be reflected in ICRP recommendations. In addition, the Group identified areas where further ICRP clarification and guidance would be of assistance to practitioners, both at the plant and the regulatory authority. The specific objective of this ISOE work was to provide operational radiological protection input, based on practical experience, to the development of new ICRP recommendations, particularly in the area of optimisation. This will help assure that new recommendations will best serve the needs of those implementing radiation protection standards, for the public and for workers, at both national and international levels. (author)

  9. Optimisation of surgical care for rectal cancer

    NARCIS (Netherlands)

    Borstlap, W.A.A.

    2017-01-01

    Optimisation of surgical care means weighing the risk of treatment related morbidity against the patients’ potential benefits of a surgical intervention. The first part of this thesis focusses on the anaemic patient undergoing colorectal surgery. Hypothesizing that a more profound haemoglobin

  10. On optimal development and becoming an optimiser

    NARCIS (Netherlands)

    de Ruyter, D.J.

    2012-01-01

    The article aims to provide a justification for the claim that optimal development and becoming an optimiser are educational ideals that parents should pursue in raising their children. Optimal development is conceptualised as enabling children to grow into flourishing persons, that is persons who

  11. Particle Swarm Optimisation with Spatial Particle Extension

    DEFF Research Database (Denmark)

    Krink, Thiemo; Vesterstrøm, Jakob Svaneborg; Riget, Jacques

    2002-01-01

    In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed...

  12. OPTIMISATION OF COMPRESSIVE STRENGTH OF PERIWINKLE ...

    African Journals Online (AJOL)

    In this paper, a regression model is developed to predict and optimise the compressive strength of periwinkle shell aggregate concrete using Scheffe's regression theory. The results obtained from the derived regression model agreed favourably with the experimental data. The model was tested for adequacy using a student ...

  13. An efficient optimisation method in groundwater resource ...

    African Journals Online (AJOL)

    DRINIE

    2003-10-04

    Oct 4, 2003 ... theories developed in the field of stochastic subsurface hydrology. In reality, many ... Recently, some researchers have applied the multi-stage ... Then a robust solution of the optimisation problem given by Eqs. (1) to (3) is as ...

  14. Water distribution systems design optimisation using metaheuristics ...

    African Journals Online (AJOL)

    The topic of multi-objective water distribution systems (WDS) design optimisation using metaheuristics is investigated, comparing numerous modern metaheuristics, including several multi-objective evolutionary algorithms, an estimation of distribution algorithm and a recent hyperheuristic named AMALGAM (an evolutionary ...

  15. Optimisation of efficiency of axial fans

    NARCIS (Netherlands)

    Kruyt, Nicolaas P.; Pennings, P.C.; Faasen, R.

    2014-01-01

    A three-stage research project has been executed to develop ducted axial-fans with increased efficiency. In the first stage a design method has been developed in which various conflicting design criteria can be incorporated. Based on this design method, an optimised design has been determined

  16. Thermodynamic optimisation of a heat exchanger

    NARCIS (Netherlands)

    Cornelissen, Rene; Hirs, Gerard

    1999-01-01

    The objective of this paper is to show that for the optimal design of an energy system, where there is a trade-off between exergy saving during operation and exergy use during construction of the energy system, exergy analysis and life cycle analysis should be combined. An exergy optimisation of a

  17. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane Loft

    2013-01-01

    . The definition of an optimal performance was carried out by through a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow, given the probability of a future rain event. The methodology is successfully applied...

  18. The Basic Radar Altimetry Toolbox for Sentinel 3 Users

    Science.gov (United States)

    Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme

    2013-04-01

    The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to

  19. Optimising Shovel-Truck Fuel Consumption using Stochastic ...

    African Journals Online (AJOL)

    Optimising the fuel consumption and truck waiting time can result in significant fuel savings. The paper demonstrates that stochastic simulation is an effective tool for optimising the utilisation of fossil-based fuels in mining and related industries. Keywords: Stochastic, Simulation Modelling, Mining, Optimisation, Shovel-Truck ...

  20. Design of optimised backstepping controller for the synchronisation ...

    Indian Academy of Sciences (India)

    Ehsan Fouladi

    2017-12-18

    Dec 18, 2017 ... for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller. Keywords. Colpitts oscillator; backstepping controller; chaos synchronisation; shark smell algorithm; particle .... The velocity model is based on the gradient of the objective function, tilting ...

  1. Efficient topology optimisation of multiscale and multiphysics problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    The aim of this Thesis is to present efficient methods for optimising high-resolution problems of a multiscale and multiphysics nature. The Thesis consists of two parts: one treating topology optimisation of microstructural details and the other treating topology optimisation of conjugate heat...

  2. Versatile Cas9-Driven Subpopulation Selection Toolbox for Lactococcus lactis.

    Science.gov (United States)

    van der Els, Simon; James, Jennelle K; Kleerebezem, Michiel; Bron, Peter A

    2018-04-15

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis The Cas9 expression vector pLABTarget, encoding the Streptocccus pyogenes Cas9 under the control of a constitutive promoter, was constructed, allowing plug and play introduction of short guide RNA (sgRNA) sequences to target specific genetic loci. Introduction of a pepN -targeting derivative of pLABTarget into L. lactis strain MG1363 led to a strong reduction in the number of transformants obtained, which did not occur in a pepN deletion derivative of the same strain, demonstrating the specificity and lethality of the Cas9-mediated double-strand breaks in the lactococcal chromosome. Moreover, the same pLABTarget derivative allowed the selection of a pepN deletion subpopulation from its corresponding single-crossover plasmid integrant precursor, accelerating the construction and selection of gene-specific deletion derivatives in L. lactis Finally, pLABTarget, which contained sgRNAs designed to target mobile genetic elements, allowed the effective curing of plasmids, prophages, and integrative conjugative elements (ICEs). These results establish that pLABTarget enables the effective exploitation of Cas9 targeting in L. lactis , while the broad-host-range vector used suggests that this toolbox could readily be expanded to other Gram-positive bacteria. IMPORTANCE Mobile genetic elements in Lactococcus lactis and other lactic acid bacteria (LAB) play an important role in dairy fermentation, having both positive and detrimental effects during the production of fermented dairy products. The pLABTarget vector offers an efficient cloning platform for Cas9 application in lactic acid bacteria. Targeting Cas9 toward mobile genetic elements enabled their effective curing, which is of particular interest in the

  3. ObsPy - A Python Toolbox for Seismology - and Applications

    Science.gov (United States)

    Krischer, L.; Megies, T.; Barsch, R.; MacCarthy, J.; Lecocq, T.; Koymans, M. R.; Carothers, L.; Eulenfeld, T.; Reyes, C. G.; Falco, N.; Sales de Andrade, E.

    2017-12-01

    Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo

  4. Analysis and Optimisation of Carcass Production for Flexible Pipes

    DEFF Research Database (Denmark)

    Nielsen, Peter Søe

    Un-bonded flexible pipes are used in the offshore oil and gas industry worldwide transporting hydrocarbons from seafloor to floating production vessels topside. Flexible pipes are advantageous over rigid pipelines in dynamic applications and during installation as they are delivered in full length......-axial tension FLC points were attained. Analysis of weld fracture of duplex stainless steel EN 1.4162 is carried out determining strains with GOM ARAMIS automated strain measurement system, which shows that strain increases faster in the weld zone than the global strain of the parent material. Fracture...... is the analysis and optimisation of the carcass manufacturing process by means of a fundamental investigation in the fields of formability, failure modes / mechanisms, Finite Element Analysis (FEA), simulative testing and tribology. A study of failure mechanisms in carcass production is performed by being present...

  5. Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.

    Science.gov (United States)

    Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2017-11-01

    Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes

  6. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    Science.gov (United States)

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  7. ALEA: a toolbox for allele-specific epigenomics analysis.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Heravi-Moussavi, Alireza; Cheng, Jeffrey B; Costello, Joseph F; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2014-04-15

    The assessment of expression and epigenomic status using sequencing based methods provides an unprecedented opportunity to identify and correlate allelic differences with epigenomic status. We present ALEA, a computational toolbox for allele-specific epigenomics analysis, which incorporates allelic variation data within existing resources, allowing for the identification of significant associations between epigenetic modifications and specific allelic variants in human and mouse cells. ALEA provides a customizable pipeline of command line tools for allele-specific analysis of next-generation sequencing data (ChIP-seq, RNA-seq, etc.) that takes the raw sequencing data and produces separate allelic tracks ready to be viewed on genome browsers. The pipeline has been validated using human and hybrid mouse ChIP-seq and RNA-seq data. The package, test data and usage instructions are available online at http://www.bcgsc.ca/platform/bioinfo/software/alea CONTACT: : mkarimi1@interchange.ubc.ca or sjones@bcgsc.ca Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. ObsPy: A Python Toolbox for Seismology

    Science.gov (United States)

    Krischer, Lion; Megies, Tobias; Sales de Andrade, Elliott; Barsch, Robert; MacCarthy, Jonathan

    2017-04-01

    In recent years the Python ecosystem evolved into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community-driven, open-source project dedicated to providing a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than seven years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases. Additionally we will discuss the road ahead as well as the long-term sustainability of open-source scientific software.

  9. A Statistical Toolbox For Mining And Modeling Spatial Data

    Directory of Open Access Journals (Sweden)

    D’Aubigny Gérard

    2016-12-01

    Full Text Available Most data mining projects in spatial economics start with an evaluation of a set of attribute variables on a sample of spatial entities, looking for the existence and strength of spatial autocorrelation, based on the Moran’s and the Geary’s coefficients, the adequacy of which is rarely challenged, despite the fact that when reporting on their properties, many users seem likely to make mistakes and to foster confusion. My paper begins by a critical appraisal of the classical definition and rational of these indices. I argue that while intuitively founded, they are plagued by an inconsistency in their conception. Then, I propose a principled small change leading to corrected spatial autocorrelation coefficients, which strongly simplifies their relationship, and opens the way to an augmented toolbox of statistical methods of dimension reduction and data visualization, also useful for modeling purposes. A second section presents a formal framework, adapted from recent work in statistical learning, which gives theoretical support to our definition of corrected spatial autocorrelation coefficients. More specifically, the multivariate data mining methods presented here, are easily implementable on the existing (free software, yield methods useful to exploit the proposed corrections in spatial data analysis practice, and, from a mathematical point of view, whose asymptotic behavior, already studied in a series of papers by Belkin & Niyogi, suggests that they own qualities of robustness and a limited sensitivity to the Modifiable Areal Unit Problem (MAUP, valuable in exploratory spatial data analysis.

  10. MATLAB Stability and Control Toolbox Trim and Static Stability Module

    Science.gov (United States)

    Kenny, Sean P.; Crespo, Luis

    2012-01-01

    MATLAB Stability and Control Toolbox (MASCOT) utilizes geometric, aerodynamic, and inertial inputs to calculate air vehicle stability in a variety of critical flight conditions. The code is based on fundamental, non-linear equations of motion and is able to translate results into a qualitative, graphical scale useful to the non-expert. MASCOT was created to provide the conceptual aircraft designer accurate predictions of air vehicle stability and control characteristics. The code takes as input mass property data in the form of an inertia tensor, aerodynamic loading data, and propulsion (i.e. thrust) loading data. Using fundamental nonlinear equations of motion, MASCOT then calculates vehicle trim and static stability data for the desired flight condition(s). Available flight conditions include six horizontal and six landing rotation conditions with varying options for engine out, crosswind, and sideslip, plus three take-off rotation conditions. Results are displayed through a unique graphical interface developed to provide the non-stability and control expert conceptual design engineer a qualitative scale indicating whether the vehicle has acceptable, marginal, or unacceptable static stability characteristics. If desired, the user can also examine the detailed, quantitative results.

  11. SELANSI: a toolbox for simulation of stochastic gene regulatory networks.

    Science.gov (United States)

    Pájaro, Manuel; Otero-Muras, Irene; Vázquez, Carlos; Alonso, Antonio A

    2018-03-01

    Gene regulation is inherently stochastic. In many applications concerning Systems and Synthetic Biology such as the reverse engineering and the de novo design of genetic circuits, stochastic effects (yet potentially crucial) are often neglected due to the high computational cost of stochastic simulations. With advances in these fields there is an increasing need of tools providing accurate approximations of the stochastic dynamics of gene regulatory networks (GRNs) with reduced computational effort. This work presents SELANSI (SEmi-LAgrangian SImulation of GRNs), a software toolbox for the simulation of stochastic multidimensional gene regulatory networks. SELANSI exploits intrinsic structural properties of gene regulatory networks to accurately approximate the corresponding Chemical Master Equation with a partial integral differential equation that is solved by a semi-lagrangian method with high efficiency. Networks under consideration might involve multiple genes with self and cross regulations, in which genes can be regulated by different transcription factors. Moreover, the validity of the method is not restricted to a particular type of kinetics. The tool offers total flexibility regarding network topology, kinetics and parameterization, as well as simulation options. SELANSI runs under the MATLAB environment, and is available under GPLv3 license at https://sites.google.com/view/selansi. antonio@iim.csic.es. © The Author(s) 2017. Published by Oxford University Press.

  12. Optimisation-based worst-case analysis and anti-windup synthesis for uncertain nonlinear systems

    Science.gov (United States)

    Menon, Prathyush Purushothama

    This thesis describes the development and application of optimisation-based methods for worst-case analysis and anti-windup synthesis for uncertain nonlinear systems. The worst-case analysis methods developed in the thesis are applied to the problem of nonlinear flight control law clearance for highly augmented aircraft. Local, global and hybrid optimisation algorithms are employed to evaluate worst-case violations of a nonlinear response clearance criterion, for a highly realistic aircraft simulation model and flight control law. The reliability and computational overheads associated with different opti misation algorithms are compared, and the capability of optimisation-based approaches to clear flight control laws over continuous regions of the flight envelope is demonstrated. An optimisation-based method for computing worst-case pilot inputs is also developed, and compared with current industrial approaches for this problem. The importance of explicitly considering uncertainty in aircraft parameters when computing worst-case pilot demands is clearly demonstrated. Preliminary results on extending the proposed framework to the problems of limit-cycle analysis and robustness analysis in the pres ence of time-varying uncertainties are also included. A new method for the design of anti-windup compensators for nonlinear constrained systems controlled using nonlinear dynamics inversion control schemes is presented and successfully applied to some simple examples. An algorithm based on the use of global optimisation is proposed to design the anti-windup compensator. Some conclusions are drawn from the results of the research presented in the thesis, and directions for future work are identified.

  13. Real-time optimisation of the Hoa Binh reservoir, Vietnam

    DEFF Research Database (Denmark)

    Richaud, Bertrand; Madsen, Henrik; Rosbjerg, Dan

    2011-01-01

    -time optimisation. First, the simulation-optimisation framework is applied for optimising reservoir operating rules. Secondly, real-time and forecast information is used for on-line optimisation that focuses on short-term goals, such as flood control or hydropower generation, without compromising the deviation...... in the downstream part of the Red River, and at the same time to increase hydropower generation and to save water for the dry season. The real-time optimisation procedure further improves the efficiency of the reservoir operation and enhances the flexibility for the decision-making. Finally, the quality......Multi-purpose reservoirs often have to be managed according to conflicting objectives, which requires efficient tools for trading-off the objectives. This paper proposes a multi-objective simulation-optimisation approach that couples off-line rule curve optimisation with on-line real...

  14. Acoustic Resonator Optimisation for Airborne Particle Manipulation

    Science.gov (United States)

    Devendran, Citsabehsan; Billson, Duncan R.; Hutchins, David A.; Alan, Tuncay; Neild, Adrian

    Advances in micro-electromechanical systems (MEMS) technology and biomedical research necessitate micro-machined manipulators to capture, handle and position delicate micron-sized particles. To this end, a parallel plate acoustic resonator system has been investigated for the purposes of manipulation and entrapment of micron sized particles in air. Numerical and finite element modelling was performed to optimise the design of the layered acoustic resonator. To obtain an optimised resonator design, careful considerations of the effect of thickness and material properties are required. Furthermore, the effect of acoustic attenuation which is dependent on frequency is also considered within this study, leading to an optimum operational frequency range. Finally, experimental results demonstrated good particle levitation and capture of various particle properties and sizes ranging to as small as 14.8 μm.

  15. Techno-economic optimisation of energy systems

    International Nuclear Information System (INIS)

    Mansilla Pellen, Ch.

    2006-07-01

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  16. Pre-operative optimisation of lung function

    Directory of Open Access Journals (Sweden)

    Naheed Azhar

    2015-01-01

    Full Text Available The anaesthetic management of patients with pre-existing pulmonary disease is a challenging task. It is associated with increased morbidity in the form of post-operative pulmonary complications. Pre-operative optimisation of lung function helps in reducing these complications. Patients are advised to stop smoking for a period of 4–6 weeks. This reduces airway reactivity, improves mucociliary function and decreases carboxy-haemoglobin. The widely used incentive spirometry may be useful only when combined with other respiratory muscle exercises. Volume-based inspiratory devices have the best results. Pharmacotherapy of asthma and chronic obstructive pulmonary disease must be optimised before considering the patient for elective surgery. Beta 2 agonists, inhaled corticosteroids and systemic corticosteroids, are the main drugs used for this and several drugs play an adjunctive role in medical therapy. A graded approach has been suggested to manage these patients for elective surgery with an aim to achieve optimal pulmonary function.

  17. Optimised dipper fine tunes shovel performance

    Energy Technology Data Exchange (ETDEWEB)

    Fiscor, S.

    2005-06-01

    Joint efforts between mine operators, OEMs, and researchers yields unexpected benefits from dippers for shovels for coal, oil, or hardrock mining that can now be tailored to meet site-specific conditions. The article outlines a process being developed by CRCMining and P & H MIning Equipment to optimise the dipper that involves rapid prototyping and scale modelling of the dipper and the mine conditions. Scale models have been successfully field tested. 2 photos.

  18. Public transport optimisation emphasising passengers’ travel behaviour.

    OpenAIRE

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in ...

  19. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  20. Natural Erosion of Sandstone as Shape Optimisation.

    Science.gov (United States)

    Ostanin, Igor; Safonov, Alexander; Oseledets, Ivan

    2017-12-11

    Natural arches, pillars and other exotic sandstone formations have always been attracting attention for their unusual shapes and amazing mechanical balance that leave a strong impression of intelligent design rather than the result of a stochastic process. It has been recently demonstrated that these shapes could have been the result of the negative feedback between stress and erosion that originates in fundamental laws of friction between the rock's constituent particles. Here we present a deeper analysis of this idea and bridge it with the approaches utilized in shape and topology optimisation. It appears that the processes of natural erosion, driven by stochastic surface forces and Mohr-Coulomb law of dry friction, can be viewed within the framework of local optimisation for minimum elastic strain energy. Our hypothesis is confirmed by numerical simulations of the erosion using the topological-shape optimisation model. Our work contributes to a better understanding of stochastic erosion and feasible landscape formations that could be found on Earth and beyond.

  1. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez

    2014-09-16

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  2. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez; Khan, Ayaz ul Hassan

    2014-01-01

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  3. Optimisation and symmetry in experimental radiation physics

    International Nuclear Information System (INIS)

    Ghose, A.

    1988-01-01

    The present monograph is concerned with the optimisation of geometric factors in radiation physics experiments. The discussions are essentially confined to those systems in which optimisation is equivalent to symmetrical configurations of the measurement systems. They include, measurements of interaction cross section of diverse types, determination of polarisations, development of detectors with almost ideal characteristics, production of radiations with continuously variable energies and development of high efficiency spectrometers etc. The monograph is intended for use by experimental physicists investigating primary interactions of radiations with matter and associated technologies. We have illustrated the various optimisation procedures by considering the cases of the so-called ''14 MeV'' on d-t neutrons and gamma rays with energies less than 3 MeV. Developments in fusion technology are critically dependent on the availability accurate cross sections of nuclei for fast neutrons of energies at least as high as d-t neutrons. In this monograph we have discussed various techniques which can be used to improve the accuracy of such measurements and have also presented a method for generating almost monoenergetic neutrons in the 8 MeV to 13 MeV energy range which can be used to measure cross sections in this sparingly investigated region

  4. moviEEG: An animation toolbox for visualization of intracranial electroencephalography synchronization dynamics.

    Science.gov (United States)

    Wong, Simeon M; Ibrahim, George M; Ochi, Ayako; Otsubo, Hiroshi; Rutka, James T; Snead, O Carter; Doesburg, Sam M

    2016-06-01

    We introduce and describe the functions of moviEEG (Multiple Overlay Visualizations for Intracranial ElectroEncephaloGraphy), a novel MATLAB-based toolbox for spatiotemporal mapping of network synchronization dynamics in intracranial electroencephalography (iEEG) data. The toolbox integrates visualizations of inter-electrode phase-locking relationships in peri-ictal epileptogenic networks with signal spectral properties and graph-theoretical network measures overlaid upon operating room images of the electrode grid. Functional connectivity between every electrode pair is evaluated over a sliding window indexed by phase synchrony. Two case studies are presented to provide preliminary evidence for the application of the toolbox to guide network-based mapping of epileptogenic cortex and to distinguish these regions from eloquent brain networks. In both cases, epileptogenic cortex was visually distinct. We introduce moviEEG, a novel toolbox for animation of oscillatory network dynamics in iEEG data, and provide two case studies showing preliminary evidence for utility of the toolbox in delineating the epileptogenic zone. Despite evidence that atypical network synchronization has shown to be altered in epileptogenic brain regions, network based techniques have yet to be incorporated into clinical pre-surgical mapping. moviEEG provides a set of functions to enable easy visualization with network based techniques. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    Science.gov (United States)

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  6. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  7. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    Science.gov (United States)

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Travel demand management : a toolbox of strategies to reduce single\\0x2010occupant vehicle trips and increase alternate mode usage in Arizona.

    Science.gov (United States)

    2012-02-01

    The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...

  9. Toolboxes for cyanobacteria: Recent advances and future direction.

    Science.gov (United States)

    Sun, Tao; Li, Shubin; Song, Xinyu; Diao, Jinjin; Chen, Lei; Zhang, Weiwen

    2018-05-03

    Photosynthetic cyanobacteria are important primary producers and model organisms for studying photosynthesis and elements cycling on earth. Due to the ability to absorb sunlight and utilize carbon dioxide, cyanobacteria have also been proposed as renewable chassis for carbon-neutral "microbial cell factories". Recent progresses on cyanobacterial synthetic biology have led to the successful production of more than two dozen of fuels and fine chemicals directly from CO 2 , demonstrating their potential for scale-up application in the future. However, compared with popular heterotrophic chassis like Escherichia coli and Saccharomyces cerevisiae, where abundant genetic tools are available for manipulations at levels from single gene, pathway to whole genome, limited genetic tools are accessible to cyanobacteria. Consequently, this significant technical hurdle restricts both the basic biological researches and further development and application of these renewable systems. Though still lagging the heterotrophic chassis, the vital roles of genetic tools in tuning of gene expression, carbon flux re-direction as well as genome-wide manipulations have been increasingly recognized in cyanobacteria. In recent years, significant progresses on developing and introducing new and efficient genetic tools have been made for cyanobacteria, including promoters, riboswitches, ribosome binding site engineering, clustered regularly interspaced short palindromic repeats/CRISPR-associated nuclease (CRISPR/Cas) systems, small RNA regulatory tools and genome-scale modeling strategies. In this review, we critically summarize recent advances on development and applications as well as technical limitations and future directions of the genetic tools in cyanobacteria. In addition, toolboxes feasible for using in large-scale cultivation are also briefly discussed. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Hemostats, sealants, and adhesives: components of the surgical toolbox.

    Science.gov (United States)

    Spotnitz, William D; Burks, Sandra

    2008-07-01

    The surgical toolbox is expanding, and newer products are being developed to improve results. Reducing blood loss so that bloodless surgery can be performed may help minimize morbidity and length of stay. As patients, hospital administrators, and government regulators desire less invasive procedures, the surgical technical challenge is increasing. More operations are being performed through minimally invasive incisions with laparoscopic, endoscopic, and robotic approaches. In this setting, tools that can reduce bleeding by causing blood to clot, sealing vessels, or gluing tissues are gaining an increasing importance. Thus, hemostats, sealants, and adhesives are becoming a more important element of surgical practice. This review is designed to facilitate the reader's basic knowledge of these tools so that informed choices are made for controlling bleeding in specific clinical situations. Such information is useful for all members of the operative team. The team includes surgeons, anesthesiologists, residents, and nurses as well as hematologists and other medical specialists who may be involved in the perioperative care of surgical patients. An understanding of these therapeutic options may also be helpful to the transfusion service. In some cases, these materials may be stored in the blood bank, and their appropriate use may reduce demand for other transfusion components. The product classification used in this review includes hemostats as represented by product categories that include mechanical agents, active agents, flowables, and fibrin sealants; sealants as represented by fibrin sealants and polyethylene glycol hydrogels; and adhesives as represented by cyanoacrylates and albumin cross-linked with glutaraldehyde. Only those agents approved by the Food and Drug Administration (FDA) and presently available (February 2008) for sale in the United States are discussed in this review.

  11. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  12. Designed Green Toolbox as built environment educating method-analytical comparison between two groups of students with different cultural background

    NARCIS (Netherlands)

    El Fiky, U.; Hamdy, I.; Fikry, M.

    2006-01-01

    This paper is concerned with evaluating and testing the application process of green architecture design strategies using a tool-box as a built environment educating method and a pre-design reminder. Understanding the suggested green design strategies, testing the tool-box effectiveness,

  13. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  14. Process evaluation of a Toolbox-training program for construction foremen in Denmark

    DEFF Research Database (Denmark)

    Jeschke, Katharina Christiane; Kines, Pete; Rasmussen, Liselotte

    2017-01-01

    foremen’s knowledge and communication skills in daily planning of work tasks and their related OSH risks on construction sites. The program builds on the popular 'toolbox meeting' concept, however there is very little research evaluating these types of meetings. This article describes the development...... and revised until the final version after the fifth group. The evaluation utilized an action research strategy with a mixed–methods approach of triangulating questionnaire, interview, and observation data. Process evaluation results showed that the eight Toolbox-training topics were relevant and useful...

  15. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    Science.gov (United States)

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  16. Imbalanced-learn: A Python Toolbox to Tackle the Curse of Imbalanced Datasets in Machine Learning

    OpenAIRE

    Lemaitre , Guillaume; Nogueira , Fernando; Aridas , Christos ,

    2017-01-01

    International audience; imbalanced-learn is an open-source python toolbox aiming at providing a wide range of methods to cope with the problem of imbalanced dataset frequently encountered in machine learning and pattern recognition. The implemented state-of-the-art methods can be categorized into 4 groups: (i) under-sampling, (ii) over-sampling, (iii) combination of over-and under-sampling, and (iv) ensemble learning methods. The proposed toolbox depends only on numpy, scipy, and scikit-learn...

  17. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  18. Mechatronic System Design Based On An Optimisation Approach

    DEFF Research Database (Denmark)

    Andersen, Torben Ole; Pedersen, Henrik Clemmensen; Hansen, Michael Rygaard

    The envisaged objective of this paper project is to extend the current state of the art regarding the design of complex mechatronic systems utilizing an optimisation approach. We propose to investigate a novel framework for mechatronic system design. The novelty and originality being the use...... of optimisation techniques. The methods used to optimise/design within the classical disciplines will be identified and extended to mechatronic system design....

  19. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2005-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi...... of application messages to frames. Optimisation heuristics for frame packing aimed at producing a schedulable system are presented. Extensive experiments and a real-life example show the efficiency of the frame-packing approach....

  20. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi...... of application messages to frames. Optimisation heuristics for frame packing aimed at producing a schedulable system are presented. Extensive experiments and a real-life example show the efficiency of the frame-packing approach....

  1. Flotation process control optimisation at Prominent Hill

    International Nuclear Information System (INIS)

    Lombardi, Josephine; Muhamad, Nur; Weidenbach, M.

    2012-01-01

    OZ Minerals' Prominent Hill copper- gold concentrator is located 130 km south east of the town of Coober Pedy in the Gawler Craton of South Australia. The concentrator was built in 2008 and commenced commercial production in early 2009. The Prominent Hill concentrator is comprised of a conventional grinding and flotation processing plant with a 9.6 Mtpa ore throughput capacity. The flotation circuit includes six rougher cells, an IseMill for regrinding the rougher concentrate and a Jameson cell heading up the three stage conventional cell cleaner circuit. In total there are four level controllers in the rougher train and ten level controllers in the cleaning circuit for 18 cells. Generic proportional — integral and derivative (PID) control used on the level controllers alone propagated any disturbances downstream in the circuit that were generated from the grinding circuit, hoppers, between cells and interconnected banks of cells, having a negative impact on plant performance. To better control such disturbances, FloatStar level stabiliser was selected for installation on the flotation circuit to account for the interaction between the cells. Multivariable control was also installed on the five concentrate hoppers to maintain consistent feed to the cells and to the IsaMill. An additional area identified for optimisation in the flotation circuit was the mass pull rate from the rougher cells. FloatStar flow optimiser was selected to be installed subsequent to the FloatStar level stabiliser. This allowed for a unified, consistent and optimal approach to running the rougher circuit. This paper describes the improvement in the stabilisation of the circuit achieved by the FloatStar level stabiliser by using the interaction matrix between cell level controllers and the results and benefits of implementing the FloatStar flow optimiser on the rougher train.

  2. Risk based test interval and maintenance optimisation - Application and uses

    International Nuclear Information System (INIS)

    Sparre, E.

    1999-10-01

    The project is part of an IAEA co-ordinated Research Project (CRP) on 'Development of Methodologies for Optimisation of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs'. The purpose of the project is to investigate the sensitivity of the results obtained when performing risk based optimisation of the technical specifications. Previous projects have shown that complete LPSA models can be created and that these models allow optimisation of technical specifications. However, these optimisations did not include any in depth check of the result sensitivity with regards to methods, model completeness etc. Four different test intervals have been investigated in this study. Aside from an original, nominal, optimisation a set of sensitivity analyses has been performed and the results from these analyses have been compared to the original optimisation. The analyses indicate that the result of an optimisation is rather stable. However, it is not possible to draw any certain conclusions without performing a number of sensitivity analyses. Significant differences in the optimisation result were discovered when analysing an alternative configuration. Also deterministic uncertainties seem to affect the result of an optimisation largely. The sensitivity of failure data uncertainties is important to investigate in detail since the methodology is based on the assumption that the unavailability of a component is dependent on the length of the test interval

  3. Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.

    Science.gov (United States)

    Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima

    2013-01-01

    The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  4. Improving Vector Evaluated Particle Swarm Optimisation by Incorporating Nondominated Solutions

    Directory of Open Access Journals (Sweden)

    Kian Sheng Lim

    2013-01-01

    Full Text Available The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  5. Optimising end of generation of Magnox reactors

    International Nuclear Information System (INIS)

    Hall, D.; Hopper, E.D.A.

    2014-01-01

    Designing, justifying and gaining regulatory approval for optimised, terminal fuel cycles for the last 4 of the 13 strong Magnox Fleet is described, covering: - constraints set by the plant owner's integrated closure plan, opportunities for innovative fuel cycles while preserving flexibility to respond to business changes; - methods of collectively determining best options for each site; - selected strategies including lower fuel element retention and inter-reactor transfer of fuel; - the required work scope, its technical, safety case and resource challenges and how they were met; - achieving additional electricity generation worth in excess of Pound 1 b from 4 sites (a total of 8 reactors); - the keys to success. (authors)

  6. Advanced manufacturing: optimising the factories of tomorrow

    International Nuclear Information System (INIS)

    Philippon, Patrick

    2013-01-01

    Faced with competition Patrick Philippon - Les Defis du CEA no.179 - April 2013 from the emerging countries, the competitiveness of the industrialised nations depends on the ability of their industries to innovate. This strategy necessarily entails the reorganisation and optimisation of the production systems. This is the whole challenge for 'advanced manufacturing', which relies on the new information and communication technologies. Interactive robotics, virtual reality and non-destructive testing are all technological building blocks developed by CEA, now approved within a cross-cutting programme, to meet the needs of industry and together build the factories of tomorrow. (author)

  7. Biorefinery plant design, engineering and process optimisation

    DEFF Research Database (Denmark)

    Holm-Nielsen, Jens Bo; Ehimen, Ehiazesebhor Augustine

    2014-01-01

    Before new biorefinery systems can be implemented, or the modification of existing single product biomass processing units into biorefineries can be carried out, proper planning of the intended biorefinery scheme must be performed initially. This chapter outlines design and synthesis approaches...... applicable for the planning and upgrading of intended biorefinery systems, and includes discussions on the operation of an existing lignocellulosic-based biorefinery platform. Furthermore, technical considerations and tools (i.e., process analytical tools) which could be applied to optimise the operations...... of existing and potential biorefinery plants are elucidated....

  8. Specification, Verification and Optimisation of Business Processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas

    is extended with stochastic branching, message passing and reward annotations which allow for the modelling of resources consumed during the execution of a business process. Further, it is shown how this structure can be used to formalise the established business process modelling language Business Process...... fault tree analysis and the automated optimisation of business processes by means of an evolutionary algorithm. This work is motivated by problems that stem from the healthcare sector, and examples encountered in this field are used to illustrate these developments....

  9. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Andreasen, Casper Schousboe; Aage, Niels

    stabilised finite elements implemented in a parallel multiphysics analysis and optimisation framework DFEM [1], developed and maintained in house. Focus is put on control of the temperature field within the solid structure and the problems can therefore be seen as conjugate heat transfer problems, where heat...... conduction governs in the solid parts of the design domain and couples to convection-dominated heat transfer to a surrounding fluid. Both loosely coupled and tightly coupled problems are considered. The loosely coupled problems are convection-diffusion problems, based on an advective velocity field from...

  10. Cost optimisation studies of high power accelerators

    Energy Technology Data Exchange (ETDEWEB)

    McAdams, R.; Nightingale, M.P.S.; Godden, D. [AEA Technology, Oxon (United Kingdom)] [and others

    1995-10-01

    Cost optimisation studies are carried out for an accelerator based neutron source consisting of a series of linear accelerators. The characteristics of the lowest cost design for a given beam current and energy machine such as power and length are found to depend on the lifetime envisaged for it. For a fixed neutron yield it is preferable to have a low current, high energy machine. The benefits of superconducting technology are also investigated. A Separated Orbit Cyclotron (SOC) has the potential to reduce capital and operating costs and intial estimates for the transverse and longitudinal current limits of such machines are made.

  11. Automation of route identification and optimisation based on data-mining and chemical intuition.

    Science.gov (United States)

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  12. Biofuels carbon footprints: Whole-systems optimisation for GHG emissions reduction.

    Science.gov (United States)

    Zamboni, Andrea; Murphy, Richard J; Woods, Jeremy; Bezzo, Fabrizio; Shah, Nilay

    2011-08-01

    A modelling approach for strategic design of ethanol production systems combining lifecycle analysis (LCA) and supply chain optimisation (SCO) can significantly contribute to assess their economic and environmental sustainability and to guide decision makers towards a more conscious implementation of ad hoc farming and processing practices. Most models applications so far have been descriptive in nature; the model proposed in this work is "normative" in that it aims to guide actions towards optimal outcomes (e.g. optimising the nitrogen balance through the whole supply chain). The modelling framework was conceived to steer strategic policies through a geographically specific design process considering economic and environmental criteria. Results shows how a crop management strategy devised from a whole systems perspective can significantly contribute to mitigate global warming even in first generation technologies. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. National Institutes of Health Toolbox Emotion Battery for English- and Spanish-speaking adults: normative data and factor-based summary scores.

    Science.gov (United States)

    Babakhanyan, Ida; McKenna, Benjamin S; Casaletto, Kaitlin B; Nowinski, Cindy J; Heaton, Robert K

    2018-01-01

    The National Institutes of Health Toolbox Emotion Battery (NIHTB-EB) is a "common currency", computerized assessment developed to measure the full spectrum of emotional health. Though comprehensive, the NIHTB-EB's 17 scales may be unwieldy for users aiming to capture more global indices of emotional functioning. NIHTB-EB was administered to 1,036 English-speaking and 408 Spanish-speaking adults as a part of the NIH Toolbox norming project. We examined the factor structure of the NIHTB-EB in English- and Spanish-speaking adults and developed factor analysis-based summary scores. Census-weighted norms were presented for English speakers, and sample-weighted norms were presented for Spanish speakers. Exploratory factor analysis for both English- and Spanish-speaking cohorts resulted in the same 3-factor solution: 1) negative affect, 2) social satisfaction, and 3) psychological well-being. Confirmatory factor analysis supported similar factor structures for English- and Spanish-speaking cohorts. Model fit indices fell within the acceptable/good range, and our final solution was optimal compared to other solutions. Summary scores based upon the normative samples appear to be psychometrically supported and should be applied to clinical samples to further validate the factor structures and investigate rates of problematic emotions in medical and psychiatric populations.

  14. HVAC system optimisation-in-building section

    Energy Technology Data Exchange (ETDEWEB)

    Lu, L.; Cai, W.; Xie, L.; Li, S.; Soh, Y.C. [School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore (Singapore)

    2004-07-01

    This paper presents a practical method to optimise in-building section of centralised Heating, Ventilation and Air-Conditioning (HVAC) systems which consist of indoor air loops and chilled water loops. First, through component characteristic analysis, mathematical models associated with cooling loads and energy consumption for heat exchangers and energy consuming devices are established. By considering variation of cooling load of each end user, adaptive neuro-fuzzy inference system (ANFIS) is employed to model duct and pipe networks and obtain optimal differential pressure (DP) set points based on limited sensor information. A mix-integer nonlinear constraint optimization of system energy is formulated and solved by a modified genetic algorithm. The main feature of our paper is a systematic approach in optimizing the overall system energy consumption rather than that of individual component. A simulation study for a typical centralized HVAC system is provided to compare the proposed optimisation method with traditional ones. The results show that the proposed method indeed improves the system performance significantly. (author)

  15. Optimisation of milling parameters using neural network

    Directory of Open Access Journals (Sweden)

    Lipski Jerzy

    2017-01-01

    Full Text Available The purpose of this study was to design and test an intelligent computer software developed with the purpose of increasing average productivity of milling not compromising the design features of the final product. The developed system generates optimal milling parameters based on the extent of tool wear. The introduced optimisation algorithm employs a multilayer model of a milling process developed in the artificial neural network. The input parameters for model training are the following: cutting speed vc, feed per tooth fz and the degree of tool wear measured by means of localised flank wear (VB3. The output parameter is the surface roughness of a machined surface Ra. Since the model in the neural network exhibits good approximation of functional relationships, it was applied to determine optimal milling parameters in changeable tool wear conditions (VB3 and stabilisation of surface roughness parameter Ra. Our solution enables constant control over surface roughness parameters and productivity of milling process after each assessment of tool condition. The recommended parameters, i.e. those which applied in milling ensure desired surface roughness and maximal productivity, are selected from all the parameters generated by the model. The developed software may constitute an expert system supporting a milling machine operator. In addition, the application may be installed on a mobile device (smartphone, connected to a tool wear diagnostics instrument and the machine tool controller in order to supply updated optimal parameters of milling. The presented solution facilitates tool life optimisation and decreasing tool change costs, particularly during prolonged operation.

  16. Noise aspects at aerodynamic blade optimisation projects

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G. [Netherlands Energy Research Foundation, Petten (Netherlands)

    1997-12-31

    This paper shows an example of an aerodynamic blade optimisation, using the program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. The aerodynamic optimised geometry from PVOPT is the `real` optimum (up to the latest decimal). The most important conclusion from this study is, that it is worthwhile to investigate the behaviour of the objective function (in the present case the energy yield) around the optimum: If the optimum is flat, there is a possibility to apply modifications to the optimum configuration with only a limited loss in energy yield. It is obvious that the modified configurations emits a different (and possibly lower) noise level. In the BLADOPT program (the successor of PVOPT) it will be possible to quantify the noise level and hence to assess the reduced noise emission more thoroughly. At present the most promising approaches for noise reduction are believed to be a reduction of the rotor speed (if at all possible), and a reduction of the tip angle by means of low lift profiles, or decreased twist at the outboard stations. These modifications were possible without a significant loss in energy yield. (LN)

  17. The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses

    Science.gov (United States)

    Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano

    2017-01-01

    The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…

  18. 40 CFR 141.717 - Pre-filtration treatment toolbox components.

    Science.gov (United States)

    2010-07-01

    ... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Pre-filtration treatment toolbox...

  19. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction

    Science.gov (United States)

    Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  20. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.

    Science.gov (United States)

    Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  1. Par@Graph - a parallel toolbox for the construction and analysis of large complex climate networks

    NARCIS (Netherlands)

    Tantet, A.J.J.

    2015-01-01

    In this paper, we present Par@Graph, a software toolbox to reconstruct and analyze complex climate networks having a large number of nodes (up to at least 106) and edges (up to at least 1012). The key innovation is an efficient set of parallel software tools designed to leverage the inherited hybrid

  2. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    Science.gov (United States)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  3. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    Science.gov (United States)

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  4. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Caroline Looms, Majken

    2013-01-01

    on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods...

  5. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline

    2013-01-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely...

  6. An integrated GIS-MARKAL toolbox for designing a CO2 infrastructure network in the Netherlands

    NARCIS (Netherlands)

    van den Broek, M.A.; Brederode, E.; Ramirez, C.A.; Kramers, K.; van der Kuip, M.; Wildenborg, T.; Faaij, A.P.C.; Turkenburg, W.C.

    2009-01-01

    Large-scale implementation of carbon capture and storage needs a whole new infrastructure to transport and store CO2. Tools that can support planning and designing of such infrastructure require incorporation of both temporal and spatial aspects. Therefore, a toolbox that integrates ArcGIS, a

  7. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    Science.gov (United States)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product

  8. Screening and assessment of chronic pain among children with cerebral palsy: a process evaluation of a pain toolbox.

    Science.gov (United States)

    Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna

    2018-06-08

    Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment

  9. The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.

    Science.gov (United States)

    Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A

    2016-04-15

    A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost

  10. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    Science.gov (United States)

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  11. Optimising polarised neutron scattering measurements--XYZ and polarimetry analysis

    International Nuclear Information System (INIS)

    Cussen, L.D.; Goossens, D.J.

    2002-01-01

    The analytic optimisation of neutron scattering measurements made using XYZ polarisation analysis and neutron polarimetry techniques is discussed. Expressions for the 'quality factor' and the optimum division of counting time for the XYZ technique are presented. For neutron polarimetry the optimisation is identified as analogous to that for measuring the flipping ratio and reference is made to the results already in the literature

  12. Optimising polarised neutron scattering measurements--XYZ and polarimetry analysis

    CERN Document Server

    Cussen, L D

    2002-01-01

    The analytic optimisation of neutron scattering measurements made using XYZ polarisation analysis and neutron polarimetry techniques is discussed. Expressions for the 'quality factor' and the optimum division of counting time for the XYZ technique are presented. For neutron polarimetry the optimisation is identified as analogous to that for measuring the flipping ratio and reference is made to the results already in the literature.

  13. Application of ant colony optimisation in distribution transformer sizing

    African Journals Online (AJOL)

    This study proposes an optimisation method for transformer sizing in power system using ant colony optimisation and a verification of the process by MATLAB software. The aim is to address the issue of transformer sizing which is a major challenge affecting its effective performance, longevity, huge capital cost and power ...

  14. Multi-objective evolutionary optimisation for product design and manufacturing

    CERN Document Server

    2011-01-01

    Presents state-of-the-art research in the area of multi-objective evolutionary optimisation for integrated product design and manufacturing Provides a comprehensive review of the literature Gives in-depth descriptions of recently developed innovative and novel methodologies, algorithms and systems in the area of modelling, simulation and optimisation

  15. Design Optimisation and Conrol of a Pilot Operated Seat Valve

    DEFF Research Database (Denmark)

    Nielsen, Brian; Andersen, Torben Ole; Hansen, Michael Rygaard

    2004-01-01

    The paper gives an approach for optimisation of the bandwidth of a pilot operated seat valve for mobile applications. Physical dimensions as well as parameters of the implemented control loop are optimised simultaneously. The frequency response of the valve varies as a function of the pressure drop...

  16. DACIA LOGAN LIVE AXLE OPTIMISATION USING COMPUTER GRAPHICS

    Directory of Open Access Journals (Sweden)

    KIRALY Andrei

    2017-05-01

    Full Text Available The paper presents some contributions to the calculus and optimisation of a live axle used at Dacia Logan using computer graphics software for creating the model and afterwards using FEA evaluation to determine the effectiveness of the optimisation. Thus using specialized computer software, a simulation is made and the results were compared to the measured real prototype.

  17. Spatial-structural interaction and strain energy structural optimisation

    NARCIS (Netherlands)

    Hofmeyer, H.; Davila Delgado, J.M.; Borrmann, A.; Geyer, P.; Rafiq, Y.; Wilde, de P.

    2012-01-01

    A research engine iteratively transforms spatial designs into structural designs and vice versa. Furthermore, spatial and structural designs are optimised. It is suggested to optimise a structural design by evaluating the strain energy of its elements and by then removing, adding, or changing the

  18. Adjoint Optimisation of the Turbulent Flow in an Annular Diffuser

    DEFF Research Database (Denmark)

    Gotfredsen, Erik; Agular Knudsen, Christian; Kunoy, Jens Dahl

    2017-01-01

    In the present study, a numerical optimisation of guide vanes in an annular diffuser, is performed. The optimisation is preformed for the purpose of improving the following two parameters simultaneously; the first parameter is the uniformity perpen-dicular to the flow direction, a 1/3 diameter do...

  19. Optimising of Steel Fiber Reinforced Concrete Mix Design | Beddar ...

    African Journals Online (AJOL)

    Optimising of Steel Fiber Reinforced Concrete Mix Design. ... as a result of the loss of mixture workability that will be translated into a difficult concrete casting in site. ... An experimental study of an optimisation method of fibres in reinforced ...

  20. GAOS: Spatial optimisation of crop and nature within agricultural fields

    NARCIS (Netherlands)

    Bruin, de S.; Janssen, H.; Klompe, A.; Lerink, P.; Vanmeulebrouk, B.

    2010-01-01

    This paper proposes and demonstrates a spatial optimiser that allocates areas of inefficient machine manoeuvring to field margins thus improving the use of available space and supporting map-based Controlled Traffic Farming. A prototype web service (GAOS) allows farmers to optimise tracks within

  1. Comparison of the genetic algorithm and incremental optimisation routines for a Bayesian inverse modelling based network design

    Science.gov (United States)

    Nickless, A.; Rayner, P. J.; Erni, B.; Scholes, R. J.

    2018-05-01

    The design of an optimal network of atmospheric monitoring stations for the observation of carbon dioxide (CO2) concentrations can be obtained by applying an optimisation algorithm to a cost function based on minimising posterior uncertainty in the CO2 fluxes obtained from a Bayesian inverse modelling solution. Two candidate optimisation methods assessed were the evolutionary algorithm: the genetic algorithm (GA), and the deterministic algorithm: the incremental optimisation (IO) routine. This paper assessed the ability of the IO routine in comparison to the more computationally demanding GA routine to optimise the placement of a five-member network of CO2 monitoring sites located in South Africa. The comparison considered the reduction in uncertainty of the overall flux estimate, the spatial similarity of solutions, and computational requirements. Although the IO routine failed to find the solution with the global maximum uncertainty reduction, the resulting solution had only fractionally lower uncertainty reduction compared with the GA, and at only a quarter of the computational resources used by the lowest specified GA algorithm. The GA solution set showed more inconsistency if the number of iterations or population size was small, and more so for a complex prior flux covariance matrix. If the GA completed with a sub-optimal solution, these solutions were similar in fitness to the best available solution. Two additional scenarios were considered, with the objective of creating circumstances where the GA may outperform the IO. The first scenario considered an established network, where the optimisation was required to add an additional five stations to an existing five-member network. In the second scenario the optimisation was based only on the uncertainty reduction within a subregion of the domain. The GA was able to find a better solution than the IO under both scenarios, but with only a marginal improvement in the uncertainty reduction. These results suggest

  2. Active vibration reduction of a flexible structure bonded with optimised piezoelectric pairs using half and quarter chromosomes in genetic algorithms

    International Nuclear Information System (INIS)

    Daraji, A H; Hale, J M

    2012-01-01

    The optimal placement of sensors and actuators in active vibration control is limited by the number of candidates in the search space. The search space of a small structure discretized to one hundred elements for optimising the location of ten actuators gives 1.73 × 10 13 possible solutions, one of which is the global optimum. In this work, a new quarter and half chromosome technique based on symmetry is developed, by which the search space for optimisation of sensor/actuator locations in active vibration control of flexible structures may be greatly reduced. The technique is applied to the optimisation for eight and ten actuators located on a 500×500mm square plate, in which the search space is reduced by up to 99.99%. This technique helps for updating genetic algorithm program by updating natural frequencies and mode shapes in each generation to find the global optimal solution in a greatly reduced number of generations. An isotropic plate with piezoelectric sensor/actuator pairs bonded to its surface was investigated using the finite element method and Hamilton's principle based on first order shear deformation theory. The placement and feedback gain of ten and eight sensor/actuator pairs was optimised for a cantilever and clamped-clamped plate to attenuate the first six modes of vibration, using minimization of linear quadratic index as an objective function.

  3. EasyCloneYALI: CRISPR/Cas9-based synthetic toolbox for engineering of the yeast Yarrowia lipolytica

    DEFF Research Database (Denmark)

    Holkenbrink, Carina; Dam, Marie Inger; Kildegaard, Kanchana Rueksomtawin

    2018-01-01

    . Here, we present the EasyCloneYALI genetic toolbox, which allows streamlined strain construction with high genome editing efficiencies in Y. lipolytica via the CRISPR/Cas9 technology. The toolbox allows marker-free integration of gene expression vectors into characterized genome sites as well as marker......-free deletion of genes with the help of CRISPR/Cas9. Genome editing efficiencies above 80% were achieved with transformation protocols using non-replicating DNA repair fragments (such as DNA oligos). Furthermore, the toolbox includes a set of integrative gene expression vectors with prototrophic markers...

  4. Results of the 2010 IGSC Topical Session on Optimisation

    International Nuclear Information System (INIS)

    Bailey, Lucy

    2014-01-01

    Document available in abstract form only. Full text follows: The 2010 IGSC topical session on optimisation explored a wide range of issues concerning optimisation throughout the radioactive waste management process. Philosophical and ethical questions were discussed, such as: - To what extent is the process of optimisation more important than the end result? - How do we balance long-term environmental safety with near-term operational safety? - For how long should options be kept open? - In balancing safety and excessive cost, when is BAT achieved and who decides on this? * How should we balance the needs of current society with those of future generations? It was clear that optimisation is about getting the right balance between a range of issues that cover: radiation protection, environmental protection, operational safety, operational requirements, social expectations and cost. The optimisation process will also need to respect various constraints, which are likely to include: regulatory requirements, site restrictions, community-imposed requirements or restrictions and resource constraints. These issues were explored through a number of presentations that discussed practical cases of optimisation occurring at different stages of international radioactive waste management programmes. These covered: - Operations and decommissioning - management of large disused components, from the findings of an international study, presented by WPDD; - Concept option selection, prior to site selection - upstream and disposal system optioneering in the UK; - Siting decisions - examples from both Germany and France, explaining how optimisation is being used to support site comparisons and communicate siting decisions; - Repository design decisions - comparison of KBS-3 horizontal and vertical deposition options in Finland; and - On-going optimisation during repository operation - operational experience from WIPP in the US. The variety of the remarks and views expressed during the

  5. Work management to optimise occupational radiological protection

    International Nuclear Information System (INIS)

    Ahier, B.

    2009-01-01

    Although work management is no longer a new concept, continued efforts are still needed to ensure that good performance, outcomes and trends are maintained in the face of current and future challenges. The ISOE programme thus created an Expert Group on Work Management in 2007 to develop an updated report reflecting the current state of knowledge, technology and experience in the occupational radiological protection of workers at nuclear power plants. Published in 2009, the new ISOE report on Work Management to Optimise Occupational Radiological Protection in the Nuclear Power Industry provides up-to-date practical guidance on the application of work management principles. Work management measures aim at optimising occupational radiological protection in the context of the economic viability of the installation. Important factors in this respect are measures and techniques influencing i) dose and dose rate, including source- term reduction; ii) exposure, including amount of time spent in controlled areas for operations; and iii) efficiency in short- and long-term planning, worker involvement, coordination and training. Equally important due to their broad, cross-cutting nature are the motivational and organisational arrangements adopted. The responsibility for these aspects may reside in various parts of an installation's organisational structure, and thus, a multi-disciplinary approach must be recognised, accounted for and well-integrated in any work. Based on the operational experience within the ISOE programme, the following key areas of work management have been identified: - regulatory aspects; - ALARA management policy; - worker involvement and performance; - work planning and scheduling; - work preparation; - work implementation; - work assessment and feedback; - ensuring continuous improvement. The details of each of these areas are elaborated and illustrated in the report through examples and case studies arising from ISOE experience. They are intended to

  6. A comparison of forward planning and optimised inverse planning

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony; Webb, Steve

    1995-01-01

    A radiotherapy treatment plan optimisation algorithm has been applied to 48 prostate plans and the results compared with those of an experienced human planner. Twelve patients were used in the study, and a 3, 4, 6 and 8 field plan (with standard coplanar beam angles for each plan type) were optimised by both the human planner and the optimisation algorithm. The human planner 'optimised' the plan by conventional forward planning techniques. The optimisation algorithm was based on fast-simulated-annealing. 'Importance factors' assigned to different regions of the patient provide a method for controlling the algorithm, and it was found that the same values gave good results for almost all plans. The plans were compared on the basis of dose statistics and normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results show that the optimisation algorithm yielded results that were at least as good as the human planner for all plan types, and on the whole slightly better. A study of the beam-weights chosen by the optimisation algorithm and the planner will be presented. The optimisation algorithm showed greater variation, in response to individual patient geometry. For simple (e.g. 3 field) plans it was found to consistently achieve slightly higher TCP and lower NTCP values. For more complicated (e.g. 8 fields) plans the optimisation also achieved slightly better results with generally less numbers of beams. The optimisation time was always ≤5 minutes; a factor of up to 20 times faster than the human planner

  7. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    DEFF Research Database (Denmark)

    Helle, K.B.; Müller, T.O.; Astrup, Poul

    2014-01-01

    of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64......Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often...... source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given...

  8. VEHICLE DRIVING CYCLE OPTIMISATION ON THE HIGHWAY

    Directory of Open Access Journals (Sweden)

    Zinoviy STOTSKO

    2016-06-01

    Full Text Available This paper is devoted to the problem of reducing vehicle energy consumption. The authors consider the optimisation of highway driving cycle a way to use the kinetic energy of a car more effectively at various road conditions. The model of a vehicle driving control at the highway which consists of elementary cycles, such as accelerating, free rolling and deceleration under forces of external resistance, was designed. Braking, as an energy dissipation regime, was not included. The influence of the various longitudinal profiles of the road was taken into consideration and included in the model. Ways to use the results of monitoring road and traffic conditions are presented. The method of non-linear programming is used to design the optimal vehicle control function and phase trajectory. The results are presented by improved typical driving cycles that present energy saving as a subject of choice at a specified schedule.

  9. Optimisation algorithms for ECG data compression.

    Science.gov (United States)

    Haugland, D; Heber, J G; Husøy, J H

    1997-07-01

    The use of exact optimisation algorithms for compressing digital electrocardiograms (ECGs) is demonstrated. As opposed to traditional time-domain methods, which use heuristics to select a small subset of representative signal samples, the problem of selecting the subset is formulated in rigorous mathematical terms. This approach makes it possible to derive algorithms guaranteeing the smallest possible reconstruction error when a bounded selection of signal samples is interpolated. The proposed model resembles well-known network models and is solved by a cubic dynamic programming algorithm. When applied to standard test problems, the algorithm produces a compressed representation for which the distortion is about one-half of that obtained by traditional time-domain compression techniques at reasonable compression ratios. This illustrates that, in terms of the accuracy of decoded signals, existing time-domain heuristics for ECG compression may be far from what is theoretically achievable. The paper is an attempt to bridge this gap.

  10. Optimisation and constraints - a view from ICRP

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1994-01-01

    The optimisation of protection has been the major policy underlying the recommendations of the International Commission on Radiological Protection for more than 20 years. In earlier forms, the concept can be traced back to 1951. Constraints are more recent, appearing in their present form only in the 1990 recommendations of the Commission. The requirement to keep all exposures as low as reasonably achievable applies to both normal and potential exposures. The policy and the techniques are well established for normal exposures, i.e. exposures that are certain to occur. The application to potential exposures, i.e. exposures that have a probability of occurring that is less than unity, is more difficult and is still under international discussion. Constraints are needed to limit the inequity associated with the use of collective dose in cost-benefit analysis and to provide a margin to protect individuals who may be exposed to more than one source. (author)

  11. Optimising Impact in Astronomy for Development Projects

    Science.gov (United States)

    Grant, Eli

    2015-08-01

    Positive outcomes in the fields of science education and international development are notoriously difficult to achieve. Among the challenges facing projects that use astronomy to improve education and socio-economic development is how to optimise project design in order to achieve the greatest possible benefits. Over the past century, medical scientists along with statisticians and economists have progressed an increasingly sophisticated and scientific approach to designing, testing and improving social intervention and public health education strategies. This talk offers a brief review of the history and current state of `intervention science'. A similar framework is then proposed for astronomy outreach and education projects, with applied examples given of how existing evidence can be used to inform project design, predict and estimate cost-effectiveness, minimise the risk of unintended negative consequences and increase the likelihood of target outcomes being achieved.

  12. Optimisation of Multilayer Insulation an Engineering Approach

    CERN Document Server

    Chorowski, M; Parente, C; Riddone, G

    2001-01-01

    A mathematical model has been developed to describe the heat flux through multilayer insulation (MLI). The total heat flux between the layers is the result of three distinct heat transfer modes: radiation, residual gas conduction and solid spacer conduction. The model describes the MLI behaviour considering a layer-to-layer approach and is based on an electrical analogy, in which the three heat transfer modes are treated as parallel thermal impedances. The values of each of the transfer mode vary from layer to layer, although the total heat flux remains constant across the whole MLI blanket. The model enables the optimisation of the insulation with regard to different MLI parameters, such as residual gas pressure, number of layers and boundary temperatures. The model has been tested with experimental measurements carried out at CERN and the results revealed to be in a good agreement, especially for insulation vacuum between 10-5 Pa and 10-3 Pa.

  13. Public transport optimisation emphasising passengers’ travel behaviour

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo

    to the case where the two problems are solved sequentially without taking into account interdependencies. Figure 1 - Planning public transport The PhD study develops a metaheuristic algorithm to adapt the line plan configuration in order better to match passengers’ travel demand in terms of transfers as well......Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms...... to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in the context of public transport operations. In public transport, the demand is represented...

  14. Value Chain Optimisation of Biogas Production

    DEFF Research Database (Denmark)

    Jensen, Ida Græsted

    economically feasible. In this PhD thesis, the focus is to create models for investigating the profitability of biogas projects by: 1) including the whole value chain in a mathematical model and considering mass and energy changes on the upstream part of the chain; and 2) including profit allocation in a value......, the costs on the biogas plant has been included in the model using economy of scale. For the second point, a mathematical model considering profit allocation was developed applying three allocation mechanisms. This mathematical model can be applied as a second step after the value chain optimisation. After...... in the energy systems model to find the optimal end use of each type of gas and fuel. The main contributions of this thesis are the methods developed on plant level. Both the mathematical model for the value chain and the profit allocation model can be generalised and used in other industries where mass...

  15. Expect systems and optimisation in process control

    Energy Technology Data Exchange (ETDEWEB)

    Mamdani, A.; Efstathiou, J. (eds.)

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately.

  16. Expert systems and optimisation in process control

    International Nuclear Information System (INIS)

    Mamdani, A.; Efstathiou, J.

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately. (author)

  17. Improving and optimising road pricing in Copenhagen

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker; Larsen, Marie Karen

    2008-01-01

    though quite a number of proposed charging systems have been examined only a few pricing strategies have been investigated. This paper deals with the optimisation of different designs for a road pricing system in the Greater Copenhagen area with respect to temporal and spatial differentiation......The question whether to introduce toll rings or road pricing in Copenhagen has been discussed intensively during the last 10 years. The main results of previous analyses are that none of the systems would make a positive contribution at present, when considered from a socio-economic view. Even...... of the pricing levels. A detailed transport model was used to describe the demand effects. The model was based on data from a real test of road pricing on 500 car drivers. The paper compares the price systems with regard to traffic effects and generalised costs for users and society. It is shown how important...

  18. A code for optimising triplet layout

    CERN Document Server

    AUTHOR|(CDS)2141109; Seryi, Andrei; Abelleira, Jose; Cruz Alaniz, Emilia

    2017-01-01

    One of the main challenges when designing final focus systems of particle accelerators is maximising the beam stay clear in the strong quadrupole magnets of the inner triplet. Moreover it is desirable to keep the quadrupoles in the inner triplet as short as possible for space and costs reasons but also to reduce chromaticity and simplify corrections schemes. An algorithm that explores the triplet parameter space to optimise both these aspects was written. It uses thin lenses as a first approximation for a broad parameter scan and MADX for more precise calculations. The thin lens algorithm is significantly faster than a full scan using MADX and relatively precise at indicating the approximate area where the optimum solution lies.

  19. Optimising Signalised Intersection Using Wireless Vehicle Detectors

    DEFF Research Database (Denmark)

    Adjin, Daniel Michael Okwabi; Torkudzor, Moses; Asare, Jack

    Traffic congestion on roads wastes travel times. In this paper, we developed a vehicular traffic model to optimise a signalised intersection in Accra, using wireless vehicle detectors. Traffic volume gathered was extrapolated to cover 2011 and 2016 and were analysed to obtain the peak hour traffic...... volume causing congestion. The intersection was modelled and simulated in Synchro7 as an actuated signalised model using results from the analysed data. The model for morning peak periods gave optimal cycle lengths of 100s and 150s with corresponding intersection delay of 48.9s and 90.6s in 2011 and 2016...... respectively while that for the evening was 55s giving delay of 14.2s and 16.3s respectively. It is shown that the model will improve traffic flow at the intersection....

  20. Dynamic optimisation of an industrial web process

    Directory of Open Access Journals (Sweden)

    M Soufian

    2008-09-01

    Full Text Available An industrial web process has been studied and it is shown that theunderlying physics of such processes governs by the Navier-Stokes partialdifferential equations with moving boundary conditions, which in turn have tobe determined by the solution of the thermodynamics equations. Thedevelopment of a two-dimensional continuous-discrete model structurebased on this study is presented. Other models are constructed based onthis model for better identification and optimisation purposes. Theparameters of the proposed models are then estimated using real dataobtained from the identification experiments with the process plant. Varioussimulation tests for validation are accompanied with the design, developmentand real-time industrial implementation of an optimal controller for dynamicoptimisation of this web process. It is shown that in comparison with thetraditional controller, the new controller resulted in a better performance, animprovement in film quality and saving in raw materials. This demonstrates theefficiency and validation of the developed models.

  1. Recent perspectives on optimisation of radiological protection

    International Nuclear Information System (INIS)

    Robb, J.D.; Croft, J.R.

    1992-01-01

    The ALARA principle as a requirement in radiological protection has evolved from its theoretical roots. Based on several years work, this paper provides a backdrop to practical approaches to ALARA for the 1990s. The key step, developing ALARA thinking so that it becomes an integral part of radiological protection programmes, is discussed using examples from the UK and France, as is the role of tools to help standardise judgements for decision-making. In its latest recommendations, ICRP have suggested that the optimisation of protection should be constrained by restrictions on the doses to individuals. This paper also considers the function of such restrictions for occupational, public and medical exposure, and in the design process. (author)

  2. Optimisation of parameters of DCD for PHWRs

    International Nuclear Information System (INIS)

    Velmurugan, S.; Sathyaseelan, V.S.; Narasimhan, S.V.; Mathur, P.K.

    1991-01-01

    Decontamination formulation based on EDTA, Oxalic acid, Citric acid was evaluated for its efficacy in removing oxide layers of PHWR. An ion exchange system which was specifically suitable for fission product dominated contamination in PHWRs was optimised for the reagent regeneration stage of the decontamination process. An analysis of the nature of the complexed metal species formed in the dissolution process and Electrochemical measurements were employed as a tool to follow the course of oxide removal during the dissolution process. An attempt was made to understand the redeposition behaviour of various isotopes during the decontamination process. SEM and ESCA studies of metal coupons before and after the dissolution process were used to analyse the deposits in the above context. The pick up of DCD reagents on the ion exchangers and material compatibility tests on Carbon steel, Monel-400 and Zircaloy-2 with the decontaminant under the conditions of decontamination experiment are reported. (author)

  3. Optimisation of Inulinase Production by Kluyveromyces bulgaricus

    Directory of Open Access Journals (Sweden)

    Darija Vranešić

    2002-01-01

    Full Text Available The present work is based on observation of the effects of pH and temperature of fermentation on the production of microbial enzyme inulinase by Kluyveromyces marxianus var. bulgaricus. Inulinase hydrolyzes inulin, a polysaccharide which can be isolated from plants such as Jerusalem artichoke, chicory or dahlia, and transformed into pure fructose or fructooligosaccharides. Fructooligosaccharides have great potential in food industry because they can be used as calorie-reduced compounds and noncariogenic sweeteners as well as soluble fibre and prebiotic compounds. Fructose formation from inulin is a single step enzymatic reaction and yields are up to 95 % the fructose. On the contrary, conventional fructose production from starch needs at least three enzymatic steps, yielding only 45 % of fructose. The process of inulinase production was optimised by using experimental design method. pH value of the cultivation medium showed to be the most significant variable and it should be maintained at optimum value of 3.6. The effect of temperature was slightly lower and optimal values were between 30 and 33 °C. At a low pH value of the cultivation medium, the microorganism was not able to producem enough enzyme and enzyme activities were low. Similar effect was caused by high temperature. The highest values of enzyme activities were achieved at optimal fermentation conditions and the values were: 100.16–124.36 IU/mL (with sucrose as substrate for determination of enzyme activity or 8.6–11.6 IU/mL (with inulin as substrate, respectively. The method of factorial design and response surface analysis makes it possible to study several factors simultaneously, to quantify the individual effect of each factor and to investigate their possible interactions. As a comparison to this method, optimisation of a physiological enzyme activity model depending on pH and temperature was also studied.

  4. Optimisation on processing parameters for minimising warpage on side arm using response surface methodology (RSM) and particle swarm optimisation (PSO)

    Science.gov (United States)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.

    2017-09-01

    This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.

  5. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    Science.gov (United States)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  6. PDB2Graph: A toolbox for identifying critical amino acids map in proteins based on graph theory.

    Science.gov (United States)

    Niknam, Niloofar; Khakzad, Hamed; Arab, Seyed Shahriar; Naderi-Manesh, Hossein

    2016-05-01

    The integrative and cooperative nature of protein structure involves the assessment of topological and global features of constituent parts. Network concept takes complete advantage of both of these properties in the analysis concomitantly. High compatibility to structural concepts or physicochemical properties in addition to exploiting a remarkable simplification in the system has made network an ideal tool to explore biological systems. There are numerous examples in which different protein structural and functional characteristics have been clarified by the network approach. Here, we present an interactive and user-friendly Matlab-based toolbox, PDB2Graph, devoted to protein structure network construction, visualization, and analysis. Moreover, PDB2Graph is an appropriate tool for identifying critical nodes involved in protein structural robustness and function based on centrality indices. It maps critical amino acids in protein networks and can greatly aid structural biologists in selecting proper amino acid candidates for manipulating protein structures in a more reasonable and rational manner. To introduce the capability and efficiency of PDB2Graph in detail, the structural modification of Calmodulin through allosteric binding of Ca(2+) is considered. In addition, a mutational analysis for three well-identified model proteins including Phage T4 lysozyme, Barnase and Ribonuclease HI, was performed to inspect the influence of mutating important central residues on protein activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A House Full of Trap Doors. Identifying barriers to resilient drylands in the toolbox of pastoral development

    Science.gov (United States)

    Krätli, Saverio; Kaufmann, Brigitte; Roba, Hassan; Hiernaux, Pierre; Li, Wenjun; Easdale, Marcos H.; Huelsebusch, Christian

    2016-04-01

    The theoretical understanding of drylands and pastoral systems has long undergone a U-turn from the initial perspective rooted in classical ecology. The shift has hinged on the way to represent asymmetric variability, from a disturbance in an ecosystem that naturally tends towards uniformity and stability, to a constitutive part of a dynamic ecosystem. Operationalising the new reversed perspective, including the need to update the methodological infrastructure to plan around drylands and pastoral development, remains a challenge. Underlying assumptions about stability and uniformity, that are a legacy of equilibrium thinking, remain embedded in the toolbox of pastoral development, starting from the technical language to talk about the subject. This effectively gets in the way of operationalizing state of the art understanding of pastoral systems and the drylands. Unless these barriers are identified, unpacked and managed, even the present calls for increasing the rigour and intensity of data collection - for example as part of the ongoing global process to revise and improve agricultural data - cannot deliver a realistic representation of pastoral systems in statistics and policy making. This contribution presents the case for understanding variability as an asset, and provides a range of examples of methodological barriers, including classifications of livestock systems, scale of observation, key parameters in animal production, indicators in the measurement of ecological efficiency, concepts of ecological fragility, natural resources, and pastoral risk. The need to update this legacy is a pressing challenge for policy makers concerned with both modernisation and resilience in the drylands.

  8. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave

    Directory of Open Access Journals (Sweden)

    Ikaro Silva

    2014-09-01

    Full Text Available The WaveForm DataBase (WFDB Toolbox for MATLAB/Octave enables  integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox allows direct loading into MATLAB/Octave's workspace of over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by meta data such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  9. Sequential projection pursuit for optimised vibration-based damage detection in an experimental wind turbine blade

    Science.gov (United States)

    Hoell, Simon; Omenzetter, Piotr

    2018-02-01

    To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.

  10. Combining optimisation and simulation in an energy systems analysis of a Swedish iron foundry

    International Nuclear Information System (INIS)

    Mardan, Nawzad; Klahr, Roger

    2012-01-01

    To face global competition, and also reduce environmental and climate impact, industry-wide changes are needed, especially regarding energy use, which is closely related to global warming. Energy efficiency is therefore an essential task for the future as it has a significant impact on both business profits and the environment. For the analysis of possible changes in industrial production processes, and to choose what changes should be made, various modelling tools can be used as a decision support. This paper uses two types of energy analysis tool: Discrete Event Simulation (DES) and Energy Systems Optimisation (ESO). The aim of this study is to describe how a DES and an ESO tool can be combined. A comprehensive five-step approach is proposed for reducing system costs and making a more robust production system. A case study representing a new investment in part of a Swedish iron foundry is also included to illustrate the method's use. The method described in this paper is based on the use of the DES program QUEST and the ESO tool reMIND. The method combination itself is generic, i.e. other similar programs can be used as well with some adjustments and adaptations. The results from the case study show that when different boundary conditions are used the result obtained from the simulation tools is not optimum, in other words, the result shows only a feasible solution and not the best way to run the factory. It is therefore important to use the optimisation tool in such cases in order to obtain the optimum operating strategy. By using the optimisation tool a substantial amount of resources can be saved. The results also show that the combination of optimisation and simulation tools is useful to provide very detailed information about how the system works and to predict system behaviour as well as to minimise the system cost. -- Highlights: ► This study describes how a simulation and an optimisation tool can be combined. ► A case study representing a new

  11. COMPLIANCE ANALYSIS, OPTIMISATION AND COMPARISON OF A NEW 3PUS-PU MECHANISM

    Directory of Open Access Journals (Sweden)

    B. Wei

    2013-06-01

    Full Text Available This paper investigates the compliance of a new 3PUS-PU hybrid mechanism with three degrees of freedom, including translation along the Z axis and rotations about the X and Y axes. Firstly, the kinematic analysis of the mechanism is analysed and the compliance model of the mechanism derived. Secondly, the effects the geometric parameters and position and orientation parameters on the compliance of the mechanism in each direction are investigated, and the genetic algorithm is used to optimise the global compliance by simultaneously adjusting design variables. Finally, the compliance of two similar kinds of 3PUS-PU mechanism in each direction is reviewed.

  12. Design optimisation of a flywheel hybrid vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Kok, D.B.

    1999-11-04

    This thesis describes the design optimisation of a flywheel hybrid vehicle with respect to fuel consumption and exhaust gas emissions. The driveline of this passenger car uses two power sources: a small spark ignition internal combustion engine with three-way catalyst, and a highspeed flywheel system for kinetic energy storage. A custom-made continuously variable transmission (CVT) with so-called i{sup 2} control transports energy between these power sources and the vehicle wheels. The driveline includes auxiliary systems for hydraulic, vacuum and electric purposes. In this fully mechanical driveline, parasitic energy losses determine the vehicle's fuel saving potential to a large extent. Practicable energy loss models have been derived to quantify friction losses in bearings, gearwheels, the CVT, clutches and dynamic seals. In addition, the aerodynamic drag in the flywheel system and power consumption of auxiliaries are charted. With the energy loss models available, a calculation procedure is introduced to optimise the flywheel as a subsystem in which the rotor geometry, the safety containment, and the vacuum system are designed for minimum energy use within the context of automotive applications. A first prototype of the flywheel system was tested experimentally and subsequently redesigned to improve rotordynamics and safety aspects. Coast-down experiments with the improved version show that the energy losses have been lowered significantly. The use of a kinetic energy storage device enables the uncoupling of vehicle wheel power and engine power. Therefore, the engine can be smaller and it can be chosen to operate in its region of best efficiency in start-stop mode. On a test-rig, the measured engine fuel consumption was reduced with more than 30 percent when the engine is intermittently restarted with the aid of the flywheel system. Although the start-stop mode proves to be advantageous for fuel consumption, exhaust gas emissions increase temporarily

  13. Profile control studies for JET optimised shear regime

    Energy Technology Data Exchange (ETDEWEB)

    Litaudon, X.; Becoulet, A.; Eriksson, L.G.; Fuchs, V.; Huysmans, G.; How, J.; Moreau, D.; Rochard, F.; Tresset, G.; Zwingmann, W. [Association Euratom-CEA, CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint-Paul-lez-Durance (France); Bayetti, P.; Joffrin, E.; Maget, P.; Mayorat, M.L.; Mazon, D.; Sarazin, Y. [JET Abingdon, Oxfordshire (United Kingdom); Voitsekhovitch, I. [Universite de Provence, LPIIM, Aix-Marseille 1, 13 (France)

    2000-03-01

    This report summarises the profile control studies, i.e. preparation and analysis of JET Optimised Shear plasmas, carried out during the year 1999 within the framework of the Task-Agreement (RF/CEA/02) between JET and the Association Euratom-CEA/Cadarache. We report on our participation in the preparation of the JET Optimised Shear experiments together with their comprehensive analyses and the modelling. Emphasis is put on the various aspects of pressure profile control (core and edge pressure) together with detailed studies of current profile control by non-inductive means, in the prospects of achieving steady, high performance, Optimised Shear plasmas. (authors)

  14. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    Directory of Open Access Journals (Sweden)

    Lucian A B Purvis

    Full Text Available In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  15. A toolbox and sample object perception data for equalization of natural images

    Directory of Open Access Journals (Sweden)

    Wilma A. Bainbridge

    2015-12-01

    Full Text Available For psychologists and neuroscientists, careful selection of their stimuli is essential, so that low-level visual features such as color or spatial frequency do not serve as confounds between conditions of interest. Here, we detail the Natural Image Statistical Toolbox, which allows scientists to measure, visualize, and control stimulus sets along a set of low-level visual properties. Additionally, we provide a set of object images varying along several perceptual object properties, including physical size and interaction envelope size (i.e., the space around an object transversed during an interaction, serving as a test-bed for the Natural Image Statistical Toolbox. This stimulus set is also a highly characterized set useful to psychology and neuroscience studies on object perception.

  16. Genetic toolbox for controlled expression of functional proteins in Geobacillus spp.

    Directory of Open Access Journals (Sweden)

    Ivan Pogrebnyakov

    Full Text Available Species of genus Geobacillus are thermophilic bacteria and play an ever increasing role as hosts for biotechnological applications both in academia and industry. Here we screened a number of Geobacillus strains to determine which industrially relevant carbon sources they can utilize. One of the strains, G. thermoglucosidasius C56-YS93, was then chosen to develop a toolbox for controlled gene expression over a wide range of levels. It includes a library of semi-synthetic constitutive promoters (76-fold difference in expression levels and an inducible promoter from the xylA gene. A library of synthetic in silico designed ribosome binding sites was also created for further tuning of translation. The PxylA was further used to successfully express native and heterologous xylanases in G. thermoglucosidasius. This toolbox enables fine-tuning of gene expression in Geobacillus species for metabolic engineering approaches in production of biochemicals and heterologous proteins.

  17. User's manual for Ecolego Toolbox and the Discretization Block

    Energy Technology Data Exchange (ETDEWEB)

    Broed, Robert (Facilia consulting AB (Sweden)); Shulan Xu (Swedish Radiation Protection Authority, Stockholm (Sweden))

    2008-03-15

    The CLIMB modelling team (Catchment LInked Models of radiological effects in the Biosphere) was instituted in 2004 to provide SSI with an independent modelling capability when reviewing SKB's assessment of long-term safety for a geological repository. Modelling in CLIMB covers all aspects of performance assessment (PA) from near-field releases to radiological consequences in the surface environment. Software used to implement assessment models has been developed within the project. The software comprises a toolbox based on the commercial packages Matlab and Simulink used to solve compartment based differential equation systems, but with an added user friendly graphical interface. This report documents the new simulation toolbox and a newly developed Discretisation Block, which is a powerful tool for solving problems involving a network of compartments in two dimensions

  18. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  19. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    Science.gov (United States)

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  20. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    Science.gov (United States)

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  1. Optimisation of tungsten ore processing through a deep mineralogical characterisation and the study of the crushing process

    OpenAIRE

    Bascompte Vaquero, Jordi

    2017-01-01

    The unstoppable increasing global demand for metals calls for an urgent development of more efficient extraction and processing methods in the mining industry. Comminution is responsible for nearly half of the energy consumption of the entire mining process, and in the majority of the cases it is far from being optimised. Inside comminution, grinding is widely known for being more inefficient than crushing, however, it is needed to reach liberation at an ultrafine particle size. ...

  2. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    Science.gov (United States)

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  3. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    Science.gov (United States)

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  4. PredPsych: A toolbox for predictive machine learning based approach in experimental psychology research

    OpenAIRE

    Cavallo, Andrea; Becchio, Cristina; Koul, Atesh

    2016-01-01

    Recent years have seen an increased interest in machine learning based predictive methods for analysing quantitative behavioural data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible software framework. The goal of this work was to build an open-source toolbox – “PredPsych” – that could make these methods readily available to all psychologists. PredPsych is a...

  5. Orfeo Toolbox: A Free And Open Source Solution For Research And Operational Remote Sensing Projects

    Science.gov (United States)

    Savinaud, Mickael; OTB-CS Team

    2013-12-01

    The free and open source solution, Orfeo ToolBox (OTB), offers the possibility to deal with large data processing. This library designed by CNES in the frame of the ORFEO accompaniment program to promote use of Pleiades data and other VHR data offers now a larger number of applications designed to end users. Due to its modular design, OTB is now used in different context from R&D studies to operational chain.

  6. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) Users' Workshop Presentations

    Science.gov (United States)

    Litt, Jonathan S. (Compiler)

    2018-01-01

    NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.

  7. Numerical solution of multi groups point kinetic equations by simulink toolbox of Matlab software

    International Nuclear Information System (INIS)

    Hadad, K.; Mohamadi, A.; Sabet, H.; Ayobian, N.; Khani, M.

    2004-01-01

    The simulink toolbox of Matlab Software was employed to solve the point kinetics equation with six group delayed neutrons. The method of Adams-Bash ford showed a good convergence in solving the system of simultaneous equations and the obtained results showed good agreements with other numerical schemes. The flexibility of the package in changing the system parameters and the user friendly interface makes this approach a reliable educational package in revealing the affects of reactivity changes on power incursions

  8. SU-E-J-253: The Radiomics Toolbox in the Computational Environment for Radiological Research (CERR)

    Energy Technology Data Exchange (ETDEWEB)

    Apte, A; Veeraraghavan, H; Oh, J; Kijewski, P; Deasy, J [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2015-06-15

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features and (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.

  9. An open-source toolbox for automated phenotyping of mice in behavioral tasks

    Directory of Open Access Journals (Sweden)

    Tapan P Patel

    2014-10-01

    Full Text Available Classifying behavior patterns in mouse models of neurological, psychiatric and neurodevelopmental disorders is critical for understanding disease causality and treatment. However, complete characterization of behavior is time-intensive, prone to subjective scoring, and often requires specialized equipment. Although several reports describe automated home-cage monitoring and individual task scoring methods, we report the first open source, comprehensive toolbox for automating the scoring of several common behavior tasks used by the neuroscience community. We show this new toolbox is robust and achieves equal or better consistency when compared to manual scoring methods. We use this toolbox to study the alterations in behavior that occur following blast-induced traumatic brain injury (bTBI, and study if these behavior patterns are altered following genetic deletion of the transcription factor Ets-like kinase 1 (Elk-1. Due to the role of Elk-1 in neuronal survival and proposed role in synaptic plasticity, we hypothesized that Elk-1 deletion would improve some neurobehavioral deficits, while impairing others, following blast exposure. In Elk-1 knockout animals, deficits in open field, spatial object recognition and elevated zero maze performance after blast exposure disappeared, while new significant deficits appeared in spatial and associative memory. These are the first data suggesting a molecular mediator of anxiety deficits following blast-induced traumatic brain injury, and represent the utility of the broad screening tool we developed. More broadly, we envision this open-source toolbox will provide a more consistent and rapid analysis of behavior across many neurological diseases, promoting the rapid discovery of novel pathways mediating disease progression and treatment.

  10. Evaluation of an Advanced Harmonic Filter for Adjustable Speed Drives using a Toolbox Approach

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian; Hansen, Steffan; Blaabjerg, Frede

    2004-01-01

    A large diversity of solutions exists to reduce the harmonic emission of the 6-pulse Adjustable Speed Drive in order to fulfill the requirements of the international harmonic standards. Among them, new types of advanced harmonic filters recently gained an increased attention due to their good...... a combination of a pre-stored database and new interpolation techniques the toolbox can provide the harmonic data on real applications allowing comparisons between different mitigation solutions....

  11. Visualizing flow fields using acoustic Doppler current profilers and the Velocity Mapping Toolbox

    Science.gov (United States)

    Jackson, P. Ryan

    2013-01-01

    The purpose of this fact sheet is to provide examples of how the U.S. Geological Survey is using acoustic Doppler current profilers for much more than routine discharge measurements. These instruments are capable of mapping complex three-dimensional flow fields within rivers, lakes, and estuaries. Using the Velocity Mapping Toolbox to process the ADCP data allows detailed visualization of the data, providing valuable information for a range of studies and applications.

  12. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    Science.gov (United States)

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  13. IV. NIH Toolbox Cognition Battery (CB): measuring language (vocabulary comprehension and reading decoding).

    Science.gov (United States)

    Gershon, Richard C; Slotkin, Jerry; Manly, Jennifer J; Blitz, David L; Beaumont, Jennifer L; Schnipke, Deborah; Wallner-Allen, Kathleen; Golinkoff, Roberta Michnick; Gleason, Jean Berko; Hirsh-Pasek, Kathy; Adams, Marilyn Jager; Weintraub, Sandra

    2013-08-01

    Mastery of language skills is an important predictor of daily functioning and health. Vocabulary comprehension and reading decoding are relatively quick and easy to measure and correlate highly with overall cognitive functioning, as well as with success in school and work. New measures of vocabulary comprehension and reading decoding (in both English and Spanish) were developed for the NIH Toolbox Cognition Battery (CB). In the Toolbox Picture Vocabulary Test (TPVT), participants hear a spoken word while viewing four pictures, and then must choose the picture that best represents the word. This approach tests receptive vocabulary knowledge without the need to read or write, removing the literacy load for children who are developing literacy and for adults who struggle with reading and writing. In the Toolbox Oral Reading Recognition Test (TORRT), participants see a letter or word onscreen and must pronounce or identify it. The examiner determines whether it was pronounced correctly by comparing the response to the pronunciation guide on a separate computer screen. In this chapter, we discuss the importance of language during childhood and the relation of language and brain function. We also review the development of the TPVT and TORRT, including information about the item calibration process and results from a validation study. Finally, the strengths and weaknesses of the measures are discussed. © 2013 The Society for Research in Child Development, Inc.

  14. Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox

    Science.gov (United States)

    Karimpour, Arash; Chen, Qin

    2017-09-01

    There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.

  15. A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering

    Science.gov (United States)

    Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng

    2013-01-01

    Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791

  16. NIH Toolbox Cognitive Function Battery (CFB): Composite Scores of Crystallized, Fluid, and Overall Cognition

    Science.gov (United States)

    Akshoomoff, Natacha; Beaumont, Jennifer L.; Bauer, Patricia J.; Dikmen, Sureyya; Gershon, Richard; Mungas, Dan; Slotkin, Jerry; Tulsky, David; Weintraub, Sandra; Zelazzo, Philip; Heaton, Robert K.

    2014-01-01

    The NIH Toolbox Cognitive Function Battery (CFB) includes 7 tests covering 8 cognitive abilities considered to be important in adaptive functioning across the lifespan (from early childhood to late adulthood). Here we present data on psychometric characteristics in children (N = 208; ages 3–15 years) of a total summary score and composite scores reflecting two major types of cognitive abilities: “crystallized” (more dependent upon past learning experiences) and “fluid” (capacity for new learning and information processing in novel situations). Both types of cognition are considered important in everyday functioning, but are thought to be differently affected by brain health status throughout life, from early childhood through older adulthood. All three Toolbox composite scores showed excellent test-retest reliability, robust developmental effects across the childhood age range considered here, and strong correlations with established, “gold standard” measures of similar abilities. Additional preliminary evidence of validity includes significant associations between all three Toolbox composite scores and maternal reports of children’s health status and school performance. PMID:23952206

  17. ARCTIS — A MATLAB® Toolbox for Archaeological Imaging Spectroscopy

    Directory of Open Access Journals (Sweden)

    Clement Atzberger

    2014-09-01

    Full Text Available Imaging spectroscopy acquires imagery in hundreds or more narrow contiguous spectral bands. This offers unprecedented information for archaeological research. To extract the maximum of useful archaeological information from it, however, a number of problems have to be solved. Major problems relate to data redundancy and the visualization of the large amount of data. This makes data mining approaches necessary, as well as efficient data visualization tools. Additional problems relate to data quality. Indeed, the upwelling electromagnetic radiation is recorded in small spectral bands that are only about ten nanometers wide. The signal received by the sensor is, thus quite low compared to sensor noise and possible atmospheric perturbations. The often small, instantaneous field of view (IFOV—essential for archaeologically relevant imaging spectrometer datasets—further limits the useful signal stemming from the ground. The combination of both effects makes radiometric smoothing techniques mandatory. The present study details the functionality of a MATLAB®-based toolbox, called ARCTIS (ARChaeological Toolbox for Imaging Spectroscopy, for filtering, enhancing, analyzing, and visualizing imaging spectrometer datasets. The toolbox addresses the above-mentioned problems. Its Graphical User Interface (GUI is designed to allow non-experts in remote sensing to extract a wealth of information from imaging spectroscopy for archaeological research. ARCTIS will be released under creative commons license, free of charge, via website (http://luftbildarchiv.univie.ac.at.

  18. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  19. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    Science.gov (United States)

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  20. Weight Optimisation of Steel Monopile Foundations for Offshore Windfarms

    DEFF Research Database (Denmark)

    Fog Gjersøe, Nils; Bouvin Pedersen, Erik; Kristensen, Brian

    2015-01-01

    The potential for mass reduction of monopiles in offshore windfarms using current design practice is investigated. Optimisation by sensitivity analysis is carried out for the following important parameters: wall thickness distribution between tower and monopile, soil stiffness, damping ratio...

  1. Protection against natural radiation: Optimisation and decision exercises

    International Nuclear Information System (INIS)

    O'Riordan, M.C.

    1984-02-01

    Six easy exercises are presented in which cost-benefit analysis is used to optimise protection against natural radiation or to decide whether protection is appropriate. The exercises are illustrative only and do not commit the Board. (author)

  2. Optimisation of wheat-sprouted soybean flour bread using response ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-11-16

    Nov 16, 2009 ... Full Length Research Paper. Optimisation of ... Victoria A. Jideani1* and Felix C. Onwubali2. 1Department of Food Technology, Cape Peninsula University of Technology, P. O. Box 652, Cape Town 8000, South. Africa.

  3. Optimised intake stroke analysis for flat and dome head pistons ...

    African Journals Online (AJOL)

    Optimised intake stroke analysis for flat and dome head pistons. ... in understanding the performance characteristics optioned between flat head and dome head pistons in engine design. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  4. Distributed optimisation problem with communication delay and external disturbance

    Science.gov (United States)

    Tran, Ngoc-Tu; Xiao, Jiang-Wen; Wang, Yan-Wu; Yang, Wu

    2017-12-01

    This paper investigates the distributed optimisation problem for the multi-agent systems (MASs) with the simultaneous presence of external disturbance and the communication delay. To solve this problem, a two-step design scheme is introduced. In the first step, based on the internal model principle, the internal model term is constructed to compensate the disturbance asymptotically. In the second step, a distributed optimisation algorithm is designed to solve the distributed optimisation problem based on the MASs with the simultaneous presence of disturbance and communication delay. Moreover, in the proposed algorithm, each agent interacts with its neighbours through the connected topology and the delay occurs during the information exchange. By utilising Lyapunov-Krasovskii functional, the delay-dependent conditions are derived for both slowly and fast time-varying delay, respectively, to ensure the convergence of the algorithm to the optimal solution of the optimisation problem. Several numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.

  5. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi; Collier, Nathan; Niemi, Antti; Calo, Victor M.

    2012-01-01

    optimised shapes produce efficient flapping flights, the wake pattern and its vorticity strength are examined. This work described in this paper should facilitate better guidance for shape design of engineered flying systems.

  6. Share-of-Surplus Product Line Optimisation with Price Levels

    Directory of Open Access Journals (Sweden)

    X. G. Luo

    2014-01-01

    Full Text Available Kraus and Yano (2003 established the share-of-surplus product line optimisation model and developed a heuristic procedure for this nonlinear mixed-integer optimisation model. In their model, price of a product is defined as a continuous decision variable. However, because product line optimisation is a planning process in the early stage of product development, pricing decisions usually are not very precise. In this research, a nonlinear integer programming share-of-surplus product line optimization model that allows the selection of candidate price levels for products is established. The model is further transformed into an equivalent linear mixed-integer optimisation model by applying linearisation techniques. Experimental results in different market scenarios show that the computation time of the transformed model is much less than that of the original model.

  7. Optimising a fall out dust monitoring sampling programme at a ...

    African Journals Online (AJOL)

    GREG

    Key words: Fall out dust monitoring, cement plant, optimising, air pollution sampling, fall out dust sampler locations. .... applied for those areas where controls are in place. Sampling ..... mass balance in the total cement manufacturing process.

  8. Issues with performance measures for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available Symposium on Computational Intelligence in Dynamic and Uncertain Environments (CIDUE), Mexico, 20-23 June 2013 Issues with Performance Measures for Dynamic Multi-objective Optimisation Mard´e Helbig CSIR: Meraka Institute Brummeria, South Africa...

  9. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    DR. AMIN

    optimise the production of ADC from organic fractions of domestic wastes and the effects of ADC amendments on soil .... (22%), cooked meat (9%), lettuce (11%), carrots. (3%), potato (44%) ... seed was obtained from a mesophilic anaerobic.

  10. Algorithm for optimisation of paediatric chest radiography

    International Nuclear Information System (INIS)

    Kostova-Lefterova, D.

    2016-01-01

    The purpose of this work is to assess the current practice and patient doses in paediatric chest radiography in a large university hospital. The X-ray unit is used in the paediatric department for respiratory diseases. Another purpose was to recommend and apply optimized protocols to reduce patient dose while maintaining diagnostic image quality for the x-ray images. The practice of two different radiographers was studied. The results were compared with the existing practice in paediatric chest radiography and the opportunities for optimization were identified in order to reduce patient doses. A methodology was developed for optimization of the x-ray examinations by grouping children in age groups or according to other appropriate indication and creating an algorithm for proper selection of the exposure parameters for each group. The algorithm for the optimisation of paediatric chest radiography reduced patient doses (PKA, organ dose, effective dose) between 1.5 and 6 times for the different age groups, the average glandular dose up to 10 times and the dose for the lung between 2 and 5 times. The resulting X-ray images were of good diagnostic quality. The subjectivity in the choice of exposure parameters was reduced and standardization has been achieved in the work of the radiographers. The role of the radiologist, the medical physicist and radiographer in the process of optimization was shown. It was proven the effect of teamwork in reducing patient doses at keeping adequate image quality. Key words: Chest Radiography. Paediatric Radiography. Optimization. Radiation Exposure. Radiation Protection

  11. Optimising preterm nutrition: present and future

    LENUS (Irish Health Repository)

    Brennan, Ann-Marie

    2016-04-01

    The goal of preterm nutrition in achieving growth and body composition approximating that of the fetus of the same postmenstrual age is difficult to achieve. Current nutrition recommendations depend largely on expert opinion, due to lack of evidence, and are primarily birth weight based, with no consideration given to gestational age and\\/or need for catch-up growth. Assessment of growth is based predominately on anthropometry, which gives insufficient attention to the quality of growth. The present paper provides a review of the current literature on the nutritional management and assessment of growth in preterm infants. It explores several approaches that may be required to optimise nutrient intakes in preterm infants, such as personalising nutritional support, collection of nutrient intake data in real-time, and measurement of body composition. In clinical practice, the response to inappropriate nutrient intakes is delayed as the effects of under- or overnutrition are not immediate, and there is limited nutritional feedback at the cot-side. The accurate and non-invasive measurement of infant body composition, assessed by means of air displacement plethysmography, has been shown to be useful in assessing quality of growth. The development and implementation of personalised, responsive nutritional management of preterm infants, utilising real-time nutrient intake data collection, with ongoing nutritional assessments that include measurement of body composition is required to help meet the individual needs of preterm infants.

  12. Optimising Boltzmann codes for the PLANCK era

    International Nuclear Information System (INIS)

    Hamann, Jan; Lesgourgues, Julien; Balbi, Amedeo; Quercellini, Claudia

    2009-01-01

    High precision measurements of the Cosmic Microwave Background (CMB) anisotropies, as can be expected from the PLANCK satellite, will require high-accuracy theoretical predictions as well. One possible source of theoretical uncertainty is the numerical error in the output of the Boltzmann codes used to calculate angular power spectra. In this work, we carry out an extensive study of the numerical accuracy of the public Boltzmann code CAMB, and identify a set of parameters which determine the error of its output. We show that at the current default settings, the cosmological parameters extracted from data of future experiments like Planck can be biased by several tenths of a standard deviation for the six parameters of the standard ΛCDM model, and potentially more seriously for extended models. We perform an optimisation procedure that leads the code to achieve sufficient precision while at the same time keeping the computation time within reasonable limits. Our conclusion is that the contribution of numerical errors to the theoretical uncertainty of model predictions is well under control—the main challenges for more accurate calculations of CMB spectra will be of an astrophysical nature instead

  13. Towards a Toolbox for a Process Innovation Laboratory

    DEFF Research Database (Denmark)

    Gong, Jie; Møller, Charles

    2011-01-01

    Driven by the global competition, process Innovation has been a challenge to large enterprises for many years. The research in this paper discusses about the theory and methods of Business Process Innovation (BPI) in designing Intelligent Supply Chains (ISC) in inter-organizations. We first review...... the BPI concepts, compare it with Business Process Improvement (BPIm) and Business Process Reengineering (BPR), and then build an analytical framework to define business process innovation from four different perspectives: management science, knowledge management, information technology and quality...

  14. Towards a Toolbox for a Process Innovation Laboratory

    DEFF Research Database (Denmark)

    Gong, Jie; Møller, Charles

    2012-01-01

    Driven by the global competition, process Innovation has been a challenge to large enterprises for many years. The research in this paper discusses about the theory and methods of Business Process Innovation (BPI) in designing Intelligent Supply Chains (ISC) in inter-organizations. We first review...... the BPI concepts, compare it with Business Process Improvement (BPIm) and Business Process Reengineering (BPR), and then build an analytical framework to define business process innovation from four different perspectives: management science, knowledge management, information technology and quality...

  15. Geometrical exploration of a flux-optimised sodium receiver through multi-objective optimisation

    Science.gov (United States)

    Asselineau, Charles-Alexis; Corsi, Clothilde; Coventry, Joe; Pye, John

    2017-06-01

    A stochastic multi-objective optimisation method is used to determine receiver geometries with maximum second law efficiency, minimal average temperature and minimal surface area. The method is able to identify a set of Pareto optimal candidates that show advantageous geometrical features, mainly in being able to maximise the intercepted flux within the geometrical boundaries set. Receivers with first law thermal efficiencies ranging from 87% to 91% are also evaluated using the second law of thermodynamics and found to have similar efficiencies of over 60%, highlighting the influence that the geometry can play in the maximisation of the work output of receivers by influencing the distribution of the flux from the concentrator.

  16. Intelligent Support for a Computer Aided Design Optimisation Cycle

    OpenAIRE

    B. Dolšak; M. Novak; J. Kaljun

    2006-01-01

    It is becoming more and more evident that  adding intelligence  to existing computer aids, such as computer aided design systems, can lead to significant improvements in the effective and reliable performance of various engineering tasks, including design optimisation. This paper presents three different intelligent modules to be applied within a computer aided design optimisation cycle to enable more intelligent and less experience-dependent design performance. 

  17. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  18. OPTIMISATION OF A DRIVE SYSTEM AND ITS EPICYCLIC GEAR SET

    OpenAIRE

    Bellegarde , Nicolas; Dessante , Philippe; Vidal , Pierre; Vannier , Jean-Claude

    2007-01-01

    International audience; This paper describes the design of a drive consisting of a DC motor, a speed reducer, a lead screw transformation system, a power converter and its associated DC source. The objective is to reduce the mass of the system. Indeed, the volume and weight optimisation of an electrical drive is an important issue for embedded applications. Here, we present an analytical model of the system in a specific application and afterwards an optimisation of the motor and speed reduce...

  19. Design of optimised backstepping controller for the synchronisation of chaotic Colpitts oscillator using shark smell algorithm

    Science.gov (United States)

    Fouladi, Ehsan; Mojallali, Hamed

    2018-01-01

    In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master-slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm optimisation (PSO) algorithm. Simulation results show better performance in terms of accuracy and convergence for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller.

  20. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  1. Optimisation of integrated energy and materials systems

    International Nuclear Information System (INIS)

    Gielen, D.J.; Okken, P.A.

    1994-06-01

    To define cost-effective long term CO2 reduction strategies an integrated energy and materials system model for the Netherlands for the period 2000-2040 is developed. The model is based upon the energy system model MARKAL, which configures an optimal mix of technologies to satisfy the specified energy and product/materials service demands. This study concentrates on CO 2 emission reduction in the materials system. For this purpose, the energy system model is enlarged with a materials system model including all steps 'from cradle to grave'. The materials system model includes 29 materials, 20 product groups and 30 waste materials. The system is divided into seven types of technologies; 250 technologies are modeled. The results show that the integrated optimisation of the energy system and the materials system can significantly reduce the emission reduction costs, especially at higher reduction percentages. The reduction is achieved through shifts in materials production and waste handling and through materials substitution in products. Shifts in materials production and waste management seem cost-effective, while the cost-effectiveness of shifts in product composition is sensitive due to the cost structure of products. For the building sector, transportation applications and packaging, CO 2 policies show a significant impact on prices, and shifts in product composition could occur. For other products, the reduction through materials substitution seems less promising. The impact on materials consumption seems most significant for cement (reduced), timber and aluminium (both increased). For steel and plastics, the net effect is balanced, but shifts between applications do occur. The MARKAL-approach is feasible to study integrated energy and materials systems. The progress compared to other environmental system analysis instruments is much more insight in the interaction of technologies on a national scale and in time

  2. An approach to next step device optimisation

    International Nuclear Information System (INIS)

    Salpietro, E.

    2000-01-01

    The requirements for ITER EDA were to achieve ignition with a good safety margin, and controlled long inductive burn. These requirements lead to a big device, which requested a too ambitious step to be undertaken by the world fusion community. More realistic objectives for a next step device shall be to demonstrate the net production of energy with a high energy gain factor (Q) and a high boot strap current fraction (>60%) which is required for a Fusion Power Plant (FPP). The Next Step Device (NSD) shall also allow operation flexibility in order to explore a large range of plasma parameters to find out the optimum concept for the fusion power plant prototype. These requirements could be too demanding for one single device and could probably be better explored in a strongly integrated world programme. The cost of one or more devices is the decisive factor for the choice of the fusion power development programme strategy. The plasma elongation and triangularity have a strong impact in the cost of the device and are limited by the plasma vertical position control issue. The distance between plasma separatrix and the toroidal field conductor does not vary a lot between devices. It is determined by the sum of the distance between first wall-plasma sepratrix and the thickness of the nuclear shield required to protect the toroidal field coil insultation. The thickness of the TF coil is determined by the allowable stresses and superconducting characteristics. The outer radius of the central solenoid is the result of an optimisation to provide the magnetic flux to inductively drive the plasma. Therefore, in order to achieve the objectives for Q and boot-strap current fractions at the minimum cost, the plasma aspect ratio and magnetic field value shall be determined. The paper will present the critical issues for the next device and will make considerations on the optimal way to proceed towards the realisation of the fusion power plant

  3. Optimisation of material discrimination using spectral CT

    International Nuclear Information System (INIS)

    Nik, S.J.; Meyer, J.; Watts, R.

    2010-01-01

    Full text: Spectral computed tomography (CT) using novel X-ray photon counting detectors (PCDs) with energy resolving capabilities is capable of providing energy-selective images. This extra energy information may allow materials such as iodine and calcium, or water and fat to be distinguished. PCDs have energy thresholds, enabling the classification of photons into multiple energy bins. The inform tion content of spectral CT images depends on how the photons are grouped together. [n this work, a method is presented to optimise energy windows for maximum material discrimination. Given a combination of thicknesses, the reference number of expected photons in each energy bin is computed using the Bee Lambert equation. A similar calculation is performed for an exhaustive range of thicknesses and the number of photons in each case is com pared to the reference, allowing a statistical map of the uncertainty in thickness parameters to be constructed. The 63%-confidence region in the two-dimensional thickness space is a representation of how optimal the bins are for material separation. The model is demonstrated with 0.1 mm of iodine and 2.2 mm of calcium using two adjacent bins encompassing the entire energy range. Bins bordering at the iodine k-edge of 33.2 keY are found to be optimal. When compared to two abutted energy bins with equal incident counts as used in the literature (bordering at 54 keY), the thickness uncertainties are reduced from approximately 4% to less than I % (see Figure). This approach has been developed for two materials and is expandable to an arbitrary number of materials and bins.

  4. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    Science.gov (United States)

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  5. Algorithme intelligent d'optimisation d'un design structurel de grande envergure

    Science.gov (United States)

    Dominique, Stephane

    genetic algorithm that prevents new individuals to be born too close to previously evaluated solutions. The restricted area becomes smaller or larger during the optimisation to allow global or local search when necessary. Also, a new search operator named Substitution Operator is incorporated in GATE. This operator allows an ANN surrogate model to guide the algorithm toward the most promising areas of the design space. The suggested CBR approach and GATE were tested on several simple test problems, as well as on the industrial problem of designing a gas turbine engine rotor's disc. These results are compared to other results obtained for the same problems by many other popular optimisation algorithms, such as (depending of the problem) gradient algorithms, binary genetic algorithm, real number genetic algorithm, genetic algorithm using multiple parents crossovers, differential evolution genetic algorithm, Hookes & Jeeves generalized pattern search method and POINTER from the software I-SIGHT 3.5. Results show that GATE is quite competitive, giving the best results for 5 of the 6 constrained optimisation problem. GATE also provided the best results of all on problem produced by a Maximum Set Gaussian landscape generator. Finally, GATE provided a disc 4.3% lighter than the best other tested algorithm (POINTER) for the gas turbine engine rotor's disc problem. One drawback of GATE is a lesser efficiency for highly multimodal unconstrained problems, for which he gave quite poor results with respect to its implementation cost. To conclude, according to the preliminary results obtained during this thesis, the suggested CBR process, combined with GATE, seems to be a very good candidate to automate and accelerate the structural design of mechanical devices, potentially reducing significantly the cost of industrial preliminary design processes.

  6. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters....... Accordingly, two novel optimisation configurations are developed, where the optimisation either acts on the actuators or acts on the regulatory control layer. These two optimisation designs are evaluated on a sub-catchment of the sewer system in Copenhagen, and found to perform better than the existing...

  7. Transmit Power Optimisation in Wireless Network

    Directory of Open Access Journals (Sweden)

    Besnik Terziu

    2011-09-01

    Full Text Available Transmit power optimisation in wireless networks based on beamforming have emerged as a promising technique to enhance the spectrum efficiency of present and future wireless communication systems. The aim of this study is to minimise the access point power consumption in cellular networks while maintaining a targeted quality of service (QoS for the mobile terminals. In this study, the targeted quality of service is delivered to a mobile station by providing a desired level of Signal to Interference and Noise Ratio (SINR. Base-stations are coordinated across multiple cells in a multi-antenna beamforming system. This study focuses on a multi-cell multi-antenna downlink scenario where each mobile user is equipped with a single antenna, but where multiple mobile users may be active simultaneously in each cell and are separated via spatial multiplexing using beamforming. The design criteria is to minimize the total weighted transmitted power across the base-stations subject to SINR constraints at the mobile users. The main contribution of this study is to define an iterative algorithm that is capable of finding the joint optimal beamformers for all basestations, based on a correlation-based channel model, the full-correlation model. Among all correlated channel models, the correlated channel model used in this study is the most accurate, giving the best performance in terms of power consumption. The environment here in this study is chosen to be Non-Light of- Sight (NLOS condition, where a signal from a wireless transmitter passes several obstructions before arriving at a wireless receiver. Moreover there are many scatterers local to the mobile, and multiple reflections can occur among them before energy arrives at the mobile. The proposed algorithm is based on uplink-downlink duality using the Lagrangian duality theory. Time-Division Duplex (TDD is chosen as the platform for this study since it has been adopted to the latest technologies in Fourth

  8. Computationally efficient optimisation algorithms for WECs arrays

    DEFF Research Database (Denmark)

    Ferri, Francesco

    2017-01-01

    In this paper two derivative-free global optimization algorithms are applied for the maximisation of the energy absorbed by wave energy converter (WEC) arrays. Wave energy is a large and mostly untapped source of energy that could have a key role in the future energy mix. The collection of this r...

  9. Mutual information-based LPI optimisation for radar network

    Science.gov (United States)

    Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun

    2015-07-01

    Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.

  10. Robustness analysis of bogie suspension components Pareto optimised values

    Science.gov (United States)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  11. A conceptual optimisation strategy for radiography in a digital environment

    International Nuclear Information System (INIS)

    Baath, M.; Haakansson, M.; Hansson, J.; Maansson, L. G.

    2005-01-01

    Using a completely digital environment for the entire imaging process leads to new possibilities for optimisation of radiography since many restrictions of screen/film systems, such as the small dynamic range and the lack of possibilities for image processing, do not apply any longer. However, at the same time these new possibilities lead to a more complicated optimisation process, since more freedom is given to alter parameters. This paper focuses on describing an optimisation strategy that concentrates on taking advantage of the conceptual differences between digital systems and screen/film systems. The strategy can be summarised as: (a) always include the anatomical background during the optimisation, (b) perform all comparisons at a constant effective dose and (c) separate the image display stage from the image collection stage. A three-step process is proposed where the optimal setting of the technique parameters is determined at first, followed by an optimisation of the image processing. In the final step the optimal dose level - given the optimal settings of the image collection and image display stages - is determined. (authors)

  12. Establishing Local Reference Dose Values and Optimisation Strategies

    International Nuclear Information System (INIS)

    Connolly, P.; Moores, B.M.

    2000-01-01

    The revised EC Patient Directive 97/43 EURATOM introduces the concepts of clinical audit, diagnostic reference levels and optimisation of radiation protection in diagnostic radiology. The application of reference dose levels in practice involves the establishment of reference dose values as actual measurable operational quantities. These values should then form part of an ongoing optimisation and audit programme against which routine performance can be compared. The CEC Quality Criteria for Radiographic Images provides guidance reference dose values against which local performance can be compared. In many cases these values can be improved upon quite considerably. This paper presents the results of a local initiative in the North West of the UK aimed at establishing local reference dose values for a number of major hospital sites. The purpose of this initiative is to establish a foundation for both optimisation strategies and clinical audit as an ongoing and routine practice. The paper presents results from an ongoing trial involving patient dose measurements for several radiological examinations upon the sites. The results of an attempt to establish local reference dose values from measured dose values and to employ them in optimisation strategies are presented. In particular emphasis is placed on the routine quality control programmes necessary to underpin this strategy including the effective data management of results from such programmes and how they can be employed to optimisation practices. (author)

  13. HERMES: towards an integrated toolbox to characterize functional and effective brain connectivity.

    Science.gov (United States)

    Niso, Guiomar; Bruña, Ricardo; Pereda, Ernesto; Gutiérrez, Ricardo; Bajo, Ricardo; Maestú, Fernando; del-Pozo, Francisco

    2013-10-01

    The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the 'traditional' set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality.This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified-easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox.Here we present HERMES ( http://hermes.ctb.upm.es ), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis.

  14. MagPy: A Python toolbox for controlling Magstim transcranial magnetic stimulators.

    Science.gov (United States)

    McNair, Nicolas A

    2017-01-30

    To date, transcranial magnetic stimulation (TMS) studies manipulating stimulation parameters have largely used blocked paradigms. However, altering these parameters on a trial-by-trial basis in Magstim stimulators is complicated by the need to send regular (1Hz) commands to the stimulator. Additionally, effecting such control interferes with the ability to send TMS pulses or simultaneously present stimuli with high-temporal precision. This manuscript presents the MagPy toolbox, a Python software package that provides full control over Magstim stimulators via the serial port. It is able to maintain this control with no impact on concurrent processing, such as stimulus delivery. In addition, a specially-designed "QuickFire" serial cable is specified that allows MagPy to trigger TMS pulses with very low-latency. In a series of experimental simulations, MagPy was able to maintain uninterrupted remote control over the connected Magstim stimulator across all testing sessions. In addition, having MagPy enabled had no effect on stimulus timing - all stimuli were presented for precisely the duration specified. Finally, using the QuickFire cable, MagPy was able to elicit TMS pulses with sub-millisecond latencies. The MagPy toolbox allows for experiments that require manipulating stimulation parameters from trial to trial. Furthermore, it can achieve this in contexts that require tight control over timing, such as those seeking to combine TMS with fMRI or EEG. Together, the MagPy toolbox and QuickFire serial cable provide an effective means for controlling Magstim stimulators during experiments while ensuring high-precision timing. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    Science.gov (United States)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  16. PETPVC: a toolbox for performing partial volume correction techniques in positron emission tomography

    Science.gov (United States)

    Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell

    2016-11-01

    Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.

  17. FACET - a "Flexible Artifact Correction and Evaluation Toolbox" for concurrently recorded EEG/fMRI data.

    Science.gov (United States)

    Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S

    2013-11-09

    In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.

  18. Using a Toolbox of Tailored Educational Lessons to Improve Fruit, Vegetable, and Physical Activity Behaviors among African American Women in California

    Science.gov (United States)

    Backman, Desiree; Scruggs, Valarie; Atiedu, Akpene Ama; Bowie, Shene; Bye, Larry; Dennis, Angela; Hall, Melanie; Ossa, Alexandra; Wertlieb, Stacy; Foerster, Susan B.

    2011-01-01

    Objective: Evaluate the effectiveness of the "Fruit, Vegetable, and Physical Activity Toolbox for Community Educators" ("Toolbox"), an intervention originally designed for Spanish- and English-speaking audiences, in changing knowledge, attitudes, and behavior among low-income African American women. Design: Quasi-experimental…

  19. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, III, F. G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM@ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased level of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM@ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM@ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that

  20. Process Evaluation of a Toolbox-training Program for Construction Foremen in Denmark

    DEFF Research Database (Denmark)

    Jeschke, Katharina Christiane; Kines, Pete; Rasmussen, Liselotte

    2017-01-01

    for the majority of the foremen, who experienced positive changes in their daily work methods and interactions with their crews, colleagues, leaders, customers and other construction professions. The program is a unique contribution to leadership training in the construction industry, and can potentially......Daily dialogue between leaders and workers on traditional construction sites is primarily focused on production, quality and time issues, and rarely involves occupational safety and health (OSH) issues. A leadership training program entitled 'Toolbox-training' was developed to improve construction...

  1. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    Science.gov (United States)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  2. GPELab, a Matlab toolbox to solve Gross-Pitaevskii equations II: Dynamics and stochastic simulations

    Science.gov (United States)

    Antoine, Xavier; Duboscq, Romain

    2015-08-01

    GPELab is a free Matlab toolbox for modeling and numerically solving large classes of systems of Gross-Pitaevskii equations that arise in the physics of Bose-Einstein condensates. The aim of this second paper, which follows (Antoine and Duboscq, 2014), is to first present the various pseudospectral schemes available in GPELab for computing the deterministic and stochastic nonlinear dynamics of Gross-Pitaevskii equations (Antoine, et al., 2013). Next, the corresponding GPELab functions are explained in detail. Finally, some numerical examples are provided to show how the code works for the complex dynamics of BEC problems.

  3. OSSIM wave-optics toolbox and its use to simulate AEOS

    Science.gov (United States)

    Smith, Carey A.; Forgham, James L.; Jones, Bruce W.; Jones, Kenneth D.

    2001-12-01

    OSSim (Optical System Simulation) is a simulation toolbox of optical and processing components. By using full wave-optics in the time-domain, OSSim simulates diffractive effects and control loop interactions missed by simpler analyses. OSSim also models the atmosphere, with user customizable turbulence strength, wind, and slew. This paper first presents 2 introductory examples: a simple 2-lens imaging system and a simple tilt-control system. Then it presents a simulation of the 3.67-meter AEOS (Advanced Electro-Optics System) telescope on Maui. The OSSim simulation agrees well with the AEOS experimental results.

  4. Recent advances of molecular toolbox construction expand Pichia pastoris in synthetic biology applications.

    Science.gov (United States)

    Kang, Zhen; Huang, Hao; Zhang, Yunfeng; Du, Guocheng; Chen, Jian

    2017-01-01

    Pichia pastoris: (reclassified as Komagataella phaffii), a methylotrophic yeast strain has been widely used for heterologous protein production because of its unique advantages, such as readily achievable high-density fermentation, tractable genetic modifications and typical eukaryotic post-translational modifications. More recently, P. pastoris as a metabolic pathway engineering platform has also gained much attention. In this mini-review, we addressed recent advances of molecular toolboxes, including synthetic promoters, signal peptides, and genome engineering tools that established for P. pastoris. Furthermore, the applications of P. pastoris towards synthetic biology were also discussed and prospected especially in the context of genome-scale metabolic pathway analysis.

  5. Cognitive functioning in socially anxious adults: Insights from the NIH Toolbox Cognition Battery

    Directory of Open Access Journals (Sweden)

    Sonya Violet Troller-Renfree

    2015-06-01

    Full Text Available Theory suggests that individuals with social anxiety manifest unique patterns of cognition with less efficient fluid cognition and unperturbed crystallized cognition; however, empirical support for these ideas remains inconclusive. The heterogeneity of past findings may reflect unreliability in cognitive assessments or the influence of confounding variables. The present study examined the relations among social anxiety and performance on the reliable, newly established NIH Toolbox Cognition Battery. Results indicate that high socially anxious adults performed as well as low anxious participants on all measures of fluid cognition. However, highly socially anxious adults demonstrated enhanced crystallized cognitive abilities relative to a low socially anxious comparison group.

  6. Data for TROTS – The Radiotherapy Optimisation Test Set

    Directory of Open Access Journals (Sweden)

    Sebastiaan Breedveld

    2017-06-01

    Full Text Available The Radiotherapy Optimisation Test Set (TROTS is an extensive set of problems originating from radiotherapy (radiation therapy treatment planning. This dataset is created for 2 purposes: (1 to supply a large-scale dense dataset to measure performance and quality of mathematical solvers, and (2 to supply a dataset to investigate the multi-criteria optimisation and decision-making nature of the radiotherapy problem. The dataset contains 120 problems (patients, divided over 6 different treatment protocols/tumour types. Each problem contains numerical data, a configuration for the optimisation problem, and data required to visualise and interpret the results. The data is stored as HDF5 compatible Matlab files, and includes scripts to work with the dataset.

  7. CLIC crab cavity design optimisation for maximum luminosity

    Energy Technology Data Exchange (ETDEWEB)

    Dexter, A.C., E-mail: a.dexter@lancaster.ac.uk [Lancaster University, Lancaster, LA1 4YR (United Kingdom); Cockcroft Institute, Daresbury, Warrington, WA4 4AD (United Kingdom); Burt, G.; Ambattu, P.K. [Lancaster University, Lancaster, LA1 4YR (United Kingdom); Cockcroft Institute, Daresbury, Warrington, WA4 4AD (United Kingdom); Dolgashev, V. [SLAC, Menlo Park, CA 94025 (United States); Jones, R. [University of Manchester, Manchester, M13 9PL (United Kingdom)

    2011-11-21

    The bunch size and crossing angle planned for CERN's compact linear collider CLIC dictate that crab cavities on opposing linacs will be needed to rotate bunches of particles into alignment at the interaction point if the desired luminosity is to be achieved. Wakefield effects, RF phase errors between crab cavities on opposing linacs and unpredictable beam loading can each act to reduce luminosity below that anticipated for bunches colliding in perfect alignment. Unlike acceleration cavities, which are normally optimised for gradient, crab cavities must be optimised primarily for luminosity. Accepting the crab cavity technology choice of a 12 GHz, normal conducting, travelling wave structure as explained in the text, this paper develops an analytical approach to optimise cell number and iris diameter.

  8. The principle of optimisation: reasons for success and legal criticism

    International Nuclear Information System (INIS)

    Fernandez Regalado, Luis

    2008-01-01

    The International Commission on Radiological Protection (ICRP) has adopted new recommendations in 2007. In broad outlines they fundamentally continue the recommendations already approved in 1990 and later on. The principle of optimisation of protection, together with the principles of justification and dose limits, remains playing a key role of the ICRP recommendations, and it has so been for the last few years. This principle, somehow reinforced in the 2007 ICRP recommendations, has been incorporated into norms and legislation which have peacefully been in force in many countries all over the world. There are three main reasons to explain the success in the application of the principle of optimisation in radiological protection: First, the subjectivity of the sentence that embraces the principle of optimisation, 'As low as reasonably achievable' (ALARA), that allows different valid interpretations under different circumstances. Second, the pragmatism and adaptability of ALARA to all exposure situations. And third, the scientific humbleness which is behind the principle of optimisation, which makes a clear contrast with the old fashioned scientific positivism that enshrined scientist opinions. Nevertheless, from a legal point of view, there is some criticism cast over the principle of optimisation in radiological protection, where it has been transformed in compulsory norm. This criticism is based on two arguments: The lack of democratic participation in the process of elaboration of the norm, and the legal uncertainty associated to its application. Both arguments are somehow known by the ICRP which, on the one hand, has broadened the participation of experts, associations and the professional radiological protection community, increasing the transparency on how decisions on recommendations have been taken, and on the other hand, the ICRP has warned about the need for authorities to specify general criteria to develop the principle of optimisation in national

  9. Optimisation in X-ray and Molecular Imaging 2015

    International Nuclear Information System (INIS)

    Baath, Magnus; Hoeschen, Christoph; Mattsson, Soeren; Mansson, Lars Gunnar

    2016-01-01

    This issue of Radiation Protection Dosimetry is based on contributions to Optimisation in X-ray and Molecular Imaging 2015 - the 4. Malmoe Conference on Medical Imaging (OXMI 2015). The conference was jointly organised by members of former and current research projects supported by the European Commission EURATOM Radiation Protection Research Programme, in cooperation with the Swedish Society for Radiation Physics. The conference brought together over 150 researchers and other professionals from hospitals, universities and industries with interests in different aspects of the optimisation of medical imaging. More than 100 presentations were given at this international gathering of medical physicists, radiologists, engineers, technicians, nurses and educational researchers. Additionally, invited talks were offered by world-renowned experts on radiation protection, spectral imaging and medical image perception, thus covering several important aspects of the generation and interpretation of medical images. The conference consisted of 13 oral sessions and a poster session, as reflected by the conference title connected by their focus on the optimisation of the use ionising radiation in medical imaging. The conference included technology-specific topics such as computed tomography and tomosynthesis, but also generic issues of interest for the optimisation of all medical imaging, such as image perception and quality assurance. Radiation protection was covered by e.g. sessions on patient dose benchmarking and occupational exposure. Technically-advanced topics such as modelling, Monte Carlo simulation, reconstruction, classification, and segmentation were seen taking advantage of recent developments of hardware and software, showing that the optimisation community is at the forefront of technology and adapts well to new requirements. These peer-reviewed proceedings, representing a continuation of a series of selected reports from meetings in the field of medical imaging

  10. Numerical optimisation of friction stir welding: review of future challenges

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2011-01-01

    During the last decade, the combination of increasingly more advanced numerical simulation software with high computational power has resulted in models for friction stir welding (FSW), which have improved the understanding of the determining physical phenomena behind the process substantially....... This has made optimisation of certain process parameters possible and has in turn led to better performing friction stir welded products, thus contributing to a general increase in the popularity of the process and its applications. However, most of these optimisation studies do not go well beyond manual...

  11. Alternatives for optimisation of rumen fermentation in ruminants

    Directory of Open Access Journals (Sweden)

    T. Slavov

    2017-06-01

    Full Text Available Abstract. The proper knowledge on the variety of events occurring in the rumen makes possible their optimisation with respect to the complete feed conversion and increasing the productive performance of ruminants. The inclusion of various dietary additives (supplements, biologically active substances, nutritional antibiotics, probiotics, enzymatic preparations, plant extracts etc. has an effect on the intensity and specific pathway of fermentation, and thus, on the general digestion and systemic metabolism. The optimisation of rumen digestion is a method with substantial potential for improving the efficiency of ruminant husbandry, increasing of quality of their produce and health maintenance.

  12. Separative power of an optimised concurrent gas centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Bogovalov, Sergey; Boman, Vladimir [National Research Nuclear University (MEPHI), Moscow (Russian Federation)

    2016-06-15

    The problem of separation of isotopes in a concurrent gas centrifuge is solved analytically for an arbitrary binary mixture of isotopes. The separative power of the optimised concurrent gas centrifuges for the uranium isotopes equals to δU = 12.7 (V/700 m/s)2(300 K/T)(L/1 m) kg·SWU/yr, where L and V are the length and linear velocity of the rotor of the gas centrifuge and T is the temperature. This equation agrees well with the empirically determined separative power of optimised counter-current gas centrifuges.

  13. MANAGEMENT OPTIMISATION OF MASS CUSTOMISATION MANUFACTURING USING COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    Louwrens Butler

    2018-05-01

    Full Text Available Computational intelligence paradigms can be used for advanced manufacturing system optimisation. A static simulation model of an advanced manufacturing system was developed in order to simulate a manufacturing system. The purpose of this advanced manufacturing system was to mass-produce a customisable product range at a competitive cost. The aim of this study was to determine whether this new algorithm could produce a better performance than traditional optimisation methods. The algorithm produced a lower cost plan than that for a simulated annealing algorithm, and had a lower impact on the workforce.

  14. Optimising a shaft's geometry by applying genetic algorithms

    Directory of Open Access Journals (Sweden)

    María Alejandra Guzmán

    2005-05-01

    Full Text Available Many engnieering design tasks involve optimising several conflicting goals; these types of problem are known as Multiobjective Optimisation Problems (MOPs. Evolutionary techniques have proved to be an effective tool for finding solutions to these MOPs during the last decade, Variations on the basic generic algorithm have been particulary proposed by different researchers for finding rapid optimal solutions to MOPs. The NSGA (Non-dominated Sorting Generic Algorithm has been implemented in this paper for finding an optimal design for a shaft subjected to cyclic loads, the conflycting goals being minimum weight and minimum lateral deflection.

  15. The optimisation study of tbp synthesis process by phosphoric acid

    International Nuclear Information System (INIS)

    Amedjkouh, A.; Attou, M.; Azzouz, A.; Zaoui, B.

    1995-07-01

    The present work deals with the optimisation study of TBP synthesis process by phosphoric acid. This way of synthesis is more advantageous than POCL3 or P2O5 as phosphatant agents. these latters are toxic and dangerous for the environnement. The optimisation study is based on a series of 16 experiences taking into account the range of variation of the following parameters : temperature, pressure, reagents mole ratio, promoter content. the yield calculation is based on the randomisation of an equation including all parameters. the resolution of this equation gave a 30% TBP molar ratio. this value is in agreement with that of experimental data

  16. Optimisation of BPMN Business Models via Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for the optimisation of business processes modelled in the business process modelling language BPMN, which builds upon earlier work, where we developed a model checking based method for the analysis of BPMN models. We define a structure for expressing optimisation goals...... for synthesized BPMN components, based on probabilistic computation tree logic and real-valued reward structures of the BPMN model, allowing for the specification of complex quantitative goals. We here present a simple algorithm, inspired by concepts from evolutionary algorithms, which iteratively generates...

  17. The optimisation of wedge filters in radiotherapy of the prostate

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony J.; Webb, Steve

    1995-01-01

    A treatment plan optimisation algorithm has been applied to 12 patients with early prostate cancer in order to determine the optimum beam-weights and wedge angles for a standard conformal three-field treatment technique. The optimisation algorithm was based on fast-simulated-annealing using a cost function designed to achieve a uniform dose in the planning-target-volume (PTV) and to minimise the integral doses to the organs-at-risk. The algorithm has been applied to standard conformal three-field plans created by an experienced human planner, and run in three PLAN MODES: (1) where the wedge angles were fixed by the human planner and only the beam-weights were optimised; (2) where both the wedge angles and beam-weights were optimised; and (3) where both the wedge angles and beam-weights were optimised and a non-uniform dose was prescribed to the PTV. In the latter PLAN MODE, a uniform 100% dose was prescribed to all of the PTV except for that region that overlaps with the rectum where a lower (e.g., 90%) dose was prescribed. The resulting optimised plans have been compared with those of the human planner who found beam-weights by conventional forward planning techniques. Plans were compared on the basis of dose statistics, normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results of the comparison showed that all three PLAN MODES produced plans with slightly higher TCP for the same rectal NTCP, than the human planner. The best results were observed for PLAN MODE 3, where an average increase in TCP of 0.73% (± 0.20, 95% confidence interval) was predicted by the biological models. This increase arises from a beneficial dose gradient which is produced across the tumour. Although the TCP gain is small it comes with no increase in treatment complexity, and could translate into increased cures given the large numbers of patients being referred. A study of the beam-weights and wedge angles chosen by the optimisation algorithm revealed

  18. Modern multicore and manycore architectures: Modelling, optimisation and benchmarking a multiblock CFD code

    Science.gov (United States)

    Hadade, Ioan; di Mare, Luca

    2016-08-01

    Modern multicore and manycore processors exhibit multiple levels of parallelism through a wide range of architectural features such as SIMD for data parallel execution or threads for core parallelism. The exploitation of multi-level parallelism is therefore crucial for achieving superior performance on current and future processors. This paper presents the performance tuning of a multiblock CFD solver on Intel SandyBridge and Haswell multicore CPUs and the Intel Xeon Phi Knights Corner coprocessor. Code optimisations have been applied on two computational kernels exhibiting different computational patterns: the update of flow variables and the evaluation of the Roe numerical fluxes. We discuss at great length the code transformations required for achieving efficient SIMD computations for both kernels across the selected devices including SIMD shuffles and transpositions for flux stencil computations and global memory transformations. Core parallelism is expressed through threading based on a number of domain decomposition techniques together with optimisations pertaining to alleviating NUMA effects found in multi-socket compute nodes. Results are correlated with the Roofline performance model in order to assert their efficiency for each distinct architecture. We report significant speedups for single thread execution across both kernels: 2-5X on the multicore CPUs and 14-23X on the Xeon Phi coprocessor. Computations at full node and chip concurrency deliver a factor of three speedup on the multicore processors and up to 24X on the Xeon Phi manycore coprocessor.

  19. Optimisation of expansion liquefaction processes using mixed refrigerant N_2–CH_4

    International Nuclear Information System (INIS)

    Ding, He; Sun, Heng; He, Ming

    2016-01-01

    Highlights: • A refrigerant composition matching method for N_2–CH_4 expansion processes. • Efficiency improvements for propane pre-cooled N_2–CH_4 expansion processes. • The process shows good adaptability to varying natural gas compositions. - Abstract: An expansion process with a pre-cooling system is simulated and optimised by Aspen HYSYS and MATLAB"™. Taking advantage of higher specific refrigeration effect of methane and easily reduced refrigeration temperature of nitrogen, the designed process adopts N_2–CH_4 as a mixed refrigerant. Based on the different thermodynamic properties and sensitivity difference of N_2 and CH_4 over the same heat transfer temperature range, this work proposes a novel method of matching refrigerant composition which aims at single-stage or multi-stage series expansion liquefaction processes with pre-cooling systems. This novel method is applied successfully in propane pre-cooled N_2–CH_4 expansion process, and the unit power consumption is reduced to 7.09 kWh/kmol, which is only 5.35% higher than the global optimised solutions obtained by genetic algorithm. This novel method can fulfil the accomplishments of low energy consumption and high liquefaction rate, and thus decreases the gap between the mixed refrigerant and expansion processes in energy consumption. Furthermore, the high exergy efficiency of the process indicates good adaptability to varying natural gas compositions.

  20. Optimising mobile phase composition, its flow-rate and column temperature in HPLC using taboo search.

    Science.gov (United States)

    Guillaume, Y C; Peyrin, E

    2000-03-06

    A chemometric methodology is proposed to study the separation of seven p-hydroxybenzoic esters in reversed phase liquid chromatography (RPLC). Fifteen experiments were found to be necessary to find a mathematical model which linked a novel chromatographic response function (CRF) with the column temperature, the water fraction in the mobile phase and its flow rate. The CRF optimum was determined using a new algorithm based on Glover's taboo search (TS). A flow-rate of 0.9 ml min(-1) with a water fraction of 0.64 in the ACN-water mixture and a column temperature of 10 degrees C gave the most efficient separation conditions. The usefulness of TS was compared with the pure random search (PRS) and simplex search (SS). As demonstrated by calculations, the algorithm avoids entrapment in local minima and continues the search to give a near-optimal final solution. Unlike other methods of global optimisation, this procedure is generally applicable, easy to implement, derivative free, conceptually simple and could be used in the future for much more complex optimisation problems.

  1. Time varying acceleration coefficients particle swarm optimisation (TVACPSO): A new optimisation algorithm for estimating parameters of PV cells and modules

    International Nuclear Information System (INIS)

    Jordehi, Ahmad Rezaee

    2016-01-01

    Highlights: • A modified PSO has been proposed for parameter estimation of PV cells and modules. • In the proposed modified PSO, acceleration coefficients are changed during run. • The proposed modified PSO mitigates premature convergence problem. • Parameter estimation problem has been solved for both PV cells and PV modules. • The results show that proposed PSO outperforms other state of the art algorithms. - Abstract: Estimating circuit model parameters of PV cells/modules represents a challenging problem. PV cell/module parameter estimation problem is typically translated into an optimisation problem and is solved by metaheuristic optimisation problems. Particle swarm optimisation (PSO) is considered as a popular and well-established optimisation algorithm. Despite all its advantages, PSO suffers from premature convergence problem meaning that it may get trapped in local optima. Personal and social acceleration coefficients are two control parameters that, due to their effect on explorative and exploitative capabilities, play important roles in computational behavior of PSO. In this paper, in an attempt toward premature convergence mitigation in PSO, its personal acceleration coefficient is decreased during the course of run, while its social acceleration coefficient is increased. In this way, an appropriate tradeoff between explorative and exploitative capabilities of PSO is established during the course of run and premature convergence problem is significantly mitigated. The results vividly show that in parameter estimation of PV cells and modules, the proposed time varying acceleration coefficients PSO (TVACPSO) offers more accurate parameters than conventional PSO, teaching learning-based optimisation (TLBO) algorithm, imperialistic competitive algorithm (ICA), grey wolf optimisation (GWO), water cycle algorithm (WCA), pattern search (PS) and Newton algorithm. For validation of the proposed methodology, parameter estimation has been done both for

  2. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    Science.gov (United States)

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  3. PredPsych: A toolbox for predictive machine learning-based approach in experimental psychology research.

    Science.gov (United States)

    Koul, Atesh; Becchio, Cristina; Cavallo, Andrea

    2017-12-12

    Recent years have seen an increased interest in machine learning-based predictive methods for analyzing quantitative behavioral data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible implementation. The aim of current work was to build an open-source R toolbox - "PredPsych" - that could make these methods readily available to all psychologists. PredPsych is a user-friendly, R toolbox based on machine-learning predictive algorithms. In this paper, we present the framework of PredPsych via the analysis of a recently published multiple-subject motion capture dataset. In addition, we discuss examples of possible research questions that can be addressed with the machine-learning algorithms implemented in PredPsych and cannot be easily addressed with univariate statistical analysis. We anticipate that PredPsych will be of use to researchers with limited programming experience not only in the field of psychology, but also in that of clinical neuroscience, enabling computational assessment of putative bio-behavioral markers for both prognosis and diagnosis.

  4. Integration of Lead Discovery Tactics and the Evolution of the Lead Discovery Toolbox.

    Science.gov (United States)

    Leveridge, Melanie; Chung, Chun-Wa; Gross, Jeffrey W; Phelps, Christopher B; Green, Darren

    2018-06-01

    There has been much debate around the success rates of various screening strategies to identify starting points for drug discovery. Although high-throughput target-based and phenotypic screening has been the focus of this debate, techniques such as fragment screening, virtual screening, and DNA-encoded library screening are also increasingly reported as a source of new chemical equity. Here, we provide examples in which integration of more than one screening approach has improved the campaign outcome and discuss how strengths and weaknesses of various methods can be used to build a complementary toolbox of approaches, giving researchers the greatest probability of successfully identifying leads. Among others, we highlight case studies for receptor-interacting serine/threonine-protein kinase 1 and the bromo- and extra-terminal domain family of bromodomains. In each example, the unique insight or chemistries individual approaches provided are described, emphasizing the synergy of information obtained from the various tactics employed and the particular question each tactic was employed to answer. We conclude with a short prospective discussing how screening strategies are evolving, what this screening toolbox might look like in the future, how to maximize success through integration of multiple tactics, and scenarios that drive selection of one combination of tactics over another.

  5. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis.

    Science.gov (United States)

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/.

  6. MBEToolbox: a Matlab toolbox for sequence data analysis in molecular biology and evolution

    Directory of Open Access Journals (Sweden)

    Xia Xuhua

    2005-03-01

    Full Text Available Abstract Background MATLAB is a high-performance language for technical computing, integrating computation, visualization, and programming in an easy-to-use environment. It has been widely used in many areas, such as mathematics and computation, algorithm development, data acquisition, modeling, simulation, and scientific and engineering graphics. However, few functions are freely available in MATLAB to perform the sequence data analyses specifically required for molecular biology and evolution. Results We have developed a MATLAB toolbox, called MBEToolbox, aimed at filling this gap by offering efficient implementations of the most needed functions in molecular biology and evolution. It can be used to manipulate aligned sequences, calculate evolutionary distances, estimate synonymous and nonsynonymous substitution rates, and infer phylogenetic trees. Moreover, it provides an extensible, functional framework for users with more specialized requirements to explore and analyze aligned nucleotide or protein sequences from an evolutionary perspective. The full functions in the toolbox are accessible through the command-line for seasoned MATLAB users. A graphical user interface, that may be especially useful for non-specialist end users, is also provided. Conclusion MBEToolbox is a useful tool that can aid in the exploration, interpretation and visualization of data in molecular biology and evolution. The software is publicly available at http://web.hku.hk/~jamescai/mbetoolbox/ and http://bioinformatics.org/project/?group_id=454.

  7. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    Science.gov (United States)

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  8. SPP Toolbox: Supporting Sustainable Public Procurement in the Context of Socio-Technical Transitions

    Directory of Open Access Journals (Sweden)

    Paula Cayolla Trindade

    2017-12-01

    Full Text Available Public procurement can shape production and consumption trends and represents a stimulus for both innovation and diversification in products and services, through a direct increase in demand. In recent years, the interest in demand-side policies has grown and several approaches have emerged, such as Green Public Procurement (GPP, Sustainable Public Procurement (SPP and Public Procurement of Innovation (PPI, representing strategic goals to be achieved through public procurement. In this context, there is a need to guide and support public organizations in the uptake of GPP, SPP and PPI practices. To respond to the challenges raised by the operationalization of such strategies, this paper proposes a new tool—the SPP Toolbox—for guiding public organizations as they re-think the procurement process, raising their ambitions and broadening their vision, thus changing the organizational approach towards culture, strategies, structures and practices. This toolbox integrates insights from GPP, SPP and PPI objectives and practices, in the context of the emergence of socio-technical transitions. The toolbox coherently links GPP, SPP and PPI, allowing flexibility in terms of goals, yet promoting an increasing complexity of institutionalized practices and skills—from GPP to SPP and then from SPP to PPI, organized in a framework fully integrated into the organizational strategy.

  9. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    Science.gov (United States)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  10. SinCHet: a MATLAB toolbox for single cell heterogeneity analysis in cancer.

    Science.gov (United States)

    Li, Jiannong; Smalley, Inna; Schell, Michael J; Smalley, Keiran S M; Chen, Y Ann

    2017-09-15

    Single-cell technologies allow characterization of transcriptomes and epigenomes for individual cells under different conditions and provide unprecedented resolution for researchers to investigate cellular heterogeneity in cancer. The SinCHet ( gle ell erogeneity) toolbox is developed in MATLAB and has a graphical user interface (GUI) for visualization and user interaction. It analyzes both continuous (e.g. mRNA expression) and binary omics data (e.g. discretized methylation data). The toolbox does not only quantify cellular heterogeneity using S hannon P rofile (SP) at different clonal resolutions but also detects heterogeneity differences using a D statistic between two populations. It is defined as the area under the P rofile of S hannon D ifference (PSD). This flexible tool provides a default clonal resolution using the change point of PSD detected by multivariate adaptive regression splines model; it also allows user-defined clonal resolutions for further investigation. This tool provides insights into emerging or disappearing clones between conditions, and enables the prioritization of biomarkers for follow-up experiments based on heterogeneity or marker differences between and/or within cell populations. The SinCHet software is freely available for non-profit academic use. The source code, example datasets, and the compiled package are available at http://labpages2.moffitt.org/chen/software/ . ann.chen@moffitt.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  11. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    Science.gov (United States)

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  12. BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.

    Science.gov (United States)

    Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K

    2014-02-15

    Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. An open-source toolbox for multiphase flow in porous media

    Science.gov (United States)

    Horgue, P.; Soulaine, C.; Franc, J.; Guibert, R.; Debenest, G.

    2015-02-01

    Multiphase flow in porous media provides a wide range of applications: from the environmental understanding (aquifer, site-pollution) to industrial process improvements (oil production, waste management). Modeling of such flows involves specific volume-averaged equations and therefore specific computational fluid dynamics (CFD) tools. In this work, we develop a toolbox for modeling multiphase flow in porous media with OpenFOAM®, an open-source platform for CFD. The underlying idea of this approach is to provide an easily adaptable tool that can be used in further studies to test new mathematical models or numerical methods. The package provides the most common effective properties models of the literature (relative permeability, capillary pressure) and specific boundary conditions related to porous media flows. To validate this package, solvers based on the IMplicit Pressure Explicit Saturation (IMPES) method are developed in the toolbox. The numerical validation is performed by comparison with analytical solutions on academic cases. Then, a satisfactory parallel efficiency of the solver is shown on a more complex configuration.

  14. A Toolbox of Genetically Encoded FRET-Based Biosensors for Rapid l-Lysine Analysis

    Directory of Open Access Journals (Sweden)

    Victoria Steffen

    2016-09-01

    Full Text Available Background: The fast development of microbial production strains for basic and fine chemicals is increasingly carried out in small scale cultivation systems to allow for higher throughput. Such parallelized systems create a need for new rapid online detection systems to quantify the respective target compound. In this regard, biosensors, especially genetically encoded Förster resonance energy transfer (FRET-based biosensors, offer tremendous opportunities. As a proof-of-concept, we have created a toolbox of FRET-based biosensors for the ratiometric determination of l-lysine in fermentation broth. Methods: The sensor toolbox was constructed based on a sensor that consists of an optimized central lysine-/arginine-/ornithine-binding protein (LAO-BP flanked by two fluorescent proteins (enhanced cyan fluorescent protein (ECFP, Citrine. Further sensor variants with altered affinity and sensitivity were obtained by circular permutation of the binding protein as well as the introduction of flexible and rigid linkers between the fluorescent proteins and the LAO-BP, respectively. Results: The sensor prototype was applied to monitor the extracellular l-lysine concentration of the l-lysine producing Corynebacterium glutamicum (C. glutamicum strain DM1933 in a BioLector® microscale cultivation device. The results matched well with data obtained by HPLC analysis and the Ninhydrin assay, demonstrating the high potential of FRET-based biosensors for high-throughput microbial bioprocess optimization.

  15. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  16. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  17. Safety, dose optimisation and security: the quadrature of the circle

    Energy Technology Data Exchange (ETDEWEB)

    Hardeman, Frank; Vermeersch, Fernand [Belgian Nuclear Research Centre (SCK.CEN), Boeretang 200, BE-2400 Mol (Belgium)

    2010-07-01

    differences being present. There are aspects of trust and distrust, supervision and coaching versus control and verification; acceptability of measures implemented; having control or being victim of global evolutions. As a conclusion, it is indispensable that some people, both at the level of regulators and operators dispose of a helicopter view on this subject, in order to achieve optimal solutions understandable to the workforce and taking into account all aspects: safety, security and dose optimisation. (authors)

  18. Safety, dose optimisation and security: the quadrature of the circle

    International Nuclear Information System (INIS)

    Hardeman, Frank; Vermeersch, Fernand

    2010-01-01

    present. There are aspects of trust and distrust, supervision and coaching versus control and verification; acceptability of measures implemented; having control or being victim of global evolutions. As a conclusion, it is indispensable that some people, both at the level of regulators and operators dispose of a helicopter view on this subject, in order to achieve optimal solutions understandable to the workforce and taking into account all aspects: safety, security and dose optimisation. (authors)

  19. Agent-Based Decision Control—How to Appreciate Multivariate Optimisation in Architecture

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas Holmer; Kolarik, Jakub

    2015-01-01

    , the method is applied to a multivariate optimisation problem. The aim is specifically to demonstrate optimisation for entire building energy consumption, daylight distribution and capital cost. Based on the demonstrations Moth’s ability to find local minima is discussed. It is concluded that agent-based...... in the early design stage. The main focus is to demonstrate the optimisation method, which is done in two ways. Firstly, the newly developed agent-based optimisation algorithm named Moth is tested on three different single objective search spaces. Here Moth is compared to two evolutionary algorithms. Secondly...... optimisation algorithms like Moth open up for new uses of optimisation in the early design stage. With Moth the final outcome is less dependent on pre- and post-processing, and Moth allows user intervention during optimisation. Therefore, agent-based models for optimisation such as Moth can be a powerful...

  20. Multigrid Implementation of Cellular Automata for Topology Optimisation of Continuum Structures with Design Dependent loads

    NARCIS (Netherlands)

    Zakhama, R.

    2009-01-01

    Topology optimisation of continuum structures has become mature enough to be often applied in industry and continues to attract the attention of researchers and software companies in various engineering fields. Traditionally, most available algorithms for solving topology optimisation problems are

  1. Solving dynamic multi-objective problems with vector evaluated particle swarm optimisation

    CSIR Research Space (South Africa)

    Greeff, M

    2008-06-01

    Full Text Available Many optimisation problems are multi-objective and change dynamically. Many methods use a weighted average approach to the multiple objectives. This paper introduces the usage of the vector evaluated particle swarm optimiser (VEPSO) to solve dynamic...

  2. Sentinel-3 SAR Altimetry Toolbox - Scientific Exploitation of Operational Missions (SEOM) Program Element

    Science.gov (United States)

    Benveniste, Jérôme; Lucas, Bruno; Dinardo, Salvatore

    2014-05-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage pioneered by ERS-1, ERS-2, Envisat and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the Sentinel-3 series is planned for launch in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales, the French Space Agency), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as net

  3. SINGLE FIXED CRANE OPTIMISATION WITHIN A DISTRIBUTION CENTRE

    Directory of Open Access Journals (Sweden)

    J. Matthews

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper considersthe optimisation of the movement of a fixed crane operating in a single aisle of a distribution centre. The crane must move pallets in inventory between docking bays, storage locations, and picking lines. Both a static and a dynamic approach to the problem are presented. The optimisation is performed by means of tabu search, ant colony metaheuristics,and hybrids of these two methods. All these solution approaches were tested on real life data obtained from an operational distribution centre. Results indicate that the hybrid methods outperform the other approaches.

    AFRIKAANSE OPSOMMING: Die optimisering van die beweging van 'n vaste hyskraan in 'n enkele gang van 'n distribusiesentrum word in hierdie artikel beskou. Die hyskraan moet pallette vervoer tussen dokhokke, stoorposisies, en opmaaklyne. Beide 'n statiese en 'n dinamiese benadering tot die probleem word aangebied. Die optimisering word gedoen met behulp van tabu-soektogte, mierkolonieoptimisering,en hibriede van hierdie twee metodes. Al die oplossingsbenaderings is getoets met werklike data wat van 'n operasionele distribusiesentrum verkry is. Die resultate toon aan dat die hibriedmetodes die beste oplossings lewer.

  4. Formulation des betons autopla~ants : Optimisation du squelette ...

    African Journals Online (AJOL)

    Formulation des betons autopla~ants : Optimisation du squelette granulaire par la methode graphique de Dreux - Gorisse. Fatiha Boumaza - Zeraoulia* & Mourad Behim. Laboratoire Materiaux, Geo - Materiaux et Environnement - Departement de Genie Civil. Universite Badji Mokhtar Annaba - BP 12, 23000 Annaba - ...

  5. Parameter Optimisation for the Behaviour of Elastic Models over Time

    DEFF Research Database (Denmark)

    Mosegaard, Jesper

    2004-01-01

    Optimisation of parameters for elastic models is essential for comparison or finding equivalent behaviour of elastic models when parameters cannot simply be transferred or converted. This is the case with a large range of commonly used elastic models. In this paper we present a general method tha...

  6. Development of an Optimised Losartan Potassium Press-Coated ...

    African Journals Online (AJOL)

    The optimised formulation was further characterized with Fourier-transform infrared spectroscopy (FTIR) and powder X-ray diffractometry (PXRD) to investigate any drug/excipient modifications/interactions. Results: The tensile strength values of all the PCT were between 1.12 and 1.23MNm-2 and friability was < 0.36 %.

  7. Statistical Optimisation of Fermentation Conditions for Citric Acid ...

    African Journals Online (AJOL)

    This study investigated the optimisation of fermentation conditions during citric acid production via solid state fermentation (SSF) of pineapple peels using Aspergillus niger. A three-variable, three-level Box-Behnken design (BBD) comprising 17 experimental runs was used to develop a statistical model for the fermentation ...

  8. Optimising the Blended Learning Environment: The Arab Open University Experience

    Science.gov (United States)

    Hamdi, Tahrir; Abu Qudais, Mohammed

    2018-01-01

    This paper will offer some insights into possible ways to optimise the blended learning environment based on experience with this modality of teaching at Arab Open University/Jordan branch and also by reflecting upon the results of several meta-analytical studies, which have shown blended learning environments to be more effective than their face…

  9. Comparing and Optimising Parallel Haskell Implementations for Multicore Machines

    DEFF Research Database (Denmark)

    Berthold, Jost; Marlow, Simon; Hammond, Kevin

    2009-01-01

    In this paper, we investigate the differences and tradeoffs imposed by two parallel Haskell dialects running on multicore machines. GpH and Eden are both constructed using the highly-optimising sequential GHC compiler, and share thread scheduling, and other elements, from a common code base. The ...

  10. Larval feeding inhibition assay – need for optimisation

    DEFF Research Database (Denmark)

    Azuhnwi, Blasius; Desrues, O.; Hoste, H.

    2013-01-01

    for this observed variation in results include: parasite (species/strain); material tested; or season. There is thus need to optimise LFIA to permit intra and inter-laboratory comparison of results. We investigate here, if changes in EC50 values occur over the patency phase of a nematode species using two test...

  11. A comparative study of marriage in honey bees optimisation (MBO ...

    African Journals Online (AJOL)

    2012-02-15

    Feb 15, 2012 ... In a typical mating, the queen mates with 7 to 20 drones. Each time the .... Honey bee mating optimisation model's pseudo-code ... for this analysis, which consists of 47 years of monthly time ... tive of Karkheh Reservoir is to control and regulate the flow of ..... Masters thesis, Maastricht University, Maastricht.

  12. Analysing the performance of dynamic multi-objective optimisation algorithms

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available and the goal of the algorithm is to track a set of tradeoff solutions over time. Analysing the performance of a dynamic multi-objective optimisation algorithm (DMOA) is not a trivial task. For each environment (before a change occurs) the DMOA has to find a set...

  13. Optimisation of dialysate flow in on-line hemodiafiltration

    Directory of Open Access Journals (Sweden)

    Francisco Maduell

    2015-09-01

    Conclusion: Qd variations in OL-HDF do not change convective volume. A higher Qd was associated to a slightly increased urea clearance with no change being observed for medium and large molecules. Qd optimisation to the minimal level assuring an adequate dialysis dose and allowing water and dialysate use to be rationalised should be recommended.

  14. Day-ahead economic optimisation of energy storage

    NARCIS (Netherlands)

    Lampropoulos, I.; Garoufalis, P.; Bosch, van den P.P.J.; Groot, de R.J.W.; Kling, W.L.

    2014-01-01

    This article addresses the day-ahead economic optimisation of energy storage systems within the setting of electricity spot markets. The case study is about a lithium-ion battery system integrated in a low voltage distribution grid with residential customers and photovoltaic generation in the

  15. Optimisation of the image resolution of a positron emission tomograph

    International Nuclear Information System (INIS)

    Ziemons, K.

    1993-10-01

    The resolution and the respective signal-to-noise ratios of reconstructed pictures were a point of main interest of the work for optimisation of PET systems. Monte-Carlo modelling calculations were applied to derive possible improvements of the technical design or performance of the PET system. (DG) [de

  16. FISHRENT; Bio-economic simulation and optimisation model

    NARCIS (Netherlands)

    Salz, P.; Buisman, F.C.; Soma, K.; Frost, H.; Accadia, P.; Prellezo, R.

    2011-01-01

    Key findings: The FISHRENT model is a major step forward in bio-economic model-ling, combining features that have not been fully integrated in earlier models: 1- Incorporation of any number of species (or stock) and/or fleets 2- Integration of simulation and optimisation over a period of 25 years 3-

  17. Design of optimised backstepping controller for the synchronisation

    Indian Academy of Sciences (India)

    In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master–slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm ...

  18. Plant-wide performance optimisation – The refrigeration system case

    DEFF Research Database (Denmark)

    Green, Torben; Razavi-Far, Roozbeh; Izadi-Zamanabadi, Roozbeh

    2012-01-01

    applicationsin the process industry. The paper addresses the fact that dynamic performance of the system is important, to ensure optimal changes between different operation conditions. To enable optimisation of the dynamic controller behaviour a method for designing the required excitation signal is presented...

  19. Preconditioned stochastic gradient descent optimisation for monomodal image registration

    NARCIS (Netherlands)

    Klein, S.; Staring, M.; Andersson, J.P.; Pluim, J.P.W.; Fichtinger, G.; Martel, A.; Peters, T.

    2011-01-01

    We present a stochastic optimisation method for intensity-based monomodal image registration. The method is based on a Robbins-Monro stochastic gradient descent method with adaptive step size estimation, and adds a preconditioning matrix. The derivation of the pre-conditioner is based on the

  20. Optimisation of Hidden Markov Model using Baum–Welch algorithm

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 126; Issue 1. Optimisation of Hidden Markov Model using Baum–Welch algorithm for prediction of maximum and minimum temperature over Indian Himalaya. J C Joshi Tankeshwar Kumar Sunita Srivastava Divya Sachdeva. Volume 126 Issue 1 February 2017 ...

  1. optimisation of compressive strength of periwinkle shell aggregate

    African Journals Online (AJOL)

    user

    2017-01-01

    Jan 1, 2017 ... In this paper, a regression model is developed to predict and optimise the compressive strength of periwinkle shell aggregate concrete using Scheffe's regression theory. The results obtained from the derived regression model agreed favourably with the experimental data. The model was tested for ...

  2. Smart optimisation and sensitivity analysis in water distribution systems

    CSIR Research Space (South Africa)

    Page, Philip R

    2015-12-01

    Full Text Available optimisation of a water distribution system by keeping the average pressure unchanged as water demands change, by changing the speed of the pumps. Another application area considered, using the same mathematical notions, is the study of the sensitivity...

  3. Estimators for initial conditions for optimisation in learning hydraulic systems

    NARCIS (Netherlands)

    Post, W.J.A.E.M.; Burrows, C.R.; Edge, K.A.

    1998-01-01

    In Learning Hydraulic Systems (LHS1. developed at the Eindhoven University of Technology, a specialised optimisation routine is employed In order to reduce energy losses in hydraulic systems. Typical load situations which can be managed by LHS are variable cyclic loads, as can be observed In many

  4. An optimised portfolio management model, incorporating best practices

    OpenAIRE

    2015-01-01

    M.Ing. (Engineering Management) Driving sustainability, optimising return on investments and cultivating a competitive market advantage, are imperative for organisational success and growth. In order to achieve the business objectives and value proposition, effective management strategies must be efficiently implemented, monitored and controlled. Failure to do so ultimately result in; financial loss due to increased capital and operational expenditure, schedule slippages, substandard deliv...

  5. Optimisation of searches for Supersymmetry with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Zvolsky, Milan

    2012-01-15

    The ATLAS experiment is one of the four large experiments at the Large Hadron Collider which is specifically designed to search for the Higgs boson and physics beyond the Standard Model. The aim of this thesis is the optimisation of searches for Supersymmetry in decays with two leptons and missing transverse energy in the final state. Two different optimisation studies have been performed for two important analysis aspects: The final signal region selection and the choice of the trigger selection. In the first part of the analysis, a cut-based optimisation of signal regions is performed, maximising the signal for a minimal background contamination. By this, the signal yield can in parts be more than doubled. The second approach is to introduce di-lepton triggers which allow to lower the lepton transverse momentum threshold, thus enhancing the number of selected signal events significantly. The signal region optimisation was considered for the choice of the final event selection in the ATLAS di-lepton analyses. The trigger study contributed to the incorporation of di-lepton triggers to the ATLAS trigger menu. (orig.)

  6. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    Organic fraction of municipal solid waste (OFMSW) is a rich substrate for biogas and compost production. Anaerobic Digestate compost (ADC) is an organic fertilizer produced from stabilized residuals of anaerobic digestion of OFMSW. This paper reports the result of studies carried out to optimise the production of ADC from ...

  7. Optimising performance in steady state for a supermarket refrigeration system

    DEFF Research Database (Denmark)

    Green, Torben; Kinnaert, Michel; Razavi-Far, Roozbeh

    2012-01-01

    Using a supermarket refrigeration system as an illustrative example, the paper postulates that by appropriately utilising knowledge of plant operation, the plant wide performance can be optimised based on a small set of variables. Focusing on steady state operations, the total system performance...

  8. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  9. Optimisation of selective breeding program for Nile tilapia (Oreochromis niloticus)

    NARCIS (Netherlands)

    Trong, T.Q.

    2013-01-01

    The aim of this thesis was to optimise the selective breeding program for Nile tilapia in the Mekong Delta region of Vietnam. Two breeding schemes, the “classic” BLUP scheme following the GIFT method (with pair mating) and a rotational mating scheme with own performance selection and

  10. Optimised cantilever biosensor with piezoresistive read-out

    DEFF Research Database (Denmark)

    Rasmussen, Peter; Thaysen, J.; Hansen, Ole

    2003-01-01

    We present a cantilever-based biochemical sensor with piezoresistive read-out which has been optimised for measuring surface stress. The resistors and the electrical wiring on the chip are encapsulated in low-pressure chemical vapor deposition (LPCVD) silicon nitride, so that the chip is well sui...

  11. Optimisation of Heterogeneous Migration Paths to High Bandwidth Home Connections

    NARCIS (Netherlands)

    Phillipson, F.

    2017-01-01

    Operators are building architectures and systems for delivering voice, audio, and data services at the required speed for now and in the future. For fixed access networks, this means in many countries a shift from copper based to fibre based access networks. This paper proposes a method to optimise

  12. Optimisation of wort production from rice malt using enzymes and ...

    African Journals Online (AJOL)

    Commercially, rice malt has never been successfully used in brewing because of its low free α-amino nitrogen (FAN) content. This study was designed to optimise rice malt replacement for barley malt in wort production and to improve FAN by adding α-amylase and protease. The response surface methodology (RSM) ...

  13. Optimisation of a novel trailing edge concept for a high lift device

    CSIR Research Space (South Africa)

    Botha, JDM

    2014-09-01

    Full Text Available A novel concept (referred to as the flap extension) is implemented on the leading edge of the flap of a three element high lift device. The concept is optimised using two optimisation approaches based on Genetic Algorithm optimisations. A zero order...

  14. Coil optimisation for transcranial magnetic stimulation in realistic head geometry.

    Science.gov (United States)

    Koponen, Lari M; Nieminen, Jaakko O; Mutanen, Tuomas P; Stenroos, Matti; Ilmoniemi, Risto J

    Transcranial magnetic stimulation (TMS) allows focal, non-invasive stimulation of the cortex. A TMS pulse is inherently weakly coupled to the cortex; thus, magnetic stimulation requires both high current and high voltage to reach sufficient intensity. These requirements limit, for example, the maximum repetition rate and the maximum number of consecutive pulses with the same coil due to the rise of its temperature. To develop methods to optimise, design, and manufacture energy-efficient TMS coils in realistic head geometry with an arbitrary overall coil shape. We derive a semi-analytical integration scheme for computing the magnetic field energy of an arbitrary surface current distribution, compute the electric field induced by this distribution with a boundary element method, and optimise a TMS coil for focal stimulation. Additionally, we introduce a method for manufacturing such a coil by using Litz wire and a coil former machined from polyvinyl chloride. We designed, manufactured, and validated an optimised TMS coil and applied it to brain stimulation. Our simulations indicate that this coil requires less than half the power of a commercial figure-of-eight coil, with a 41% reduction due to the optimised winding geometry and a partial contribution due to our thinner coil former and reduced conductor height. With the optimised coil, the resting motor threshold of abductor pollicis brevis was reached with the capacitor voltage below 600 V and peak current below 3000 A. The described method allows designing practical TMS coils that have considerably higher efficiency than conventional figure-of-eight coils. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Various Solution Methods, Accompanied by Dynamic Investigation, for the Same Problem as a Means for Enriching the Mathematical Toolbox

    Science.gov (United States)

    Oxman, Victor; Stupel, Moshe

    2018-01-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  16. A molecular toolbox for rapid generation of viral vectors to up- or down-regulate in vivo neuronal gene expression

    Directory of Open Access Journals (Sweden)

    Melanie D. White

    2011-07-01

    Full Text Available We introduce a molecular toolbox for manipulation of neuronal gene expression in vivo. The toolbox includes promoters, ion channels, optogenetic tools, fluorescent proteins and intronic artificial microRNAs. The components are easily assembled into adeno-associated virus (AAV or lentivirus vectors using recombination cloning. We demonstrate assembly of toolbox components into lentivirus and AAV vectors and use these vectors for in vivo expression of inwardly rectifying potassium channels (Kir2.1, Kir3.1 and Kir3.2 and an artificial microRNA targeted against the ion channel HCN1 (HCN1 miR. We show that AAV assembled to express HCN1 miR produces efficacious and specific in vivo knockdown of HCN1 channels. Comparison of in vivo viral transduction using HCN1 miR with mice containing a germ line deletion of HCN1 reveals similar physiological phenotypes in cerebellar Purkinje cells. The easy assembly and re-usability of the toolbox components, together with the ability to up- or down-regulate neuronal gene expression in vivo, may be useful for applications in many areas of neuroscience.

  17. User Guide for Compressible Flow Toolbox Version 2.1 for Use With MATLAB(Registered Trademark); Version 7

    Science.gov (United States)

    Melcher, Kevin J.

    2006-01-01

    This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.

  18. TopoToolbox: using sensor topography to calculate psychologically meaningful measures from event-related EEG/MEG.

    Science.gov (United States)

    Tian, Xing; Poeppel, David; Huber, David E

    2011-01-01

    The open-source toolbox "TopoToolbox" is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004).

  19. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Science.gov (United States)

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  20. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap.

    Science.gov (United States)

    Snigdha, Shikha; Milgram, Norton W; Willis, Sherry L; Albert, Marylin; Weintraub, S; Fortin, Norbert J; Cotman, Carl W

    2013-07-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer's disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer's disease. Copyright © 2013 Elsevier Inc. All rights reserved.