WorldWideScience

Sample records for optimisation study big

  1. Profile control studies for JET optimised shear regime

    Energy Technology Data Exchange (ETDEWEB)

    Litaudon, X.; Becoulet, A.; Eriksson, L.G.; Fuchs, V.; Huysmans, G.; How, J.; Moreau, D.; Rochard, F.; Tresset, G.; Zwingmann, W. [Association Euratom-CEA, CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint-Paul-lez-Durance (France); Bayetti, P.; Joffrin, E.; Maget, P.; Mayorat, M.L.; Mazon, D.; Sarazin, Y. [JET Abingdon, Oxfordshire (United Kingdom); Voitsekhovitch, I. [Universite de Provence, LPIIM, Aix-Marseille 1, 13 (France)

    2000-03-01

    This report summarises the profile control studies, i.e. preparation and analysis of JET Optimised Shear plasmas, carried out during the year 1999 within the framework of the Task-Agreement (RF/CEA/02) between JET and the Association Euratom-CEA/Cadarache. We report on our participation in the preparation of the JET Optimised Shear experiments together with their comprehensive analyses and the modelling. Emphasis is put on the various aspects of pressure profile control (core and edge pressure) together with detailed studies of current profile control by non-inductive means, in the prospects of achieving steady, high performance, Optimised Shear plasmas. (authors)

  2. Big data to optimise product strategy in electronic industry

    OpenAIRE

    Khan, Nawaz; Lakshmi Sabih, Vijay; Georgiadou, Elli; Repanovich, Angela

    2016-01-01

    This research identifies the success factors for new product development and competitive advantage as well as argues how big data can expedite the process of launching a new product initiative. By combining the research findings and the patterns of background theories, an inquisitive framework for the new product development and competitive advantage is proposed. This model and framework is a prototype, which with the aid of scenario recommends the parsimonious and an unified way to elucidate...

  3. Optimising parallel R correlation matrix calculations on gene expression data using MapReduce.

    Science.gov (United States)

    Wang, Shicai; Pandis, Ioannis; Johnson, David; Emam, Ibrahim; Guitton, Florian; Oehmichen, Axel; Guo, Yike

    2014-11-05

    High-throughput molecular profiling data has been used to improve clinical decision making by stratifying subjects based on their molecular profiles. Unsupervised clustering algorithms can be used for stratification purposes. However, the current speed of the clustering algorithms cannot meet the requirement of large-scale molecular data due to poor performance of the correlation matrix calculation. With high-throughput sequencing technologies promising to produce even larger datasets per subject, we expect the performance of the state-of-the-art statistical algorithms to be further impacted unless efforts towards optimisation are carried out. MapReduce is a widely used high performance parallel framework that can solve the problem. In this paper, we evaluate the current parallel modes for correlation calculation methods and introduce an efficient data distribution and parallel calculation algorithm based on MapReduce to optimise the correlation calculation. We studied the performance of our algorithm using two gene expression benchmarks. In the micro-benchmark, our implementation using MapReduce, based on the R package RHIPE, demonstrates a 3.26-5.83 fold increase compared to the default Snowfall and 1.56-1.64 fold increase compared to the basic RHIPE in the Euclidean, Pearson and Spearman correlations. Though vanilla R and the optimised Snowfall outperforms our optimised RHIPE in the micro-benchmark, they do not scale well with the macro-benchmark. In the macro-benchmark the optimised RHIPE performs 2.03-16.56 times faster than vanilla R. Benefiting from the 3.30-5.13 times faster data preparation, the optimised RHIPE performs 1.22-1.71 times faster than the optimised Snowfall. Both the optimised RHIPE and the optimised Snowfall successfully performs the Kendall correlation with TCGA dataset within 7 hours. Both of them conduct more than 30 times faster than the estimated vanilla R. The performance evaluation found that the new MapReduce algorithm and its

  4. The optimisation study of tbp synthesis process by phosphoric acid

    International Nuclear Information System (INIS)

    Amedjkouh, A.; Attou, M.; Azzouz, A.; Zaoui, B.

    1995-07-01

    The present work deals with the optimisation study of TBP synthesis process by phosphoric acid. This way of synthesis is more advantageous than POCL3 or P2O5 as phosphatant agents. these latters are toxic and dangerous for the environnement. The optimisation study is based on a series of 16 experiences taking into account the range of variation of the following parameters : temperature, pressure, reagents mole ratio, promoter content. the yield calculation is based on the randomisation of an equation including all parameters. the resolution of this equation gave a 30% TBP molar ratio. this value is in agreement with that of experimental data

  5. Optimisation in radiotherapy II: Programmed and inversion optimisation algorithms

    International Nuclear Information System (INIS)

    Ebert, M.

    1997-01-01

    This is the second article in a three part examination of optimisation in radiotherapy. The previous article established the bases of optimisation in radiotherapy, and the formulation of the optimisation problem. This paper outlines several algorithms that have been used in radiotherapy, for searching for the best irradiation strategy within the full set of possible strategies. Two principle classes of algorithm are considered - those associated with mathematical programming which employ specific search techniques, linear programming type searches or artificial intelligence - and those which seek to perform a numerical inversion of the optimisation problem, finishing with deterministic iterative inversion. (author)

  6. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    Organic fraction of municipal solid waste (OFMSW) is a rich substrate for biogas and compost production. Anaerobic Digestate compost (ADC) is an organic fertilizer produced from stabilized residuals of anaerobic digestion of OFMSW. This paper reports the result of studies carried out to optimise the production of ADC from ...

  7. Beam position optimisation for IMRT

    International Nuclear Information System (INIS)

    Holloway, L.; Hoban, P.

    2001-01-01

    Full text: The introduction of IMRT has not generally resulted in the use of optimised beam positions because to find the global solution of the problem a time consuming stochastic optimisation method must be used. Although a deterministic method may not achieve the global minimum it should achieve a superior dose distribution compared to no optimisation. This study aimed to develop and test such a method. The beam optimisation method developed relies on an iterative process to achieve the desired number of beams from a large initial number of beams. The number of beams is reduced in a 'weeding-out' process based on the total fluence which each beam delivers. The process is gradual, with only three beams removed each time (following a small number of iterations), ensuring that the reduction in beams does not dramatically affect the fluence maps of those remaining. A comparison was made between the dose distributions achieved when the beams positions were optimised in this fashion and when the beams positions were evenly distributed. The method has been shown to work quite effectively and efficiently. The Figure shows a comparison in dose distribution with optimised and non optimised beam positions for 5 beams. It can be clearly seen that there is an improvement in the dose distribution delivered to the tumour and a reduction in the dose to the critical structure with beam position optimisation. A method for beam position optimisation for use in IMRT optimisations has been developed. This method although not necessarily achieving the global minimum in beam position still achieves quite a dramatic improvement compared with no beam position optimisation and is very efficiently achieved. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  8. Optimised Renormalisation Group Flows

    CERN Document Server

    Litim, Daniel F

    2001-01-01

    Exact renormalisation group (ERG) flows interpolate between a microscopic or classical theory and the corresponding macroscopic or quantum effective theory. For most problems of physical interest, the efficiency of the ERG is constrained due to unavoidable approximations. Approximate solutions of ERG flows depend spuriously on the regularisation scheme which is determined by a regulator function. This is similar to the spurious dependence on the ultraviolet regularisation known from perturbative QCD. Providing a good control over approximated ERG flows is at the root for reliable physical predictions. We explain why the convergence of approximate solutions towards the physical theory is optimised by appropriate choices of the regulator. We study specific optimised regulators for bosonic and fermionic fields and compare the optimised ERG flows with generic ones. This is done up to second order in the derivative expansion at both vanishing and non-vanishing temperature. An optimised flow for a ``proper-time ren...

  9. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  10. Big(ger Data as Better Data in Open Distance Learning

    Directory of Open Access Journals (Sweden)

    Paul Prinsloo

    2015-02-01

    Full Text Available In the context of the hype, promise and perils of Big Data and the currently dominant paradigm of data-driven decision-making, it is important to critically engage with the potential of Big Data for higher education. We do not question the potential of Big Data, but we do raise a number of issues, and present a number of theses to be seriously considered in realising this potential. The University of South Africa (Unisa is one of the mega ODL institutions in the world with more than 360,000 students and a range of courses and programmes. Unisa already has access to a staggering amount of student data, hosted in disparate sources, and governed by different processes. As the university moves to mainstreaming online learning, the amount of and need for analyses of data are increasing, raising important questions regarding our assumptions, understanding, data sources, systems and processes. This article presents a descriptive case study of the current state of student data at Unisa, as well as explores the impact of existing data sources and analytic approaches. From the analysis it is clear that in order for big(ger data to be better data, a number of issues need to be addressed. The article concludes by presenting a number of theses that should form the basis for the imperative to optimise the harvesting, analysis and use of student data.

  11. Transforming fragments into candidates: small becomes big in medicinal chemistry.

    Science.gov (United States)

    de Kloe, Gerdien E; Bailey, David; Leurs, Rob; de Esch, Iwan J P

    2009-07-01

    Fragment-based drug discovery (FBDD) represents a logical and efficient approach to lead discovery and optimisation. It can draw on structural, biophysical and biochemical data, incorporating a wide range of inputs, from precise mode-of-binding information on specific fragments to wider ranging pharmacophoric screening surveys using traditional HTS approaches. It is truly an enabling technology for the imaginative medicinal chemist. In this review, we analyse a representative set of 23 published FBDD studies that describe how low molecular weight fragments are being identified and efficiently transformed into higher molecular weight drug candidates. FBDD is now becoming warmly endorsed by industry as well as academia and the focus on small interacting molecules is making a big scientific impact.

  12. Study on the evolutionary optimisation of the topology of network control systems

    Science.gov (United States)

    Zhou, Zude; Chen, Benyuan; Wang, Hong; Fan, Zhun

    2010-08-01

    Computer networks have been very popular in enterprise applications. However, optimisation of network designs that allows networks to be used more efficiently in industrial environment and enterprise applications remains an interesting research topic. This article mainly discusses the topology optimisation theory and methods of the network control system based on switched Ethernet in an industrial context. Factors that affect the real-time performance of the industrial control network are presented in detail, and optimisation criteria with their internal relations are analysed. After the definition of performance parameters, the normalised indices for the evaluation of the topology optimisation are proposed. The topology optimisation problem is formulated as a multi-objective optimisation problem and the evolutionary algorithm is applied to solve it. Special communication characteristics of the industrial control network are considered in the optimisation process. In respect to the evolutionary algorithm design, an improved arena algorithm is proposed for the construction of the non-dominated set of the population. In addition, for the evaluation of individuals, the integrated use of the dominative relation method and the objective function combination method, for reducing the computational cost of the algorithm, are given. Simulation tests show that the performance of the proposed algorithm is preferable and superior compared to other algorithms. The final solution greatly improves the following indices: traffic localisation, traffic balance and utilisation rate balance of switches. In addition, a new performance index with its estimation process is proposed.

  13. Topology optimisation of natural convection problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations...... coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences...... in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach...

  14. Pre-segmented 2-Step IMRT with subsequent direct machine parameter optimisation – a planning study

    International Nuclear Information System (INIS)

    Bratengeier, Klaus; Meyer, Jürgen; Flentje, Michael

    2008-01-01

    Modern intensity modulated radiotherapy (IMRT) mostly uses iterative optimisation methods. The integration of machine parameters into the optimisation process of step and shoot leaf positions has been shown to be successful. For IMRT segmentation algorithms based on the analysis of the geometrical structure of the planning target volumes (PTV) and the organs at risk (OAR), the potential of such procedures has not yet been fully explored. In this work, 2-Step IMRT was combined with subsequent direct machine parameter optimisation (DMPO-Raysearch Laboratories, Sweden) to investigate this potential. In a planning study DMPO on a commercial planning system was compared with manual primary 2-Step IMRT segment generation followed by DMPO optimisation. 15 clinical cases and the ESTRO Quasimodo phantom were employed. Both the same number of optimisation steps and the same set of objective values were used. The plans were compared with a clinical DMPO reference plan and a traditional IMRT plan based on fluence optimisation and consequent segmentation. The composite objective value (the weighted sum of quadratic deviations of the objective values and the related points in the dose volume histogram) was used as a measure for the plan quality. Additionally, a more extended set of parameters was used for the breast cases to compare the plans. The plans with segments pre-defined with 2-Step IMRT were slightly superior to DMPO alone in the majority of cases. The composite objective value tended to be even lower for a smaller number of segments. The total number of monitor units was slightly higher than for the DMPO-plans. Traditional IMRT fluence optimisation with subsequent segmentation could not compete. 2-Step IMRT segmentation is suitable as starting point for further DMPO optimisation and, in general, results in less complex plans which are equal or superior to plans generated by DMPO alone

  15. CFD optimisation of a stadium roof geometry: a qualitative study to improve the wind microenvironment

    Directory of Open Access Journals (Sweden)

    Sofotasiou Polytimi

    2017-01-01

    Full Text Available The complexity of the built environment requires the adoption of coupled techniques to predict the flow phenomena and provide optimum design solutions. In this study, coupled computational fluid dynamics (CFD and response surface methodology (RSM optimisation tools are employed to investigate the parameters that determine the wind comfort in a two-dimensional stadium model, by optimising the roof geometry. The roof height, width and length are evaluated against the flow homogeneity at the spectator terraces and the playing field area, the roof flow rate and the average interior pressure. Based on non-parametric regression analysis, both symmetric and asymmetric configurations are considered for optimisation. The optimum design solutions revealed that it is achievable to provide an improved wind environment in both playing field area and spectator terraces, giving a further insight on the interrelations of the parameters involved. Considering the limitations of conducting a two-dimensional study, the obtained results may beneficially be used as a basis for the optimisation of a complex three-dimensional stadium structure and thus become an important design guide for stadium structures.

  16. Cost optimisation studies of high power accelerators

    Energy Technology Data Exchange (ETDEWEB)

    McAdams, R.; Nightingale, M.P.S.; Godden, D. [AEA Technology, Oxon (United Kingdom)] [and others

    1995-10-01

    Cost optimisation studies are carried out for an accelerator based neutron source consisting of a series of linear accelerators. The characteristics of the lowest cost design for a given beam current and energy machine such as power and length are found to depend on the lifetime envisaged for it. For a fixed neutron yield it is preferable to have a low current, high energy machine. The benefits of superconducting technology are also investigated. A Separated Orbit Cyclotron (SOC) has the potential to reduce capital and operating costs and intial estimates for the transverse and longitudinal current limits of such machines are made.

  17. Advanced optimisation - coal fired power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Turney, D.M.; Mayes, I. [E.ON UK, Nottingham (United Kingdom)

    2005-03-01

    The purpose of this unit optimization project is to develop an integrated approach to unit optimisation and develop an overall optimiser that is able to resolve any conflicts between the individual optimisers. The individual optimisers have been considered during this project are: on-line thermal efficiency package, GNOCIS boiler optimiser, GNOCIS steam side optimiser, ESP optimisation, and intelligent sootblowing system. 6 refs., 7 figs., 3 tabs.

  18. Dose optimisation in single plane interstitial brachytherapy

    DEFF Research Database (Denmark)

    Tanderup, Kari; Hellebust, Taran Paulsen; Honoré, Henriette Benedicte

    2006-01-01

    patients,       treated for recurrent rectal and cervical cancer, flexible catheters were       sutured intra-operatively to the tumour bed in areas with compromised       surgical margin. Both non-optimised, geometrically and graphically       optimised CT -based dose plans were made. The overdose index...... on the       regularity of the implant, such that the benefit of optimisation was       larger for irregular implants. OI and HI correlated strongly with target       volume limiting the usability of these parameters for comparison of dose       plans between patients. CONCLUSIONS: Dwell time optimisation significantly......BACKGROUND AND PURPOSE: Brachytherapy dose distributions can be optimised       by modulation of source dwell times. In this study dose optimisation in       single planar interstitial implants was evaluated in order to quantify the       potential benefit in patients. MATERIAL AND METHODS: In 14...

  19. Optimisation on processing parameters for minimising warpage on side arm using response surface methodology (RSM) and particle swarm optimisation (PSO)

    Science.gov (United States)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.

    2017-09-01

    This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.

  20. (MBO) algorithm in multi-reservoir system optimisation

    African Journals Online (AJOL)

    A comparative study of marriage in honey bees optimisation (MBO) algorithm in ... A practical application of the marriage in honey bees optimisation (MBO) ... to those of other evolutionary algorithms, such as the genetic algorithm (GA), ant ...

  1. The Study of “big data” to support internal business strategists

    Science.gov (United States)

    Ge, Mei

    2018-01-01

    How is big data different from previous data analysis systems? The primary purpose behind traditional small data analytics that all managers are more or less familiar with is to support internal business strategies. But big data also offers a promising new dimension: to discover new opportunities to offer customers high-value products and services. The study focus to introduce some strategists which big data support to. Business decisions using big data can also involve some areas for analytics. They include customer satisfaction, customer journeys, supply chains, risk management, competitive intelligence, pricing, discovery and experimentation or facilitating big data discovery.

  2. Combining simulation and multi-objective optimisation for equipment quantity optimisation in container terminals

    OpenAIRE

    Lin, Zhougeng

    2013-01-01

    This thesis proposes a combination framework to integrate simulation and multi-objective optimisation (MOO) for container terminal equipment optimisation. It addresses how the strengths of simulation and multi-objective optimisation can be integrated to find high quality solutions for multiple objectives with low computational cost. Three structures for the combination framework are proposed respectively: pre-MOO structure, integrated MOO structure and post-MOO structure. The applications of ...

  3. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  4. A comparison of forward planning and optimised inverse planning

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony; Webb, Steve

    1995-01-01

    A radiotherapy treatment plan optimisation algorithm has been applied to 48 prostate plans and the results compared with those of an experienced human planner. Twelve patients were used in the study, and a 3, 4, 6 and 8 field plan (with standard coplanar beam angles for each plan type) were optimised by both the human planner and the optimisation algorithm. The human planner 'optimised' the plan by conventional forward planning techniques. The optimisation algorithm was based on fast-simulated-annealing. 'Importance factors' assigned to different regions of the patient provide a method for controlling the algorithm, and it was found that the same values gave good results for almost all plans. The plans were compared on the basis of dose statistics and normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results show that the optimisation algorithm yielded results that were at least as good as the human planner for all plan types, and on the whole slightly better. A study of the beam-weights chosen by the optimisation algorithm and the planner will be presented. The optimisation algorithm showed greater variation, in response to individual patient geometry. For simple (e.g. 3 field) plans it was found to consistently achieve slightly higher TCP and lower NTCP values. For more complicated (e.g. 8 fields) plans the optimisation also achieved slightly better results with generally less numbers of beams. The optimisation time was always ≤5 minutes; a factor of up to 20 times faster than the human planner

  5. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    DR. AMIN

    optimise the production of ADC from organic fractions of domestic wastes and the effects of ADC amendments on soil .... (22%), cooked meat (9%), lettuce (11%), carrots. (3%), potato (44%) ... seed was obtained from a mesophilic anaerobic.

  6. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  7. Optimisation of Investment Resources at Small Enterprises

    Directory of Open Access Journals (Sweden)

    Shvets Iryna B.

    2014-03-01

    Full Text Available The goal of the article lies in the study of the process of optimisation of the structure of investment resources, development of criteria and stages of optimisation of volumes of investment resources for small enterprises by types of economic activity. The article characterises the process of transformation of investment resources into assets and liabilities of the balances of small enterprises and conducts calculation of the structure of sources of formation of investment resources in Ukraine at small enterprises by types of economic activity in 2011. On the basis of the conducted analysis of the structure of investment resources of small enterprises the article forms main groups of criteria of optimisation in the context of individual small enterprises by types of economic activity. The article offers an algorithm and step-by-step scheme of optimisation of investment resources at small enterprises in the form of a multi-stage process of management of investment resources in the context of increase of their mobility and rate of transformation of existing resources into investments. The prospect of further studies in this direction is development of a structural and logic scheme of optimisation of volumes of investment resources at small enterprises.

  8. Parametric studies and optimisation of pumped thermal electricity storage

    International Nuclear Information System (INIS)

    McTigue, Joshua D.; White, Alexander J.; Markides, Christos N.

    2015-01-01

    Highlights: • PTES is modelled by cycle analysis and a Schumann-style model of the thermal stores. • Optimised trade-off surfaces show a flat efficiency vs. energy density profile. • Overall roundtrip efficiencies of around 70% are not inconceivable. - Abstract: Several of the emerging technologies for electricity storage are based on some form of thermal energy storage (TES). Examples include liquid air energy storage, pumped heat energy storage and, at least in part, advanced adiabatic compressed air energy storage. Compared to other large-scale storage methods, TES benefits from relatively high energy densities, which should translate into a low cost per MW h of storage capacity and a small installation footprint. TES is also free from the geographic constraints that apply to hydro storage schemes. TES concepts for electricity storage rely on either a heat pump or refrigeration cycle during the charging phase to create a hot or a cold storage space (the thermal stores), or in some cases both. During discharge, the thermal stores are depleted by reversing the cycle such that it acts as a heat engine. The present paper is concerned with a form of TES that has both hot and cold packed-bed thermal stores, and for which the heat pump and heat engine are based on a reciprocating Joule cycle, with argon as the working fluid. A thermodynamic analysis is presented based on traditional cycle calculations coupled with a Schumann-style model of the packed beds. Particular attention is paid to the various loss-generating mechanisms and their effect on roundtrip efficiency and storage density. A parametric study is first presented that examines the sensitivity of results to assumed values of the various loss factors and demonstrates the rather complex influence of the numerous design variables. Results of an optimisation study are then given in the form of trade-off surfaces for roundtrip efficiency, energy density and power density. The optimised designs show a

  9. Simulation optimisation

    International Nuclear Information System (INIS)

    Anon

    2010-01-01

    Over the past decade there has been a significant advance in flotation circuit optimisation through performance benchmarking using metallurgical modelling and steady-state computer simulation. This benchmarking includes traditional measures, such as grade and recovery, as well as new flotation measures, such as ore floatability, bubble surface area flux and froth recovery. To further this optimisation, Outotec has released its HSC Chemistry software with simulation modules. The flotation model developed by the AMIRA P9 Project, of which Outotec is a sponsor, is regarded by industry as the most suitable flotation model to use for circuit optimisation. This model incorporates ore floatability with flotation cell pulp and froth parameters, residence time, entrainment and water recovery. Outotec's HSC Sim enables you to simulate mineral processes in different levels, from comminution circuits with sizes and no composition, through to flotation processes with minerals by size by floatability components, to full processes with true particles with MLA data.

  10. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  11. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  12. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters....... Accordingly, two novel optimisation configurations are developed, where the optimisation either acts on the actuators or acts on the regulatory control layer. These two optimisation designs are evaluated on a sub-catchment of the sewer system in Copenhagen, and found to perform better than the existing...

  13. A study of certain Monte Carlo search and optimisation methods

    International Nuclear Information System (INIS)

    Budd, C.

    1984-11-01

    Studies are described which might lead to the development of a search and optimisation facility for the Monte Carlo criticality code MONK. The facility envisaged could be used to maximise a function of k-effective with respect to certain parameters of the system or, alternatively, to find the system (in a given range of systems) for which that function takes a given value. (UK)

  14. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  15. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  16. Power supply of Eurotunnel. Optimisation based on traffic and simulation studies

    Energy Technology Data Exchange (ETDEWEB)

    Marie, Stephane [SNCF, Direction de l' Ingenierie, Saint-Denis (France). Dept. des Installations Fixes de Traction Electrique; Dupont, Jean-Pierre; Findinier, Bertrand; Maquaire, Christian [Eurotunnel, Coquelles (France)

    2010-12-15

    In order to reduce electrical power costs and also to cope with the significant traffic increase, a new study was carried on feeding the tunnel section from the French power station, thus improving and reinforcing the existing network. Based on a design study established by SNCF engineering department, EUROTUNNEL chose a new electrical scheme to cope with the traffic increase and optimise investments. (orig.)

  17. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  18. Optimal Optimisation in Chemometrics

    NARCIS (Netherlands)

    Hageman, J.A.

    2004-01-01

    The use of global optimisation methods is not straightforward, especially for the more difficult optimisation problems. Solutions have to be found for items such as the evaluation function, representation, step function and meta-parameters, before any useful results can be obtained. This thesis aims

  19. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  20. Enhancing organization and maintenance of big data with Apache Solr in IBM WebSphere Commerce deployments

    OpenAIRE

    Grigel, Rudolf

    2015-01-01

    The main objective of this thesis was to enhance the organization and maintenance of big data with Apache Solr in IBM WebSphere Commerce deployments. This objective can be split into several subtasks: reorganization of data, fast and optimised exporting and importing, efficient update and cleanup operations. E-Commerce is a fast growing and frequently changing environment. There is a constant flow of data that is rapidly growing larger and larger every day which is becoming an ...

  1. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  2. Risk based test interval and maintenance optimisation - Application and uses

    International Nuclear Information System (INIS)

    Sparre, E.

    1999-10-01

    The project is part of an IAEA co-ordinated Research Project (CRP) on 'Development of Methodologies for Optimisation of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs'. The purpose of the project is to investigate the sensitivity of the results obtained when performing risk based optimisation of the technical specifications. Previous projects have shown that complete LPSA models can be created and that these models allow optimisation of technical specifications. However, these optimisations did not include any in depth check of the result sensitivity with regards to methods, model completeness etc. Four different test intervals have been investigated in this study. Aside from an original, nominal, optimisation a set of sensitivity analyses has been performed and the results from these analyses have been compared to the original optimisation. The analyses indicate that the result of an optimisation is rather stable. However, it is not possible to draw any certain conclusions without performing a number of sensitivity analyses. Significant differences in the optimisation result were discovered when analysing an alternative configuration. Also deterministic uncertainties seem to affect the result of an optimisation largely. The sensitivity of failure data uncertainties is important to investigate in detail since the methodology is based on the assumption that the unavailability of a component is dependent on the length of the test interval

  3. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  4. Multi-Optimisation Consensus Clustering

    Science.gov (United States)

    Li, Jian; Swift, Stephen; Liu, Xiaohui

    Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.

  5. Particle swarm optimisation classical and quantum perspectives

    CERN Document Server

    Sun, Jun; Wu, Xiao-Jun

    2016-01-01

    IntroductionOptimisation Problems and Optimisation MethodsRandom Search TechniquesMetaheuristic MethodsSwarm IntelligenceParticle Swarm OptimisationOverviewMotivationsPSO Algorithm: Basic Concepts and the ProcedureParadigm: How to Use PSO to Solve Optimisation ProblemsSome Harder Examples Some Variants of Particle Swarm Optimisation Why Does the PSO Algorithm Need to Be Improved? Inertia and Constriction-Acceleration Techniques for PSOLocal Best ModelProbabilistic AlgorithmsOther Variants of PSO Quantum-Behaved Particle Swarm Optimisation OverviewMotivation: From Classical Dynamics to Quantum MechanicsQuantum Model: Fundamentals of QPSOQPSO AlgorithmSome Essential ApplicationsSome Variants of QPSOSummary Advanced Topics Behaviour Analysis of Individual ParticlesConvergence Analysis of the AlgorithmTime Complexity and Rate of ConvergenceParameter Selection and PerformanceSummaryIndustrial Applications Inverse Problems for Partial Differential EquationsInverse Problems for Non-Linear Dynamical SystemsOptimal De...

  6. Warpage optimisation on the moulded part with straight-drilled and conformal cooling channels using response surface methodology (RSM) and glowworm swarm optimisation (GSO)

    Science.gov (United States)

    Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.

    2017-09-01

    In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.

  7. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  8. Evolutionary programming for neutron instrument optimisation

    International Nuclear Information System (INIS)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelievre-Berna, Eddy

    2006-01-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations

  9. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  10. Optimising of Steel Fiber Reinforced Concrete Mix Design | Beddar ...

    African Journals Online (AJOL)

    Optimising of Steel Fiber Reinforced Concrete Mix Design. ... as a result of the loss of mixture workability that will be translated into a difficult concrete casting in site. ... An experimental study of an optimisation method of fibres in reinforced ...

  11. A pilot investigation to optimise methods for a future satiety preload study

    OpenAIRE

    Hobden, Mark R.; Guérin-Deremaux, Laetitia; Commane, Daniel M.; Rowland, Ian; Gibson, Glenn R.; Kennedy, Orla B.

    2017-01-01

    Background Preload studies are used to investigate the satiating effects of foods and food ingredients. However, the design of preload studies is complex, with many methodological considerations influencing appetite responses. The aim of this pilot investigation was to determine acceptability, and optimise methods, for a future satiety preload study. Specifically, we investigated the effects of altering (i) energy intake at a standardised breakfast (gender-specific or non-gender specific), an...

  12. Telecom Big Data for Urban Transport Analysis - a Case Study of Split-Dalmatia County in Croatia

    Science.gov (United States)

    Baučić, M.; Jajac, N.; Bućan, M.

    2017-09-01

    Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the "IPA Adriatic CBC//N.0086/INTERMODAL" project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.

  13. Big Data technology in traffic: A case study of automatic counters

    Directory of Open Access Journals (Sweden)

    Janković Slađana R.

    2016-01-01

    Full Text Available Modern information and communication technologies together with intelligent devices provide a continuous inflow of large amounts of data that are used by traffic and transport systems. Collecting traffic data does not represent a challenge nowadays, but the issues remains in relation to storing and processing increasing amounts of data. In this paper we have investigated the possibilities of using Big Data technology to store and process data in the transport domain. The term Big Data refers to a large volume of information resource, its velocity and variety, far beyond the capabilities of commonly used software for storing, processing and data management. In our case study, Apache™ Hadoop® Big Data was used for processing data collected from 10 automatic traffic counters set up in Novi Sad and its surroundings. Indicators of traffic load which were calculated using the Big Data platforms were presented using tables and graphs in Microsoft Office Excel tool. The visualization and geolocation of the obtained indicators were performed using the Microsoft Business Intelligence (BI tools such as: Excel Power View and Excel Power Map. This case study showed that Big Data technologies combined with the BI tools can be used as a reliable support in monitoring of the traffic management systems.

  14. Optimising Magnetostatic Assemblies

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Smith, Anders

    theorem. This theorem formulates an energy equivalence principle with several implications concerning the optimisation of objective functionals that are linear with respect to the magnetic field. Linear functionals represent different optimisation goals, e.g. maximising a certain component of the field...... approached employing a heuristic algorithm, which led to new design concepts. Some of the procedures developed for linear objective functionals have been extended to non-linear objectives, by employing iterative techniques. Even though most the optimality results discussed in this work have been derived...

  15. Isogeometric Analysis and Shape Optimisation

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Gersborg, Allan Roulund

    of the whole domain. So in every optimisation cycle we need to extend a parametrisation of the boundary of a domain to the whole domain. It has to be fast in order not to slow the optimisation down but it also has to be robust and give a parametrisation of high quality. These are conflicting requirements so we...... will explain how the validity of a parametrisation can be checked and we will describe various ways to parametrise a domain. We will in particular study the Winslow functional which turns out to have some desirable properties. Other problems we touch upon is clustering of boundary control points (design...

  16. Recurrent personality dimensions in inclusive lexical studies: indications for a big six structure.

    Science.gov (United States)

    Saucier, Gerard

    2009-10-01

    Previous evidence for both the Big Five and the alternative six-factor model has been drawn from lexical studies with relatively narrow selections of attributes. This study examined factors from previous lexical studies using a wider selection of attributes in 7 languages (Chinese, English, Filipino, Greek, Hebrew, Spanish, and Turkish) and found 6 recurrent factors, each with common conceptual content across most of the studies. The previous narrow-selection-based six-factor model outperformed the Big Five in capturing the content of the 6 recurrent wideband factors. Adjective markers of the 6 recurrent wideband factors showed substantial incremental prediction of important criterion variables over and above the Big Five. Correspondence between wideband 6 and narrowband 6 factors indicate they are variants of a "Big Six" model that is more general across variable-selection procedures and may be more general across languages and populations.

  17. Optimising Shovel-Truck Fuel Consumption using Stochastic ...

    African Journals Online (AJOL)

    Optimising the fuel consumption and truck waiting time can result in significant fuel savings. The paper demonstrates that stochastic simulation is an effective tool for optimising the utilisation of fossil-based fuels in mining and related industries. Keywords: Stochastic, Simulation Modelling, Mining, Optimisation, Shovel-Truck ...

  18. Application of Surpac and Whittle Software in Open Pit Optimisation ...

    African Journals Online (AJOL)

    Application of Surpac and Whittle Software in Open Pit Optimisation and Design. ... This paper studies the Surpac and Whittle software and their application in designing an optimised pit. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  19. Optimisation of technical specifications using probabilistic methods

    International Nuclear Information System (INIS)

    Ericsson, G.; Knochenhauer, M.; Hultqvist, G.

    1986-01-01

    During the last few years the development of methods for modifying and optimising nuclear power plant Technical Specifications (TS) for plant operations has received increased attention. Probalistic methods in general, and the plant and system models of probabilistic safety assessment (PSA) in particular, seem to provide the most forceful tools for optimisation. This paper first gives some general comments on optimisation, identifying important parameters and then gives a description of recent Swedish experiences from the use of nuclear power plant PSA models and results for TS optimisation

  20. Results of the 2010 IGSC Topical Session on Optimisation

    International Nuclear Information System (INIS)

    Bailey, Lucy

    2014-01-01

    Document available in abstract form only. Full text follows: The 2010 IGSC topical session on optimisation explored a wide range of issues concerning optimisation throughout the radioactive waste management process. Philosophical and ethical questions were discussed, such as: - To what extent is the process of optimisation more important than the end result? - How do we balance long-term environmental safety with near-term operational safety? - For how long should options be kept open? - In balancing safety and excessive cost, when is BAT achieved and who decides on this? * How should we balance the needs of current society with those of future generations? It was clear that optimisation is about getting the right balance between a range of issues that cover: radiation protection, environmental protection, operational safety, operational requirements, social expectations and cost. The optimisation process will also need to respect various constraints, which are likely to include: regulatory requirements, site restrictions, community-imposed requirements or restrictions and resource constraints. These issues were explored through a number of presentations that discussed practical cases of optimisation occurring at different stages of international radioactive waste management programmes. These covered: - Operations and decommissioning - management of large disused components, from the findings of an international study, presented by WPDD; - Concept option selection, prior to site selection - upstream and disposal system optioneering in the UK; - Siting decisions - examples from both Germany and France, explaining how optimisation is being used to support site comparisons and communicate siting decisions; - Repository design decisions - comparison of KBS-3 horizontal and vertical deposition options in Finland; and - On-going optimisation during repository operation - operational experience from WIPP in the US. The variety of the remarks and views expressed during the

  1. Turbulence optimisation in stellarator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Proll, Josefine H.E. [Max-Planck/Princeton Center for Plasma Physics (Germany); Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstr. 1, 17491 Greifswald (Germany); Faber, Benjamin J. [HSX Plasma Laboratory, University of Wisconsin-Madison, Madison, WI 53706 (United States); Helander, Per; Xanthopoulos, Pavlos [Max-Planck/Princeton Center for Plasma Physics (Germany); Lazerson, Samuel A.; Mynick, Harry E. [Plasma Physics Laboratory, Princeton University, P.O. Box 451 Princeton, New Jersey 08543-0451 (United States)

    2015-05-01

    Stellarators, the twisted siblings of the axisymmetric fusion experiments called tokamaks, have historically suffered from confining the heat of the plasma insufficiently compared with tokamaks and were therefore considered to be less promising candidates for a fusion reactor. This has changed, however, with the advent of stellarators in which the laminar transport is reduced to levels below that of tokamaks by shaping the magnetic field accordingly. As in tokamaks, the turbulent transport remains as the now dominant transport channel. Recent analytical theory suggests that the large configuration space of stellarators allows for an additional optimisation of the magnetic field to also reduce the turbulent transport. In this talk, the idea behind the turbulence optimisation is explained. We also present how an optimised equilibrium is obtained and how it might differ from the equilibrium field of an already existing device, and we compare experimental turbulence measurements in different configurations of the HSX stellarator in order to test the optimisation procedure.

  2. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  3. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  4. Layout Optimisation of Wave Energy Converter Arrays

    DEFF Research Database (Denmark)

    Ruiz, Pau Mercadé; Nava, Vincenzo; Topper, Mathew B. R.

    2017-01-01

    This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC) arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation......, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA), a genetic algorithm (GA) and the glowworm swarm optimisation (GSO) algorithm...

  5. BIG DATA IN SUPPLY CHAIN MANAGEMENT: AN EXPLORATORY STUDY

    Directory of Open Access Journals (Sweden)

    Gheorghe MILITARU

    2015-12-01

    Full Text Available The objective of this paper is to set a framework for examining the conditions under which the big data can create long-term profitability through developing dynamic operations and digital supply networks in supply chain. We investigate the extent to which big data analytics has the power to change the competitive landscape of industries that could offer operational, strategic and competitive advantages. This paper is based upon a qualitative study of the convergence of predictive analytics and big data in the field of supply chain management. Our findings indicate a need for manufacturers to introduce analytics tools, real-time data, and more flexible production techniques to improve their productivity in line with the new business model. By gathering and analysing vast volumes of data, analytics tools help companies to resource allocation and capital spends more effectively based on risk assessment. Finally, implications and directions for future research are discussed.

  6. TELECOM BIG DATA FOR URBAN TRANSPORT ANALYSIS – A CASE STUDY OF SPLIT-DALMATIA COUNTY IN CROATIA

    Directory of Open Access Journals (Sweden)

    M. Baučić

    2017-09-01

    Full Text Available Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the “IPA Adriatic CBC//N.0086/INTERMODAL” project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.

  7. Application of ant colony optimisation in distribution transformer sizing

    African Journals Online (AJOL)

    This study proposes an optimisation method for transformer sizing in power system using ant colony optimisation and a verification of the process by MATLAB software. The aim is to address the issue of transformer sizing which is a major challenge affecting its effective performance, longevity, huge capital cost and power ...

  8. A STUDY ON OPTIMISATION OF RESOURCES FOR MULTIPLE PROJECTS BY USING PRIMAVERA

    Directory of Open Access Journals (Sweden)

    B. S. K. REDDY

    2015-02-01

    Full Text Available Resources play vital role in construction projects. The performance of construction industry depends chiefly on how best the resources are managed. Optimisation play pivotal role in resource management, but task is highly haphazard and chaotic under the influence of complexities and vastness. Management always looks for optimum utility of resources available with them. Hence, the project management has got important place especially in resource allocation and smooth functioning with allocated budget. To achieve these goals and to exercise enhance optimisation certain tools are used for resource allocation optimally. Present work illustrates resource optimisation exercises on two ongoing projects in Dubai, UAE. Resource demands of project A & B are individually levelled and observed cumulative requirement is 17475. In other option demands of projects A & B are aggregated and then together levelled, the necessary resource observed is 16490. Comparison of individually levelled and then combined option with aggregated and then levelled clearly indicates reduction in demand of resources by 5.65% in later option, which could be best considered for economy.

  9. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  10. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    This thesis deals with topology optimisation for coupled convection problems. The aim is to extend and apply topology optimisation to steady-state conjugate heat transfer problems, where the heat conduction equation governs the heat transfer in a solid and is coupled to thermal transport...... in a surrounding uid, governed by a convection-diffusion equation, where the convective velocity field is found from solving the isothermal incompressible steady-state Navier-Stokes equations. Topology optimisation is also applied to steady-state natural convection problems. The modelling is done using stabilised...... finite elements, the formulation and implementation of which was done partly during a special course as prepatory work for this thesis. The formulation is extended with a Brinkman friction term in order to facilitate the topology optimisation of fluid flow and convective cooling problems. The derived...

  11. Optimisation of searches for Supersymmetry with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Zvolsky, Milan

    2012-01-15

    The ATLAS experiment is one of the four large experiments at the Large Hadron Collider which is specifically designed to search for the Higgs boson and physics beyond the Standard Model. The aim of this thesis is the optimisation of searches for Supersymmetry in decays with two leptons and missing transverse energy in the final state. Two different optimisation studies have been performed for two important analysis aspects: The final signal region selection and the choice of the trigger selection. In the first part of the analysis, a cut-based optimisation of signal regions is performed, maximising the signal for a minimal background contamination. By this, the signal yield can in parts be more than doubled. The second approach is to introduce di-lepton triggers which allow to lower the lepton transverse momentum threshold, thus enhancing the number of selected signal events significantly. The signal region optimisation was considered for the choice of the final event selection in the ATLAS di-lepton analyses. The trigger study contributed to the incorporation of di-lepton triggers to the ATLAS trigger menu. (orig.)

  12. Real-time optimisation of the Hoa Binh reservoir, Vietnam

    DEFF Research Database (Denmark)

    Richaud, Bertrand; Madsen, Henrik; Rosbjerg, Dan

    2011-01-01

    -time optimisation. First, the simulation-optimisation framework is applied for optimising reservoir operating rules. Secondly, real-time and forecast information is used for on-line optimisation that focuses on short-term goals, such as flood control or hydropower generation, without compromising the deviation...... in the downstream part of the Red River, and at the same time to increase hydropower generation and to save water for the dry season. The real-time optimisation procedure further improves the efficiency of the reservoir operation and enhances the flexibility for the decision-making. Finally, the quality......Multi-purpose reservoirs often have to be managed according to conflicting objectives, which requires efficient tools for trading-off the objectives. This paper proposes a multi-objective simulation-optimisation approach that couples off-line rule curve optimisation with on-line real...

  13. An empirical study on website usability elements and how they affect search engine optimisation

    Directory of Open Access Journals (Sweden)

    Eugene B. Visser

    2011-03-01

    Full Text Available The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared to a usability-designed version of the control website,namely the experimental website. All optimisation elements were ignored, thus implementing only usability. The results clearly show that inclusion of the usability attributes positively affect conversion,indicating that usability is a prerequisite for effective website design. Search engine optimisation is also a prerequisite for the very reason that if a website does not rank on the first page of the search engine result page for a given keyword, then that website might as well not exist. According to this empirical work, usability is in contradiction to search engine optimisation best practices. Therefore the two need to be weighed up in terms of importance towards search engines and visitors.

  14. A Study of the Application of Big Data in a Rural Comprehensive Information Service

    Directory of Open Access Journals (Sweden)

    Leifeng Guo

    2015-05-01

    Full Text Available Big data has attracted extensive interest due to its potential tremendous social and scientific value. Researchers are also trying to extract potential value from agriculture big data. This paper presents a study of information services based on big data from the perspective of a rural comprehensive information service. First, we introduce the background of the rural comprehensive information service, and then we present in detail the National Rural Comprehensive Information Service Platform (NRCISP, which is supported by the national science and technology support program. Next, we discuss big data in the NRCISP according to data characteristics, data sources, and data processing. Finally, we discuss a service model and services based on big data in the NRCISP.

  15. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  16. Efficient topology optimisation of multiscale and multiphysics problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    The aim of this Thesis is to present efficient methods for optimising high-resolution problems of a multiscale and multiphysics nature. The Thesis consists of two parts: one treating topology optimisation of microstructural details and the other treating topology optimisation of conjugate heat...

  17. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  18. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  19. Adjoint Optimisation of the Turbulent Flow in an Annular Diffuser

    DEFF Research Database (Denmark)

    Gotfredsen, Erik; Agular Knudsen, Christian; Kunoy, Jens Dahl

    2017-01-01

    In the present study, a numerical optimisation of guide vanes in an annular diffuser, is performed. The optimisation is preformed for the purpose of improving the following two parameters simultaneously; the first parameter is the uniformity perpen-dicular to the flow direction, a 1/3 diameter do...

  20. Design of optimised backstepping controller for the synchronisation ...

    Indian Academy of Sciences (India)

    Ehsan Fouladi

    2017-12-18

    Dec 18, 2017 ... for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller. Keywords. Colpitts oscillator; backstepping controller; chaos synchronisation; shark smell algorithm; particle .... The velocity model is based on the gradient of the objective function, tilting ...

  1. Big Earth Data Initiative: Metadata Improvement: Case Studies

    Science.gov (United States)

    Kozimor, John; Habermann, Ted; Farley, John

    2016-01-01

    Big Earth Data Initiative (BEDI) The Big Earth Data Initiative (BEDI) invests in standardizing and optimizing the collection, management and delivery of U.S. Government's civil Earth observation data to improve discovery, access use, and understanding of Earth observations by the broader user community. Complete and consistent standard metadata helps address all three goals.

  2. Extending Particle Swarm Optimisers with Self-Organized Criticality

    DEFF Research Database (Denmark)

    Løvbjerg, Morten; Krink, Thiemo

    2002-01-01

    Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.......Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions....

  3. Techno-economic optimisation of energy systems; Contribution a l'optimisation technico-economique de systemes energetiques

    Energy Technology Data Exchange (ETDEWEB)

    Mansilla Pellen, Ch

    2006-07-15

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  4. Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine

    Science.gov (United States)

    Erdogan, Gamze; Yavuz, Mahmut

    2017-12-01

    The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.

  5. Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.

    Science.gov (United States)

    Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima

    2013-01-01

    The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  6. Layout Optimisation of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Pau Mercadé Ruiz

    2017-08-01

    Full Text Available This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA, a genetic algorithm (GA and the glowworm swarm optimisation (GSO algorithm. The results show slightly higher performances for the latter two algorithms; however, the first turns out to be significantly less computationally demanding.

  7. Peran Dimensi Kepribadian Big Five terhadap Psychological Adjustment Pada Mahasiswa Indonesia yang Studi Keluar Negeri

    OpenAIRE

    Adelia, Cindy Inge

    2012-01-01

    This study aims to examine the Effect of Big Five Personality to Psychological Adjustment on Indonesian’s Sojourners. The instruments used to collect the data arepsychological adjustment scale and Big Five Inventory. The scale of psychological adjustment was made within 33 items. Big Five Inventory was used from Big Five Inventory that had been adapted by professional translator. Convenience sampling method was used to gather the respond of 117 samples. The data obtained are later analyzed us...

  8. Radiation dose to children in diagnostic radiology. Measurements and methods for clinical optimisation studies

    Energy Technology Data Exchange (ETDEWEB)

    Almen, A J

    1995-09-01

    A method for estimating mean absorbed dose to different organs and tissues was developed for paediatric patients undergoing X-ray investigations. The absorbed dose distribution in water was measured for the specific X-ray beam used. Clinical images were studied to determine X-ray beam positions and field sizes. Size and position of organs in the patient were estimated using ORNL phantoms and complementary clinical information. Conversion factors between the mean absorbed dose to various organs and entrance surface dose for five different body sizes were calculated. Direct measurements on patients estimating entrance surface dose and energy imparted for common X-ray investigations were performed. The examination technique for a number of paediatric X-ray investigations used in 19 Swedish hospitals was studied. For a simulated pelvis investigation of a 1-year old child the entrance surface dose was measured and image quality was estimated using a contrast-detail phantom. Mean absorbed doses to organs and tissues in urography, lung, pelvis, thoracic spine, lumbar spine and scoliosis investigations was calculated. Calculations of effective dose were supplemented with risk calculations for special organs e g the female breast. The work shows that the examination technique in paediatric radiology is not yet optimised, and that the non-optimised procedures contribute to a considerable variation in radiation dose. In order to optimise paediatric radiology there is a need for more standardised methods in patient dosimetry. It is especially important to relate measured quantities to the size of the patient, using e g the patient weight and length. 91 refs, 17 figs, 8 tabs.

  9. Radiation dose to children in diagnostic radiology. Measurements and methods for clinical optimisation studies

    International Nuclear Information System (INIS)

    Almen, A.J.

    1995-09-01

    A method for estimating mean absorbed dose to different organs and tissues was developed for paediatric patients undergoing X-ray investigations. The absorbed dose distribution in water was measured for the specific X-ray beam used. Clinical images were studied to determine X-ray beam positions and field sizes. Size and position of organs in the patient were estimated using ORNL phantoms and complementary clinical information. Conversion factors between the mean absorbed dose to various organs and entrance surface dose for five different body sizes were calculated. Direct measurements on patients estimating entrance surface dose and energy imparted for common X-ray investigations were performed. The examination technique for a number of paediatric X-ray investigations used in 19 Swedish hospitals was studied. For a simulated pelvis investigation of a 1-year old child the entrance surface dose was measured and image quality was estimated using a contrast-detail phantom. Mean absorbed doses to organs and tissues in urography, lung, pelvis, thoracic spine, lumbar spine and scoliosis investigations was calculated. Calculations of effective dose were supplemented with risk calculations for special organs e g the female breast. The work shows that the examination technique in paediatric radiology is not yet optimised, and that the non-optimised procedures contribute to a considerable variation in radiation dose. In order to optimise paediatric radiology there is a need for more standardised methods in patient dosimetry. It is especially important to relate measured quantities to the size of the patient, using e g the patient weight and length. 91 refs, 17 figs, 8 tabs

  10. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  11. Big-data-based edge biomarkers: study on dynamical drug sensitivity and resistance in individuals.

    Science.gov (United States)

    Zeng, Tao; Zhang, Wanwei; Yu, Xiangtian; Liu, Xiaoping; Li, Meiyi; Chen, Luonan

    2016-07-01

    Big-data-based edge biomarker is a new concept to characterize disease features based on biomedical big data in a dynamical and network manner, which also provides alternative strategies to indicate disease status in single samples. This article gives a comprehensive review on big-data-based edge biomarkers for complex diseases in an individual patient, which are defined as biomarkers based on network information and high-dimensional data. Specifically, we firstly introduce the sources and structures of biomedical big data accessible in public for edge biomarker and disease study. We show that biomedical big data are typically 'small-sample size in high-dimension space', i.e. small samples but with high dimensions on features (e.g. omics data) for each individual, in contrast to traditional big data in many other fields characterized as 'large-sample size in low-dimension space', i.e. big samples but with low dimensions on features. Then, we demonstrate the concept, model and algorithm for edge biomarkers and further big-data-based edge biomarkers. Dissimilar to conventional biomarkers, edge biomarkers, e.g. module biomarkers in module network rewiring-analysis, are able to predict the disease state by learning differential associations between molecules rather than differential expressions of molecules during disease progression or treatment in individual patients. In particular, in contrast to using the information of the common molecules or edges (i.e.molecule-pairs) across a population in traditional biomarkers including network and edge biomarkers, big-data-based edge biomarkers are specific for each individual and thus can accurately evaluate the disease state by considering the individual heterogeneity. Therefore, the measurement of big data in a high-dimensional space is required not only in the learning process but also in the diagnosing or predicting process of the tested individual. Finally, we provide a case study on analyzing the temporal expression

  12. Improving Vector Evaluated Particle Swarm Optimisation by Incorporating Nondominated Solutions

    Directory of Open Access Journals (Sweden)

    Kian Sheng Lim

    2013-01-01

    Full Text Available The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  13. Techno-economic optimisation of energy systems

    International Nuclear Information System (INIS)

    Mansilla Pellen, Ch.

    2006-07-01

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  14. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  15. The big data potential of epidemiological studies for criminology and forensics.

    Science.gov (United States)

    DeLisi, Matt

    2018-07-01

    Big data, the analysis of original datasets with large samples ranging from ∼30,000 to one million participants to mine unexplored data, has been under-utilized in criminology. However, there have been recent calls for greater synthesis between epidemiology and criminology and a small number of scholars have utilized epidemiological studies that were designed to measure alcohol and substance use to harvest behavioral and psychiatric measures that relate to the study of crime. These studies have been helpful in producing knowledge about the most serious, violent, and chronic offenders, but applications to more pathological forensic populations is lagging. Unfortunately, big data relating to crime and justice are restricted and limited to criminal justice purposes and not easily available to the research community. Thus, the study of criminal and forensic populations is limited in terms of data volume, velocity, and variety. Additional forays into epidemiology, increased use of available online judicial and correctional data, and unknown new frontiers are needed to bring criminology up to speed in the big data arena. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  16. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...... using the boundary element method where absorption is incorporated. An example is given where the geometry of a room is defined by four design modes. The room geometry is optimised to get a uniform sound pressure....

  17. Optimisation: how to develop stake holder involvement

    International Nuclear Information System (INIS)

    Weiss, W.

    2003-01-01

    The Precautionary Principle is an internationally recognised approach for dealing with risk situations characterised by uncertainties and potential irreversible damages. Since the late fifties, ICRP has adopted this prudent attitude because of the lack of scientific evidence concerning the existence of a threshold at low doses for stochastic effects. The 'linear, no-threshold' model and the 'optimisation of protection' principle have been developed as a pragmatic response for the management of the risk. The progress in epidemiology and radiobiology over the last decades have affirmed the initial assumption and the optimisation remains the appropriate response for the application of the precautionary principle in the context of radiological protection. The basic objective of optimisation is, for any source within the system of radiological protection, to maintain the level of exposure as low as reasonably achievable, taking into account social and economical factors. Methods tools and procedures have been developed over the last two decades to put into practice the optimisation principle with a central role given to the cost-benefit analysis as a means to determine the optimised level of protection. However, with the advancement in the implementation of the principle more emphasis was progressively given to good practice, as well as on the importance of controlling individual levels of exposure through the optimisation process. In the context of the revision of its present recommendations, the Commission is reenforcing the emphasis on protection of the individual with the adoption of an equity-based system that recognizes individual rights and a basic level of health protection. Another advancement is the role that is now recognised to 'stakeholders involvement' in the optimisation process as a mean to improve the quality of the decision aiding process for identifying and selecting protection actions considered as being accepted by all those involved. The paper

  18. A COMPARATIVE STUDY ON MULTI-SWARM OPTIMISATION AND BAT ALGORITHM FOR UNCONSTRAINED NON LINEAR OPTIMISATION PROBLEMS

    Directory of Open Access Journals (Sweden)

    Evans BAIDOO

    2016-12-01

    Full Text Available A study branch that mocks-up a population of network of swarms or agents with the ability to self-organise is Swarm intelligence. In spite of the huge amount of work that has been done in this area in both theoretically and empirically and the greater success that has been attained in several aspects, it is still ongoing and at its infant stage. An immune system, a cloud of bats, or a flock of birds are distinctive examples of a swarm system. . In this study, two types of meta-heuristics algorithms based on population and swarm intelligence - Multi Swarm Optimization (MSO and Bat algorithms (BA - are set up to find optimal solutions of continuous non-linear optimisation models. In order to analyze and compare perfect solutions at the expense of performance of both algorithms, a chain of computational experiments on six generally used test functions for assessing the accuracy and the performance of algorithms, in swarm intelligence fields are used. Computational experiments show that MSO algorithm seems much superior to BA.

  19. A comparative study of marriage in honey bees optimisation (MBO ...

    African Journals Online (AJOL)

    2012-02-15

    Feb 15, 2012 ... In a typical mating, the queen mates with 7 to 20 drones. Each time the .... Honey bee mating optimisation model's pseudo-code ... for this analysis, which consists of 47 years of monthly time ... tive of Karkheh Reservoir is to control and regulate the flow of ..... Masters thesis, Maastricht University, Maastricht.

  20. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  1. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  2. Optimising neutron polarizers--measuring the flipping ratio and related quantities

    CERN Document Server

    Goossens, D J

    2002-01-01

    The continuing development of gaseous spin polarized sup 3 He transmission filters for use as neutron polarizers makes the choice of optimum thickness for these filters an important consideration. The 'quality factors' derived for the optimisation of transmission filters for particular measurements are general to all neutron polarizers. In this work optimisation conditions for neutron polarizers are derived and discussed for the family of studies related to measuring the flipping ratio from samples. The application of the optimisation conditions to sup 3 He transmission filters and other types of neutron polarizers is discussed. Absolute comparisons are made between the effectiveness of different types of polarizers for this sort of work.

  3. Mechatronic System Design Based On An Optimisation Approach

    DEFF Research Database (Denmark)

    Andersen, Torben Ole; Pedersen, Henrik Clemmensen; Hansen, Michael Rygaard

    The envisaged objective of this paper project is to extend the current state of the art regarding the design of complex mechatronic systems utilizing an optimisation approach. We propose to investigate a novel framework for mechatronic system design. The novelty and originality being the use...... of optimisation techniques. The methods used to optimise/design within the classical disciplines will be identified and extended to mechatronic system design....

  4. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  5. Optimised low-dose multidetector CT protocol for children with cranial deformity

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez, Jose Luis [Complejo Hospitalario Universitario de Vigo, Department of Radiology, Vigo, Pontevedra (Spain); Pombar, Miguel Angel [Complejo Hospitalario Universitario de Santiago, Department of Radiophysics, Santiago de Compostela, La Coruna (Spain); Pumar, Jose Manuel [Complejo Hospitalario Universitario de Santiago, Department of Radiology, Santiago de Compostela, La Coruna (Spain); Campo, Victor Miguel del [Complejo Hospitalario Universitario de Vigo, Department of Public Health, Vigo, Pontevedra (Spain)

    2013-08-15

    To present an optimised low-dose multidetector computed tomography (MDCT) protocol for the study of children with cranial deformity. Ninety-one consecutive MDCT studies were performed in 80 children. Studies were performed with either our standard head CT protocol (group 1, n = 20) or a low-dose cranial deformity protocol (groups 2 and 3). Group 2 (n = 38), initial, and group 3 (n = 33), final and more optimised. All studies were performed in the same 64-MDCT equipment. Cranial deformity protocol was gradationally optimised decreasing kVp, limiting mA range, using automatic exposure control (AEC) and increasing the noise index (NI). Image quality was assessed. Dose indicators such us CT dose index volume (CTDIvol), dose-length product (DLP) and effective dose (E) were used. The optimised low-dose protocol reached the following values: 80 kVp, mA range: 50-150 and NI = 23. We achieved a maximum dose reduction of 10-22 times in the 1- to 12-month-old cranium in regard to the 2004 European guidelines for MDCT. A low-dose MDCT protocol that may be used as the first diagnostic imaging option in clinically selected patients with skull abnormalities. (orig.)

  6. Centralising and optimising decentralised stroke care systems : A simulation study on short-term costs and effects

    NARCIS (Netherlands)

    Lahr, Maarten M. H.; van der Zee, Durk-Jouke; Luijckx, Gert-Jan; Vroomen, Patrick C. A. J.; Buskens, Erik

    2017-01-01

    Background: Centralisation of thrombolysis may offer substantial benefits. The aim of this study was to assess short term costs and effects of centralisation of thrombolysis and optimised care in a decentralised system. Methods: Using simulation modelling, three scenarios to improve decentralised

  7. Effectiveness of an implementation optimisation intervention aimed at increasing parent engagement in HENRY, a childhood obesity prevention programme - the Optimising Family Engagement in HENRY (OFTEN) trial: study protocol for a randomised controlled trial.

    Science.gov (United States)

    Bryant, Maria; Burton, Wendy; Cundill, Bonnie; Farrin, Amanda J; Nixon, Jane; Stevens, June; Roberts, Kim; Foy, Robbie; Rutter, Harry; Hartley, Suzanne; Tubeuf, Sandy; Collinson, Michelle; Brown, Julia

    2017-01-24

    Family-based interventions to prevent childhood obesity depend upon parents' taking action to improve diet and other lifestyle behaviours in their families. Programmes that attract and retain high numbers of parents provide an enhanced opportunity to improve public health and are also likely to be more cost-effective than those that do not. We have developed a theory-informed optimisation intervention to promote parent engagement within an existing childhood obesity prevention group programme, HENRY (Health Exercise Nutrition for the Really Young). Here, we describe a proposal to evaluate the effectiveness of this optimisation intervention in regard to the engagement of parents and cost-effectiveness. The Optimising Family Engagement in HENRY (OFTEN) trial is a cluster randomised controlled trial being conducted across 24 local authorities (approximately 144 children's centres) which currently deliver HENRY programmes. The primary outcome will be parental enrolment and attendance at the HENRY programme, assessed using routinely collected process data. Cost-effectiveness will be presented in terms of primary outcomes using acceptability curves and through eliciting the willingness to pay for the optimisation from HENRY commissioners. Secondary outcomes include the longitudinal impact of the optimisation, parent-reported infant intake of fruits and vegetables (as a proxy to compliance) and other parent-reported family habits and lifestyle. This innovative trial will provide evidence on the implementation of a theory-informed optimisation intervention to promote parent engagement in HENRY, a community-based childhood obesity prevention programme. The findings will be generalisable to other interventions delivered to parents in other community-based environments. This research meets the expressed needs of commissioners, children's centres and parents to optimise the potential impact that HENRY has on obesity prevention. A subsequent cluster randomised controlled pilot

  8. Agent-Based Decision Control—How to Appreciate Multivariate Optimisation in Architecture

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas Holmer; Kolarik, Jakub

    2015-01-01

    , the method is applied to a multivariate optimisation problem. The aim is specifically to demonstrate optimisation for entire building energy consumption, daylight distribution and capital cost. Based on the demonstrations Moth’s ability to find local minima is discussed. It is concluded that agent-based...... in the early design stage. The main focus is to demonstrate the optimisation method, which is done in two ways. Firstly, the newly developed agent-based optimisation algorithm named Moth is tested on three different single objective search spaces. Here Moth is compared to two evolutionary algorithms. Secondly...... optimisation algorithms like Moth open up for new uses of optimisation in the early design stage. With Moth the final outcome is less dependent on pre- and post-processing, and Moth allows user intervention during optimisation. Therefore, agent-based models for optimisation such as Moth can be a powerful...

  9. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  10. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  11. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  12. Metal Removal Process Optimisation using Taguchi Method - Simplex Algorithm (TM-SA) with Case Study Applications

    OpenAIRE

    Ajibade, Oluwaseyi A.; Agunsoye, Johnson O.; Oke, Sunday A.

    2018-01-01

    In the metal removal process industry, the current practice to optimise cutting parameters adoptsa conventional method. It is based on trial and error, in which the machine operator uses experience,coupled with handbook guidelines to determine optimal parametric values of choice. This method is notaccurate, is time-consuming and costly. Therefore, there is a need for a method that is scientific, costeffectiveand precise. Keeping this in mind, a different direction for process optimisation is ...

  13. Stress analysis studies in optimised 'D' shaped TOKAMAK magnet designs

    International Nuclear Information System (INIS)

    Diserens, N.J.

    1975-07-01

    A suite of computer programs TOK was developed which enabled simple data input to be used for computation of magnetic fields and forces in a toroidal system of coils with either D-shaped or circular cross section. An additional requirement was that input data to the Swansea stress analysis program FINESSE could be output from the TOK fields and forces program, and that graphical output from either program should be available. A further program was required to optimise the coil shape. This used the field calculating routines from the TOK program. The starting point for these studies was the proposed 40 coil Princeton design. The stresses resulting from three different shapes of D-coil were compared. (author)

  14. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  15. Optimisation of X-ray examinations: General principles and an Irish perspective

    International Nuclear Information System (INIS)

    Matthews, Kate; Brennan, Patrick C.

    2009-01-01

    In Ireland, the European Medical Exposures Directive [Council Directive 97/43] was enacted into national law in Statutory Instrument 478 of 2002. This series of three review articles discusses the status of justification and optimisation of X-ray examinations nationally, and progress with the establishment of Irish diagnostic reference levels. In this second article, literature relating to optimisation issues arising in SI 478 of 2002 is reviewed. Optimisation associated with X-ray equipment and optimisation during day-to-day practice are considered. Optimisation proposals found in published research are summarised, and indicate the complex nature of optimisation. A paucity of current, research-based guidance documentation is identified. This is needed in order to support a range of professional staff in their practical implementation of optimisation.

  16. Cultural-based particle swarm for dynamic optimisation problems

    Science.gov (United States)

    Daneshyari, Moayed; Yen, Gary G.

    2012-07-01

    Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.

  17. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  18. Robustness analysis of bogie suspension components Pareto optimised values

    Science.gov (United States)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  19. Renewables portfolio standard and regional energy structure optimisation in China

    International Nuclear Information System (INIS)

    Fan, J.; Sun, W.; Ren, D.-M.

    2005-01-01

    Eastern Coastal areas of China have been developing rapidly since the implementation of reforms and the opening of China's economic markets in 1978. As in most areas of the world, this rapid economic growth has been accompanied by large increases in energy consumption. China's coal-dominated energy structure has resulted in serious ecological and environmental problems. Exploiting renewable energy resources and introducing Renewables Portfolio Standard (RPS) are some of the most important approaches towards optimising and sustaining the energy structure of China. This paper discusses international experiences in the implementation of RPS policies and prospects for using these policies to encourage renewable energy development in China, establishes a concise definition of renewable resources, differentiating between the broad definition (which includes hydro over 25 MW in size) from the narrow definition (which limits the eligibility of hydro to below 25 MW in size), and quantitatively analyses the potential renewable energy target. The research shows that: (1) Under the narrow hydro definition the renewable energy target would be 5.1% and under the broad hydro definition it would be 18.4%. (2) Western China has contributed 90.2% of the total renewable electricity generation in the country (if big and medium hydropowers are not included). Including big and medium hydropower, the figure is 63.8%. (3) Eastern electricity companies can achieve their quota by buying Tradable Renewable Energy Certificates (TRCs or Green Certificates) and by exploiting renewable energy resources in Western China. The successful implementation of the RPS policy will achieve the goal of sharing the benefits and responsibilities of energy production between the different regions of China

  20. Optimisation of rocker sole footwear for prevention of first plantar ulcer: comparison of group-optimised and individually-selected footwear designs.

    Science.gov (United States)

    Preece, Stephen J; Chapman, Jonathan D; Braunstein, Bjoern; Brüggemann, Gert-Peter; Nester, Christopher J

    2017-01-01

    Appropriate footwear for individuals with diabetes but no ulceration history could reduce the risk of first ulceration. However, individuals who deem themselves at low risk are unlikely to seek out bespoke footwear which is personalised. Therefore, our primary aim was to investigate whether group-optimised footwear designs, which could be prefabricated and delivered in a retail setting, could achieve appropriate pressure reduction, or whether footwear selection must be on a patient-by-patient basis. A second aim was to compare responses to footwear design between healthy participants and people with diabetes in order to understand the transferability of previous footwear research, performed in healthy populations. Plantar pressures were recorded from 102 individuals with diabetes, considered at low risk of ulceration. This cohort included 17 individuals with peripheral neuropathy. We also collected data from 66 healthy controls. Each participant walked in 8 rocker shoe designs (4 apex positions × 2 rocker angles). ANOVA analysis was then used to understand the effect of two design features and descriptive statistics used to identify the group-optimised design. Using 200 kPa as a target, this group-optimised design was then compared to the design identified as the best for each participant (using plantar pressure data). Peak plantar pressure increased significantly as apex position was moved distally and rocker angle reduced ( p  footwear which was individually selected. In terms of optimised footwear designs, healthy participants demonstrated the same response as participants with diabetes, despite having lower plantar pressures. This is the first study demonstrating that a group-optimised, generic rocker shoe might perform almost as well as footwear selected on a patient by patient basis in a low risk patient group. This work provides a starting point for clinical evaluation of generic versus personalised pressure reducing footwear.

  1. Optimising the Target and Capture Sections of the Neutrino Factory

    CERN Document Server

    Hansen, Ole Martin; Stapnes, Steinar

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accelerator and thus the neutrino beam intensity. The optimisation studies were performed with the use of Monte Carlo simulation tools. The production of secondary particles, by interactions between the incoming proton beam and the mercury target, was optimised by varying the proton beam impact position and impact angles on the target. The proton beam and target interaction region was studied and showed to be off the central axis of the capture section in the baseline configuration. The off-centred interaction region resulted in ...

  2. Ontology-based coupled optimisation design method using state-space analysis for the spindle box system of large ultra-precision optical grinding machine

    Science.gov (United States)

    Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian

    2017-08-01

    With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.

  3. Mutual information-based LPI optimisation for radar network

    Science.gov (United States)

    Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun

    2015-07-01

    Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.

  4. Multicriteria Optimisation in Logistics Forwarder Activities

    Directory of Open Access Journals (Sweden)

    Tanja Poletan Jugović

    2007-05-01

    Full Text Available Logistics forwarder, as organizer and planner of coordinationand integration of all the transport and logistics chains elements,uses adequate ways and methods in the process of planningand decision-making. One of these methods, analysed inthis paper, which could be used in optimisation of transportand logistics processes and activities of logistics forwarder, isthe multicriteria optimisation method. Using that method, inthis paper is suggested model of multicriteria optimisation of logisticsforwarder activities. The suggested model of optimisationis justified in keeping with method principles of multicriteriaoptimization, which is included in operation researchmethods and it represents the process of multicriteria optimizationof variants. Among many different processes of multicriteriaoptimization, PROMETHEE (Preference Ranking OrganizationMethod for Enrichment Evaluations and Promcalc& Gaia V. 3.2., computer program of multicriteria programming,which is based on the mentioned process, were used.

  5. Optimisation by hierarchical search

    Science.gov (United States)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  6. Operational Radiological Protection and Aspects of Optimisation

    International Nuclear Information System (INIS)

    Lazo, E.; Lindvall, C.G.

    2005-01-01

    Since 1992, the Nuclear Energy Agency (NEA), along with the International Atomic Energy Agency (IAEA), has sponsored the Information System on Occupational Exposure (ISOE). ISOE collects and analyses occupational exposure data and experience from over 400 nuclear power plants around the world and is a forum for radiological protection experts from both nuclear power plants and regulatory authorities to share lessons learned and best practices in the management of worker radiation exposures. In connection to the ongoing work of the International Commission on Radiological Protection (ICRP) to develop new recommendations, the ISOE programme has been interested in how the new recommendations would affect operational radiological protection application at nuclear power plants. Bearing in mind that the ICRP is developing, in addition to new general recommendations, a new recommendation specifically on optimisation, the ISOE programme created a working group to study the operational aspects of optimisation, and to identify the key factors in optimisation that could usefully be reflected in ICRP recommendations. In addition, the Group identified areas where further ICRP clarification and guidance would be of assistance to practitioners, both at the plant and the regulatory authority. The specific objective of this ISOE work was to provide operational radiological protection input, based on practical experience, to the development of new ICRP recommendations, particularly in the area of optimisation. This will help assure that new recommendations will best serve the needs of those implementing radiation protection standards, for the public and for workers, at both national and international levels. (author)

  7. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  8. Big Data access and infrastructure for modern biology: case studies in data repository utility.

    Science.gov (United States)

    Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R

    2017-01-01

    Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data. © 2016 New York Academy of Sciences.

  9. A Study on SE Methodology for Design of Big Data Pilot Platform to Improve Nuclear Power Plant Safety

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Junguk; Cha, Jae-Min; Kim, Jun-Young; Park, Sung-Ho; Yeom, Choong-Sub [Institute for Advanced Engineering (IAE), Yongin (Korea, Republic of)

    2016-10-15

    A big data concept is expected to have a large impact on the safety of the nuclear power plant (NPP) from the beginning of the big data era. Though there are high interests on the NPP safety with the big data, almost no studies on the logical and physical structures and the systematic design methods of the big data platform for the NPP safety have been conducted. For the current study, a new big data pilot platform for the NPP safety is designed with the main focus on the health monitoring and early warning systems, and a tailored design process based on the systems engineering approaches is proposed to manage inherent high complexity of the platform design. The big data concept is expected to have a large impact on the safety of the NPP. So, in this study, the big data pilot platform for the health monitoring and early warning of the NPP is designed. For this, the development process based on the SE approach for the pilot platform is proposed and the design results along with the proposed process are also presented. Implementation of the individual modules and integrations of those are in currently progress.

  10. A Study on SE Methodology for Design of Big Data Pilot Platform to Improve Nuclear Power Plant Safety

    International Nuclear Information System (INIS)

    Shin, Junguk; Cha, Jae-Min; Kim, Jun-Young; Park, Sung-Ho; Yeom, Choong-Sub

    2016-01-01

    A big data concept is expected to have a large impact on the safety of the nuclear power plant (NPP) from the beginning of the big data era. Though there are high interests on the NPP safety with the big data, almost no studies on the logical and physical structures and the systematic design methods of the big data platform for the NPP safety have been conducted. For the current study, a new big data pilot platform for the NPP safety is designed with the main focus on the health monitoring and early warning systems, and a tailored design process based on the systems engineering approaches is proposed to manage inherent high complexity of the platform design. The big data concept is expected to have a large impact on the safety of the NPP. So, in this study, the big data pilot platform for the health monitoring and early warning of the NPP is designed. For this, the development process based on the SE approach for the pilot platform is proposed and the design results along with the proposed process are also presented. Implementation of the individual modules and integrations of those are in currently progress

  11. Optimised cut-off function for Tersoff-like potentials for a BN nanosheet: a molecular dynamics study

    International Nuclear Information System (INIS)

    Kumar, Rajesh; Rajasekaran, G; Parashar, Avinash

    2016-01-01

    In this article, molecular dynamics based simulations were carried out to study the tensile behaviour of boron nitride nanosheets (BNNSs). Four different sets of Tersoff potential parameters were used in the simulations for estimating the interatomic interactions between boron and nitrogen atoms. Modifications were incorporated in the Tersoff cut-off function to improve the accuracy of results with respect to fracture stress, fracture strain and Young’s modulus. In this study, the original cut-off function was optimised in such a way that small and large cut-off distances were made equal, and hence a single cut-off distance was used with all sets of Tersoff potential parameters. The single value of cut-off distance for the Tersoff potential was chosen after analysing the potential energy and bond forces experienced by boron and nitrogen atoms subjected to bond stretching. The simulations performed with the optimised cut-off function help in identifying the Tersoff potential parameters that reproduce the experimentally evaluated mechanical behaviour of BNNSs. (paper)

  12. Risk-informed optimisation of railway tracks inspection and maintenance procedures

    International Nuclear Information System (INIS)

    Podofillini, Luca; Zio, Enrico; Vatn, Jorn

    2006-01-01

    Nowadays, efforts are being made by the railway industry for the application of reliability-based and risk-informed approaches to maintenance optimisation of railway infrastructures, with the aim of reducing the operation and maintenance expenditures while still assuring high safety standards. In particular, in this paper, we address the use of ultrasonic inspection cars and develop a methodology for the determination of an optimal strategy for their use. A model is developed to calculate the risks and costs associated with an inspection strategy, giving credit to the realistic issues of the rail failure process and including the actual inspection and maintenance procedures followed by the railway company. A multi-objective optimisation viewpoint is adopted in an effort to optimise inspection and maintenance procedures with respect to both economical and safety-related aspects. More precisely, the objective functions here considered are such to drive the search towards solutions characterized by low expenditures and low derailment probability. The optimisation is performed by means of a genetic algorithm. The work has been carried out within a study of the Norwegian National Rail Administration (Jernbaneverket)

  13. The optimisation of wedge filters in radiotherapy of the prostate

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony J.; Webb, Steve

    1995-01-01

    A treatment plan optimisation algorithm has been applied to 12 patients with early prostate cancer in order to determine the optimum beam-weights and wedge angles for a standard conformal three-field treatment technique. The optimisation algorithm was based on fast-simulated-annealing using a cost function designed to achieve a uniform dose in the planning-target-volume (PTV) and to minimise the integral doses to the organs-at-risk. The algorithm has been applied to standard conformal three-field plans created by an experienced human planner, and run in three PLAN MODES: (1) where the wedge angles were fixed by the human planner and only the beam-weights were optimised; (2) where both the wedge angles and beam-weights were optimised; and (3) where both the wedge angles and beam-weights were optimised and a non-uniform dose was prescribed to the PTV. In the latter PLAN MODE, a uniform 100% dose was prescribed to all of the PTV except for that region that overlaps with the rectum where a lower (e.g., 90%) dose was prescribed. The resulting optimised plans have been compared with those of the human planner who found beam-weights by conventional forward planning techniques. Plans were compared on the basis of dose statistics, normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results of the comparison showed that all three PLAN MODES produced plans with slightly higher TCP for the same rectal NTCP, than the human planner. The best results were observed for PLAN MODE 3, where an average increase in TCP of 0.73% (± 0.20, 95% confidence interval) was predicted by the biological models. This increase arises from a beneficial dose gradient which is produced across the tumour. Although the TCP gain is small it comes with no increase in treatment complexity, and could translate into increased cures given the large numbers of patients being referred. A study of the beam-weights and wedge angles chosen by the optimisation algorithm revealed

  14. Modified cuckoo search: A new gradient free optimisation algorithm

    International Nuclear Information System (INIS)

    Walton, S.; Hassan, O.; Morgan, K.; Brown, M.R.

    2011-01-01

    Highlights: → Modified cuckoo search (MCS) is a new gradient free optimisation algorithm. → MCS shows a high convergence rate, able to outperform other optimisers. → MCS is particularly strong at high dimension objective functions. → MCS performs well when applied to engineering problems. - Abstract: A new robust optimisation algorithm, which can be regarded as a modification of the recently developed cuckoo search, is presented. The modification involves the addition of information exchange between the top eggs, or the best solutions. Standard optimisation benchmarking functions are used to test the effects of these modifications and it is demonstrated that, in most cases, the modified cuckoo search performs as well as, or better than, the standard cuckoo search, a particle swarm optimiser, and a differential evolution strategy. In particular the modified cuckoo search shows a high convergence rate to the true global minimum even at high numbers of dimensions.

  15. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  16. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.; Hakiki, Farizal; Syihab, Z.; Ambia, F.; Yasutra, A.; Sutopo, S.; Efendi, M.; Sitompul, V.; Primasari, I.; Apriandi, R.

    2017-01-01

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  17. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.

    2017-10-17

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  18. Study on Effects of Different Replacement Rate on Bending Behavior of Big Recycled Aggregate Self Compacting Concrete

    Science.gov (United States)

    Li, Jing; Guo, Tiantian; Gao, Shuai; Jiang, Lin; Zhao, Zhijun; Wang, Yalin

    2018-03-01

    Big recycled aggregate self compacting concrete is a new type of recycled concrete, which has the advantages of low hydration heat and green environmental protection, but its bending behavior can be affected by different replacement rate. Therefor, in this paper, the research status of big Recycled aggregate self compacting concrete was systematically introduced, and the effect of different replacement rate of big recycled aggregate on failure mode, crack distribution and bending strength of the beam were studied through the bending behavior test of 4 big recycled aggregate self compacting concrete beams. The results show that: The crack distribution of the beam can be affected by the replacement rate; The failure modes of big recycled aggregate beams are the same as those of ordinary concrete; The plane section assumption is applicable to the big recycled aggregate self compacting concrete beam; The higher the replacement rate, the lower the bending strength of big recycled aggregate self compacting concrete beams.

  19. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  20. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  1. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  2. Intelligent inversion method for pre-stack seismic big data based on MapReduce

    Science.gov (United States)

    Yan, Xuesong; Zhu, Zhixin; Wu, Qinghua

    2018-01-01

    Seismic exploration is a method of oil exploration that uses seismic information; that is, according to the inversion of seismic information, the useful information of the reservoir parameters can be obtained to carry out exploration effectively. Pre-stack data are characterised by a large amount of data, abundant information, and so on, and according to its inversion, the abundant information of the reservoir parameters can be obtained. Owing to the large amount of pre-stack seismic data, existing single-machine environments have not been able to meet the computational needs of the huge amount of data; thus, the development of a method with a high efficiency and the speed to solve the inversion problem of pre-stack seismic data is urgently needed. The optimisation of the elastic parameters by using a genetic algorithm easily falls into a local optimum, which results in a non-obvious inversion effect, especially for the optimisation effect of the density. Therefore, an intelligent optimisation algorithm is proposed in this paper and used for the elastic parameter inversion of pre-stack seismic data. This algorithm improves the population initialisation strategy by using the Gardner formula and the genetic operation of the algorithm, and the improved algorithm obtains better inversion results when carrying out a model test with logging data. All of the elastic parameters obtained by inversion and the logging curve of theoretical model are fitted well, which effectively improves the inversion precision of the density. This algorithm was implemented with a MapReduce model to solve the seismic big data inversion problem. The experimental results show that the parallel model can effectively reduce the running time of the algorithm.

  3. MANAGEMENT OPTIMISATION OF MASS CUSTOMISATION MANUFACTURING USING COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    Louwrens Butler

    2018-05-01

    Full Text Available Computational intelligence paradigms can be used for advanced manufacturing system optimisation. A static simulation model of an advanced manufacturing system was developed in order to simulate a manufacturing system. The purpose of this advanced manufacturing system was to mass-produce a customisable product range at a competitive cost. The aim of this study was to determine whether this new algorithm could produce a better performance than traditional optimisation methods. The algorithm produced a lower cost plan than that for a simulated annealing algorithm, and had a lower impact on the workforce.

  4. High School Learners' Mental Construction during Solving Optimisation Problems in Calculus: A South African Case Study

    Science.gov (United States)

    Brijlall, Deonarain; Ndlovu, Zanele

    2013-01-01

    This qualitative case study in a rural school in Umgungundlovu District in KwaZulu-Natal, South Africa, explored Grade 12 learners' mental constructions of mathematical knowledge during engagement with optimisation problems. Ten Grade 12 learners who do pure Mathemat-ics participated, and data were collected through structured activity sheets and…

  5. Modeling and processing for next-generation big-data technologies with applications and case studies

    CERN Document Server

    Barolli, Leonard; Barolli, Admir; Papajorgji, Petraq

    2015-01-01

    This book covers the latest advances in Big Data technologies and provides the readers with a comprehensive review of the state-of-the-art in Big Data processing, analysis, analytics, and other related topics. It presents new models, algorithms, software solutions and methodologies, covering the full data cycle, from data gathering to their visualization and interaction, and includes a set of case studies and best practices. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data are also identified and presented throughout the book, which is intended for researchers, scholars, advanced students, software developers and practitioners working at the forefront in their field.

  6. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    DEFF Research Database (Denmark)

    Helle, K.B.; Müller, T.O.; Astrup, Poul

    2014-01-01

    of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64......Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often...... source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given...

  7. Auto-optimisation for three-dimensional conformal radiotherapy of nasopharyngeal carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Wu, V.W.C. E-mail: orvinwu@polyu.edu.hk; Kwong, D.W.L.; Sham, J.S.T.; Mui, A.W.L

    2003-08-01

    Purpose: The purpose of this study was to evaluate the application of auto-optimisation in the treatment planning of three-dimensional conformal radiotherapy (3DCRT) of nasopharyngeal carcinoma (NPC). Methods: Twenty-nine NPC patients were planned by both forward planning and auto-optimisation methods. The forward plans, which consisted of three coplanar facial fields, were produced according to the routine planning criteria. The auto-optimised plans, which consisted of 5-15 (median 9) fields, were generated by the planning system after prescribing the dose requirements and the importance weightings of the planning target volume and organs at risk. Plans produced by the two planning methods were compared by the dose volume histogram, tumour control probability (TCP), conformity index and normal tissue complication probability (NTCP). Results: The auto-optimised plans reduced the average planner's time by over 35 min. It demonstrated better TCP and conformity index than the forward plans (P=0.03 and 0.04, respectively). Besides, the parotid gland and temporo-mandibular (TM) joint were better spared with the mean dose reduction of 31.8 and 17.7%, respectively. The slight trade off was the mild dose increase in spinal cord and brain stem with their maximum doses remaining within the tolerance limits. Conclusions: The findings demonstrated the potentials of auto-optimisation for improving target dose and parotid sparing in the 3DCRT of NPC with saving of the planner's time.

  8. Auto-optimisation for three-dimensional conformal radiotherapy of nasopharyngeal carcinoma

    International Nuclear Information System (INIS)

    Wu, V.W.C.; Kwong, D.W.L.; Sham, J.S.T.; Mui, A.W.L.

    2003-01-01

    Purpose: The purpose of this study was to evaluate the application of auto-optimisation in the treatment planning of three-dimensional conformal radiotherapy (3DCRT) of nasopharyngeal carcinoma (NPC). Methods: Twenty-nine NPC patients were planned by both forward planning and auto-optimisation methods. The forward plans, which consisted of three coplanar facial fields, were produced according to the routine planning criteria. The auto-optimised plans, which consisted of 5-15 (median 9) fields, were generated by the planning system after prescribing the dose requirements and the importance weightings of the planning target volume and organs at risk. Plans produced by the two planning methods were compared by the dose volume histogram, tumour control probability (TCP), conformity index and normal tissue complication probability (NTCP). Results: The auto-optimised plans reduced the average planner's time by over 35 min. It demonstrated better TCP and conformity index than the forward plans (P=0.03 and 0.04, respectively). Besides, the parotid gland and temporo-mandibular (TM) joint were better spared with the mean dose reduction of 31.8 and 17.7%, respectively. The slight trade off was the mild dose increase in spinal cord and brain stem with their maximum doses remaining within the tolerance limits. Conclusions: The findings demonstrated the potentials of auto-optimisation for improving target dose and parotid sparing in the 3DCRT of NPC with saving of the planner's time

  9. A big-data model for multi-modal public transportation with application to macroscopic control and optimisation

    Science.gov (United States)

    Faizrahnemoon, Mahsa; Schlote, Arieh; Maggi, Lorenzo; Crisostomi, Emanuele; Shorten, Robert

    2015-11-01

    This paper describes a Markov-chain-based approach to modelling multi-modal transportation networks. An advantage of the model is the ability to accommodate complex dynamics and handle huge amounts of data. The transition matrix of the Markov chain is built and the model is validated using the data extracted from a traffic simulator. A realistic test-case using multi-modal data from the city of London is given to further support the ability of the proposed methodology to handle big quantities of data. Then, we use the Markov chain as a control tool to improve the overall efficiency of a transportation network, and some practical examples are described to illustrate the potentials of the approach.

  10. Spatial-structural interaction and strain energy structural optimisation

    NARCIS (Netherlands)

    Hofmeyer, H.; Davila Delgado, J.M.; Borrmann, A.; Geyer, P.; Rafiq, Y.; Wilde, de P.

    2012-01-01

    A research engine iteratively transforms spatial designs into structural designs and vice versa. Furthermore, spatial and structural designs are optimised. It is suggested to optimise a structural design by evaluating the strain energy of its elements and by then removing, adding, or changing the

  11. Energy Savings from Optimised In-Field Route Planning for Agricultural Machinery

    Directory of Open Access Journals (Sweden)

    Efthymios Rodias

    2017-10-01

    Full Text Available Various types of sensors technologies, such as machine vision and global positioning system (GPS have been implemented in navigation of agricultural vehicles. Automated navigation systems have proved the potential for the execution of optimised route plans for field area coverage. This paper presents an assessment of the reduction of the energy requirements derived from the implementation of optimised field area coverage planning. The assessment regards the analysis of the energy requirements and the comparison between the non-optimised and optimised plans for field area coverage in the whole sequence of operations required in two different cropping systems: Miscanthus and Switchgrass production. An algorithmic approach for the simulation of the executed field operations by following both non-optimised and optimised field-work patterns was developed. As a result, the corresponding time requirements were estimated as the basis of the subsequent energy cost analysis. Based on the results, the optimised routes reduce the fuel energy consumption up to 8%, the embodied energy consumption up to 7%, and the total energy consumption from 3% up to 8%.

  12. Optimisation of radiation protection

    International Nuclear Information System (INIS)

    1988-01-01

    Optimisation of radiation protection is one of the key elements in the current radiation protection philosophy. The present system of dose limitation was issued in 1977 by the International Commission on Radiological Protection (ICRP) and includes, in addition to the requirements of justification of practices and limitation of individual doses, the requirement that all exposures be kept as low as is reasonably achievable, taking social and economic factors into account. This last principle is usually referred to as optimisation of radiation protection, or the ALARA principle. The NEA Committee on Radiation Protection and Public Health (CRPPH) organised an ad hoc meeting, in liaison with the NEA committees on the safety of nuclear installations and radioactive waste management. Separate abstracts were prepared for individual papers presented at the meeting

  13. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-01-01

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing. PMID:27763525

  14. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Directory of Open Access Journals (Sweden)

    Ho Ting Wong

    2016-10-01

    Full Text Available The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  15. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness.

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-10-17

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term "Big Data", which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  16. Day-ahead economic optimisation of energy storage

    NARCIS (Netherlands)

    Lampropoulos, I.; Garoufalis, P.; Bosch, van den P.P.J.; Groot, de R.J.W.; Kling, W.L.

    2014-01-01

    This article addresses the day-ahead economic optimisation of energy storage systems within the setting of electricity spot markets. The case study is about a lithium-ion battery system integrated in a low voltage distribution grid with residential customers and photovoltaic generation in the

  17. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  18. TEM turbulence optimisation in stellarators

    Science.gov (United States)

    Proll, J. H. E.; Mynick, H. E.; Xanthopoulos, P.; Lazerson, S. A.; Faber, B. J.

    2016-01-01

    With the advent of neoclassically optimised stellarators, optimising stellarators for turbulent transport is an important next step. The reduction of ion-temperature-gradient-driven turbulence has been achieved via shaping of the magnetic field, and the reduction of trapped-electron mode (TEM) turbulence is addressed in the present paper. Recent analytical and numerical findings suggest TEMs are stabilised when a large fraction of trapped particles experiences favourable bounce-averaged curvature. This is the case for example in Wendelstein 7-X (Beidler et al 1990 Fusion Technol. 17 148) and other Helias-type stellarators. Using this knowledge, a proxy function was designed to estimate the TEM dynamics, allowing optimal configurations for TEM stability to be determined with the STELLOPT (Spong et al 2001 Nucl. Fusion 41 711) code without extensive turbulence simulations. A first proof-of-principle optimised equilibrium stemming from the TEM-dominated stellarator experiment HSX (Anderson et al 1995 Fusion Technol. 27 273) is presented for which a reduction of the linear growth rates is achieved over a broad range of the operational parameter space. As an important consequence of this property, the turbulent heat flux levels are reduced compared with the initial configuration.

  19. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2005-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi...... of application messages to frames. Optimisation heuristics for frame packing aimed at producing a schedulable system are presented. Extensive experiments and a real-life example show the efficiency of the frame-packing approach....

  20. Analysis and optimisation of heterogeneous real-time embedded systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    . The success of such new design methods depends on the availability of analysis and optimisation techniques. Analysis and optimisation techniques for heterogeneous real-time embedded systems are presented in the paper. The authors address in more detail a particular class of such systems called multi...... of application messages to frames. Optimisation heuristics for frame packing aimed at producing a schedulable system are presented. Extensive experiments and a real-life example show the efficiency of the frame-packing approach....

  1. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  2. A study and analysis of recommendation systems for location-based social network (LBSN with big data

    Directory of Open Access Journals (Sweden)

    Murale Narayanan

    2016-03-01

    Full Text Available Recommender systems play an important role in our day-to-day life. A recommender system automatically suggests an item to a user that he/she might be interested in. Small-scale datasets are used to provide recommendations based on location, but in real time, the volume of data is large. We have selected Foursquare dataset to study the need for big data in recommendation systems for location-based social network (LBSN. A few quality parameters like parallel processing and multimodal interface have been selected to study the need for big data in recommender systems. This paper provides a study and analysis of quality parameters of recommendation systems for LBSN with big data.

  3. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  4. Acoustic Resonator Optimisation for Airborne Particle Manipulation

    Science.gov (United States)

    Devendran, Citsabehsan; Billson, Duncan R.; Hutchins, David A.; Alan, Tuncay; Neild, Adrian

    Advances in micro-electromechanical systems (MEMS) technology and biomedical research necessitate micro-machined manipulators to capture, handle and position delicate micron-sized particles. To this end, a parallel plate acoustic resonator system has been investigated for the purposes of manipulation and entrapment of micron sized particles in air. Numerical and finite element modelling was performed to optimise the design of the layered acoustic resonator. To obtain an optimised resonator design, careful considerations of the effect of thickness and material properties are required. Furthermore, the effect of acoustic attenuation which is dependent on frequency is also considered within this study, leading to an optimum operational frequency range. Finally, experimental results demonstrated good particle levitation and capture of various particle properties and sizes ranging to as small as 14.8 μm.

  5. Multiobjective optimisation of energy systems and building envelope retrofit in a residential community

    International Nuclear Information System (INIS)

    Wu, Raphael; Mavromatidis, Georgios; Orehounig, Kristina; Carmeliet, Jan

    2017-01-01

    Highlights: • Simultaneous optimisation of building envelope retrofit and energy systems. • Retrofit and energy systems change interact and should be considered simultaneously. • Case study quantifies cost-GHG emission tradeoffs for different retrofit options. - Abstract: In this paper, a method for a multi-objective and simultaneous optimisation of building energy systems and retrofit is presented. Tailored to be suitable for the diverse range of existing buildings in terms of age, size, and use, it combines dynamic energy demand simulation to explore individual retrofit scenarios with an energy hub optimisation. Implemented as an epsilon-constrained mixed integer linear program (MILP), the optimisation matches envelope retrofit with renewable and high efficiency energy supply technologies such as biomass boilers, heat pumps, photovoltaic and solar thermal panels to minimise life cycle cost and greenhouse gas (GHG) emissions. Due to its multi-objective, integrated assessment of building transformation options and its ability to capture both individual building characteristics and trends within a neighbourhood, this method is aimed to provide developers, neighbourhood and town policy makers with the necessary information to make adequate decisions. Our method is deployed in a case study of typical residential buildings in the Swiss village of Zernez, simulating energy demands in EnergyPlus and solving the optimisation problem with CPLEX. Although common trade-offs in energy system and retrofit choice can be observed, optimisation results suggest that the diversity in building age and size leads to optimal strategies for retrofitting and building system solutions, which are specific to different categories. With this method, GHG emissions of the entire community can be reduced by up to 76% at a cost increase of 3% compared to the current emission levels, if an optimised solution is selected for each building category.

  6. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  7. Interpretation of optimisation in the context of a disposal facility for long-lived radioactive waste

    International Nuclear Information System (INIS)

    1999-01-01

    Full text: Guidance on the Requirements for Authorisation (the GRA) issued by the Environment Agency for England and Wales requires that all disposals of radioactive waste are undertaken in a manner consistent with four principles for the protection of the public. Among these is a principle of Optimisation, that: 'The radiological detriment to members of the public that may result from the disposal of radioactive waste shall be as low as reasonably achievable, economic and social factors being taken into account'. The principle of optimisation is widely accepted and has been discussed in both UK national policy and guidance and in documents from international organisations. The practical interpretation of optimisation in the context of post-closure safety of radioactive waste repositories is, however, still open to question. In particular, the strategies and procedures that a developer might employ to implement optimisation in the siting and development of a repository, and demonstrate optimisation in a safety case, are not defined. In preparation for its role of regulatory review, the Agency has undertaken a pilot study to explore the possible interpretations of optimisation stemming from the GRA, and to identify possible strategies and procedures that a developer might follow. A review has been undertaken of UK regulatory guidance and related documents, and also international guidance, referring to optimisation in relation to radioactive waste disposal facilities. In addition, diverse examples of the application of optimisation have been identified in the international and UK performance assessment literature. A one-day meeting was organised bringing together Agency staff and technical experts with different experiences and perspectives on the subject of optimisation in the context of disposal facilities for radioactive waste. This meeting identified and discussed key issues and possible approaches to optimisation, and specifically: (1) The meaning of

  8. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  9. Time varying acceleration coefficients particle swarm optimisation (TVACPSO): A new optimisation algorithm for estimating parameters of PV cells and modules

    International Nuclear Information System (INIS)

    Jordehi, Ahmad Rezaee

    2016-01-01

    Highlights: • A modified PSO has been proposed for parameter estimation of PV cells and modules. • In the proposed modified PSO, acceleration coefficients are changed during run. • The proposed modified PSO mitigates premature convergence problem. • Parameter estimation problem has been solved for both PV cells and PV modules. • The results show that proposed PSO outperforms other state of the art algorithms. - Abstract: Estimating circuit model parameters of PV cells/modules represents a challenging problem. PV cell/module parameter estimation problem is typically translated into an optimisation problem and is solved by metaheuristic optimisation problems. Particle swarm optimisation (PSO) is considered as a popular and well-established optimisation algorithm. Despite all its advantages, PSO suffers from premature convergence problem meaning that it may get trapped in local optima. Personal and social acceleration coefficients are two control parameters that, due to their effect on explorative and exploitative capabilities, play important roles in computational behavior of PSO. In this paper, in an attempt toward premature convergence mitigation in PSO, its personal acceleration coefficient is decreased during the course of run, while its social acceleration coefficient is increased. In this way, an appropriate tradeoff between explorative and exploitative capabilities of PSO is established during the course of run and premature convergence problem is significantly mitigated. The results vividly show that in parameter estimation of PV cells and modules, the proposed time varying acceleration coefficients PSO (TVACPSO) offers more accurate parameters than conventional PSO, teaching learning-based optimisation (TLBO) algorithm, imperialistic competitive algorithm (ICA), grey wolf optimisation (GWO), water cycle algorithm (WCA), pattern search (PS) and Newton algorithm. For validation of the proposed methodology, parameter estimation has been done both for

  10. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  11. Share-of-Surplus Product Line Optimisation with Price Levels

    Directory of Open Access Journals (Sweden)

    X. G. Luo

    2014-01-01

    Full Text Available Kraus and Yano (2003 established the share-of-surplus product line optimisation model and developed a heuristic procedure for this nonlinear mixed-integer optimisation model. In their model, price of a product is defined as a continuous decision variable. However, because product line optimisation is a planning process in the early stage of product development, pricing decisions usually are not very precise. In this research, a nonlinear integer programming share-of-surplus product line optimization model that allows the selection of candidate price levels for products is established. The model is further transformed into an equivalent linear mixed-integer optimisation model by applying linearisation techniques. Experimental results in different market scenarios show that the computation time of the transformed model is much less than that of the original model.

  12. Multiobjective optimisation of bogie suspension to boost speed on curves

    Science.gov (United States)

    Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor

    2016-01-01

    To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.

  13. HVAC system optimisation-in-building section

    Energy Technology Data Exchange (ETDEWEB)

    Lu, L.; Cai, W.; Xie, L.; Li, S.; Soh, Y.C. [School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore (Singapore)

    2004-07-01

    This paper presents a practical method to optimise in-building section of centralised Heating, Ventilation and Air-Conditioning (HVAC) systems which consist of indoor air loops and chilled water loops. First, through component characteristic analysis, mathematical models associated with cooling loads and energy consumption for heat exchangers and energy consuming devices are established. By considering variation of cooling load of each end user, adaptive neuro-fuzzy inference system (ANFIS) is employed to model duct and pipe networks and obtain optimal differential pressure (DP) set points based on limited sensor information. A mix-integer nonlinear constraint optimization of system energy is formulated and solved by a modified genetic algorithm. The main feature of our paper is a systematic approach in optimizing the overall system energy consumption rather than that of individual component. A simulation study for a typical centralized HVAC system is provided to compare the proposed optimisation method with traditional ones. The results show that the proposed method indeed improves the system performance significantly. (author)

  14. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  15. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  16. Thermodynamic optimisation and analysis of four Kalina cycle layouts for high temperature applications

    International Nuclear Information System (INIS)

    Modi, Anish; Haglind, Fredrik

    2015-01-01

    The Kalina cycle has seen increased interest in the last few years as an efficient alternative to the conventional steam Rankine cycle. However, the available literature gives little information on the algorithms to solve or optimise this inherently complex cycle. This paper presents a detailed approach to solve and optimise a Kalina cycle for high temperature (a turbine inlet temperature of 500 °C) and high pressure (over 100 bar) applications using a computationally efficient solution algorithm. A central receiver solar thermal power plant with direct steam generation was considered as a case study. Four different layouts for the Kalina cycle based on the number and/or placement of the recuperators in the cycle were optimised and compared based on performance parameters such as the cycle efficiency and the cooling water requirement. The cycles were modelled in steady state and optimised with the maximisation of the cycle efficiency as the objective function. It is observed that the different cycle layouts result in different regions for the optimal value of the turbine inlet ammonia mass fraction. Out of the four compared layouts, the most complex layout KC1234 gives the highest efficiency. The cooling water requirement is closely related to the cycle efficiency, i.e., the better the efficiency, the lower is the cooling water requirement. - Highlights: • Detailed methodology for solving and optimising Kalina cycle for high temperature applications. • A central receiver solar thermal power plant with direct steam generation considered as a case study. • Four Kalina cycle layouts based on the placement of recuperators optimised and compared

  17. Approaches and challenges to optimising primary care teams’ electronic health record usage

    Directory of Open Access Journals (Sweden)

    Nancy Pandhi

    2014-07-01

    Full Text Available Background Although the presence of an electronic health record (EHR alone does not ensure high quality, efficient care, few studies have focused on the work of those charged with optimising use of existing EHR functionality.Objective To examine the approaches used and challenges perceived by analysts supporting the optimisation of primary care teams’ EHR use at a large U.S. academic health care system.Methods A qualitative study was conducted. Optimisation analysts and their supervisor were interviewed and data were analysed for themes.Results Analysts needed to reconcile the tension created by organisational mandates focused on the standardisation of EHR processes with the primary care teams’ demand for EHR customisation. They gained an understanding of health information technology (HIT leadership’s and primary care team’s goals through attending meetings, reading meeting minutes and visiting with clinical teams. Within what was organisationally possible, EHR education could then be tailored to fit team needs. Major challenges were related to organisational attempts to standardise EHR use despite varied clinic contexts, personnel readiness and technical issues with the EHR platform. Forcing standardisation upon clinical needs that current EHR functionality could not satisfy was difficult.Conclusions Dedicated optimisation analysts can add value to health systems through playing a mediating role between HIT leadership and care teams. Our findings imply that EHR optimisation should be performed with an in-depth understanding of the workflow, cognitive and interactional activities in primary care.

  18. Establishing Local Reference Dose Values and Optimisation Strategies

    International Nuclear Information System (INIS)

    Connolly, P.; Moores, B.M.

    2000-01-01

    The revised EC Patient Directive 97/43 EURATOM introduces the concepts of clinical audit, diagnostic reference levels and optimisation of radiation protection in diagnostic radiology. The application of reference dose levels in practice involves the establishment of reference dose values as actual measurable operational quantities. These values should then form part of an ongoing optimisation and audit programme against which routine performance can be compared. The CEC Quality Criteria for Radiographic Images provides guidance reference dose values against which local performance can be compared. In many cases these values can be improved upon quite considerably. This paper presents the results of a local initiative in the North West of the UK aimed at establishing local reference dose values for a number of major hospital sites. The purpose of this initiative is to establish a foundation for both optimisation strategies and clinical audit as an ongoing and routine practice. The paper presents results from an ongoing trial involving patient dose measurements for several radiological examinations upon the sites. The results of an attempt to establish local reference dose values from measured dose values and to employ them in optimisation strategies are presented. In particular emphasis is placed on the routine quality control programmes necessary to underpin this strategy including the effective data management of results from such programmes and how they can be employed to optimisation practices. (author)

  19. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  20. Particle Swarm Optimisation with Spatial Particle Extension

    DEFF Research Database (Denmark)

    Krink, Thiemo; Vesterstrøm, Jakob Svaneborg; Riget, Jacques

    2002-01-01

    In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed...

  1. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    Science.gov (United States)

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  2. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  3. Methodology implementation for multi objective optimisation for nuclear fleet evolution scenarios

    International Nuclear Information System (INIS)

    Freynet, David

    2016-01-01

    The issue of the evolution French nuclear fleet can be considered through the study of nuclear transition scenarios. These studies are of paramount importance as their results can greatly affect the decision making process, given that they take into account industrial concerns, investments, time, and nuclear system complexity. Such studies can be performed with the COSI code (developed at the CEA/DEN), which enables the calculation of matter inventories and fluxes across the fuel cycle (nuclear reactors and associated facilities), especially when coupled with the CESAR depletion code. The studies today performed with COSI require the definition of the various scenarios' input parameters, in order to fulfil different objectives such as minimising natural uranium consumption, waste production and so on. These parameters concern the quantities and the scheduling of spent fuel destined for reprocessing, and the number, the type and the commissioning dates of deployed reactors.This work aims to develop, validate and apply an optimisation methodology coupled with COSI, in order to determine optimal nuclear transition scenarios for a multi-objective platform. Firstly, this methodology is based on the acceleration of scenario evaluation, enabling the use of optimisation methods in a reasonable time-frame. With this goal in mind, artificial neural network irradiation surrogate models are created with the URANIE platform (developed at the CEA/DEN) and are implemented within COSI. The next step in this work is to use, adapt and compare different optimisation methods, such as URANIE's genetic algorithm and particle swarm methods, in order to define a methodology suited to this type of study. This methodology development is based on an incremental approach which progressively adds objectives, constraints and decision variables to the optimisation problem definition. The variables added, which are related to reactor deployment and spent fuel reprocessing strategies, are chosen

  4. Numerical optimisation of friction stir welding: review of future challenges

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2011-01-01

    During the last decade, the combination of increasingly more advanced numerical simulation software with high computational power has resulted in models for friction stir welding (FSW), which have improved the understanding of the determining physical phenomena behind the process substantially....... This has made optimisation of certain process parameters possible and has in turn led to better performing friction stir welded products, thus contributing to a general increase in the popularity of the process and its applications. However, most of these optimisation studies do not go well beyond manual...

  5. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  7. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  8. Current applications of big data in obstetric anesthesiology.

    Science.gov (United States)

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.

  9. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  10. Optimisation in X-ray and Molecular Imaging 2015

    International Nuclear Information System (INIS)

    Baath, Magnus; Hoeschen, Christoph; Mattsson, Soeren; Mansson, Lars Gunnar

    2016-01-01

    This issue of Radiation Protection Dosimetry is based on contributions to Optimisation in X-ray and Molecular Imaging 2015 - the 4. Malmoe Conference on Medical Imaging (OXMI 2015). The conference was jointly organised by members of former and current research projects supported by the European Commission EURATOM Radiation Protection Research Programme, in cooperation with the Swedish Society for Radiation Physics. The conference brought together over 150 researchers and other professionals from hospitals, universities and industries with interests in different aspects of the optimisation of medical imaging. More than 100 presentations were given at this international gathering of medical physicists, radiologists, engineers, technicians, nurses and educational researchers. Additionally, invited talks were offered by world-renowned experts on radiation protection, spectral imaging and medical image perception, thus covering several important aspects of the generation and interpretation of medical images. The conference consisted of 13 oral sessions and a poster session, as reflected by the conference title connected by their focus on the optimisation of the use ionising radiation in medical imaging. The conference included technology-specific topics such as computed tomography and tomosynthesis, but also generic issues of interest for the optimisation of all medical imaging, such as image perception and quality assurance. Radiation protection was covered by e.g. sessions on patient dose benchmarking and occupational exposure. Technically-advanced topics such as modelling, Monte Carlo simulation, reconstruction, classification, and segmentation were seen taking advantage of recent developments of hardware and software, showing that the optimisation community is at the forefront of technology and adapts well to new requirements. These peer-reviewed proceedings, representing a continuation of a series of selected reports from meetings in the field of medical imaging

  11. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R H

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  12. User perspectives in public transport timetable optimisation

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    The present paper deals with timetable optimisation from the perspective of minimising the waiting time experienced by passengers when transferring either to or from a bus. Due to its inherent complexity, this bi-level minimisation problem is extremely difficult to solve mathematically, since tim...... on the large-scale public transport network in Denmark. The timetable optimisation approach yielded a yearly reduction in weighted waiting time equivalent to approximately 45 million Danish kroner (9 million USD)....

  13. Study of alexithymia trait based on Big-Five Personality Dimensions

    Directory of Open Access Journals (Sweden)

    Rasoul Heshmati

    2017-12-01

    Full Text Available The purpose of this research was to study the relationship between Big Five personality traits and alexithymia and to determine differences of alexithymic compare with non- alexithymic individuals in these personality traits in university students. In present study, 150 university students at Tabriz University were selected and asked to answer NEO – Five Factor Inventory (NEO - FFI, and Toronto Alexithymia Scale (TAS - 20. Results showed that there are negative and significant relationships between conscientiousness and openness to experiences with alexithymia and positive and significant relationships between neuroticism with alexithymia. As well as, there is significant difference between alexithymic and non-alexithymic individuals in neuroticism and openness to experiences. In one hand, these results suggest that neuroticism, conscientiousness and openness to experiences are determinant of alexithymia; and in the other hand, high level of neuroticism and low level of openness to experiences are the characteristic of alexithymic people based on Big-five. Therefore, it can be conclude that high neuroticism and low openness to experiences are the alexithymic individual’s traits.

  14. State of the art concerning optimum location of capacitors and studying the exhaustive search approach for optimising a given solution

    Directory of Open Access Journals (Sweden)

    Sergio Raúl Rivera Rodríguez

    2004-09-01

    Full Text Available The present article reviews the state of the art of optimum capacitor location in distribution systems, provideing guidelines for planners engaged in optimising tension profiles and controlling reagents in distribution networks.Optimising a given solution by exhastive search is studied here; the dimensions of a given problem are determined by evaluating the different possibilities for resolving it and the solution algorithm's computational times and requierements are visualised. An example system (9 node, IEEE is used for illustrating the exhaustive search approach, where it was found that methods used in the literature regarding this topic do not always lead to the optimum solution.

  15. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  16. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  17. Big Data Analytics for Smart Manufacturing: Case Studies in Semiconductor Manufacturing

    Directory of Open Access Journals (Sweden)

    James Moyne

    2017-07-01

    Full Text Available Smart manufacturing (SM is a term generally applied to the improvement in manufacturing operations through integration of systems, linking of physical and cyber capabilities, and taking advantage of information including leveraging the big data evolution. SM adoption has been occurring unevenly across industries, thus there is an opportunity to look to other industries to determine solution and roadmap paths for industries such as biochemistry or biology. The big data evolution affords an opportunity for managing significantly larger amounts of information and acting on it with analytics for improved diagnostics and prognostics. The analytics approaches can be defined in terms of dimensions to understand their requirements and capabilities, and to determine technology gaps. The semiconductor manufacturing industry has been taking advantage of the big data and analytics evolution by improving existing capabilities such as fault detection, and supporting new capabilities such as predictive maintenance. For most of these capabilities: (1 data quality is the most important big data factor in delivering high quality solutions; and (2 incorporating subject matter expertise in analytics is often required for realizing effective on-line manufacturing solutions. In the future, an improved big data environment incorporating smart manufacturing concepts such as digital twin will further enable analytics; however, it is anticipated that the need for incorporating subject matter expertise in solution design will remain.

  18. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  19. Design of passive coolers for light-emitting diode lamps using topology optimisation

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Sigmund, Ole; Meyer, Knud Erik

    2018-01-01

    Topology optimised designs for passive cooling of light-emitting diode (LED) lamps are investigated through extensive numerical parameter studies. The designs are optimised for either horizontal or vertical orientations and are compared to a lattice-fin design as well as a simple parameter......, while maintaining low sensitivity to orientation. Furthermore, they exhibit several defining features and provide insight and general guidelines for the design of passive coolers for LED lamps....

  20. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  1. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  2. Optimisation of process parameters on thin shell part using response surface methodology (RSM)

    Science.gov (United States)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.

    2017-09-01

    This study is carried out to focus on optimisation of process parameters by simulation using Autodesk Moldflow Insight (AMI) software. The process parameters are taken as the input in order to analyse the warpage value which is the output in this study. There are some significant parameters that have been used which are melt temperature, mould temperature, packing pressure, and cooling time. A plastic part made of Polypropylene (PP) has been selected as the study part. Optimisation of process parameters is applied in Design Expert software with the aim to minimise the obtained warpage value. Response Surface Methodology (RSM) has been applied in this study together with Analysis of Variance (ANOVA) in order to investigate the interactions between parameters that are significant to the warpage value. Thus, the optimised warpage value can be obtained using the model designed using RSM due to its minimum error value. This study comes out with the warpage value improved by using RSM.

  3. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  4. Noise aspects at aerodynamic blade optimisation projects

    International Nuclear Information System (INIS)

    Schepers, J.G.

    1997-06-01

    The Netherlands Energy Research Foundation (ECN) has often been involved in industrial projects, in which blade geometries are created automatic by means of numerical optimisation. Usually, these projects aim at the determination of the aerodynamic optimal wind turbine blade, i.e. the goal is to design a blade which is optimal with regard to energy yield. In other cases, blades have been designed which are optimal with regard to cost of generated energy. However, it is obvious that the wind turbine blade designs which result from these optimisations, are not necessarily optimal with regard to noise emission. In this paper an example is shown of an aerodynamic blade optimisation, using the ECN-program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. 11 figs., 8 refs

  5. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  6. DACIA LOGAN LIVE AXLE OPTIMISATION USING COMPUTER GRAPHICS

    Directory of Open Access Journals (Sweden)

    KIRALY Andrei

    2017-05-01

    Full Text Available The paper presents some contributions to the calculus and optimisation of a live axle used at Dacia Logan using computer graphics software for creating the model and afterwards using FEA evaluation to determine the effectiveness of the optimisation. Thus using specialized computer software, a simulation is made and the results were compared to the measured real prototype.

  7. Mesh dependence in PDE-constrained optimisation an application in tidal turbine array layouts

    CERN Document Server

    Schwedes, Tobias; Funke, Simon W; Piggott, Matthew D

    2017-01-01

    This book provides an introduction to PDE-constrained optimisation using finite elements and the adjoint approach. The practical impact of the mathematical insights presented here are demonstrated using the realistic scenario of the optimal placement of marine power turbines, thereby illustrating the real-world relevance of best-practice Hilbert space aware approaches to PDE-constrained optimisation problems. Many optimisation problems that arise in a real-world context are constrained by partial differential equations (PDEs). That is, the system whose configuration is to be optimised follows physical laws given by PDEs. This book describes general Hilbert space formulations of optimisation algorithms, thereby facilitating optimisations whose controls are functions of space. It demonstrates the importance of methods that respect the Hilbert space structure of the problem by analysing the mathematical drawbacks of failing to do so. The approaches considered are illustrated using the optimisation problem arisin...

  8. CLIC crab cavity design optimisation for maximum luminosity

    Energy Technology Data Exchange (ETDEWEB)

    Dexter, A.C., E-mail: a.dexter@lancaster.ac.uk [Lancaster University, Lancaster, LA1 4YR (United Kingdom); Cockcroft Institute, Daresbury, Warrington, WA4 4AD (United Kingdom); Burt, G.; Ambattu, P.K. [Lancaster University, Lancaster, LA1 4YR (United Kingdom); Cockcroft Institute, Daresbury, Warrington, WA4 4AD (United Kingdom); Dolgashev, V. [SLAC, Menlo Park, CA 94025 (United States); Jones, R. [University of Manchester, Manchester, M13 9PL (United Kingdom)

    2011-11-21

    The bunch size and crossing angle planned for CERN's compact linear collider CLIC dictate that crab cavities on opposing linacs will be needed to rotate bunches of particles into alignment at the interaction point if the desired luminosity is to be achieved. Wakefield effects, RF phase errors between crab cavities on opposing linacs and unpredictable beam loading can each act to reduce luminosity below that anticipated for bunches colliding in perfect alignment. Unlike acceleration cavities, which are normally optimised for gradient, crab cavities must be optimised primarily for luminosity. Accepting the crab cavity technology choice of a 12 GHz, normal conducting, travelling wave structure as explained in the text, this paper develops an analytical approach to optimise cell number and iris diameter.

  9. Optimisation of logistics processes of energy grass collection

    Science.gov (United States)

    Bányai, Tamás.

    2010-05-01

    The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The

  10. A conceptual optimisation strategy for radiography in a digital environment

    International Nuclear Information System (INIS)

    Baath, M.; Haakansson, M.; Hansson, J.; Maansson, L. G.

    2005-01-01

    Using a completely digital environment for the entire imaging process leads to new possibilities for optimisation of radiography since many restrictions of screen/film systems, such as the small dynamic range and the lack of possibilities for image processing, do not apply any longer. However, at the same time these new possibilities lead to a more complicated optimisation process, since more freedom is given to alter parameters. This paper focuses on describing an optimisation strategy that concentrates on taking advantage of the conceptual differences between digital systems and screen/film systems. The strategy can be summarised as: (a) always include the anatomical background during the optimisation, (b) perform all comparisons at a constant effective dose and (c) separate the image display stage from the image collection stage. A three-step process is proposed where the optimal setting of the technique parameters is determined at first, followed by an optimisation of the image processing. In the final step the optimal dose level - given the optimal settings of the image collection and image display stages - is determined. (authors)

  11. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  12. Optimisation of a novel trailing edge concept for a high lift device

    CSIR Research Space (South Africa)

    Botha, JDM

    2014-09-01

    Full Text Available A novel concept (referred to as the flap extension) is implemented on the leading edge of the flap of a three element high lift device. The concept is optimised using two optimisation approaches based on Genetic Algorithm optimisations. A zero order...

  13. Utility systems operation: Optimisation-based decision making

    International Nuclear Information System (INIS)

    Velasco-Garcia, Patricia; Varbanov, Petar Sabev; Arellano-Garcia, Harvey; Wozny, Guenter

    2011-01-01

    Utility systems provide heat and power to industrial sites. The importance of operating these systems in an optimal way has increased significantly due to the unstable and in the long term rising prices of fossil fuels as well as the need for reducing the greenhouse gas emissions. This paper presents an analysis of the problem for supporting operator decision making under conditions of variable steam demands from the production processes on an industrial site. An optimisation model has been developed, where besides for running the utility system, also the costs associated with starting-up the operating units have been modelled. The illustrative case study shows that accounting for the shut-downs and start-ups of utility operating units can bring significant cost savings. - Highlights: → Optimisation methodology for decision making on running utility systems. → Accounting for varying steam demands. → Optimal operating specifications when a demand change occurs. → Operating costs include start-up costs of boilers and other units. → Validated on a real-life case study. Up to 20% cost savings are possible.

  14. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    Science.gov (United States)

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  15. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    Directory of Open Access Journals (Sweden)

    Vito Trianni

    Full Text Available The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled. However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  16. Distributed optimisation problem with communication delay and external disturbance

    Science.gov (United States)

    Tran, Ngoc-Tu; Xiao, Jiang-Wen; Wang, Yan-Wu; Yang, Wu

    2017-12-01

    This paper investigates the distributed optimisation problem for the multi-agent systems (MASs) with the simultaneous presence of external disturbance and the communication delay. To solve this problem, a two-step design scheme is introduced. In the first step, based on the internal model principle, the internal model term is constructed to compensate the disturbance asymptotically. In the second step, a distributed optimisation algorithm is designed to solve the distributed optimisation problem based on the MASs with the simultaneous presence of disturbance and communication delay. Moreover, in the proposed algorithm, each agent interacts with its neighbours through the connected topology and the delay occurs during the information exchange. By utilising Lyapunov-Krasovskii functional, the delay-dependent conditions are derived for both slowly and fast time-varying delay, respectively, to ensure the convergence of the algorithm to the optimal solution of the optimisation problem. Several numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.

  17. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  18. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  19. SINGLE FIXED CRANE OPTIMISATION WITHIN A DISTRIBUTION CENTRE

    Directory of Open Access Journals (Sweden)

    J. Matthews

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper considersthe optimisation of the movement of a fixed crane operating in a single aisle of a distribution centre. The crane must move pallets in inventory between docking bays, storage locations, and picking lines. Both a static and a dynamic approach to the problem are presented. The optimisation is performed by means of tabu search, ant colony metaheuristics,and hybrids of these two methods. All these solution approaches were tested on real life data obtained from an operational distribution centre. Results indicate that the hybrid methods outperform the other approaches.

    AFRIKAANSE OPSOMMING: Die optimisering van die beweging van 'n vaste hyskraan in 'n enkele gang van 'n distribusiesentrum word in hierdie artikel beskou. Die hyskraan moet pallette vervoer tussen dokhokke, stoorposisies, en opmaaklyne. Beide 'n statiese en 'n dinamiese benadering tot die probleem word aangebied. Die optimisering word gedoen met behulp van tabu-soektogte, mierkolonieoptimisering,en hibriede van hierdie twee metodes. Al die oplossingsbenaderings is getoets met werklike data wat van 'n operasionele distribusiesentrum verkry is. Die resultate toon aan dat die hibriedmetodes die beste oplossings lewer.

  20. Toward a Literature-Driven Definition of Big Data in Healthcare.

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  1. Optimisation of optical receiver for 10 Gbit/s optical duobinary transmission system

    DEFF Research Database (Denmark)

    Zheng, Xueyan; Liu, Fenghai; Jeppesen, Palle

    2001-01-01

    Optimisation of a receiver for an optical duobinary signal is studied numerically. It is shown that a conventional receiver is not optimum neither when a DCF is used before the receiver nor without a DCF being used. The optimum receiver for an optical duobinary system is identified.......Optimisation of a receiver for an optical duobinary signal is studied numerically. It is shown that a conventional receiver is not optimum neither when a DCF is used before the receiver nor without a DCF being used. The optimum receiver for an optical duobinary system is identified....

  2. Optimisation of decision making under uncertainty throughout field lifetime: A fractured reservoir example

    Science.gov (United States)

    Arnold, Dan; Demyanov, Vasily; Christie, Mike; Bakay, Alexander; Gopa, Konstantin

    2016-10-01

    Assessing the change in uncertainty in reservoir production forecasts over field lifetime is rarely undertaken because of the complexity of joining together the individual workflows. This becomes particularly important in complex fields such as naturally fractured reservoirs. The impact of this problem has been identified in previous and many solutions have been proposed but never implemented on complex reservoir problems due to the computational cost of quantifying uncertainty and optimising the reservoir development, specifically knowing how many and what kind of simulations to run. This paper demonstrates a workflow that propagates uncertainty throughout field lifetime, and into the decision making process by a combination of a metric-based approach, multi-objective optimisation and Bayesian estimation of uncertainty. The workflow propagates uncertainty estimates from appraisal into initial development optimisation, then updates uncertainty through history matching and finally propagates it into late-life optimisation. The combination of techniques applied, namely the metric approach and multi-objective optimisation, help evaluate development options under uncertainty. This was achieved with a significantly reduced number of flow simulations, such that the combined workflow is computationally feasible to run for a real-field problem. This workflow is applied to two synthetic naturally fractured reservoir (NFR) case studies in appraisal, field development, history matching and mid-life EOR stages. The first is a simple sector model, while the second is a more complex full field example based on a real life analogue. This study infers geological uncertainty from an ensemble of models that are based on the carbonate Brazilian outcrop which are propagated through the field lifetime, before and after the start of production, with the inclusion of production data significantly collapsing the spread of P10-P90 in reservoir forecasts. The workflow links uncertainty

  3. Multi-objective evolutionary optimisation for product design and manufacturing

    CERN Document Server

    2011-01-01

    Presents state-of-the-art research in the area of multi-objective evolutionary optimisation for integrated product design and manufacturing Provides a comprehensive review of the literature Gives in-depth descriptions of recently developed innovative and novel methodologies, algorithms and systems in the area of modelling, simulation and optimisation

  4. Natural Erosion of Sandstone as Shape Optimisation.

    Science.gov (United States)

    Ostanin, Igor; Safonov, Alexander; Oseledets, Ivan

    2017-12-11

    Natural arches, pillars and other exotic sandstone formations have always been attracting attention for their unusual shapes and amazing mechanical balance that leave a strong impression of intelligent design rather than the result of a stochastic process. It has been recently demonstrated that these shapes could have been the result of the negative feedback between stress and erosion that originates in fundamental laws of friction between the rock's constituent particles. Here we present a deeper analysis of this idea and bridge it with the approaches utilized in shape and topology optimisation. It appears that the processes of natural erosion, driven by stochastic surface forces and Mohr-Coulomb law of dry friction, can be viewed within the framework of local optimisation for minimum elastic strain energy. Our hypothesis is confirmed by numerical simulations of the erosion using the topological-shape optimisation model. Our work contributes to a better understanding of stochastic erosion and feasible landscape formations that could be found on Earth and beyond.

  5. Statistical Optimisation of Fermentation Conditions for Citric Acid ...

    African Journals Online (AJOL)

    This study investigated the optimisation of fermentation conditions during citric acid production via solid state fermentation (SSF) of pineapple peels using Aspergillus niger. A three-variable, three-level Box-Behnken design (BBD) comprising 17 experimental runs was used to develop a statistical model for the fermentation ...

  6. Big Data Management in US Hospitals: Benefits and Barriers.

    Science.gov (United States)

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  7. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  8. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  9. Analysis of Big Data Maturity Stage in Hospitality Industry

    OpenAIRE

    Shabani, Neda; Munir, Arslan; Bose, Avishek

    2017-01-01

    Big data analytics has an extremely significant impact on many areas in all businesses and industries including hospitality. This study aims to guide information technology (IT) professionals in hospitality on their big data expedition. In particular, the purpose of this study is to identify the maturity stage of the big data in hospitality industry in an objective way so that hotels be able to understand their progress, and realize what it will take to get to the next stage of big data matur...

  10. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  11. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  12. Topology Optimisation of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Thike Aye Min

    2016-01-01

    Full Text Available Wireless sensor networks are widely used in a variety of fields including industrial environments. In case of a clustered network the location of cluster head affects the reliability of the network operation. Finding of the optimum location of the cluster head, therefore, is critical for the design of a network. This paper discusses the optimisation approach, based on the brute force algorithm, in the context of topology optimisation of a cluster structure centralised wireless sensor network. Two examples are given to verify the approach that demonstrate the implementation of the brute force algorithm to find an optimum location of the cluster head.

  13. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  14. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  15. Optimised operation of an off-grid hybrid wind-diesel-battery system using genetic algorithm

    International Nuclear Information System (INIS)

    Gan, Leong Kit; Shek, Jonathan K.H.; Mueller, Markus A.

    2016-01-01

    Highlights: • Diesel generator’s operation is optimised in a hybrid wind-diesel-battery system. • Optimisation is performed using wind speed and load demand forecasts. • The objective is to maximise wind energy utilisation with limited battery storage. • Physical modelling approach (Simscape) is used to verify mathematical model. • Sensitivity analyses are performed with synthesised wind and load forecast errors. - Abstract: In an off-grid hybrid wind-diesel-battery system, the diesel generator is often not utilised efficiently, therefore compromising its lifetime. In particular, the general rule of thumb of running the diesel generator at more than 40% of its rated capacity is often unmet. This is due to the variation in power demand and wind speed which needs to be supplied by the diesel generator. In addition, the frequent start-stop of the diesel generator leads to additional mechanical wear and fuel wastage. This research paper proposes a novel control algorithm which optimises the operation of a diesel generator, using genetic algorithm. With a given day-ahead forecast of local renewable energy resource and load demand, it is possible to optimise the operation of a diesel generator, subjected to other pre-defined constraints. Thus, the utilisation of the renewable energy sources to supply electricity can be maximised. Usually, the optimisation studies of a hybrid system are being conducted through simple analytical modelling, coupled with a selected optimisation algorithm to seek the optimised solution. The obtained solution is not verified using a more realistic system model, for instance the physical modelling approach. This often led to the question of the applicability of such optimised operation being used in reality. In order to take a step further, model-based design using Simulink is employed in this research to perform a comparison through a physical modelling approach. The Simulink model has the capability to incorporate the electrical

  16. Big Data in HEP: A comprehensive use case study

    Science.gov (United States)

    Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; Jayatilaka, Bo; Kowalkowski, Jim; Pivarski, Jim; Sehrish, Saba; Mantilla Surez, Cristina; Svyatkovskiy, Alexey; Tran, Nhan

    2017-10-01

    Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.

  17. Optimisation of Software-Defined Networks Performance Using a Hybrid Intelligent System

    Directory of Open Access Journals (Sweden)

    Ann Sabih

    2017-06-01

    Full Text Available This paper proposes a novel intelligent technique that has been designed to optimise the performance of Software Defined Networks (SDN. The proposed hybrid intelligent system has employed integration of intelligence-based optimisation approaches with the artificial neural network. These heuristic optimisation methods include Genetic Algorithms (GA and Particle Swarm Optimisation (PSO. These methods were utilised separately in order to select the best inputs to maximise SDN performance. In order to identify SDN behaviour, the neural network model is trained and applied. The maximal optimisation approach has been identified using an analytical approach that considered SDN performance and the computational time as objective functions. Initially, the general model of the neural network was tested with unseen data before implementing the model using GA and PSO to determine the optimal performance of SDN. The results showed that the SDN represented by Artificial Neural Network ANN, and optmised by PSO, generated a better configuration with regards to computational efficiency and performance index.

  18. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  19. Toward a Literature-Driven Definition of Big Data in Healthcare

    Directory of Open Access Journals (Sweden)

    Emilie Baro

    2015-01-01

    Full Text Available Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n and the number of variables (p for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n*p≥7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR data.

  20. Toward a Literature-Driven Definition of Big Data in Healthcare

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  1. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  2. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  3. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  4. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  5. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  6. Ants Colony Optimisation of a Measuring Path of Prismatic Parts on a CMM

    Directory of Open Access Journals (Sweden)

    Stojadinovic Slavenko M.

    2016-03-01

    Full Text Available This paper presents optimisation of a measuring probe path in inspecting the prismatic parts on a CMM. The optimisation model is based on: (i the mathematical model that establishes an initial collision-free path presented by a set of points, and (ii the solution of Travelling Salesman Problem (TSP obtained with Ant Colony Optimisation (ACO. In order to solve TSP, an ACO algorithm that aims to find the shortest path of ant colony movement (i.e. the optimised path is applied. Then, the optimised path is compared with the measuring path obtained with online programming on CMM ZEISS UMM500 and with the measuring path obtained in the CMM inspection module of Pro/ENGINEER® software. The results of comparing the optimised path with the other two generated paths show that the optimised path is at least 20% shorter than the path obtained by on-line programming on CMM ZEISS UMM500, and at least 10% shorter than the path obtained by using the CMM module in Pro/ENGINEER®.

  7. Focal psychodynamic therapy, cognitive behaviour therapy, and optimised treatment as usual in outpatients with anorexia nervosa (ANTOP study): randomised controlled trial.

    Science.gov (United States)

    Zipfel, Stephan; Wild, Beate; Groß, Gaby; Friederich, Hans-Christoph; Teufel, Martin; Schellberg, Dieter; Giel, Katrin E; de Zwaan, Martina; Dinkel, Andreas; Herpertz, Stephan; Burgmer, Markus; Löwe, Bernd; Tagay, Sefik; von Wietersheim, Jörn; Zeeck, Almut; Schade-Brittinger, Carmen; Schauenburg, Henning; Herzog, Wolfgang

    2014-01-11

    Psychotherapy is the treatment of choice for patients with anorexia nervosa, although evidence of efficacy is weak. The Anorexia Nervosa Treatment of OutPatients (ANTOP) study aimed to assess the efficacy and safety of two manual-based outpatient treatments for anorexia nervosa--focal psychodynamic therapy and enhanced cognitive behaviour therapy--versus optimised treatment as usual. The ANTOP study is a multicentre, randomised controlled efficacy trial in adults with anorexia nervosa. We recruited patients from ten university hospitals in Germany. Participants were randomly allocated to 10 months of treatment with either focal psychodynamic therapy, enhanced cognitive behaviour therapy, or optimised treatment as usual (including outpatient psychotherapy and structured care from a family doctor). The primary outcome was weight gain, measured as increased body-mass index (BMI) at the end of treatment. A key secondary outcome was rate of recovery (based on a combination of weight gain and eating disorder-specific psychopathology). Analysis was by intention to treat. This trial is registered at http://isrctn.org, number ISRCTN72809357. Of 727 adults screened for inclusion, 242 underwent randomisation: 80 to focal psychodynamic therapy, 80 to enhanced cognitive behaviour therapy, and 82 to optimised treatment as usual. At the end of treatment, 54 patients (22%) were lost to follow-up, and at 12-month follow-up a total of 73 (30%) had dropped out. At the end of treatment, BMI had increased in all study groups (focal psychodynamic therapy 0·73 kg/m(2), enhanced cognitive behaviour therapy 0·93 kg/m(2), optimised treatment as usual 0·69 kg/m(2)); no differences were noted between groups (mean difference between focal psychodynamic therapy and enhanced cognitive behaviour therapy -0·45, 95% CI -0·96 to 0·07; focal psychodynamic therapy vs optimised treatment as usual -0·14, -0·68 to 0·39; enhanced cognitive behaviour therapy vs optimised treatment as usual -0·30

  8. Komunikasi Pemasaran Produk Big-Cola Dan Coca-Cola Terhadap Minat Beli Konsumen (Studi Komparatif Komunikasi Pemasaran Produk Big-Cola Dan Coca-Cola Terahadap Minat Beli Konsumen Pada Mahasiswa Di Universitas Sumatera Utara )

    OpenAIRE

    Harahap, Mirza swardani

    2015-01-01

    Research is called communication of its marketing Big-Cola and Coca-Cola to interest in buying types of buyers in student at the university of north sumatra.The purpose of this research to know the influence of communication of its marketing Big-Cola and Coca-Cola to interest in buying student at the university of north sumatra. A study of correlational to perceive the difference or comparative marketing communications Big-Cola products and Coca-Cola to interest in buying types of buyers in s...

  9. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  10. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  11. PHYSICAL-MATEMATICALSCIENCE MECHANICS SIMULATION CHALLENGES IN OPTIMISING THEORETICAL METAL CUTTING TASKS

    Directory of Open Access Journals (Sweden)

    Rasul V. Guseynov

    2017-01-01

    Full Text Available Abstract. Objectives In the article, problems in the optimising of machining operations, which provide end-unit production of the required quality with a minimum processing cost, are addressed. Methods Increasing the effectiveness of experimental research was achieved through the use of mathematical methods for planning experiments for optimising metal cutting tasks. The minimal processing cost model, in which the objective function is polynomial, is adopted as a criterion for the selection of optimal parameters. Results Polynomial models of the influence of angles φ, α, γ on the torque applied when cutting threads in various steels are constructed. Optimum values of the geometrical tool parameters were obtained using the criterion of minimum cutting forces during processing. The high stability of tools having optimal geometric parameters is determined. It is shown that the use of experimental planning methods allows the optimisation of cutting parameters. In optimising solutions to metal cutting problems, it is found to be expedient to use multifactor experimental planning methods and to select the cutting force as the optimisation parameter when determining tool geometry. Conclusion The joint use of geometric programming and experiment planning methods in order to optimise the parameters of cutting significantly increases the efficiency of technological metal processing approaches. 

  12. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  13. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  14. The Big Five of Personality and structural imaging revisited: a VBM - DARTEL study.

    Science.gov (United States)

    Liu, Wei-Yin; Weber, Bernd; Reuter, Martin; Markett, Sebastian; Chu, Woei-Chyn; Montag, Christian

    2013-05-08

    The present study focuses on the neurostructural foundations of the human personality. In a large sample of 227 healthy human individuals (168 women and 59 men), we used MRI to examine the relationship between personality traits and both regional gray and white matter volume, while controlling for age and sex. Personality was assessed using the German version of the NEO Five-Factor Inventory that measures individual differences in the 'Big Five of Personality': extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience. In contrast to most previous studies on neural correlates of the Big Five, we used improved processing strategies: white and gray matter were independently assessed by segmentation steps before data analysis. In addition, customized sex-specific diffeomorphic anatomical registration using exponentiated lie algebra templates were used. Our results did not show significant correlations between any dimension of the Big Five and regional gray matter volume. However, among others, higher conscientiousness scores correlated significantly with reductions in regional white matter volume in different brain areas, including the right insula, putamen, caudate, and left fusiformis. These correlations were driven by the female subsample. The present study suggests that many results from the literature on the neurostructural basis of personality should be reviewed carefully, considering the results when the sample size is larger, imaging methods are rigorously applied, and sex-related and age-related effects are controlled.

  15. Smart optimisation and sensitivity analysis in water distribution systems

    CSIR Research Space (South Africa)

    Page, Philip R

    2015-12-01

    Full Text Available optimisation of a water distribution system by keeping the average pressure unchanged as water demands change, by changing the speed of the pumps. Another application area considered, using the same mathematical notions, is the study of the sensitivity...

  16. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  17. Data for TROTS – The Radiotherapy Optimisation Test Set

    Directory of Open Access Journals (Sweden)

    Sebastiaan Breedveld

    2017-06-01

    Full Text Available The Radiotherapy Optimisation Test Set (TROTS is an extensive set of problems originating from radiotherapy (radiation therapy treatment planning. This dataset is created for 2 purposes: (1 to supply a large-scale dense dataset to measure performance and quality of mathematical solvers, and (2 to supply a dataset to investigate the multi-criteria optimisation and decision-making nature of the radiotherapy problem. The dataset contains 120 problems (patients, divided over 6 different treatment protocols/tumour types. Each problem contains numerical data, a configuration for the optimisation problem, and data required to visualise and interpret the results. The data is stored as HDF5 compatible Matlab files, and includes scripts to work with the dataset.

  18. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  19. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  20. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  1. MULTI-OBJECTIVE OPTIMISATION OF LASER CUTTING USING CUCKOO SEARCH ALGORITHM

    Directory of Open Access Journals (Sweden)

    M. MADIĆ

    2015-03-01

    Full Text Available Determining of optimal laser cutting conditions for improving cut quality characteristics is of great importance in process planning. This paper presents multi-objective optimisation of the CO2 laser cutting process considering three cut quality characteristics such as surface roughness, heat affected zone (HAZ and kerf width. It combines an experimental design by using Taguchi’s method, modelling the relationships between the laser cutting factors (laser power, cutting speed, assist gas pressure and focus position and cut quality characteristics by artificial neural networks (ANNs, formulation of the multiobjective optimisation problem using weighting sum method, and solving it by the novel meta-heuristic cuckoo search algorithm (CSA. The objective is to obtain optimal cutting conditions dependent on the importance order of the cut quality characteristics for each of four different case studies presented in this paper. The case studies considered in this study are: minimisation of cut quality characteristics with equal priority, minimisation of cut quality characteristics with priority given to surface roughness, minimisation of cut quality characteristics with priority given to HAZ, and minimisation of cut quality characteristics with priority given to kerf width. The results indicate that the applied CSA for solving the multi-objective optimisation problem is effective, and that the proposed approach can be used for selecting the optimal laser cutting factors for specific production requirements.

  2. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  3. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  4. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  5. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  6. GAOS: Spatial optimisation of crop and nature within agricultural fields

    NARCIS (Netherlands)

    Bruin, de S.; Janssen, H.; Klompe, A.; Lerink, P.; Vanmeulebrouk, B.

    2010-01-01

    This paper proposes and demonstrates a spatial optimiser that allocates areas of inefficient machine manoeuvring to field margins thus improving the use of available space and supporting map-based Controlled Traffic Farming. A prototype web service (GAOS) allows farmers to optimise tracks within

  7. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  8. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  9. Concurrence of big data analytics and healthcare: A systematic review.

    Science.gov (United States)

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  10. Using Big Book to Teach Things in My House

    OpenAIRE

    Effrien, Intan; Lailatus, Sa’diyah; Nuruliftitah Maja, Neneng

    2017-01-01

    The purpose of this study to determine students' interest in learning using the big book media. Big book is a big book from the general book. The big book contains simple words and images that match the content of sentences and spelling. From here researchers can know the interest and development of students' knowledge. As well as train researchers to remain crative in developing learning media for students.

  11. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  12. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  13. Comparative case study on website traffic generated by search engine optimisation and a pay-per-click campaign, versus marketing expenditure

    Directory of Open Access Journals (Sweden)

    Wouter T. Kritzinger

    2015-09-01

    Full Text Available Background: No empirical work was found on how marketing expenses compare when used solely for either the one or the other of the two main types of search engine marketing. Objectives: This research set out to determine how the results of the implementation of a pay-per-click campaign compared to those of a search engine optimisation campaign, given the same website and environment. At the same time, the expenses incurred on both these marketing methods were recorded and compared. Method: The active website of an existing, successful e-commerce concern was used as platform. The company had been using pay-per-click only for a period, whilst traffic was monitored. This system was decommissioned on a particular date and time, and an alternative search engine optimisation system was started at the same time. Again, both traffic and expenses were monitored. Results: The results indicate that the pay-per-click system did produce favourable results, but on the condition that a monthly fee has to be set aside to guarantee consistent traffic. The implementation of search engine optimisation required a relatively large investment at the outset, but it was once-off. After a drop in traffic owing to crawler visitation delays, the website traffic bypassed the average figure achieved during the pay-per-click period after a little over three months, whilst the expenditure crossed over after just six months. Conclusion: Whilst considering the specific parameters of this study, an investment in search engine optimisation rather than a pay-per-click campaign appears to produce better results at a lower cost, after a given period of time. [PDF to follow

  14. Discussion on Implementation of ICRP Recommendations Concerning Reference Levels and Optimisation

    International Nuclear Information System (INIS)

    2013-02-01

    International Commission on Radiological Protection (ICRP) Publication 103, 'The 2007 Recommendations of the International Commission on Radiological Protection', issued in 2007, defines emergency exposure situations as unexpected situations that may require the implementation of urgent protective actions and perhaps longer term protective actions. The ICRP continues to recommend optimisation and the use of reference levels to ensure an adequate degree of protection in regard to exposure to ionising radiation in emergency exposure situations. Reference levels represent the level of dose or risk above which it is judged to be inappropriate to plan to allow exposures to occur and for which protective actions should therefore be planned and optimised. National authorities are responsible for establishing reference levels. The Expert Group on the Implementation of New International Recommendations for Emergency Exposure Situations (EGIRES) performed a survey to analyse the established processes for optimisation of the protection strategy for emergency exposure situations and for practical implementation of the reference level concept in several member states of the Nuclear Energy Agency (NEA). The EGIRES collected information on several national optimisation strategy definitions, on optimisation of protection for different protective actions, and also on optimisation of urgent protective actions. In addition, national criteria for setting reference levels, their use, and relevant processes, including specific triggers and dosimetric quantifies in setting reference levels, are focus points that the EGIRES also evaluated. The analysis of national responses to this 2011 survey shows many differences in the interpretation and application of the established processes and suggests that most countries are still in the early stages of implementing these processes. Since 2011, national authorities have continued their study of the ICRP recommendations to incorporate them into

  15. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi

    2012-09-04

    In this paper, shape optimisation of flapping wings in forward flight is considered. This analysis is performed by combining a local gradient-based optimizer with the unsteady vortex lattice method (UVLM). Although the UVLM applies only to incompressible, inviscid flows where the separation lines are known a priori, Persson et al. [1] showed through a detailed comparison between UVLM and higher-fidelity computational fluid dynamics methods for flapping flight that the UVLM schemes produce accurate results for attached flow cases and even remain trend-relevant in the presence of flow separation. As such, they recommended the use of an aerodynamic model based on UVLM to perform preliminary design studies of flapping wing vehicles Unlike standard computational fluid dynamics schemes, this method requires meshing of the wing surface only and not of the whole flow domain [2]. From the design or optimisation perspective taken in our work, it is fairly common (and sometimes entirely necessary, as a result of the excessive computational cost of the highest fidelity tools such as Navier-Stokes solvers) to rely upon such a moderate level of modelling fidelity to traverse the design space in an economical manner. The objective of the work, described in this paper, is to identify a set of optimised shapes that maximise the propulsive efficiency, defined as the ratio of the propulsive power over the aerodynamic power, under lift, thrust, and area constraints. The shape of the wings is modelled using B-splines, a technology used in the computer-aided design (CAD) field for decades. This basis can be used to smoothly discretize wing shapes with few degrees of freedom, referred to as control points. The locations of the control points constitute the design variables. The results suggest that changing the shape yields significant improvement in the performance of the flapping wings. The optimisation pushes the design to "bird-like" shapes with substantial increase in the time

  16. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    Science.gov (United States)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  17. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  18. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  19. Optimised sensitivity to leptonic CP violation from spectral information: the LBNO case at 2300 km baseline

    CERN Document Server

    Agarwalla, S K; Aittola, M; Alekou, A; Andrieu, B; Antoniou, F; Asfandiyarov, R; Autiero, D; Bésida, O; Balik, A; Ballett, P; Bandac, I; Banerjee, D; Bartmann, W; Bay, F; Biskup, B; Blebea-Apostu, A M; Blondel, A; Bogomilov, M; Bolognesi, S; Borriello, E; Brancus, I; Bravar, A; Buizza-Avanzini, M; Caiulo, D; Calin, M; Calviani, M; Campanelli, M; Cantini, C; Cata-Danil, G; Chakraborty, S; Charitonidis, N; Chaussard, L; Chesneanu, D; Chipesiu, F; Crivelli, P; Dawson, J; De Bonis, I; Declais, Y; Sanchez, P Del Amo; Delbart, A; Di Luise, S; Duchesneau, D; Dumarchez, J; Efthymiopoulos, I; Eliseev, A; Emery, S; Enqvist, T; Enqvist, K; Epprecht, L; Erykalov, A N; Esanu, T; Franco, D; Friend, M; Galymov, V; Gavrilov, G; Gendotti, A; Giganti, C; Gilardoni, S; Goddard, B; Gomoiu, C M; Gornushkin, Y A; Gorodetzky, P; Haesler, A; Hasegawa, T; Horikawa, S; Huitu, K; Izmaylov, A; Jipa, A; Kainulainen, K; Karadzhov, Y; Khabibullin, M; Khotjantsev, A; Kopylov, A N; Korzenev, A; Kosyanenko, S; Kryn, D; Kudenko, Y; Kuusiniemi, P; Lazanu, I; Lazaridis, C; Levy, J -M; Loo, K; Maalampi, J; Margineanu, R M; Marteau, J; Martin-Mari, C; Matveev, V; Mazzucato, E; Mefodiev, A; Mineev, O; Mirizzi, A; Mitrica, B; Murphy, S; Nakadaira, T; Narita, S; Nesterenko, D A; Nguyen, K; Nikolics, K; Noah, E; Novikov, Yu; Oprima, A; Osborne, J; Ovsyannikova, T; Papaphilippou, Y; Pascoli, S; Patzak, T; Pectu, M; Pennacchio, E; Periale, L; Pessard, H; Popov, B; Ravonel, M; Rayner, M; Resnati, F; Ristea, O; Robert, A; Rubbia, A; Rummukainen, K; Saftoiu, A; Sakashita, K; Sanchez-Galan, F; Sarkamo, J; Saviano, N; Scantamburlo, E; Sergiampietri, F; Sgalaberna, D; Shaposhnikova, E; Slupecki, M; Smargianaki, D; Stanca, D; Steerenberg, R; Sterian, A R; Sterian, P; Stoica, S; Strabel, C; Suhonen, J; Suvorov, V; Toma, G; Tonazzo, A; Trzaska, W H; Tsenov, R; Tuominen, K; Valram, M; Vankova-Kirilova, G; Vannucci, F; Vasseur, G; Velotti, F; Velten, P; Venturi, V; Viant, T; Vihonen, S; Vincke, H; Vorobyev, A; Weber, A; Wu, S; Yershov, N; Zambelli, L; Zito, M

    2014-01-01

    One of the main goals of the Long Baseline Neutrino Observatory (LBNO) is to study the $L/E$ behaviour (spectral information) of the electron neutrino and antineutrino appearance probabilities, in order to determine the unknown CP-violation phase $\\delta_{CP}$ and discover CP-violation in the leptonic sector. The result is based on the measurement of the appearance probabilities in a broad range of energies, covering t he 1st and 2nd oscillation maxima, at a very long baseline of 2300 km. The sensitivity of the experiment can be maximised by optimising the energy spectra of the neutrino and anti-neutrino fluxes. Such an optimisation requires exploring an extended range of parameters describing in details the geometries and properties of the primary protons, hadron target and focusing elements in the neutrino beam line. In this paper we present a numerical solution that leads to an optimised energy spectra and study its impact on the sensitivity of LBNO to discover leptonic CP violation. In the optimised flux ...

  20. Optimising polarised neutron scattering measurements--XYZ and polarimetry analysis

    International Nuclear Information System (INIS)

    Cussen, L.D.; Goossens, D.J.

    2002-01-01

    The analytic optimisation of neutron scattering measurements made using XYZ polarisation analysis and neutron polarimetry techniques is discussed. Expressions for the 'quality factor' and the optimum division of counting time for the XYZ technique are presented. For neutron polarimetry the optimisation is identified as analogous to that for measuring the flipping ratio and reference is made to the results already in the literature

  1. Optimising polarised neutron scattering measurements--XYZ and polarimetry analysis

    CERN Document Server

    Cussen, L D

    2002-01-01

    The analytic optimisation of neutron scattering measurements made using XYZ polarisation analysis and neutron polarimetry techniques is discussed. Expressions for the 'quality factor' and the optimum division of counting time for the XYZ technique are presented. For neutron polarimetry the optimisation is identified as analogous to that for measuring the flipping ratio and reference is made to the results already in the literature.

  2. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  3. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  4. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  5. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  6. Design Optimisation and Conrol of a Pilot Operated Seat Valve

    DEFF Research Database (Denmark)

    Nielsen, Brian; Andersen, Torben Ole; Hansen, Michael Rygaard

    2004-01-01

    The paper gives an approach for optimisation of the bandwidth of a pilot operated seat valve for mobile applications. Physical dimensions as well as parameters of the implemented control loop are optimised simultaneously. The frequency response of the valve varies as a function of the pressure drop...

  7. Public transport optimisation emphasising passengers’ travel behaviour.

    OpenAIRE

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in ...

  8. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  9. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  10. Biomass supply chain optimisation for Organosolv-based biorefineries.

    Science.gov (United States)

    Giarola, Sara; Patel, Mayank; Shah, Nilay

    2014-05-01

    This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  12. Optimisation of Protection as applicable to geological disposal: the ICRP view

    International Nuclear Information System (INIS)

    Weiss, W.

    2010-01-01

    Wolfgang Weiss (BfS), vice-chair of ICRP Committee 4, recalled that the role of optimisation is to select the best protection options under the prevailing circumstances based on scientific considerations, societal concerns and ethical aspects as well as considerations of transparency. An important role of the concept of optimisation of protection is to foster a 'safety culture' and thereby to engender a state of thinking in everyone responsible for control of radiation exposures, such that they are continuously asking themselves the question, 'Have I done all that I reasonably can to avoid or reduce these doses?' Clearly, the answer to this question is a matter of judgement and necessitates co-operation between all parties involved and, as a minimum, the operating management and the regulatory agencies, but the dialogue would be more complete if other stakeholders were also involved. What kinds of checks and balances or factors would be needed to be considered for an 'optimal' system? Can indicators be identified? Quantitative methods may provide input to this dialogue but they should never be the sole input. The ICRP considers that the parameters to take into account include also social considerations and values, environmental considerations, as well as technical and economic considerations. Wolfgang Weiss approached the question of the distinction to be made between system optimisation (in the sense of taking account of social and economic as well as of all types of hazards) and optimisation of radiological protection. The position of the ICRP is that the system of protection that it proposes is based on both science (quantification of the health risk) and value judgement (what is an acceptable risk?) and optimisation is the recommended process to integrate both aspects. Indeed, there has been evolution since the old system of intervention levels to the new system, whereby, even if the level of the dose or risk (which is called constraint in ICRP-81 ) is met

  13. Numerical Analysis and Geometry Optimisation of Vertical Vane of Room Air-conditioner

    Directory of Open Access Journals (Sweden)

    Al-Obaidi Abdulkareem Sh. Mahdi

    2018-01-01

    Full Text Available Vertical vanes of room air-conditioners are used to control and direct cold air. This paper aims to study vertical vane as one of the parameters that affect the efficiency of dissipating cold air to a given space. The vertical vane geometry is analysed and optimised for lower production cost using CFD. The optimised geometry of the vertical vane should have the same or increased efficiency of dissipating cold air and have lesser mass compared to the existing original design. The existing original design of vertical vane is simplified and analysed by using ANSYS Fluent. Efficiency of wind direction is define as how accurate the direction of airflow coming out from vertical vane. In order to calculate the efficiency of wind direction, 15° and 30° rotation of vertical vane inside room air-conditioner are simulated. The efficiency of wind direction for 15° rotation of vertical vane is 57.81% while efficiency of wind direction for 30° rotation of vertical vane is 47.54%. The results of the efficiency of wind direction are used as base reference for parametric study. The parameters investigated for optimisation of vertical vane are focused at length of long span, tip chord and short span. The design of 15% decreased in vane surface area at tip chord is the best optimised design of vertical vane because the efficiency of wind direction is the highest as 60.32%.

  14. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  15. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  16. Visualization of big data security: a case study on the KDD99 cup data set

    Directory of Open Access Journals (Sweden)

    Zichan Ruan

    2017-11-01

    Full Text Available Cyber security has been thrust into the limelight in the modern technological era because of an array of attacks often bypassing untrained intrusion detection systems (IDSs. Therefore, greater attention has been directed on being able deciphering better methods for identifying attack types to train IDSs more effectively. Keycyber-attack insights exist in big data; however, an efficient approach is required to determine strong attack types to train IDSs to become more effective in key areas. Despite the rising growth in IDS research, there is a lack of studies involving big data visualization, which is key. The KDD99 data set has served as a strong benchmark since 1999; therefore, we utilized this data set in our experiment. In this study, we utilized hash algorithm, a weight table, and sampling method to deal with the inherent problems caused by analyzing big data; volume, variety, and velocity. By utilizing a visualization algorithm, we were able to gain insights into the KDD99 data set with a clear identification of “normal” clusters and described distinct clusters of effective attacks.

  17. Internet of things and Big Data as potential solutions to the problems in waste electrical and electronic equipment management: An exploratory study.

    Science.gov (United States)

    Gu, Fu; Ma, Buqing; Guo, Jianfeng; Summers, Peter A; Hall, Philip

    2017-10-01

    Management of Waste Electrical and Electronic Equipment (WEEE) is a vital part in solid waste management, there are still some difficult issues require attentionss. This paper investigates the potential of applying Internet of Things (IoT) and Big Data as the solutions to the WEEE management problems. The massive data generated during the production, consumption and disposal of Electrical and Electronic Equipment (EEE) fits the characteristics of Big Data. Through using the state-of-the-art communication technologies, the IoT derives the WEEE "Big Data" from the life cycle of EEE, and the Big Data technologies process the WEEE "Big Data" for supporting decision making in WEEE management. The framework of implementing the IoT and the Big Data technologies is proposed, with its multiple layers are illustrated. Case studies with the potential application scenarios of the framework are presented and discussed. As an unprecedented exploration, the combined application of the IoT and the Big Data technologies in WEEE management brings a series of opportunities as well as new challenges. This study provides insights and visions for stakeholders in solving the WEEE management problems under the context of IoT and Big Data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Study of LBS for characterization and analysis of big data benchmarks

    International Nuclear Information System (INIS)

    Chandio, A.A.; Zhang, F.; Memon, T.D.

    2014-01-01

    In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a) on-demand accessed and (b) large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes) to thousands of PT (Peta Bytes) (i.e. Big Data). To increase the development and the assessment of the applications such as LBS (Location Based Services), a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction. (author)

  19. Study on LBS for Characterization and Analysis of Big Data Benchmarks

    Directory of Open Access Journals (Sweden)

    Aftab Ahmed Chandio

    2014-10-01

    Full Text Available In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a on-demand accessed and (b large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes to thousands of PT (Peta Bytes (i.e. Big Data. To increase the development and the assessment of the applications such as LBS (Location Based Services, a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction

  20. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  1. Statistical optimisation techniques in fatigue signal editing problem

    International Nuclear Information System (INIS)

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-01-01

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection

  2. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  3. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  4. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  5. A national optimisation model for energy wood streams; Energiapuuvirtojen valtakunnallinen optimointimalli

    Energy Technology Data Exchange (ETDEWEB)

    Iikkanen, P.; Keskinen, S.; Korpilahti, A.; Raesaenen, T.; Sirkiae, A.

    2011-07-01

    In 2010 a total of 12,5 terawatt hours of forest energy was used in Finland's heat and power plants. According to studies by Metsaeteho and Poeyry, use of energy wood will nearly double to 21.6 terawatt hours by 2020. There are also plans to use energy wood as a raw material for biofuel plants. The techno-ecological supply potential of energy wood in 2020 is estimated at 42.9 terawatt hours. Energy wood has been transported almost entirely by road. The situation is changing, however, because growing demand for energy wood will expand raw wood procurement areas and lengthen transport distances. A cost-effective transport system therefore also requires the use of rail and waterway transports. In Finland, however, there is almost a complete absence of the terminals required for the use of rail and waterway transports; where energy wood is chipped, temporarily stored and loaded onto railway wagons and vessels for further transport. A national optimisation model for energy wood has been developed to serve transport system planning in particular. The linear optimisation model optimises, on a national level, goods streams between supply points and usage points based on forest energy procurement costs. The model simultaneously covers deliveries of forest chips, stumps and small-sized thinning wood. The procurement costs used in the optimisation include the costs of the energy wood's roadside price, chipping, transport and terminal handling. The transport system described in the optimisation model consists of wood supply points (2007 municipality precision), wood usage points, railway terminals and the connections between them along the main road and rail network. Elements required for the examination of waterway transports can also be easily added to the model. The optimisation model can be used to examine, for example, the effects of changes of energy wood demand and supply as well as transport costs on energy wood goods streams, the relative use of different

  6. TELECOM BIG DATA FOR URBAN TRANSPORT ANALYSIS – A CASE STUDY OF SPLIT-DALMATIA COUNTY IN CROATIA

    OpenAIRE

    M. Baučić; N. Jajac; M. Bućan

    2017-01-01

    Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the...

  7. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  8. An exergy-based multi-objective optimisation model for energy retrofit strategies in non-domestic buildings

    International Nuclear Information System (INIS)

    García Kerdan, Iván; Raslan, Rokia; Ruyssevelt, Paul

    2016-01-01

    While the building sector has a significant thermodynamic improvement potential, exergy analysis has been shown to provide new insight for the optimisation of building energy systems. This paper presents an exergy-based multi-objective optimisation tool that aims to assess the impact of a diverse range of retrofit measures with a focus on non-domestic buildings. EnergyPlus was used as a dynamic calculation engine for first law analysis, while a Python add-on was developed to link dynamic exergy analysis and a Genetic Algorithm optimisation process with the aforementioned software. Two UK archetype case studies (an office and a primary school) were used to test the feasibility of the proposed framework. Different measures combinations based on retrofitting the envelope insulation levels and the application of different HVAC configurations were assessed. The objective functions in this study are annual energy use, occupants' thermal comfort, and total building exergy destructions. A large range of optimal solutions was achieved highlighting the framework capabilities. The model achieved improvements of 53% in annual energy use, 51% of exergy destructions and 66% of thermal comfort for the school building, and 50%, 33%, and 80% for the office building. This approach can be extended by using exergoeconomic optimisation. - Highlights: • Integration of dynamic exergy analysis into a retrofit-oriented simulation tool. • Two UK non-domestic building archetypes are used as case studies. • The model delivers non-dominated solutions based on energy, exergy and comfort. • Exergy destructions of ERMs are optimised using GA algorithms. • Strengths and limitations of the proposed exergy-based framework are discussed.

  9. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  10. Solving dynamic multi-objective problems with vector evaluated particle swarm optimisation

    CSIR Research Space (South Africa)

    Greeff, M

    2008-06-01

    Full Text Available Many optimisation problems are multi-objective and change dynamically. Many methods use a weighted average approach to the multiple objectives. This paper introduces the usage of the vector evaluated particle swarm optimiser (VEPSO) to solve dynamic...

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  12. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez; Khan, Ayaz ul Hassan

    2014-01-01

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  13. Exploration of automatic optimisation for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez

    2014-09-16

    © 2014 Taylor & Francis. Writing optimised compute unified device architecture (CUDA) program for graphic processing units (GPUs) is complex even for experts. We present a design methodology for a restructuring tool that converts C-loops into optimised CUDA kernels based on a three-step algorithm which are loop tiling, coalesced memory access and resource optimisation. A method for finding possible loop tiling solutions with coalesced memory access is developed and a simplified algorithm for restructuring C-loops into an efficient CUDA kernel is presented. In the evaluation, we implement matrix multiply (MM), matrix transpose (M-transpose), matrix scaling (M-scaling) and matrix vector multiply (MV) using the proposed algorithm. We present the analysis of the execution time and GPU throughput for the above applications, which favourably compare to other proposals. Evaluation is carried out while scaling the problem size and running under a variety of kernel configurations. The obtained speedup is about 28-35% for M-transpose compared to NVIDIA Software Development Kit, 33% speedup for MV compared to general purpose computation on graphics processing unit compiler, and more than 80% speedup for MM and M-scaling compared to CUDA-lite.

  14. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  15. Validation of a large-scale audit technique for CT dose optimisation

    International Nuclear Information System (INIS)

    Wood, T. J.; Davis, A. W.; Moore, C. S.; Beavis, A. W.; Saunderson, J. R.

    2008-01-01

    The expansion and increasing availability of computed tomography (CT) imaging means that there is a greater need for the development of efficient optimisation strategies that are able to inform clinical practice, without placing a significant burden on limited departmental resources. One of the most fundamental aspects to any optimisation programme is the collection of patient dose information, which can be compared with appropriate diagnostic reference levels. This study has investigated the implementation of a large-scale audit technique, which utilises data that already exist in the radiology information system, to determine typical doses for a range of examinations on four CT scanners. This method has been validated against what is considered the 'gold standard' technique for patient dose audits, and it has been demonstrated that results equivalent to the 'standard-sized patient' can be inferred from this much larger data set. This is particularly valuable where CT optimisation is concerned as it is considered a 'high dose' technique, and hence close monitoring of patient dose is particularly important. (authors)

  16. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  17. Optimisation of BPMN Business Models via Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for the optimisation of business processes modelled in the business process modelling language BPMN, which builds upon earlier work, where we developed a model checking based method for the analysis of BPMN models. We define a structure for expressing optimisation goals...... for synthesized BPMN components, based on probabilistic computation tree logic and real-valued reward structures of the BPMN model, allowing for the specification of complex quantitative goals. We here present a simple algorithm, inspired by concepts from evolutionary algorithms, which iteratively generates...

  18. Intelligent Support for a Computer Aided Design Optimisation Cycle

    OpenAIRE

    B. Dolšak; M. Novak; J. Kaljun

    2006-01-01

    It is becoming more and more evident that  adding intelligence  to existing computer aids, such as computer aided design systems, can lead to significant improvements in the effective and reliable performance of various engineering tasks, including design optimisation. This paper presents three different intelligent modules to be applied within a computer aided design optimisation cycle to enable more intelligent and less experience-dependent design performance. 

  19. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  20. Process and Economic Optimisation of a Milk Processing Plant with Solar Thermal Energy

    DEFF Research Database (Denmark)

    Bühler, Fabian; Nguyen, Tuong-Van; Elmegaard, Brian

    2016-01-01

    . Based on the case study of a dairy factory, where first a heat integration is performed to optimise the system, a model for solar thermal process integration is developed. The detailed model is based on annual hourly global direct and diffuse solar radiation, from which the radiation on a defined......This work investigates the integration of solar thermal systems for process energy use. A shift from fossil fuels to renewable energy could be beneficial both from environmental and economic perspectives, after the process itself has been optimised and efficiency measures have been implemented...... surface is calculated. Based on hourly process stream data from the dairy factory, the optimal streams for solar thermal process integration are found, with an optimal thermal storagetank volume. The last step consists of an economic optimisation of the problem to determine the optimal size...

  1. The principle of optimisation: reasons for success and legal criticism

    International Nuclear Information System (INIS)

    Fernandez Regalado, Luis

    2008-01-01

    The International Commission on Radiological Protection (ICRP) has adopted new recommendations in 2007. In broad outlines they fundamentally continue the recommendations already approved in 1990 and later on. The principle of optimisation of protection, together with the principles of justification and dose limits, remains playing a key role of the ICRP recommendations, and it has so been for the last few years. This principle, somehow reinforced in the 2007 ICRP recommendations, has been incorporated into norms and legislation which have peacefully been in force in many countries all over the world. There are three main reasons to explain the success in the application of the principle of optimisation in radiological protection: First, the subjectivity of the sentence that embraces the principle of optimisation, 'As low as reasonably achievable' (ALARA), that allows different valid interpretations under different circumstances. Second, the pragmatism and adaptability of ALARA to all exposure situations. And third, the scientific humbleness which is behind the principle of optimisation, which makes a clear contrast with the old fashioned scientific positivism that enshrined scientist opinions. Nevertheless, from a legal point of view, there is some criticism cast over the principle of optimisation in radiological protection, where it has been transformed in compulsory norm. This criticism is based on two arguments: The lack of democratic participation in the process of elaboration of the norm, and the legal uncertainty associated to its application. Both arguments are somehow known by the ICRP which, on the one hand, has broadened the participation of experts, associations and the professional radiological protection community, increasing the transparency on how decisions on recommendations have been taken, and on the other hand, the ICRP has warned about the need for authorities to specify general criteria to develop the principle of optimisation in national

  2. A comparison of an energy/economic-based against an exergoeconomic-based multi-objective optimisation for low carbon building energy design

    International Nuclear Information System (INIS)

    García Kerdan, Iván; Raslan, Rokia; Ruyssevelt, Paul; Morillón Gálvez, David

    2017-01-01

    This study presents a comparison of the optimisation of building energy retrofit strategies from two different perspectives: an energy/economic-based analysis and an exergy/exergoeconomic-based analysis. A recently retrofitted community centre is used as a case study. ExRET-Opt, a novel building energy/exergy simulation tool with multi-objective optimisation capabilities based on NSGA-II is used to run both analysis. The first analysis, based on the 1st Law only, simultaneously optimises building energy use and design's Net Present Value (NPV). The second analysis, based on the 1st and the 2nd Laws, simultaneously optimises exergy destructions and the exergoeconomic cost-benefit index. Occupant thermal comfort is considered as a common objective function for both approaches. The aim is to assess the difference between the methods and calculate the performance among main indicators, considering the same decision variables and constraints. Outputs show that the inclusion of exergy/exergoeconomics as objective functions into the optimisation procedure has resulted in similar 1st Law and thermal comfort outputs, while providing solutions with less environmental impact under similar capital investments. This outputs demonstrate how the 1st Law is only a necessary calculation while the utilisation of the 1st and 2nd Laws becomes a sufficient condition for the analysis and design of low carbon buildings. - Highlights: • The study compares an energy-based and an exergy-based building design optimisation. • Occupant thermal comfort is considered as a common objective function. • A comparison of thermodynamic outputs is made against the actual retrofit design. • Under similar constraints, second law optimisation presents better overall results. • Exergoeconomic optimisation solutions improves building exergy efficiency to double.

  3. Achieving a Sustainable Urban Form through Land Use Optimisation: Insights from Bekasi City’s Land-Use Plan (2010–2030

    Directory of Open Access Journals (Sweden)

    Rahmadya Trias Handayanto

    2017-02-01

    Full Text Available Cities worldwide have been trying to achieve a sustainable urban form to handle their rapid urban growth. Many sustainable urban forms have been studied and two of them, the compact city and the eco city, were chosen in this study as urban form foundations. Based on these forms, four sustainable city criteria (compactness, compatibility, dependency, and suitability were considered as necessary functions for land use optimisation. This study presents a land use optimisation as a method for achieving a sustainable urban form. Three optimisation methods (particle swarm optimisation, genetic algorithms, and a local search method were combined into a single hybrid optimisation method for land use in Bekasi city, Indonesia. It was also used for examining Bekasi city’s land-use-plan (2010–2030 after optimising current (2015 and future land use (2030. After current land use optimisation, the score of sustainable city criteria increased significantly. Three important centres of land use (commercial, industrial, and residential were also created through clustering the results. These centres were slightly different from centres of the city plan zones. Additional land uses in 2030 were predicted using a nonlinear autoregressive neural network with external input. Three scenarios were used for allocating these additional land uses including sustainable development, government policy, and business-as-usual. Future land use allocation in 2030 found that the sustainable development scenario showed better performance compared to government policy and business-as-usual scenarios.

  4. Simulation and optimisation modelling approach for operation of the Hoa Binh Reservoir, Vietnam

    DEFF Research Database (Denmark)

    Ngo, Long le; Madsen, Henrik; Rosbjerg, Dan

    2007-01-01

    Hoa Binh, the largest reservoir in Vietnam, plays an important role in flood control for the Red River delta and hydropower generation. Due to its multi-purpose character, conflicts and disputes in operating the reservoir have been ongoing since its construction, particularly in the flood season....... This paper proposes to optimise the control strategies for the Hoa Binh reservoir operation by applying a combination of simulation and optimisation models. The control strategies are set up in the MIKE 11 simulation model to guide the releases of the reservoir system according to the current storage level......, the hydro-meteorological conditions, and the time of the year. A heuristic global optimisation tool, the shuffled complex evolution (SCE) algorithm, is adopted for optimising the reservoir operation. The optimisation puts focus on the trade-off between flood control and hydropower generation for the Hoa...

  5. MIMO-Radar Waveform Design for Beampattern Using Particle-Swarm-Optimisation

    KAUST Repository

    Ahmed, Sajid

    2012-07-31

    Multiple input multiple output (MIMO) radars have many advantages over their phased-array counterparts: improved spatial resolution; better parametric identifiably and greater flexibility to acheive the desired transmit beampattern. The desired transmit beampatterns using MIMO-radar requires the waveforms to have arbitrary auto- and cross-correlations. To design such waveforms, generally a waveform covariance matrix, R, is synthesised first then the actual waveforms are designed. Synthesis of the covariance matrix, R, is a constrained optimisation problem, which requires R to be positive semidefinite and all of its diagonal elements to be equal. To simplify the first constraint the covariance matrix is synthesised indirectly from its square-root matrix U, while for the second constraint the elements of the m-th column of U are parameterised using the coordinates of the m-hypersphere. This implicitly fulfils both of the constraints and enables us to write the cost-function in closed form. Then the cost-function is optimised using a simple particle-swarm-optimisation (PSO) technique, which requires only the cost-function and can optimise any choice of norm cost-function. © 2012 IEEE.

  6. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  7. Use of artificial intelligence techniques for optimisation of co-combustion of coal with biomass

    Energy Technology Data Exchange (ETDEWEB)

    Tan, C.K.; Wilcox, S.J.; Ward, J. [University of Glamorgan, Pontypridd (United Kingdom). Division of Mechanical Engineering

    2006-03-15

    The optimisation of burner operation in conventional pulverised-coal-fired boilers for co-combustion applications represents a significant challenge This paper describes a strategic framework in which Artificial Intelligence (AI) techniques can be applied to solve such an optimisation problem. The effectiveness of the proposed system is demonstrated by a case study that simulates the co-combustion of coal with sewage sludge in a 500-kW pilot-scale combustion rig equipped with a swirl stabilised low-NOx burner. A series of Computational Fluid Dynamics (CFD) simulations were performed to generate data for different operating conditions, which were then used to train several Artificial Neural Networks (ANNs) to predict the co-combustion performance. Once trained, the ANNs were able to make estimations of unseen situations in a fraction of the time taken by the CFD simulation. Consequently, the networks were capable of representing the underlying physics of the CFD models and could be executed efficiently for a large number of iterations as required by optimisation techniques based on Evolutionary Algorithms (EAs). Four operating parameters of the burner, namely the swirl angles and flow rates of the secondary and tertiary combustion air were optimised with the objective of minimising the NOx and CO emissions as well as the unburned carbon at the furnace exit. The results suggest that ANNs combined with EAs provide a useful tool for optimising co-combustion processes.

  8. Design of optimised backstepping controller for the synchronisation of chaotic Colpitts oscillator using shark smell algorithm

    Science.gov (United States)

    Fouladi, Ehsan; Mojallali, Hamed

    2018-01-01

    In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master-slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm optimisation (PSO) algorithm. Simulation results show better performance in terms of accuracy and convergence for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller.

  9. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  10. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  11. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  12. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  13. Separative power of an optimised concurrent gas centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Bogovalov, Sergey; Boman, Vladimir [National Research Nuclear University (MEPHI), Moscow (Russian Federation)

    2016-06-15

    The problem of separation of isotopes in a concurrent gas centrifuge is solved analytically for an arbitrary binary mixture of isotopes. The separative power of the optimised concurrent gas centrifuges for the uranium isotopes equals to δU = 12.7 (V/700 m/s)2(300 K/T)(L/1 m) kg·SWU/yr, where L and V are the length and linear velocity of the rotor of the gas centrifuge and T is the temperature. This equation agrees well with the empirically determined separative power of optimised counter-current gas centrifuges.

  14. D-branes in a big bang/big crunch universe: Misner space

    International Nuclear Information System (INIS)

    Hikida, Yasuaki; Nayak, Rashmi R.; Panigrahi, Kamal L.

    2005-01-01

    We study D-branes in a two-dimensional lorentzian orbifold R 1,1 /Γ with a discrete boost Γ. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2→2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case

  15. D-branes in a big bang/big crunch universe: Misner space

    Energy Technology Data Exchange (ETDEWEB)

    Hikida, Yasuaki [Theory Group, High Energy Accelerator Research Organization (KEK), Tukuba, Ibaraki 305-0801 (Japan); Nayak, Rashmi R. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy); Panigrahi, Kamal L. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy)

    2005-09-01

    We study D-branes in a two-dimensional lorentzian orbifold R{sup 1,1}/{gamma} with a discrete boost {gamma}. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2{yields}2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case.

  16. Reducing passengers’ travel time by optimising stopping patterns in a large-scale network: A case-study in the Copenhagen Region

    DEFF Research Database (Denmark)

    Parbo, Jens; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2018-01-01

    Optimising stopping patterns in railway schedules is a cost-effective way to reduce passengers’ generalised travel costs without increasing train operators’ costs. The challenge consists in striking a balance between an increase in waiting time for passengers at skipped stations and a decrease...... in travel time for through-going passengers, with possible consequent changes in the passenger demand and route choices. This study presents the formulation of the skip-stop problem as a bi-level optimisation problem where the lower level is a schedule-based transit assignment model that delivers passengers...... is a mixed-integer problem, whereas the route choice model is a non-linear non-continuous mapping of the timetable. The method was tested on the suburban railway network in the Greater Copenhagen Region (Denmark): the reduction in railway passengers’ in-vehicle travel time was 5.5%, the reduction...

  17. Research on the Impact of Big Data on Logistics

    Directory of Open Access Journals (Sweden)

    Wang Yaxing

    2017-01-01

    Full Text Available In the context of big data development, a large amount of data will appear at logistics enterprises, especially in the aspect of logistics, such as transportation, warehousing, distribution and so on. Based on the analysis of the characteristics of big data, this paper studies the impact of big data on the logistics and its action mechanism, and gives reasonable suggestions. Through building logistics data center by using the big data technology, some hidden value information behind the data will be digged out, in which the logistics enterprises can benefit from it.

  18. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  19. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  20. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  1. Optimising resource management in neurorehabilitation.

    Science.gov (United States)

    Wood, Richard M; Griffiths, Jeff D; Williams, Janet E; Brouwers, Jakko

    2014-01-01

    To date, little research has been published regarding the effective and efficient management of resources (beds and staff) in neurorehabilitation, despite being an expensive service in limited supply. To demonstrate how mathematical modelling can be used to optimise service delivery, by way of a case study at a major 21 bed neurorehabilitation unit in the UK. An automated computer program for assigning weekly treatment sessions is developed. Queue modelling is used to construct a mathematical model of the hospital in terms of referral submissions to a waiting list, admission and treatment, and ultimately discharge. This is used to analyse the impact of hypothetical strategic decisions on a variety of performance measures and costs. The project culminates in a hybridised model of these two approaches, since a relationship is found between the number of therapy hours received each week (scheduling output) and length of stay (queuing model input). The introduction of the treatment scheduling program has substantially improved timetable quality (meaning a better and fairer service to patients) and has reduced employee time expended in its creation by approximately six hours each week (freeing up time for clinical work). The queuing model has been used to assess the effect of potential strategies, such as increasing the number of beds or employing more therapists. The use of mathematical modelling has not only optimised resources in the short term, but has allowed the optimality of longer term strategic decisions to be assessed.

  2. Discontinuous permeable adsorptive barrier design and cost analysis: a methodological approach to optimisation.

    Science.gov (United States)

    Santonastaso, Giovanni Francesco; Bortone, Immacolata; Chianese, Simeone; Di Nardo, Armando; Di Natale, Michele; Erto, Alessandro; Karatza, Despina; Musmarra, Dino

    2017-09-19

    The following paper presents a method to optimise a discontinuous permeable adsorptive barrier (PAB-D). This method is based on the comparison of different PAB-D configurations obtained by changing some of the main PAB-D design parameters. In particular, the well diameters, the distance between two consecutive passive wells and the distance between two consecutive well lines were varied, and a cost analysis for each configuration was carried out in order to define the best performing and most cost-effective PAB-D configuration. As a case study, a benzene-contaminated aquifer located in an urban area in the north of Naples (Italy) was considered. The PAB-D configuration with a well diameter of 0.8 m resulted the best optimised layout in terms of performance and cost-effectiveness. Moreover, in order to identify the best configuration for the remediation of the aquifer studied, a comparison with a continuous permeable adsorptive barrier (PAB-C) was added. In particular, this showed a 40% reduction of the total remediation costs by using the optimised PAB-D.

  3. An Empirical Study on Visualizing the Intellectual Structure and Hotspots of Big Data Research from a Sustainable Perspective

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2018-03-01

    Full Text Available Big data has been extensively applied to many fields and wanted for sustainable development. However, increasingly growing publications and the dynamic nature of research fronts pose challenges to understand the current research situation and sustainable development directions of big data. In this paper, we visually conducted a bibliometric study of big data literatures from the Web of Science (WoS between 2002 and 2016, involving 4927 effective journal articles in 1729 journals contributed by 16,404 authors from 4137 institutions. The bibliometric results reveal the current annual publications distribution, journals distribution and co-citation network, institutions distribution and collaboration network, authors distribution, collaboration network and co-citation network, and research hotspots. The results can help researchers worldwide to understand the panorama of current big data research, to find the potential research gaps, and to focus on the future sustainable development directions.

  4. Structure and weights optimisation of a modified Elman network emotion classifier using hybrid computational intelligence algorithms: a comparative study

    Science.gov (United States)

    Sheikhan, Mansour; Abbasnezhad Arabi, Mahdi; Gharavian, Davood

    2015-10-01

    Artificial neural networks are efficient models in pattern recognition applications, but their performance is dependent on employing suitable structure and connection weights. This study used a hybrid method for obtaining the optimal weight set and architecture of a recurrent neural emotion classifier based on gravitational search algorithm (GSA) and its binary version (BGSA), respectively. By considering the features of speech signal that were related to prosody, voice quality, and spectrum, a rich feature set was constructed. To select more efficient features, a fast feature selection method was employed. The performance of the proposed hybrid GSA-BGSA method was compared with similar hybrid methods based on particle swarm optimisation (PSO) algorithm and its binary version, PSO and discrete firefly algorithm, and hybrid of error back-propagation and genetic algorithm that were used for optimisation. Experimental tests on Berlin emotional database demonstrated the superior performance of the proposed method using a lighter network structure.

  5. Big Data Science Education: A Case Study of a Project-Focused Introductory Course

    Science.gov (United States)

    Saltz, Jeffrey; Heckman, Robert

    2015-01-01

    This paper reports on a case study of a project-focused introduction to big data science course. The pedagogy of the course leveraged boundary theory, where students were positioned to be at the boundary between a client's desire to understand their data and the academic class. The results of the case study demonstrate that using live clients…

  6. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  7. Optimal design and operation of a photovoltaic-electrolyser system using particle swarm optimisation

    Science.gov (United States)

    Sayedin, Farid; Maroufmashat, Azadeh; Roshandel, Ramin; Khavas, Sourena Sattari

    2016-07-01

    In this study, hydrogen generation is maximised by optimising the size and the operating conditions of an electrolyser (EL) directly connected to a photovoltaic (PV) module at different irradiance. Due to the variations of maximum power points of the PV module during a year and the complexity of the system, a nonlinear approach is considered. A mathematical model has been developed to determine the performance of the PV/EL system. The optimisation methodology presented here is based on the particle swarm optimisation algorithm. By this method, for the given number of PV modules, the optimal sizeand operating condition of a PV/EL system areachieved. The approach can be applied for different sizes of PV systems, various ambient temperatures and different locations with various climaticconditions. The results show that for the given location and the PV system, the energy transfer efficiency of PV/EL system can reach up to 97.83%.

  8. Smart Information Management in Health Big Data.

    Science.gov (United States)

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  9. Visual grading characteristics and ordinal regression analysis during optimisation of CT head examinations.

    Science.gov (United States)

    Zarb, Francis; McEntee, Mark F; Rainford, Louise

    2015-06-01

    To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.

  10. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  11. Cholesterol, Cholesterol-Lowering Medication Use, and Breast Cancer Outcome in the BIG 1-98 Study

    DEFF Research Database (Denmark)

    Borgquist, Signe; Giobbie-Hurder, Anita; Ahern, Thomas P

    2017-01-01

    on cholesterol levels and hypercholesterolemia per se may counteract the intended effect of aromatase inhibitors. Patients and Methods The Breast International Group (BIG) conducted a randomized, phase III, double-blind trial, BIG 1-98, which enrolled 8,010 postmenopausal women with early-stage, hormone receptor......-positive invasive breast cancer from 1998 to 2003. Systemic levels of total cholesterol and use of CLM were measured at study entry and every 6 months up to 5.5 years. Cumulative incidence functions were used to describe the initiation of CLM in the presence of competing risks. Marginal structural Cox proportional...

  12. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  13. Optimisation and symmetry in experimental radiation physics

    International Nuclear Information System (INIS)

    Ghose, A.

    1988-01-01

    The present monograph is concerned with the optimisation of geometric factors in radiation physics experiments. The discussions are essentially confined to those systems in which optimisation is equivalent to symmetrical configurations of the measurement systems. They include, measurements of interaction cross section of diverse types, determination of polarisations, development of detectors with almost ideal characteristics, production of radiations with continuously variable energies and development of high efficiency spectrometers etc. The monograph is intended for use by experimental physicists investigating primary interactions of radiations with matter and associated technologies. We have illustrated the various optimisation procedures by considering the cases of the so-called ''14 MeV'' on d-t neutrons and gamma rays with energies less than 3 MeV. Developments in fusion technology are critically dependent on the availability accurate cross sections of nuclei for fast neutrons of energies at least as high as d-t neutrons. In this monograph we have discussed various techniques which can be used to improve the accuracy of such measurements and have also presented a method for generating almost monoenergetic neutrons in the 8 MeV to 13 MeV energy range which can be used to measure cross sections in this sparingly investigated region

  14. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  15. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  16. Credit price optimisation within retail banking

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... cost based pricing, where the price of a product or service is based on the .... function obtained from fitting a logistic regression model .... Note that the proposed optimisation approach below will allow us to also incorporate.

  17. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  18. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  19. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  20. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  1. PNGMDR 2013-2015. 2015 report by the 'optimisation of the distribution of wastes among management sectors' work-group

    International Nuclear Information System (INIS)

    2015-01-01

    PNGMDR is the French national plan for the management of radioactive materials and wastes. After a discussion of the stakes associated with the optimisation of the different sectors involved in radioactive waste management (production, sorting, processing, packaging, warehousing, storage), this reports proposes a description of wastes for which a sector optimisation study has been performed (graphite wastes, Marcoule bitumen packages, common solid residues, cemented solid wastes in concrete containers from Areva La Hague), describes the various studies optimisation ways and alternate scenarios for these different wastes. It analyses and discusses the progress status of the studies optimisation ways: improvement of the characterisation of graphite and bituminous wastes, studies and works on the storage of low-level-long-lived wastes, studies on the processing of graphite wastes, of bituminous packages and common solid residues. Other issues are evoked: the case of wastes produced by dismantling operations, and a technical-economic approach. Perspectives and future works are discussed for the different considered wastes

  2. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    Science.gov (United States)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  3. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  4. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  5. Energy efficiency optimisation for distillation column using artificial neural network models

    International Nuclear Information System (INIS)

    Osuolale, Funmilayo N.; Zhang, Jie

    2016-01-01

    This paper presents a neural network based strategy for the modelling and optimisation of energy efficiency in distillation columns incorporating the second law of thermodynamics. Real-time optimisation of distillation columns based on mechanistic models is often infeasible due to the effort in model development and the large computation effort associated with mechanistic model computation. This issue can be addressed by using neural network models which can be quickly developed from process operation data. The computation time in neural network model evaluation is very short making them ideal for real-time optimisation. Bootstrap aggregated neural networks are used in this study for enhanced model accuracy and reliability. Aspen HYSYS is used for the simulation of the distillation systems. Neural network models for exergy efficiency and product compositions are developed from simulated process operation data and are used to maximise exergy efficiency while satisfying products qualities constraints. Applications to binary systems of methanol-water and benzene-toluene separations culminate in a reduction of utility consumption of 8.2% and 28.2% respectively. Application to multi-component separation columns also demonstrate the effectiveness of the proposed method with a 32.4% improvement in the exergy efficiency. - Highlights: • Neural networks can accurately model exergy efficiency in distillation columns. • Bootstrap aggregated neural network offers improved model prediction accuracy. • Improved exergy efficiency is obtained through model based optimisation. • Reductions of utility consumption by 8.2% and 28.2% were achieved for binary systems. • The exergy efficiency for multi-component distillation is increased by 32.4%.

  6. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  7. Big Five personality group differences across academic majors

    DEFF Research Database (Denmark)

    Vedel, Anna

    2016-01-01

    During the past decades, a number of studies have explored personality group differences in the Big Five personality traits among students in different academic majors. To date, though, this research has not been reviewed systematically. This was the aim of the present review. A systematic...... literature search identified twelve eligible studies yielding an aggregated sample size of 13,389. Eleven studies reported significant group differences in one or multiple Big Five personality traits. Consistent findings across studies were that students of arts/humanities and psychology scored high...... on Conscientiousness. Effect sizes were calculated to estimate the magnitude of the personality group differences. These effect sizes were consistent across studies comparing similar pairs of academic majors. For all Big Five personality traits medium effect sizes were found frequently, and for Openness even large...

  8. Big data science: A literature review of nursing research exemplars.

    Science.gov (United States)

    Westra, Bonnie L; Sylvia, Martha; Weinfurter, Elizabeth F; Pruinelli, Lisiane; Park, Jung In; Dodd, Dianna; Keenan, Gail M; Senk, Patricia; Richesson, Rachel L; Baukner, Vicki; Cruz, Christopher; Gao, Grace; Whittenburg, Luann; Delaney, Connie W

    Big data and cutting-edge analytic methods in nursing research challenge nurse scientists to extend the data sources and analytic methods used for discovering and translating knowledge. The purpose of this study was to identify, analyze, and synthesize exemplars of big data nursing research applied to practice and disseminated in key nursing informatics, general biomedical informatics, and nursing research journals. A literature review of studies published between 2009 and 2015. There were 650 journal articles identified in 17 key nursing informatics, general biomedical informatics, and nursing research journals in the Web of Science database. After screening for inclusion and exclusion criteria, 17 studies published in 18 articles were identified as big data nursing research applied to practice. Nurses clearly are beginning to conduct big data research applied to practice. These studies represent multiple data sources and settings. Although numerous analytic methods were used, the fundamental issue remains to define the types of analyses consistent with big data analytic methods. There are needs to increase the visibility of big data and data science research conducted by nurse scientists, further examine the use of state of the science in data analytics, and continue to expand the availability and use of a variety of scientific, governmental, and industry data resources. A major implication of this literature review is whether nursing faculty and preparation of future scientists (PhD programs) are prepared for big data and data science. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  10. A knowledge representation model for the optimisation of electricity generation mixes

    International Nuclear Information System (INIS)

    Chee Tahir, Aidid; Bañares-Alcántara, René

    2012-01-01

    Highlights: ► Prototype energy model which uses semantic representation (ontologies). ► Model accepts both quantitative and qualitative based energy policy goals. ► Uses logic inference to formulate equations for linear optimisation. ► Proposes electricity generation mix based on energy policy goals. -- Abstract: Energy models such as MARKAL, MESSAGE and DNE-21 are optimisation tools which aid in the formulation of energy policies. The strength of these models lie in their solid theoretical foundations built on rigorous mathematical equations designed to process numerical (quantitative) data related to economics and the environment. Nevertheless, a complete consideration of energy policy issues also requires the consideration of the political and social aspects of energy. These political and social issues are often associated with non-numerical (qualitative) information. To enable the evaluation of these aspects in a computer model, we hypothesise that a different approach to energy model optimisation design is required. A prototype energy model that is based on a semantic representation using ontologies and is integrated to engineering models implemented in Java has been developed. The model provides both quantitative and qualitative evaluation capabilities through the use of logical inference. The semantic representation of energy policy goals is used (i) to translate a set of energy policy goals into a set of logic queries which is then used to determine the preferred electricity generation mix and (ii) to assist in the formulation of a set of equations which is then solved in order to obtain a proposed electricity generation mix. Scenario case studies have been developed and tested on the prototype energy model to determine its capabilities. Knowledge queries were made on the semantic representation to determine an electricity generation mix which fulfilled a set of energy policy goals (e.g. CO 2 emissions reduction, water conservation, energy supply

  11. Optimisation of Oil Production in Two – Phase Flow Reservoir Using Simultaneous Method and Interior Point Optimiser

    DEFF Research Database (Denmark)

    Lerch, Dariusz Michal; Völcker, Carsten; Capolei, Andrea

    2012-01-01

    in the reservoir. A promising decrease of these remained resources can be provided by smart wells applying water injections to sustain satisfactory pressure level in the reservoir throughout the whole process of oil production. Basically to enhance secondary recovery of the remaining oil after drilling, water...... is injected at the injection wells of the down-hole pipes. This sustains the pressure in the reservoir and drives oil towards production wells. There are however, many factors contributing to the poor conventional secondary recovery methods e.g. strong surface tension, heterogeneity of the porous rock...... fields, or closed loop optimisation, can be used for optimising the reservoir performance in terms of net present value of oil recovery or another economic objective. In order to solve an optimal control problem we use a direct collocation method where we translate a continuous problem into a discrete...

  12. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  13. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  14. Alternatives for optimisation of rumen fermentation in ruminants

    Directory of Open Access Journals (Sweden)

    T. Slavov

    2017-06-01

    Full Text Available Abstract. The proper knowledge on the variety of events occurring in the rumen makes possible their optimisation with respect to the complete feed conversion and increasing the productive performance of ruminants. The inclusion of various dietary additives (supplements, biologically active substances, nutritional antibiotics, probiotics, enzymatic preparations, plant extracts etc. has an effect on the intensity and specific pathway of fermentation, and thus, on the general digestion and systemic metabolism. The optimisation of rumen digestion is a method with substantial potential for improving the efficiency of ruminant husbandry, increasing of quality of their produce and health maintenance.

  15. Optimising a shaft's geometry by applying genetic algorithms

    Directory of Open Access Journals (Sweden)

    María Alejandra Guzmán

    2005-05-01

    Full Text Available Many engnieering design tasks involve optimising several conflicting goals; these types of problem are known as Multiobjective Optimisation Problems (MOPs. Evolutionary techniques have proved to be an effective tool for finding solutions to these MOPs during the last decade, Variations on the basic generic algorithm have been particulary proposed by different researchers for finding rapid optimal solutions to MOPs. The NSGA (Non-dominated Sorting Generic Algorithm has been implemented in this paper for finding an optimal design for a shaft subjected to cyclic loads, the conflycting goals being minimum weight and minimum lateral deflection.

  16. Optimisation of process parameters in friction stir welding based on residual stress analysis: a feasibility study

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2010-01-01

    The present paper considers the optimisation of process parameters in friction stir welding (FSW). More specifically, the choices of rotational speed and traverse welding speed have been investigated using genetic algorithms. The welding process is simulated in a transient, two......-dimensional sequentially coupled thermomechanical model in ANSYS. This model is then used in an optimisation case where the two objectives are the minimisation of the peak residual stresses and the maximisation of the welding speed. The results indicate that the objectives for the considered case are conflicting......, and this is presented as a Pareto optimal front. Moreover, a higher welding speed for a fixed rotational speed results, in general, in slightly higher stress levels in the tension zone, whereas a higher rotational speed for a fixed welding speed yields somewhat lower peak residual stress, however, a wider tension zone...

  17. How to Use TCM Informatics to Study Traditional Chinese Medicine in Big Data Age.

    Science.gov (United States)

    Shi, Cheng; Gong, Qing-Yue; Zhou, Jinhai

    2017-01-01

    This paper introduces the characteristics and complexity of traditional Chinese medicine (TCM) data, considers that modern big data processing technology has brought new opportunities for the research of TCM, and gives some ideas and methods to apply big data technology in TCM.

  18. Urban Big Data and the Development of City Intelligence

    Directory of Open Access Journals (Sweden)

    Yunhe Pan

    2016-06-01

    Full Text Available This study provides a definition for urban big data while exploring its features and applications of China's city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China's city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China's urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China's urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation's current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.

  19. A reliability-based maintenance technicians' workloads optimisation model with stochastic consideration

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2016-06-01

    The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.

  20. Coil optimisation for transcranial magnetic stimulation in realistic head geometry.

    Science.gov (United States)

    Koponen, Lari M; Nieminen, Jaakko O; Mutanen, Tuomas P; Stenroos, Matti; Ilmoniemi, Risto J

    Transcranial magnetic stimulation (TMS) allows focal, non-invasive stimulation of the cortex. A TMS pulse is inherently weakly coupled to the cortex; thus, magnetic stimulation requires both high current and high voltage to reach sufficient intensity. These requirements limit, for example, the maximum repetition rate and the maximum number of consecutive pulses with the same coil due to the rise of its temperature. To develop methods to optimise, design, and manufacture energy-efficient TMS coils in realistic head geometry with an arbitrary overall coil shape. We derive a semi-analytical integration scheme for computing the magnetic field energy of an arbitrary surface current distribution, compute the electric field induced by this distribution with a boundary element method, and optimise a TMS coil for focal stimulation. Additionally, we introduce a method for manufacturing such a coil by using Litz wire and a coil former machined from polyvinyl chloride. We designed, manufactured, and validated an optimised TMS coil and applied it to brain stimulation. Our simulations indicate that this coil requires less than half the power of a commercial figure-of-eight coil, with a 41% reduction due to the optimised winding geometry and a partial contribution due to our thinner coil former and reduced conductor height. With the optimised coil, the resting motor threshold of abductor pollicis brevis was reached with the capacitor voltage below 600 V and peak current below 3000 A. The described method allows designing practical TMS coils that have considerably higher efficiency than conventional figure-of-eight coils. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  2. On Study of Application of Big Data and Cloud Computing Technology in Smart Campus

    Science.gov (United States)

    Tang, Zijiao

    2017-12-01

    We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.

  3. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  4. Reduction environmental effects of civil aircraft through multi-objective flight plan optimisation

    International Nuclear Information System (INIS)

    Lee, D S; Gonzalez, L F; Walker, R; Periaux, J; Onate, E

    2010-01-01

    With rising environmental alarm, the reduction of critical aircraft emissions including carbon dioxides (CO 2 ) and nitrogen oxides (NO x ) is one of most important aeronautical problems. There can be many possible attempts to solve such problem by designing new wing/aircraft shape, new efficient engine, etc. The paper rather provides a set of acceptable flight plans as a first step besides replacing current aircrafts. The paper investigates a green aircraft design optimisation in terms of aircraft range, mission fuel weight (CO 2 ) and NO x using advanced Evolutionary Algorithms coupled to flight optimisation system software. Two multi-objective design optimisations are conducted to find the best set of flight plans for current aircrafts considering discretised altitude and Mach numbers without designing aircraft shape and engine types. The objectives of first optimisation are to maximise range of aircraft while minimising NO x with constant mission fuel weight. The second optimisation considers minimisation of mission fuel weight and NO x with fixed aircraft range. Numerical results show that the method is able to capture a set of useful trade-offs that reduce NO x and CO 2 (minimum mission fuel weight).

  5. Topology optimisation of passive coolers for light-emitting diode lamps

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    2015-01-01

    This work applies topology optimisation to the design of passive coolers for light-emitting diode (LED) lamps. The heat sinks are cooled by the natural convection currents arising from the temperature difference between the LED lamp and the surrounding air. A large scale parallel computational....... The optimisation results show interesting features that are currently being incorporated into industrial designs for enhanced passive cooling abilities....

  6. Optimisation of parameters for co-precipitation of uranium and plutonium - results of simulation studies

    International Nuclear Information System (INIS)

    Pandey, N.K.; Velvandan, P.V.; Murugesan, S.; Ahmed, M.K.; Koganti, S.B.

    1999-01-01

    Preparation of plutonium oxide from plutonium nitrate solution generally proceeds via oxalate precipitation route. In a nuclear fuel reprocessing scheme this step succeeds the partitioning step (separation of uranium and plutonium). Results of present studies confirm that it is possible to avoid partitioning step and recover plutonium and uranium as co-precipitated product. This also helps in minimising the risk of proliferation of fissile material. In this procedure, the solubility of uranium oxalate in nitric acid is effectively used. Co-precipitation parameters are optimised with simulated solutions of uranium nitrate and thorium nitrate (in place of plutonium). On the basis of obtained results a reconversion flow-sheet is designed and reported here. (author)

  7. A study of lateral fall-off (penumbra) optimisation for pencil beam scanning (PBS) proton therapy

    Science.gov (United States)

    Winterhalter, C.; Lomax, A.; Oxley, D.; Weber, D. C.; Safai, S.

    2018-01-01

    The lateral fall-off is crucial for sparing organs at risk in proton therapy. It is therefore of high importance to minimize the penumbra for pencil beam scanning (PBS). Three optimisation approaches are investigated: edge-collimated uniformly weighted spots (collimation), pencil beam optimisation of uncollimated pencil beams (edge-enhancement) and the optimisation of edge collimated pencil beams (collimated edge-enhancement). To deliver energies below 70 MeV, these strategies are evaluated in combination with the following pre-absorber methods: field specific fixed thickness pre-absorption (fixed), range specific, fixed thickness pre-absorption (automatic) and range specific, variable thickness pre-absorption (variable). All techniques are evaluated by Monte Carlo simulated square fields in a water tank. For a typical air gap of 10 cm, without pre-absorber collimation reduces the penumbra only for water equivalent ranges between 4-11 cm by up to 2.2 mm. The sharpest lateral fall-off is achieved through collimated edge-enhancement, which lowers the penumbra down to 2.8 mm. When using a pre-absorber, the sharpest fall-offs are obtained when combining collimated edge-enhancement with a variable pre-absorber. For edge-enhancement and large air gaps, it is crucial to minimize the amount of material in the beam. For small air gaps however, the superior phase space of higher energetic beams can be employed when more material is used. In conclusion, collimated edge-enhancement combined with the variable pre-absorber is the recommended setting to minimize the lateral penumbra for PBS. Without collimator, it would be favourable to use a variable pre-absorber for large air gaps and an automatic pre-absorber for small air gaps.

  8. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  9. Standardised approach to optimisation

    International Nuclear Information System (INIS)

    Warren-Forward, Helen M.; Beckhaus, Ronald

    2004-01-01

    Optimisation of radiographic images is said to have been obtained if the patient has achieved an acceptable level of dose and the image is of diagnostic value. In the near future, it will probably be recommended that radiographers measure patient doses and compare them to reference levels. The aim of this paper is to describe a standardised approach to optimisation of radiographic examinations in a diagnostic imaging department. A three-step approach is outlined with specific examples for some common examinations (chest, abdomen, pelvis and lumbar spine series). Step One: Patient doses are calculated. Step Two: Doses are compared to existing reference levels and the technique used compared to image quality criteria. Step Three: Appropriate action is taken if doses are above the reference level. Results: Average entrance surface doses for two rooms were as follows AP Abdomen (6.3mGy and 3.4mGy); AP Lumbar Spine (6.4mGy and 4.1mGy) for AP Pelvis (4.8mGy and 2.6mGy) and PA chest (0.19mGy and 0.20mGy). Comparison with the Commission of the European Communities (CEC) recommended techniques identified large differences in the applied potential. The kVp values in this study were significantly lower (by up to lOkVp) than the CEC recommendations. The results of this study have indicated that there is a need to monitor radiation doses received by patients undergoing diagnostic radiography examinations. Not only has the assessment allowed valuable comparison with International Diagnostic Reference Levels and Radiography Good Practice but has demonstrated large variations in mean doses being delivered from different rooms of the same radiology department. Following the simple 3-step approach advocated in this paper should either provide evidence that department are practising the ALARA principle or assist in making suitable changes to current practice. Copyright (2004) Australian Institute of Radiography

  10. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  11. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  12. Integrative methods for analyzing big data in precision medicine.

    Science.gov (United States)

    Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša

    2016-03-01

    We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  14. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  15. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  16. Crystal structure optimisation using an auxiliary equation of state

    Science.gov (United States)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron

    2015-11-01

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.

  17. Crystal structure optimisation using an auxiliary equation of state

    International Nuclear Information System (INIS)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; 3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of))" data-affiliation=" (Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Global E3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of))" >Walsh, Aron

    2015-01-01

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy–volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other “beyond” density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu 2 ZnSnS 4 and the magnetic metal-organic framework HKUST-1

  18. Crystal structure optimisation using an auxiliary equation of state

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T. [Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Walsh, Aron, E-mail: a.walsh@bath.ac.uk [Centre for Sustainable Chemical Technologies and Department of Chemistry, University of Bath, Claverton Down, Bath BA2 7AY (United Kingdom); Global E" 3 Institute and Department of Materials Science and Engineering, Yonsei University, Seoul 120-749 (Korea, Republic of)

    2015-11-14

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy–volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other “beyond” density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu{sub 2}ZnSnS{sub 4} and the magnetic metal-organic framework HKUST-1.

  19. Cogeneration technologies, optimisation and implementation

    CERN Document Server

    Frangopoulos, Christos A

    2017-01-01

    Cogeneration refers to the use of a power station to deliver two or more useful forms of energy, for example, to generate electricity and heat at the same time. This book provides an integrated treatment of cogeneration, including a tour of the available technologies and their features, and how these systems can be analysed and optimised.

  20. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  1. Formulation and optimisation of raft-forming chewable tablets containing H2 antagonist.

    Science.gov (United States)

    Prajapati, Shailesh T; Mehta, Anant P; Modhia, Ishan P; Patel, Chhagan N

    2012-10-01

    The purpose of this research work was to formulate raft-forming chewable tablets of H2 antagonist (Famotidine) using a raft-forming agent along with an antacid- and gas-generating agent. Tablets were prepared by wet granulation and evaluated for raft strength, acid neutralisation capacity, weight variation, % drug content, thickness, hardness, friability and in vitro drug release. Various raft-forming agents were used in preliminary screening. A 2(3) full-factorial design was used in the present study for optimisation. The amount of sodium alginate, amount of calcium carbonate and amount sodium bicarbonate were selected as independent variables. Raft strength, acid neutralisation capacity and drug release at 30 min were selected as responses. Tablets containing sodium alginate were having maximum raft strength as compared with other raft-forming agents. Acid neutralisation capacity and in vitro drug release of all factorial batches were found to be satisfactory. The F5 batch was optimised based on maximum raft strength and good acid neutralisation capacity. Drug-excipient compatibility study showed no interaction between the drug and excipients. Stability study of the optimised formulation showed that the tablets were stable at accelerated environmental conditions. It was concluded that raft-forming chewable tablets prepared using an optimum amount of sodium alginate, calcium carbonate and sodium bicarbonate could be an efficient dosage form in the treatment of gastro oesophageal reflux disease.

  2. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  3. The Big Five Personality Factors and Application Fields

    Directory of Open Access Journals (Sweden)

    Agnė Matuliauskaitė

    2011-07-01

    Full Text Available The Big five factors are used in many research fields. The literature survey showed that the personality trait theory was used to study and explain relations with different variables. The article focuses on a brief description of methods that can help with identifying the Big five factors and considers the model for applying them in personnel selection. The paper looks at scientific researches assessing relations between the Big five factors and different variables such as job performance, academic performance, student knowledge management and evaluation.Article in Lithuanian

  4. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  5. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  6. Energy balance of the optimised CVT-hybrid-driveline

    Energy Technology Data Exchange (ETDEWEB)

    Hoehn, Bernd-Robert; Pflaum, Hermann; Lechner, Claus [Forschungsstelle fuer Zahnraeder und Getriebebau, Technische Univ. Muenchen, Garching (Germany)

    2009-07-01

    Funded by the DFG (German Research Foundation) and some industry partners like GM Powertrain Europe, ZF, EPCOS the Optimised CVT-Hybrid was developed at Technische Universitaet Muenchen in close collaboration with the industry and is currently under scientific investigation. Designed as a parallel hybrid vehicle the Optimised CVT-Hybrid combines a series-production diesel engine with a small electric motor. The core element of the driveline is a two range continuously variable transmission (i{radical}i-transmission), which is based on a chain variator. By a special shifting process without interruption of traction force the ratio range of the chain variator is used twice; thereby a wide transmission-ratio spread is achieved by low complexity. Thus the transmission provides a large pull-away ratio for the small electric motor and a fuel-efficient overdrive ratio for the ic-engine. Instead of heavy and space-consuming accumulators a small efficient package of double layer capacitors (UltraCaps) is used for electric energy and power storage. The driveline management is done by an optimised vehicle controller. Within the scope of the research project two prototype drivelines were manufactured. One driveline is integrated into an Opel Vectra Caravan and is available for investigations at the roller dynamometer and in the actual road traffic. The second hybrid driveline is assembled at the powertrain test rig of the FZG for detailed analysis of system behaviour and fuel consumption. Based on measurements of standardised driving cycles system behaviour, fuel consumption and a detailed energy balance of the Optimised CVT-Hybrid are presented. In comparison to the series-production vehicle the fuel savings are shown. (orig.)

  7. Optimising the neutron environment of Radiation Portal Monitors: A computational study

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Mark R., E-mail: mark.gilbert@ccfe.ac.uk [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Ghani, Zamir [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); McMillan, John E. [Department of Physics and Astronomy, University of Sheffield, Hicks building, Hounsfield Road, Sheffield S3 7RH (United Kingdom); Packer, Lee W. [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-21

    Efficient and reliable detection of radiological or nuclear threats is a crucial part of national and international efforts to prevent terrorist activities. Radiation Portal Monitors (RPMs), which are deployed worldwide, are intended to interdict smuggled fissile material by detecting emissions of neutrons and gamma rays. However, considering the range and variety of threat sources, vehicular and shielding scenarios, and that only a small signature is present, it is important that the design of the RPMs allows these signatures to be accurately differentiated from the environmental background. Using Monte-Carlo neutron-transport simulations of a model {sup 3}He detector system we have conducted a parameter study to identify the optimum combination of detector shielding, moderation, and collimation that maximises the sensitivity of neutron-sensitive RPMs. These structures, which could be simply and cost-effectively added to existing RPMs, can improve the detector response by more than a factor of two relative to an unmodified, bare design. Furthermore, optimisation of the air gap surrounding the helium tubes also improves detector efficiency.

  8. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  9. Energy thermal management in commercial bread-baking using a multi-objective optimisation framework

    International Nuclear Information System (INIS)

    Khatir, Zinedine; Taherkhani, A.R.; Paton, Joe; Thompson, Harvey; Kapur, Nik; Toropov, Vassili

    2015-01-01

    In response to increasing energy costs and legislative requirements energy efficient high-speed air impingement jet baking systems are now being developed. In this paper, a multi-objective optimisation framework for oven designs is presented which uses experimentally verified heat transfer correlations and high fidelity Computational Fluid Dynamics (CFD) analyses to identify optimal combinations of design features which maximise desirable characteristics such as temperature uniformity in the oven and overall energy efficiency of baking. A surrogate-assisted multi-objective optimisation framework is proposed and used to explore a range of practical oven designs, providing information on overall temperature uniformity within the oven together with ensuing energy usage and potential savings. - Highlights: • A multi-objective optimisation framework to design commercial ovens is presented. • High fidelity CFD embeds experimentally calibrated heat transfer inputs. • The optimum oven design minimises specific energy and bake time. • The Pareto front outlining the surrogate-assisted optimisation framework is built. • Optimisation of industrial bread-baking ovens reveals an energy saving of 637.6 GWh

  10. Integration of Monte-Carlo ray tracing with a stochastic optimisation method: application to the design of solar receiver geometry.

    Science.gov (United States)

    Asselineau, Charles-Alexis; Zapata, Jose; Pye, John

    2015-06-01

    A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.

  11. Décomposition-coordination en optimisation déterministe et stochastique

    CERN Document Server

    Carpentier, Pierre

    2017-01-01

    Ce livre considère le traitement de problèmes d'optimisation de grande taille. L'idée est d'éclater le problème d'optimisation global en sous-problèmes plus petits, donc plus faciles à résoudre, chacun impliquant l'un des sous-systèmes (décomposition), mais sans renoncer à obtenir l'optimum global, ce qui nécessite d'utiliser une procédure itérative (coordination). Ce sujet a fait l'objet de plusieurs livres publiés dans les années 70 dans le contexte de l'optimisation déterministe. Nous présentans ici les principes essentiels et méthodes de décomposition-coordination au travers de situations typiques, puis nous proposons un cadre général qui permet de construire des algorithmes corrects et d'étudier leur convergence. Cette théorie est présentée aussi bien dans le contexte de l'optimisation déterministe que stochastique. Ce matériel a été enseigné par les auteurs dans divers cours de 3ème cycle et également mis en œuvre dans de nombreuses applications industrielles. Des exerc...

  12. A proposed framework of big data readiness in public sectors

    Science.gov (United States)

    Ali, Raja Haslinda Raja Mohd; Mohamad, Rosli; Sudin, Suhizaz

    2016-08-01

    Growing interest over big data mainly linked to its great potential to unveil unforeseen pattern or profiles that support organisation's key business decisions. Following private sector moves to embrace big data, the government sector has now getting into the bandwagon. Big data has been considered as one of the potential tools to enhance service delivery of the public sector within its financial resources constraints. Malaysian government, particularly, has considered big data as one of the main national agenda. Regardless of government commitment to promote big data amongst government agencies, degrees of readiness of the government agencies as well as their employees are crucial in ensuring successful deployment of big data. This paper, therefore, proposes a conceptual framework to investigate perceived readiness of big data potentials amongst Malaysian government agencies. Perceived readiness of 28 ministries and their respective employees will be assessed using both qualitative (interview) and quantitative (survey) approaches. The outcome of the study is expected to offer meaningful insight on factors affecting change readiness among public agencies on big data potentials and the expected outcome from greater/lower change readiness among the public sectors.

  13. The structure of the big magnetic storms

    International Nuclear Information System (INIS)

    Mihajlivich, J. Spomenko; Chop, Rudi; Palangio, Paolo

    2010-01-01

    The records of geomagnetic activity during Solar Cycles 22 and 23 (which occurred from 1986 to 2006) indicate several extremely intensive A-class geomagnetic storms. These were storms classified in the category of the Big Magnetic Storms. In a year of maximum solar activity during Solar Cycle 23, or more precisely, during a phase designated as a post-maximum phase in solar activity (PPM - Phase Post maximum), near the autumn equinox, on 29, October 2003, an extremely strong and intensive magnetic storm was recorded. In the first half of November 2004 (7, November 2004) an intensive magnetic storm was recorded (the Class Big Magnetic Storm). The level of geomagnetic field variations which were recorded for the selected Big Magnetic Storms, was ΔD st=350 nT. For the Big Magnetic Storms the indicated three-hour interval indices geomagnetic activity was Kp = 9. This study presents the spectral composition of the Di - variations which were recorded during magnetic storms in October 2003 and November 2004. (Author)

  14. Benefits, Challenges and Tools of Big Data Management

    Directory of Open Access Journals (Sweden)

    Fernando L. F. Almeida

    2017-10-01

    Full Text Available Big Data is one of the most predominant field of knowledge and research that has generated high repercussion in the process of digital transformation of organizations in recent years. The Big Data's main goal is to improve work processes through analysis and interpretation of large amounts of data. Knowing how Big Data works, its benefits, challenges and tools, are essential elements for business success. Our study performs a systematic review on Big Data field adopting a mind map approach, which allows us to easily and visually identify its main elements and dependencies. The findings identified and mapped a total of 12 main branches of benefits, challenges and tools, and also a total of 52 sub branches in each of the main areas of the model.

  15. Thickness Optimisation of Textiles Subjected to Heat and Mass Transport during Ironing

    Directory of Open Access Journals (Sweden)

    Korycki Ryszard

    2016-09-01

    Full Text Available Let us next analyse the coupled problem during ironing of textiles, that is, the heat is transported with mass whereas the mass transport with heat is negligible. It is necessary to define both physical and mathematical models. Introducing two-phase system of mass sorption by fibres, the transport equations are introduced and accompanied by the set of boundary and initial conditions. Optimisation of material thickness during ironing is gradient oriented. The first-order sensitivity of an arbitrary objective functional is analysed and included in optimisation procedure. Numerical example is the thickness optimisation of different textile materials in ironing device.

  16. Optimisation of NMR dynamic models II. A new methodology for the dual optimisation of the model-free parameters and the Brownian rotational diffusion tensor

    International Nuclear Information System (INIS)

    D'Auvergne, Edward J.; Gooley, Paul R.

    2008-01-01

    Finding the dynamics of an entire macromolecule is a complex problem as the model-free parameter values are intricately linked to the Brownian rotational diffusion of the molecule, mathematically through the autocorrelation function of the motion and statistically through model selection. The solution to this problem was formulated using set theory as an element of the universal set U-the union of all model-free spaces (d'Auvergne EJ and Gooley PR (2007) Mol BioSyst 3(7), 483-494). The current procedure commonly used to find the universal solution is to initially estimate the diffusion tensor parameters, to optimise the model-free parameters of numerous models, and then to choose the best model via model selection. The global model is then optimised and the procedure repeated until convergence. In this paper a new methodology is presented which takes a different approach to this diffusion seeded model-free paradigm. Rather than starting with the diffusion tensor this iterative protocol begins by optimising the model-free parameters in the absence of any global model parameters, selecting between all the model-free models, and finally optimising the diffusion tensor. The new model-free optimisation protocol will be validated using synthetic data from Schurr JM et al. (1994) J Magn Reson B 105(3), 211-224 and the relaxation data of the bacteriorhodopsin (1-36)BR fragment from Orekhov VY (1999) J Biomol NMR 14(4), 345-356. To demonstrate the importance of this new procedure the NMR relaxation data of the Olfactory Marker Protein (OMP) of Gitti R et al. (2005) Biochem 44(28), 9673-9679 is reanalysed. The result is that the dynamics for certain secondary structural elements is very different from those originally reported

  17. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  18. Toward a Literature-Driven Definition of Big Data in Healthcare

    OpenAIRE

    Baro, Emilie; Degoul, Samuel; Beuscart, R?gis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. ...

  19. Strategic optimisation of microgrid by evolving a unitised regenerative fuel cell system operational criterion

    Science.gov (United States)

    Bhansali, Gaurav; Singh, Bhanu Pratap; Kumar, Rajesh

    2016-09-01

    In this paper, the problem of microgrid optimisation with storage has been addressed in an unaccounted way rather than confining it to loss minimisation. Unitised regenerative fuel cell (URFC) systems have been studied and employed in microgrids to store energy and feed it back into the system when required. A value function-dependent on line losses, URFC system operational cost and stored energy at the end of the day are defined here. The function is highly complex, nonlinear and multi dimensional in nature. Therefore, heuristic optimisation techniques in combination with load flow analysis are used here to resolve the network and time domain complexity related with the problem. Particle swarm optimisation with the forward/backward sweep algorithm ensures optimal operation of microgrid thereby minimising the operational cost of the microgrid. Results are shown and are found to be consistently improving with evolution of the solution strategy.

  20. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  1. Optimisation of wort production from rice malt using enzymes and ...

    African Journals Online (AJOL)

    Commercially, rice malt has never been successfully used in brewing because of its low free α-amino nitrogen (FAN) content. This study was designed to optimise rice malt replacement for barley malt in wort production and to improve FAN by adding α-amylase and protease. The response surface methodology (RSM) ...

  2. Multi-objective optimisation of aircraft flight trajectories in the ATM and avionics context

    Science.gov (United States)

    Gardi, Alessandro; Sabatini, Roberto; Ramasamy, Subramanian

    2016-05-01

    The continuous increase of air transport demand worldwide and the push for a more economically viable and environmentally sustainable aviation are driving significant evolutions of aircraft, airspace and airport systems design and operations. Although extensive research has been performed on the optimisation of aircraft trajectories and very efficient algorithms were widely adopted for the optimisation of vertical flight profiles, it is only in the last few years that higher levels of automation were proposed for integrated flight planning and re-routing functionalities of innovative Communication Navigation and Surveillance/Air Traffic Management (CNS/ATM) and Avionics (CNS+A) systems. In this context, the implementation of additional environmental targets and of multiple operational constraints introduces the need to efficiently deal with multiple objectives as part of the trajectory optimisation algorithm. This article provides a comprehensive review of Multi-Objective Trajectory Optimisation (MOTO) techniques for transport aircraft flight operations, with a special focus on the recent advances introduced in the CNS+A research context. In the first section, a brief introduction is given, together with an overview of the main international research initiatives where this topic has been studied, and the problem statement is provided. The second section introduces the mathematical formulation and the third section reviews the numerical solution techniques, including discretisation and optimisation methods for the specific problem formulated. The fourth section summarises the strategies to articulate the preferences and to select optimal trajectories when multiple conflicting objectives are introduced. The fifth section introduces a number of models defining the optimality criteria and constraints typically adopted in MOTO studies, including fuel consumption, air pollutant and noise emissions, operational costs, condensation trails, airspace and airport operations

  3. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  4. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  5. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  6. Optimisation of efficiency of axial fans

    NARCIS (Netherlands)

    Kruyt, Nicolaas P.; Pennings, P.C.; Faasen, R.

    2014-01-01

    A three-stage research project has been executed to develop ducted axial-fans with increased efficiency. In the first stage a design method has been developed in which various conflicting design criteria can be incorporated. Based on this design method, an optimised design has been determined

  7. Optimising Job-Shop Functions Utilising the Score-Function Method

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2000-01-01

    During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging to this ......During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging...... of a Job-Shop can be handled by the SF method....

  8. Pre-operative optimisation of lung function

    Directory of Open Access Journals (Sweden)

    Naheed Azhar

    2015-01-01

    Full Text Available The anaesthetic management of patients with pre-existing pulmonary disease is a challenging task. It is associated with increased morbidity in the form of post-operative pulmonary complications. Pre-operative optimisation of lung function helps in reducing these complications. Patients are advised to stop smoking for a period of 4–6 weeks. This reduces airway reactivity, improves mucociliary function and decreases carboxy-haemoglobin. The widely used incentive spirometry may be useful only when combined with other respiratory muscle exercises. Volume-based inspiratory devices have the best results. Pharmacotherapy of asthma and chronic obstructive pulmonary disease must be optimised before considering the patient for elective surgery. Beta 2 agonists, inhaled corticosteroids and systemic corticosteroids, are the main drugs used for this and several drugs play an adjunctive role in medical therapy. A graded approach has been suggested to manage these patients for elective surgery with an aim to achieve optimal pulmonary function.

  9. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  10. Application and Exploration of Big Data Mining in Clinical Medicine

    Science.gov (United States)

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-01-01

    Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378

  11. Predictive Big Data Analytics: A Study of Parkinson?s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    OpenAIRE

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph

    2016-01-01

    Background A unique archive of Big Data on Parkinson?s Disease is collected, managed and disseminated by the Parkinson?s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationsh...

  12. Solution structure of leptospiral LigA4 Big domain

    Energy Technology Data Exchange (ETDEWEB)

    Mei, Song; Zhang, Jiahai [Hefei National Laboratory for Physical Sciences at Microscale, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui 230026 (China); Zhang, Xuecheng [School of Life Sciences, Anhui University, Hefei, Anhui 230039 (China); Tu, Xiaoming, E-mail: xmtu@ustc.edu.cn [Hefei National Laboratory for Physical Sciences at Microscale, School of Life Sciences, University of Science and Technology of China, Hefei, Anhui 230026 (China)

    2015-11-13

    Pathogenic Leptospiraspecies express immunoglobulin-like proteins which serve as adhesins to bind to the extracellular matrices of host cells. Leptospiral immunoglobulin-like protein A (LigA), a surface exposed protein containing tandem repeats of bacterial immunoglobulin-like (Big) domains, has been proved to be involved in the interaction of pathogenic Leptospira with mammalian host. In this study, the solution structure of the fourth Big domain of LigA (LigA4 Big domain) from Leptospira interrogans was solved by nuclear magnetic resonance (NMR). The structure of LigA4 Big domain displays a similar bacterial immunoglobulin-like fold compared with other Big domains, implying some common structural aspects of Big domain family. On the other hand, it displays some structural characteristics significantly different from classic Ig-like domain. Furthermore, Stains-all assay and NMR chemical shift perturbation revealed the Ca{sup 2+} binding property of LigA4 Big domain. - Highlights: • Determining the solution structure of a bacterial immunoglobulin-like domain from a surface protein of Leptospira. • The solution structure shows some structural characteristics significantly different from the classic Ig-like domains. • A potential Ca{sup 2+}-binding site was identified by strains-all and NMR chemical shift perturbation.

  13. Solution structure of leptospiral LigA4 Big domain

    International Nuclear Information System (INIS)

    Mei, Song; Zhang, Jiahai; Zhang, Xuecheng; Tu, Xiaoming

    2015-01-01

    Pathogenic Leptospiraspecies express immunoglobulin-like proteins which serve as adhesins to bind to the extracellular matrices of host cells. Leptospiral immunoglobulin-like protein A (LigA), a surface exposed protein containing tandem repeats of bacterial immunoglobulin-like (Big) domains, has been proved to be involved in the interaction of pathogenic Leptospira with mammalian host. In this study, the solution structure of the fourth Big domain of LigA (LigA4 Big domain) from Leptospira interrogans was solved by nuclear magnetic resonance (NMR). The structure of LigA4 Big domain displays a similar bacterial immunoglobulin-like fold compared with other Big domains, implying some common structural aspects of Big domain family. On the other hand, it displays some structural characteristics significantly different from classic Ig-like domain. Furthermore, Stains-all assay and NMR chemical shift perturbation revealed the Ca"2"+ binding property of LigA4 Big domain. - Highlights: • Determining the solution structure of a bacterial immunoglobulin-like domain from a surface protein of Leptospira. • The solution structure shows some structural characteristics significantly different from the classic Ig-like domains. • A potential Ca"2"+-binding site was identified by strains-all and NMR chemical shift perturbation.

  14. Satellite Vibration Testing: Angle optimisation method to Reduce Overtesting

    Science.gov (United States)

    Knight, Charly; Remedia, Marcello; Aglietti, Guglielmo S.; Richardson, Guy

    2018-06-01

    Spacecraft overtesting is a long running problem, and the main focus of most attempts to reduce it has been to adjust the base vibration input (i.e. notching). Instead this paper examines testing alternatives for secondary structures (equipment) coupled to the main structure (satellite) when they are tested separately. Even if the vibration source is applied along one of the orthogonal axes at the base of the coupled system (satellite plus equipment), the dynamics of the system and potentially the interface configuration mean the vibration at the interface may not occur all along one axis much less the corresponding orthogonal axis of the base excitation. This paper proposes an alternative testing methodology in which the testing of a piece of equipment occurs at an offset angle. This Angle Optimisation method may have multiple tests but each with an altered input direction allowing for the best match between all specified equipment system responses with coupled system tests. An optimisation process that compares the calculated equipment RMS values for a range of inputs with the maximum coupled system RMS values, and is used to find the optimal testing configuration for the given parameters. A case study was performed to find the best testing angles to match the acceleration responses of the centre of mass and sum of interface forces for all three axes, as well as the von Mises stress for an element by a fastening point. The angle optimisation method resulted in RMS values and PSD responses that were much closer to the coupled system when compared with traditional testing. The optimum testing configuration resulted in an overall average error significantly smaller than the traditional method. Crucially, this case study shows that the optimum test campaign could be a single equipment level test opposed to the traditional three orthogonal direction tests.

  15. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  16. Optimisation of electrical system for offshore wind farms via genetic algorithm

    DEFF Research Database (Denmark)

    Chen, Zhe; Zhao, Menghua; Blaabjerg, Frede

    2009-01-01

    An optimisation platform based on genetic algorithm (GA) is presented, where the main components of a wind farm and key technical specifications are used as input parameters and the electrical system design of the wind farm is optimised in terms of both production cost and system reliability....... The power losses, wind power production, initial investment and maintenance costs are considered in the production cost. The availability of components and network redundancy are included in the reliability evaluation. The method of coding an electrical system to a binary string, which is processed by GA......, is developed. Different GA techniques are investigated based on a real example offshore wind farm. This optimisation platform has been demonstrated as a powerful tool for offshore wind farm design and evaluation....

  17. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  18. Identification of the mechanical behaviour of biopolymer composites using multistart optimisation technique

    KAUST Repository

    Brahim, Elhacen

    2013-10-01

    This paper aims at identifying the mechanical behaviour of starch-zein composites as a function of zein content using a novel optimisation technique. Starting from bending experiments, force-deflection response is used to derive adequate mechanical parameters representing the elastic-plastic behaviour of the studied material. For such a purpose, a finite element model is developed accounting for a simple hardening rule, namely isotropic hardening model. A deterministic optimisation strategy is implemented to provide rapid matching between parameters of the constitutive law and the observed behaviour. Results are discussed based on the robustness of the numerical approach and predicted tendencies with regards to the role of zein content. © 2013 Elsevier Ltd.

  19. Design and optimisation of dual-mode heat pump systems using natural fluids

    International Nuclear Information System (INIS)

    Zhang Wenling; Klemeš, Jiří Jaromír; Kim, Jin-Kuk

    2012-01-01

    The paper introduces new multi-period modelling and design methodology for dual-mode heat pumps using natural fluids. First, a mathematical model is developed to capture thermodynamic and operating characteristics of dual-mode heat pump systems, subject to different ambient temperatures. The multi-period optimisation framework has been developed to reflect different ambient conditions and its influences on heat pump performance, as well as to determine a system capacity of heat pump which allows systematic economic trade-offs between supplementary heating (or cooling) and operating cost for heat pump. Case study considering three geographical locations with different heating and cooling demands is presented to illustrate the importance of using multi-period optimisation for the design of heat pump systems.

  20. A Synchronous-Asynchronous Particle Swarm Optimisation Algorithm

    Science.gov (United States)

    Ab Aziz, Nor Azlina; Mubin, Marizan; Mohamad, Mohd Saberi; Ab Aziz, Kamarulzaman

    2014-01-01

    In the original particle swarm optimisation (PSO) algorithm, the particles' velocities and positions are updated after the whole swarm performance is evaluated. This algorithm is also known as synchronous PSO (S-PSO). The strength of this update method is in the exploitation of the information. Asynchronous update PSO (A-PSO) has been proposed as an alternative to S-PSO. A particle in A-PSO updates its velocity and position as soon as its own performance has been evaluated. Hence, particles are updated using partial information, leading to stronger exploration. In this paper, we attempt to improve PSO by merging both update methods to utilise the strengths of both methods. The proposed synchronous-asynchronous PSO (SA-PSO) algorithm divides the particles into smaller groups. The best member of a group and the swarm's best are chosen to lead the search. Members within a group are updated synchronously, while the groups themselves are asynchronously updated. Five well-known unimodal functions, four multimodal functions, and a real world optimisation problem are used to study the performance of SA-PSO, which is compared with the performances of S-PSO and A-PSO. The results are statistically analysed and show that the proposed SA-PSO has performed consistently well. PMID:25121109

  1. Noise aspects at aerodynamic blade optimisation projects

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G. [Netherlands Energy Research Foundation, Petten (Netherlands)

    1997-12-31

    This paper shows an example of an aerodynamic blade optimisation, using the program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. The aerodynamic optimised geometry from PVOPT is the `real` optimum (up to the latest decimal). The most important conclusion from this study is, that it is worthwhile to investigate the behaviour of the objective function (in the present case the energy yield) around the optimum: If the optimum is flat, there is a possibility to apply modifications to the optimum configuration with only a limited loss in energy yield. It is obvious that the modified configurations emits a different (and possibly lower) noise level. In the BLADOPT program (the successor of PVOPT) it will be possible to quantify the noise level and hence to assess the reduced noise emission more thoroughly. At present the most promising approaches for noise reduction are believed to be a reduction of the rotor speed (if at all possible), and a reduction of the tip angle by means of low lift profiles, or decreased twist at the outboard stations. These modifications were possible without a significant loss in energy yield. (LN)

  2. OPTIMISATION OF COMPRESSIVE STRENGTH OF PERIWINKLE ...

    African Journals Online (AJOL)

    In this paper, a regression model is developed to predict and optimise the compressive strength of periwinkle shell aggregate concrete using Scheffe's regression theory. The results obtained from the derived regression model agreed favourably with the experimental data. The model was tested for adequacy using a student ...

  3. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane Loft

    2013-01-01

    . The definition of an optimal performance was carried out by through a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow, given the probability of a future rain event. The methodology is successfully applied...

  4. An empirical study on website usability elements and how they affect search engine optimisation

    OpenAIRE

    Eugene B. Visser; Melius Weideman

    2011-01-01

    The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared...

  5. On the impact of optimisation models in maintenance decision making: the state of the art

    International Nuclear Information System (INIS)

    Dekker, Rommert; Scarf, Philip A.

    1998-01-01

    In this paper we discuss the state of the art in applications of maintenance optimisation models. After giving a short introduction to the area, we consider several ways in which models may be used to optimise maintenance, such as case studies, operational and strategic decision support systems, and give examples of each of them. Next we discuss several areas where the models have been applied successfully. These include civil structure and aeroplane maintenance. From a comparative point of view, we discuss future prospects

  6. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  7. Fuzzy VIKOR approach for selection of big data analyst in procurement management

    Directory of Open Access Journals (Sweden)

    Surajit Bag

    2016-07-01

    Full Text Available Background: Big data and predictive analysis have been hailed as the fourth paradigm of science. Big data and analytics are critical to the future of business sustainability. The demand for data scientists is increasing with the dynamic nature of businesses, thus making it indispensable to manage big data, derive meaningful results and interpret management decisions. Objectives: The purpose of this study was to provide a brief conceptual review of big data and analytics and further illustrate the use of a multicriteria decision-making technique in selecting the right skilled candidate for big data and analytics in procurement management. Method: It is important for firms to select and recruit the right data analyst, both in terms of skills sets and scope of analysis. The nature of such a problem is complex and multicriteria decision-making, which deals with both qualitative and quantitative factors. In the current study, an application of the Fuzzy VIsekriterijumska optimizacija i KOmpromisno Resenje (VIKOR method was used to solve the big data analyst selection problem. Results: From this study, it was identified that Technical knowledge (C1, Intellectual curiosity (C4 and Business acumen (C5 are the strongest influential criteria and must be present in the candidate for the big data and analytics job. Conclusion: Fuzzy VIKOR is the perfect technique in this kind of multiple criteria decisionmaking problematic scenario. This study will assist human resource managers and procurement managers in selecting the right workforce for big data analytics.

  8. Classical and quantum Big Brake cosmology for scalar field and tachyonic models

    Energy Technology Data Exchange (ETDEWEB)

    Kamenshchik, A. Yu. [Dipartimento di Fisica e Astronomia and INFN, Via Irnerio 46, 40126 Bologna (Italy) and L.D. Landau Institute for Theoretical Physics of the Russian Academy of Sciences, Kosygin str. 2, 119334 Moscow (Russian Federation); Manti, S. [Scuola Normale Superiore, Piazza dei Cavalieri 7, 56126 Pisa (Italy)

    2013-02-21

    We study a relation between the cosmological singularities in classical and quantum theory, comparing the classical and quantum dynamics in some models possessing the Big Brake singularity - the model based on a scalar field and two models based on a tachyon-pseudo-tachyon field . It is shown that the effect of quantum avoidance is absent for the soft singularities of the Big Brake type while it is present for the Big Bang and Big Crunch singularities. Thus, there is some kind of a classical - quantum correspondence, because soft singularities are traversable in classical cosmology, while the strong Big Bang and Big Crunch singularities are not traversable.

  9. Classical and quantum Big Brake cosmology for scalar field and tachyonic models

    International Nuclear Information System (INIS)

    Kamenshchik, A. Yu.; Manti, S.

    2013-01-01

    We study a relation between the cosmological singularities in classical and quantum theory, comparing the classical and quantum dynamics in some models possessing the Big Brake singularity - the model based on a scalar field and two models based on a tachyon-pseudo-tachyon field . It is shown that the effect of quantum avoidance is absent for the soft singularities of the Big Brake type while it is present for the Big Bang and Big Crunch singularities. Thus, there is some kind of a classical - quantum correspondence, because soft singularities are traversable in classical cosmology, while the strong Big Bang and Big Crunch singularities are not traversable.

  10. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  11. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  12. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  13. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  14. Big Five personality traits, job satisfaction and subjective wellbeing in China.

    Science.gov (United States)

    Zhai, Qingguo; Willis, Mike; O'Shea, Bob; Zhai, Yubo; Yang, Yuwen

    2013-01-01

    This paper examines the effect of the Big Five personality traits on job satisfaction and subjective wellbeing (SWB). The paper also examines the mediating role of job satisfaction on the Big Five-SWB relationship. Data were collected from a sample of 818 urban employees from five Chinese cities: Harbin, Changchun, Shenyang, Dalian, and Fushun. All the study variables were measured with well-established multi-item scales that have been validated both in English-speaking populations and in China. The study found only extraversion to have an effect on job satisfaction, suggesting that there could be cultural difference in the relationships between the Big Five and job satisfaction in China and in the West. The study found that three factors in the Big Five--extraversion, conscientiousness, and neuroticism--have an effect on SWB. This finding is similar to findings in the West, suggesting convergence in the relationship between the Big Five and SWB in different cultural contexts. The research found that only the relationship between extraversion and SWB is partially mediated by job satisfaction, implying that the effect of the Big Five on SWB is mainly direct, rather than indirect via job satisfaction. The study also found that extraversion was the strongest predictor of both job satisfaction and SWB. This finding implies that extraversion could be more important than other factors in the Big Five in predicting job satisfaction and SWB in a "high collectivism" and "high power distance" country such as China. The research findings are discussed in the Chinese cultural context. The study also offers suggestions on the directions for future research.

  15. Probabilistic sensitivity analysis of optimised preventive maintenance strategies for deteriorating infrastructure assets

    International Nuclear Information System (INIS)

    Daneshkhah, A.; Stocks, N.G.; Jeffrey, P.

    2017-01-01

    Efficient life-cycle management of civil infrastructure systems under continuous deterioration can be improved by studying the sensitivity of optimised preventive maintenance decisions with respect to changes in model parameters. Sensitivity analysis in maintenance optimisation problems is important because if the calculation of the cost of preventive maintenance strategies is not sufficiently robust, the use of the maintenance model can generate optimised maintenances strategies that are not cost-effective. Probabilistic sensitivity analysis methods (particularly variance based ones), only partially respond to this issue and their use is limited to evaluating the extent to which uncertainty in each input contributes to the overall output's variance. These methods do not take account of the decision-making problem in a straightforward manner. To address this issue, we use the concept of the Expected Value of Perfect Information (EVPI) to perform decision-informed sensitivity analysis: to identify the key parameters of the problem and quantify the value of learning about certain aspects of the life-cycle management of civil infrastructure system. This approach allows us to quantify the benefits of the maintenance strategies in terms of expected costs and in the light of accumulated information about the model parameters and aspects of the system, such as the ageing process. We use a Gamma process model to represent the uncertainty associated with asset deterioration, illustrating the use of EVPI to perform sensitivity analysis on the optimisation problem for age-based and condition-based preventive maintenance strategies. The evaluation of EVPI indices is computationally demanding and Markov Chain Monte Carlo techniques would not be helpful. To overcome this computational difficulty, we approximate the EVPI indices using Gaussian process emulators. The implications of the worked numerical examples discussed in the context of analytical efficiency and organisational

  16. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  17. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  18. Work management to optimise occupational radiological protection

    International Nuclear Information System (INIS)

    Ahier, B.

    2009-01-01

    Although work management is no longer a new concept, continued efforts are still needed to ensure that good performance, outcomes and trends are maintained in the face of current and future challenges. The ISOE programme thus created an Expert Group on Work Management in 2007 to develop an updated report reflecting the current state of knowledge, technology and experience in the occupational radiological protection of workers at nuclear power plants. Published in 2009, the new ISOE report on Work Management to Optimise Occupational Radiological Protection in the Nuclear Power Industry provides up-to-date practical guidance on the application of work management principles. Work management measures aim at optimising occupational radiological protection in the context of the economic viability of the installation. Important factors in this respect are measures and techniques influencing i) dose and dose rate, including source- term reduction; ii) exposure, including amount of time spent in controlled areas for operations; and iii) efficiency in short- and long-term planning, worker involvement, coordination and training. Equally important due to their broad, cross-cutting nature are the motivational and organisational arrangements adopted. The responsibility for these aspects may reside in various parts of an installation's organisational structure, and thus, a multi-disciplinary approach must be recognised, accounted for and well-integrated in any work. Based on the operational experience within the ISOE programme, the following key areas of work management have been identified: - regulatory aspects; - ALARA management policy; - worker involvement and performance; - work planning and scheduling; - work preparation; - work implementation; - work assessment and feedback; - ensuring continuous improvement. The details of each of these areas are elaborated and illustrated in the report through examples and case studies arising from ISOE experience. They are intended to

  19. Optimisation of the PCR-invA primers for the detection of Salmonella ...

    African Journals Online (AJOL)

    A polymerase chain reaction (PCR)-based method for the detection of Salmonella species in water samples was optimised and evaluated for speed, specificity and sensitivity. Optimisation of Mg2+ and primer concentrations and cycling parameters increased the sensitivity and limit of detection of PCR to 2.6 x 104 cfu/m.

  20. Epidemiological study of venous thromboembolism in a big Danish cohort

    DEFF Research Database (Denmark)

    Severinsen, Marianne Tang; Kristensen, Søren Risom; Overvad, Kim

    Introduction: Epidemiological data on venous thromboembolism (VT), i.e. pulmonary emboli (PE) and deep venous thrombosis (DVT) are sparse. We have examined VT-diagnoses registered in a big Danish Cohort study.  Methods: All first-time VT diagnoses in The Danish National Patient Register were...... were probable cases (1.7%) whereas for 449 (41.6%) the diagnosis could be excluded. The incidence rate was 1 per 1000 personyears. Out of the 632 cases 60% were DVT and 40% PE. 315 VT were considered idiopathic (49.8%), 311 were secondary (49.2%) and 15 were unclassifiable. 122 patients had cancer, 87...

  1. Impact of physical exercise on reaction time in patients with Parkinson's disease-data from the Berlin BIG Study.

    Science.gov (United States)

    Ebersbach, Georg; Ebersbach, Almut; Gandor, Florin; Wegner, Brigitte; Wissel, Jörg; Kupsch, Andreas

    2014-05-01

    To determine whether physical activity may affect cognitive performance in patients with Parkinson's disease by measuring reaction times in patients participating in the Berlin BIG study. Randomized controlled trial, rater-blinded. Ambulatory care. Patients with mild to moderate Parkinson's disease (N=60) were randomly allocated to 3 treatment arms. Outcome was measured at the termination of training and at follow-up 16 weeks after baseline in 58 patients (completers). Patients received 16 hours of individual Lee Silverman Voice Treatment-BIG training (BIG; duration of treatment, 4wk), 16 hours of group training with Nordic Walking (WALK; duration of treatment, 8wk), or nonsupervised domestic exercise (HOME; duration of instruction, 1hr). Cued reaction time (cRT) and noncued reaction time (nRT). Differences between treatment groups in improvement in reaction times from baseline to intermediate and baseline to follow-up assessments were observed for cRT but not for nRT. Pairwise t test comparisons revealed differences in change in cRT at both measurements between BIG and HOME groups (intermediate: -52ms; 95% confidence interval [CI], -84/-20; P=.002; follow-up: 55ms; CI, -105/-6; P=.030) and between WALK and HOME groups (intermediate: -61ms; CI, -120/-2; P=.042; follow-up: -78ms; CI, -136/-20; P=.010). There was no difference between BIG and WALK groups (intermediate: 9ms; CI, -49/67; P=.742; follow-up: 23ms; CI, -27/72; P=.361). Supervised physical exercise with Lee Silverman Voice Treatment-BIG or Nordic Walking is associated with improvement in cognitive aspects of movement preparation. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Implementing large-scale programmes to optimise the health workforce in low- and middle-income settings: a multicountry case study synthesis.

    Science.gov (United States)

    Gopinathan, Unni; Lewin, Simon; Glenton, Claire

    2014-12-01

    To identify factors affecting the implementation of large-scale programmes to optimise the health workforce in low- and middle-income countries. We conducted a multicountry case study synthesis. Eligible programmes were identified through consultation with experts and using Internet searches. Programmes were selected purposively to match the inclusion criteria. Programme documents were gathered via Google Scholar and PubMed and from key informants. The SURE Framework - a comprehensive list of factors that may influence the implementation of health system interventions - was used to organise the data. Thematic analysis was used to identify the key issues that emerged from the case studies. Programmes from Brazil, Ethiopia, India, Iran, Malawi, Venezuela and Zimbabwe were selected. Key system-level factors affecting the implementation of the programmes were related to health worker training and continuing education, management and programme support structures, the organisation and delivery of services, community participation, and the sociopolitical environment. Existing weaknesses in health systems may undermine the implementation of large-scale programmes to optimise the health workforce. Changes in the roles and responsibilities of cadres may also, in turn, impact the health system throughout. © 2014 John Wiley & Sons Ltd.

  3. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  4. Thermodynamic optimisation of a heat exchanger

    NARCIS (Netherlands)

    Cornelissen, Rene; Hirs, Gerard

    1999-01-01

    The objective of this paper is to show that for the optimal design of an energy system, where there is a trade-off between exergy saving during operation and exergy use during construction of the energy system, exergy analysis and life cycle analysis should be combined. An exergy optimisation of a

  5. Topology optimisation of micro fluidic mixers considering fluid-structure interactions with a coupled Lattice Boltzmann algorithm

    Science.gov (United States)

    Munk, David J.; Kipouros, Timoleon; Vio, Gareth A.; Steven, Grant P.; Parks, Geoffrey T.

    2017-11-01

    Recently, the study of micro fluidic devices has gained much interest in various fields from biology to engineering. In the constant development cycle, the need to optimise the topology of the interior of these devices, where there are two or more optimality criteria, is always present. In this work, twin physical situations, whereby optimal fluid mixing in the form of vorticity maximisation is accompanied by the requirement that the casing in which the mixing takes place has the best structural performance in terms of the greatest specific stiffness, are considered. In the steady state of mixing this also means that the stresses in the casing are as uniform as possible, thus giving a desired operating life with minimum weight. The ultimate aim of this research is to couple two key disciplines, fluids and structures, into a topology optimisation framework, which shows fast convergence for multidisciplinary optimisation problems. This is achieved by developing a bi-directional evolutionary structural optimisation algorithm that is directly coupled to the Lattice Boltzmann method, used for simulating the flow in the micro fluidic device, for the objectives of minimum compliance and maximum vorticity. The needs for the exploration of larger design spaces and to produce innovative designs make meta-heuristic algorithms, such as genetic algorithms, particle swarms and Tabu Searches, less efficient for this task. The multidisciplinary topology optimisation framework presented in this article is shown to increase the stiffness of the structure from the datum case and produce physically acceptable designs. Furthermore, the topology optimisation method outperforms a Tabu Search algorithm in designing the baffle to maximise the mixing of the two fluids.

  6. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  7. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  8. Optimising the Blended Learning Environment: The Arab Open University Experience

    Science.gov (United States)

    Hamdi, Tahrir; Abu Qudais, Mohammed

    2018-01-01

    This paper will offer some insights into possible ways to optimise the blended learning environment based on experience with this modality of teaching at Arab Open University/Jordan branch and also by reflecting upon the results of several meta-analytical studies, which have shown blended learning environments to be more effective than their face…

  9. Medical Big Data Warehouse: Architecture and System Design, a Case Study: Improving Healthcare Resources Distribution.

    Science.gov (United States)

    Sebaa, Abderrazak; Chikh, Fatima; Nouicer, Amina; Tari, AbdelKamel

    2018-02-19

    The huge increases in medical devices and clinical applications which generate enormous data have raised a big issue in managing, processing, and mining this massive amount of data. Indeed, traditional data warehousing frameworks can not be effective when managing the volume, variety, and velocity of current medical applications. As a result, several data warehouses face many issues over medical data and many challenges need to be addressed. New solutions have emerged and Hadoop is one of the best examples, it can be used to process these streams of medical data. However, without an efficient system design and architecture, these performances will not be significant and valuable for medical managers. In this paper, we provide a short review of the literature about research issues of traditional data warehouses and we present some important Hadoop-based data warehouses. In addition, a Hadoop-based architecture and a conceptual data model for designing medical Big Data warehouse are given. In our case study, we provide implementation detail of big data warehouse based on the proposed architecture and data model in the Apache Hadoop platform to ensure an optimal allocation of health resources.

  10. Monte carlo study of MOSFET packaging, optimised for improved energy response: single MOSFET filtration.

    Science.gov (United States)

    Othman, M A R; Cutajar, D L; Hardcastle, N; Guatelli, S; Rosenfeld, A B

    2010-09-01

    Monte Carlo simulations of the energy response of a conventionally packaged single metal-oxide field effect transistors (MOSFET) detector were performed with the goal of improving MOSFET energy dependence for personal accident or military dosimetry. The MOSFET detector packaging was optimised. Two different 'drop-in' design packages for a single MOSFET detector were modelled and optimised using the GEANT4 Monte Carlo toolkit. Absorbed photon dose simulations of the MOSFET dosemeter placed in free-air response, corresponding to the absorbed doses at depths of 0.07 mm (D(w)(0.07)) and 10 mm (D(w)(10)) in a water equivalent phantom of size 30 x 30 x 30 cm(3) for photon energies of 0.015-2 MeV were performed. Energy dependence was reduced to within + or - 60 % for photon energies 0.06-2 MeV for both D(w)(0.07) and D(w)(10). Variations in the response for photon energies of 15-60 keV were 200 and 330 % for D(w)(0.07) and D(w)(10), respectively. The obtained energy dependence was reduced compared with that for conventionally packaged MOSFET detectors, which usually exhibit a 500-700 % over-response when used in free-air geometry.

  11. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  12. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  13. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  14. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  15. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  16. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  17. Control strategies to optimise power output in heave buoy energy convertors

    International Nuclear Information System (INIS)

    Abu Zarim, M A U A; Sharip, R M

    2013-01-01

    Wave energy converter (WEC) designs are always discussed in order to obtain an optimum design to generate the power from the wave. Output power from wave energy converter can be improved by controlling the oscillation in order to acquire the interaction between the WEC and the incident wave.The purpose of this research is to study the heave buoys in the interest to generate an optimum power output by optimising the phase control and amplitude in order to maximise the active power. In line with the real aims of this study which investigate the theory and function and hence optimise the power generation of heave buoys as renewable energy sources, the condition that influence the heave buoy must be understand in which to propose the control strategies that can be use to control parameters to obtain optimum power output. However, this research is in an early stage, and further analysis and technical development is require

  18. OPTIMISATION OF A DRIVE SYSTEM AND ITS EPICYCLIC GEAR SET

    OpenAIRE

    Bellegarde , Nicolas; Dessante , Philippe; Vidal , Pierre; Vannier , Jean-Claude

    2007-01-01

    International audience; This paper describes the design of a drive consisting of a DC motor, a speed reducer, a lead screw transformation system, a power converter and its associated DC source. The objective is to reduce the mass of the system. Indeed, the volume and weight optimisation of an electrical drive is an important issue for embedded applications. Here, we present an analytical model of the system in a specific application and afterwards an optimisation of the motor and speed reduce...

  19. Optimisation of gas-cooled reactors with the aid of mathematical computers

    Energy Technology Data Exchange (ETDEWEB)

    Margen, P H

    1959-04-15

    Reactor optimisation is the task of finding the combination of values of the independent variables in a reactor design producing the lowest cost of electricity. In a gas-cooled reactor the number of independent variables is particularly large and the optimisation process is, therefore, laborious. The present note describes a procedure for performing the entire optimisation procedure with the aid of a mathematical computer in a single operation, thus saving time for the design staff. Detailed equations and numerical constants are proposed for the thermal and cost relations involved. The reactor physics equations, on the other hand are merely stated as general functions of the relevant variables. The task of expressing these functions as detailed equations will be covered by separate documents prepared by the reactor physics department.

  20. Optimisation of gas-cooled reactors with the aid of mathematical computers

    International Nuclear Information System (INIS)

    Margen, P.H.

    1959-04-01

    Reactor optimisation is the task of finding the combination of values of the independent variables in a reactor design producing the lowest cost of electricity. In a gas-cooled reactor the number of independent variables is particularly large and the optimisation process is, therefore, laborious. The present note describes a procedure for performing the entire optimisation procedure with the aid of a mathematical computer in a single operation, thus saving time for the design staff. Detailed equations and numerical constants are proposed for the thermal and cost relations involved. The reactor physics equations, on the other hand are merely stated as general functions of the relevant variables. The task of expressing these functions as detailed equations will be covered by separate documents prepared by the reactor physics department

  1. Infrastructure optimisation via MBR retrofit: a design guide.

    Science.gov (United States)

    Bagg, W K

    2009-01-01

    Wastewater management is continually evolving with the development and implementation of new, more efficient technologies. One of these is the Membrane Bioreactor (MBR). Although a relatively new technology in Australia, MBR wastewater treatment has been widely used elsewhere for over 20 years, with thousands of MBRs now in operation worldwide. Over the past 5 years, MBR technology has been enthusiastically embraced in Australia as a potential treatment upgrade option, and via retrofit typically offers two major benefits: (1) more capacity using mostly existing facilities, and (2) very high quality treated effluent. However, infrastructure optimisation via MBR retrofit is not a simple or low-cost solution and there are many factors which should be carefully evaluated before deciding on this method of plant upgrade. The paper reviews a range of design parameters which should be carefully evaluated when considering an MBR retrofit solution. Several actual and conceptual case studies are considered to demonstrate both advantages and disadvantages. Whilst optimising existing facilities and production of high quality water for reuse are powerful drivers, it is suggested that MBRs are perhaps not always the most sustainable Whole-of-Life solution for a wastewater treatment plant upgrade, especially by way of a retrofit.

  2. Water distribution systems design optimisation using metaheuristics ...

    African Journals Online (AJOL)

    The topic of multi-objective water distribution systems (WDS) design optimisation using metaheuristics is investigated, comparing numerous modern metaheuristics, including several multi-objective evolutionary algorithms, an estimation of distribution algorithm and a recent hyperheuristic named AMALGAM (an evolutionary ...

  3. Pre-big-bang model on the brane

    International Nuclear Information System (INIS)

    Foffa, Stefano

    2002-01-01

    The equations of motion and junction conditions for a gravidilaton brane world scenario are studied in the string frame. It is shown that they allow Kasner-like solutions on the brane, which makes the dynamics of the brane very similar to the low curvature phase of pre-big-bang cosmology. Analogies and differences of this scenario with the Randall-Sundrum one and with the standard bulk pre-big-bang dynamics are also discussed

  4. Clinical research of traditional Chinese medicine in big data era.

    Science.gov (United States)

    Zhang, Junhua; Zhang, Boli

    2014-09-01

    With the advent of big data era, our thinking, technology and methodology are being transformed. Data-intensive scientific discovery based on big data, named "The Fourth Paradigm," has become a new paradigm of scientific research. Along with the development and application of the Internet information technology in the field of healthcare, individual health records, clinical data of diagnosis and treatment, and genomic data have been accumulated dramatically, which generates big data in medical field for clinical research and assessment. With the support of big data, the defects and weakness may be overcome in the methodology of the conventional clinical evaluation based on sampling. Our research target shifts from the "causality inference" to "correlativity analysis." This not only facilitates the evaluation of individualized treatment, disease prediction, prevention and prognosis, but also is suitable for the practice of preventive healthcare and symptom pattern differentiation for treatment in terms of traditional Chinese medicine (TCM), and for the post-marketing evaluation of Chinese patent medicines. To conduct clinical studies involved in big data in TCM domain, top level design is needed and should be performed orderly. The fundamental construction and innovation studies should be strengthened in the sections of data platform creation, data analysis technology and big-data professionals fostering and training.

  5. Optimisation of window settings for traditional and noise-optimised virtual monoenergetic imaging in dual-energy computed tomography pulmonary angiography

    International Nuclear Information System (INIS)

    D'Angelo, Tommaso; ''G. Martino'' University Hospital, Messina; Bucher, Andreas M.; Lenga, Lukas; Arendt, Christophe T.; Peterke, Julia L.; Martin, Simon S.; Leithner, Doris; Vogl, Thomas J.; Wichmann, Julian L.; Caruso, Damiano; University Hospital, Latina; Mazziotti, Silvio; Blandino, Alfredo; Ascenti, Giorgio; University Hospital, Messina; Othman, Ahmed E.

    2018-01-01

    To define optimal window settings for displaying virtual monoenergetic images (VMI) of dual-energy CT pulmonary angiography (DE-CTPA). Forty-five patients who underwent clinically-indicated third-generation dual-source DE-CTPA were retrospectively evaluated. Standard linearly-blended (M 0 .6), 70-keV traditional VMI (M70), and 40-keV noise-optimised VMI (M40+) reconstructions were analysed. For M70 and M40+ datasets, the subjectively best window setting (width and level, B-W/L) was independently determined by two observers and subsequently related with pulmonary artery attenuation to calculate separate optimised values (O-W/L) using linear regression. Subjective evaluation of image quality (IQ) between W/L settings were assessed by two additional readers. Repeated measures of variance were performed to compare W/L settings and IQ indices between M 0 .6, M70, and M40+. B-W/L and O-W/L for M70 were 460/140 and 450/140, and were 1100/380 and 1070/380 for M40+, respectively, differing from standard DE-CTPA W/L settings (450/100). Highest subjective scores were observed for M40+ regarding vascular contrast, embolism demarcation, and overall IQ (all p<0.001). Application of O-W/L settings is beneficial to optimise subjective IQ of VMI reconstructions of DE-CTPA. A width slightly less than two times the pulmonary trunk attenuation and a level approximately of overall pulmonary vessel attenuation are recommended. (orig.)

  6. Elitism, Sharing and Ranking Choices in Evolutionary Multi-Criterion Optimisation

    OpenAIRE

    Pursehouse, R.C.; Fleming, P.J.

    2002-01-01

    Elitism and sharing are two mechanisms that are believed to improve the performance of an evolutionary multi-criterion optimiser. The relative performance of of the two most popular ranking strategies is largely unknown. Using a new empirical inquiry framework, this report studies the effect of elitism, sharing and ranking design choices using a benchmark suite of two-criterion problems.........

  7. Optimisation of quantitative lung SPECT applied to mild COPD: a software phantom simulation study.

    Science.gov (United States)

    Norberg, Pernilla; Olsson, Anna; Alm Carlsson, Gudrun; Sandborg, Michael; Gustafsson, Agnetha

    2015-01-01

    The amount of inhomogeneities in a (99m)Tc Technegas single-photon emission computed tomography (SPECT) lung image, caused by reduced ventilation in lung regions affected by chronic obstructive pulmonary disease (COPD), is correlated to disease advancement. A quantitative analysis method, the CVT method, measuring these inhomogeneities was proposed in earlier work. To detect mild COPD, which is a difficult task, optimised parameter values are needed. In this work, the CVT method was optimised with respect to the parameter values of acquisition, reconstruction and analysis. The ordered subset expectation maximisation (OSEM) algorithm was used for reconstructing the lung SPECT images. As a first step towards clinical application of the CVT method in detecting mild COPD, this study was based on simulated SPECT images of an advanced anthropomorphic lung software phantom including respiratory and cardiac motion, where the mild COPD lung had an overall ventilation reduction of 5%. The best separation between healthy and mild COPD lung images as determined using the CVT measure of ventilation inhomogeneity and 125 MBq (99m)Tc was obtained using a low-energy high-resolution collimator (LEHR) and a power 6 Butterworth post-filter with a cutoff frequency of 0.6 to 0.7 cm(-1). Sixty-four reconstruction updates and a small kernel size should be used when the whole lung is analysed, and for the reduced lung a greater number of updates and a larger kernel size are needed. A LEHR collimator and 125 (99m)Tc MBq together with an optimal combination of cutoff frequency, number of updates and kernel size, gave the best result. Suboptimal selections of either cutoff frequency, number of updates and kernel size will reduce the imaging system's ability to detect mild COPD in the lung phantom.

  8. Optimisation of surgical care for rectal cancer

    NARCIS (Netherlands)

    Borstlap, W.A.A.

    2017-01-01

    Optimisation of surgical care means weighing the risk of treatment related morbidity against the patients’ potential benefits of a surgical intervention. The first part of this thesis focusses on the anaemic patient undergoing colorectal surgery. Hypothesizing that a more profound haemoglobin

  9. An efficient optimisation method in groundwater resource ...

    African Journals Online (AJOL)

    DRINIE

    2003-10-04

    Oct 4, 2003 ... theories developed in the field of stochastic subsurface hydrology. In reality, many ... Recently, some researchers have applied the multi-stage ... Then a robust solution of the optimisation problem given by Eqs. (1) to (3) is as ...

  10. Topology optimised photonic crystal waveguide intersections with high-transmittance and low crosstalk

    DEFF Research Database (Denmark)

    Ikeda, N; Sugimoto, Y; Watanabe, Y

    2006-01-01

    Numerical and experimental studies on the photonic crystal waveguide intersection based on the topology optimisation design method are reported and the effectiveness of this technique is shown by achieving high transmittance spectra with low crosstalk for the straightforward beam-propagation line...

  11. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  12. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  13. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  14. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  15. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  16. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare.

  17. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  18. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  19. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)). © The Author(s) 2016.

  20. Intelligent Test Mechanism Design of Worn Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available With the continuous development of national economy, big gear was widely applied in metallurgy and mine domains. So, big gear plays an important role in above domains. In practical production, big gear abrasion and breach take place often. It affects normal production and causes unnecessary economic loss. A kind of intelligent test method was put forward on worn big gear mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. The measure equations transformations were made on involute straight gear. Original polar coordinate equations were transformed into rectangular coordinate equations. Big gear abrasion measure principle was introduced. Detection principle diagram was given. Detection route realization method was introduced. OADM12 laser sensor was selected. Detection on big gear abrasion area was realized by detection mechanism. Tested data of unworn gear and worn gear were led in designed calculation program written by Visual Basic language. Big gear abrasion quantity can be obtained. It provides a feasible method for intelligent test and intelligent repair welding on worn big gear.