WorldWideScience

Sample records for optimisation study big

  1. Big data to optimise product strategy in electronic industry

    OpenAIRE

    Khan, Nawaz; Lakshmi Sabih, Vijay; Georgiadou, Elli; Repanovich, Angela

    2016-01-01

    This research identifies the success factors for new product development and competitive advantage as well as argues how big data can expedite the process of launching a new product initiative. By combining the research findings and the patterns of background theories, an inquisitive framework for the new product development and competitive advantage is proposed. This model and framework is a prototype, which with the aid of scenario recommends the parsimonious and an unified way to elucidate...

  2. Cost optimisation studies of high power accelerators

    Energy Technology Data Exchange (ETDEWEB)

    McAdams, R.; Nightingale, M.P.S.; Godden, D. [AEA Technology, Oxon (United Kingdom)] [and others

    1995-10-01

    Cost optimisation studies are carried out for an accelerator based neutron source consisting of a series of linear accelerators. The characteristics of the lowest cost design for a given beam current and energy machine such as power and length are found to depend on the lifetime envisaged for it. For a fixed neutron yield it is preferable to have a low current, high energy machine. The benefits of superconducting technology are also investigated. A Separated Orbit Cyclotron (SOC) has the potential to reduce capital and operating costs and intial estimates for the transverse and longitudinal current limits of such machines are made.

  3. Profile control studies for JET optimised shear regime

    Energy Technology Data Exchange (ETDEWEB)

    Litaudon, X.; Becoulet, A.; Eriksson, L.G.; Fuchs, V.; Huysmans, G.; How, J.; Moreau, D.; Rochard, F.; Tresset, G.; Zwingmann, W. [Association Euratom-CEA, CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint-Paul-lez-Durance (France); Bayetti, P.; Joffrin, E.; Maget, P.; Mayorat, M.L.; Mazon, D.; Sarazin, Y. [JET Abingdon, Oxfordshire (United Kingdom); Voitsekhovitch, I. [Universite de Provence, LPIIM, Aix-Marseille 1, 13 (France)

    2000-03-01

    This report summarises the profile control studies, i.e. preparation and analysis of JET Optimised Shear plasmas, carried out during the year 1999 within the framework of the Task-Agreement (RF/CEA/02) between JET and the Association Euratom-CEA/Cadarache. We report on our participation in the preparation of the JET Optimised Shear experiments together with their comprehensive analyses and the modelling. Emphasis is put on the various aspects of pressure profile control (core and edge pressure) together with detailed studies of current profile control by non-inductive means, in the prospects of achieving steady, high performance, Optimised Shear plasmas. (authors)

  4. The optimisation study of tbp synthesis process by phosphoric acid

    International Nuclear Information System (INIS)

    Amedjkouh, A.; Attou, M.; Azzouz, A.; Zaoui, B.

    1995-07-01

    The present work deals with the optimisation study of TBP synthesis process by phosphoric acid. This way of synthesis is more advantageous than POCL3 or P2O5 as phosphatant agents. these latters are toxic and dangerous for the environnement. The optimisation study is based on a series of 16 experiences taking into account the range of variation of the following parameters : temperature, pressure, reagents mole ratio, promoter content. the yield calculation is based on the randomisation of an equation including all parameters. the resolution of this equation gave a 30% TBP molar ratio. this value is in agreement with that of experimental data

  5. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    Organic fraction of municipal solid waste (OFMSW) is a rich substrate for biogas and compost production. Anaerobic Digestate compost (ADC) is an organic fertilizer produced from stabilized residuals of anaerobic digestion of OFMSW. This paper reports the result of studies carried out to optimise the production of ADC from ...

  6. The Cell Factory Aspergillus Enters the Big Data Era: Opportunities and Challenges for Optimising Product Formation.

    Science.gov (United States)

    Meyer, Vera; Fiedler, Markus; Nitsche, Benjamin; King, Rudibert

    2015-01-01

    Living with limits. Getting more from less. Producing commodities and high-value products from renewable resources including waste. What is the driving force and quintessence of bioeconomy outlines the lifestyle and product portfolio of Aspergillus, a saprophytic genus, to which some of the top-performing microbial cell factories belong: Aspergillus niger, Aspergillus oryzae and Aspergillus terreus. What makes them so interesting for exploitation in biotechnology and how can they help us to address key challenges of the twenty-first century? How can these strains become trimmed for better growth on second-generation feedstocks and how can we enlarge their product portfolio by genetic and metabolic engineering to get more from less? On the other hand, what makes it so challenging to deduce biological meaning from the wealth of Aspergillus -omics data? And which hurdles hinder us to model and engineer industrial strains for higher productivity and better rheological performance under industrial cultivation conditions? In this review, we will address these issues by highlighting most recent findings from the Aspergillus research with a focus on fungal growth, physiology, morphology and product formation. Indeed, the last years brought us many surprising insights into model and industrial strains. They clearly told us that similar is not the same: there are different ways to make a hypha, there are more protein secretion routes than anticipated and there are different molecular and physical mechanisms which control polar growth and the development of hyphal networks. We will discuss new conceptual frameworks derived from these insights and the future scientific advances necessary to create value from Aspergillus Big Data.

  7. A study of certain Monte Carlo search and optimisation methods

    International Nuclear Information System (INIS)

    Budd, C.

    1984-11-01

    Studies are described which might lead to the development of a search and optimisation facility for the Monte Carlo criticality code MONK. The facility envisaged could be used to maximise a function of k-effective with respect to certain parameters of the system or, alternatively, to find the system (in a given range of systems) for which that function takes a given value. (UK)

  8. Parametric studies and optimisation of pumped thermal electricity storage

    International Nuclear Information System (INIS)

    McTigue, Joshua D.; White, Alexander J.; Markides, Christos N.

    2015-01-01

    Highlights: • PTES is modelled by cycle analysis and a Schumann-style model of the thermal stores. • Optimised trade-off surfaces show a flat efficiency vs. energy density profile. • Overall roundtrip efficiencies of around 70% are not inconceivable. - Abstract: Several of the emerging technologies for electricity storage are based on some form of thermal energy storage (TES). Examples include liquid air energy storage, pumped heat energy storage and, at least in part, advanced adiabatic compressed air energy storage. Compared to other large-scale storage methods, TES benefits from relatively high energy densities, which should translate into a low cost per MW h of storage capacity and a small installation footprint. TES is also free from the geographic constraints that apply to hydro storage schemes. TES concepts for electricity storage rely on either a heat pump or refrigeration cycle during the charging phase to create a hot or a cold storage space (the thermal stores), or in some cases both. During discharge, the thermal stores are depleted by reversing the cycle such that it acts as a heat engine. The present paper is concerned with a form of TES that has both hot and cold packed-bed thermal stores, and for which the heat pump and heat engine are based on a reciprocating Joule cycle, with argon as the working fluid. A thermodynamic analysis is presented based on traditional cycle calculations coupled with a Schumann-style model of the packed beds. Particular attention is paid to the various loss-generating mechanisms and their effect on roundtrip efficiency and storage density. A parametric study is first presented that examines the sensitivity of results to assumed values of the various loss factors and demonstrates the rather complex influence of the numerous design variables. Results of an optimisation study are then given in the form of trade-off surfaces for roundtrip efficiency, energy density and power density. The optimised designs show a

  9. Stress analysis studies in optimised 'D' shaped TOKAMAK magnet designs

    International Nuclear Information System (INIS)

    Diserens, N.J.

    1975-07-01

    A suite of computer programs TOK was developed which enabled simple data input to be used for computation of magnetic fields and forces in a toroidal system of coils with either D-shaped or circular cross section. An additional requirement was that input data to the Swansea stress analysis program FINESSE could be output from the TOK fields and forces program, and that graphical output from either program should be available. A further program was required to optimise the coil shape. This used the field calculating routines from the TOK program. The starting point for these studies was the proposed 40 coil Princeton design. The stresses resulting from three different shapes of D-coil were compared. (author)

  10. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  11. Optimisation Study on the Production of Anaerobic Digestate ...

    African Journals Online (AJOL)

    DR. AMIN

    optimise the production of ADC from organic fractions of domestic wastes and the effects of ADC amendments on soil .... (22%), cooked meat (9%), lettuce (11%), carrots. (3%), potato (44%) ... seed was obtained from a mesophilic anaerobic.

  12. A comparative study of marriage in honey bees optimisation (MBO ...

    African Journals Online (AJOL)

    2012-02-15

    Feb 15, 2012 ... In a typical mating, the queen mates with 7 to 20 drones. Each time the .... Honey bee mating optimisation model's pseudo-code ... for this analysis, which consists of 47 years of monthly time ... tive of Karkheh Reservoir is to control and regulate the flow of ..... Masters thesis, Maastricht University, Maastricht.

  13. A big-data model for multi-modal public transportation with application to macroscopic control and optimisation

    Science.gov (United States)

    Faizrahnemoon, Mahsa; Schlote, Arieh; Maggi, Lorenzo; Crisostomi, Emanuele; Shorten, Robert

    2015-11-01

    This paper describes a Markov-chain-based approach to modelling multi-modal transportation networks. An advantage of the model is the ability to accommodate complex dynamics and handle huge amounts of data. The transition matrix of the Markov chain is built and the model is validated using the data extracted from a traffic simulator. A realistic test-case using multi-modal data from the city of London is given to further support the ability of the proposed methodology to handle big quantities of data. Then, we use the Markov chain as a control tool to improve the overall efficiency of a transportation network, and some practical examples are described to illustrate the potentials of the approach.

  14. Pre-segmented 2-Step IMRT with subsequent direct machine parameter optimisation – a planning study

    International Nuclear Information System (INIS)

    Bratengeier, Klaus; Meyer, Jürgen; Flentje, Michael

    2008-01-01

    Modern intensity modulated radiotherapy (IMRT) mostly uses iterative optimisation methods. The integration of machine parameters into the optimisation process of step and shoot leaf positions has been shown to be successful. For IMRT segmentation algorithms based on the analysis of the geometrical structure of the planning target volumes (PTV) and the organs at risk (OAR), the potential of such procedures has not yet been fully explored. In this work, 2-Step IMRT was combined with subsequent direct machine parameter optimisation (DMPO-Raysearch Laboratories, Sweden) to investigate this potential. In a planning study DMPO on a commercial planning system was compared with manual primary 2-Step IMRT segment generation followed by DMPO optimisation. 15 clinical cases and the ESTRO Quasimodo phantom were employed. Both the same number of optimisation steps and the same set of objective values were used. The plans were compared with a clinical DMPO reference plan and a traditional IMRT plan based on fluence optimisation and consequent segmentation. The composite objective value (the weighted sum of quadratic deviations of the objective values and the related points in the dose volume histogram) was used as a measure for the plan quality. Additionally, a more extended set of parameters was used for the breast cases to compare the plans. The plans with segments pre-defined with 2-Step IMRT were slightly superior to DMPO alone in the majority of cases. The composite objective value tended to be even lower for a smaller number of segments. The total number of monitor units was slightly higher than for the DMPO-plans. Traditional IMRT fluence optimisation with subsequent segmentation could not compete. 2-Step IMRT segmentation is suitable as starting point for further DMPO optimisation and, in general, results in less complex plans which are equal or superior to plans generated by DMPO alone

  15. A COMPARATIVE STUDY ON MULTI-SWARM OPTIMISATION AND BAT ALGORITHM FOR UNCONSTRAINED NON LINEAR OPTIMISATION PROBLEMS

    Directory of Open Access Journals (Sweden)

    Evans BAIDOO

    2016-12-01

    Full Text Available A study branch that mocks-up a population of network of swarms or agents with the ability to self-organise is Swarm intelligence. In spite of the huge amount of work that has been done in this area in both theoretically and empirically and the greater success that has been attained in several aspects, it is still ongoing and at its infant stage. An immune system, a cloud of bats, or a flock of birds are distinctive examples of a swarm system. . In this study, two types of meta-heuristics algorithms based on population and swarm intelligence - Multi Swarm Optimization (MSO and Bat algorithms (BA - are set up to find optimal solutions of continuous non-linear optimisation models. In order to analyze and compare perfect solutions at the expense of performance of both algorithms, a chain of computational experiments on six generally used test functions for assessing the accuracy and the performance of algorithms, in swarm intelligence fields are used. Computational experiments show that MSO algorithm seems much superior to BA.

  16. Power supply of Eurotunnel. Optimisation based on traffic and simulation studies

    Energy Technology Data Exchange (ETDEWEB)

    Marie, Stephane [SNCF, Direction de l' Ingenierie, Saint-Denis (France). Dept. des Installations Fixes de Traction Electrique; Dupont, Jean-Pierre; Findinier, Bertrand; Maquaire, Christian [Eurotunnel, Coquelles (France)

    2010-12-15

    In order to reduce electrical power costs and also to cope with the significant traffic increase, a new study was carried on feeding the tunnel section from the French power station, thus improving and reinforcing the existing network. Based on a design study established by SNCF engineering department, EUROTUNNEL chose a new electrical scheme to cope with the traffic increase and optimise investments. (orig.)

  17. High School Learners' Mental Construction during Solving Optimisation Problems in Calculus: A South African Case Study

    Science.gov (United States)

    Brijlall, Deonarain; Ndlovu, Zanele

    2013-01-01

    This qualitative case study in a rural school in Umgungundlovu District in KwaZulu-Natal, South Africa, explored Grade 12 learners' mental constructions of mathematical knowledge during engagement with optimisation problems. Ten Grade 12 learners who do pure Mathemat-ics participated, and data were collected through structured activity sheets and…

  18. CFD optimisation of a stadium roof geometry: a qualitative study to improve the wind microenvironment

    Directory of Open Access Journals (Sweden)

    Sofotasiou Polytimi

    2017-01-01

    Full Text Available The complexity of the built environment requires the adoption of coupled techniques to predict the flow phenomena and provide optimum design solutions. In this study, coupled computational fluid dynamics (CFD and response surface methodology (RSM optimisation tools are employed to investigate the parameters that determine the wind comfort in a two-dimensional stadium model, by optimising the roof geometry. The roof height, width and length are evaluated against the flow homogeneity at the spectator terraces and the playing field area, the roof flow rate and the average interior pressure. Based on non-parametric regression analysis, both symmetric and asymmetric configurations are considered for optimisation. The optimum design solutions revealed that it is achievable to provide an improved wind environment in both playing field area and spectator terraces, giving a further insight on the interrelations of the parameters involved. Considering the limitations of conducting a two-dimensional study, the obtained results may beneficially be used as a basis for the optimisation of a complex three-dimensional stadium structure and thus become an important design guide for stadium structures.

  19. BIG DATA IN SUPPLY CHAIN MANAGEMENT: AN EXPLORATORY STUDY

    Directory of Open Access Journals (Sweden)

    Gheorghe MILITARU

    2015-12-01

    Full Text Available The objective of this paper is to set a framework for examining the conditions under which the big data can create long-term profitability through developing dynamic operations and digital supply networks in supply chain. We investigate the extent to which big data analytics has the power to change the competitive landscape of industries that could offer operational, strategic and competitive advantages. This paper is based upon a qualitative study of the convergence of predictive analytics and big data in the field of supply chain management. Our findings indicate a need for manufacturers to introduce analytics tools, real-time data, and more flexible production techniques to improve their productivity in line with the new business model. By gathering and analysing vast volumes of data, analytics tools help companies to resource allocation and capital spends more effectively based on risk assessment. Finally, implications and directions for future research are discussed.

  20. A pilot investigation to optimise methods for a future satiety preload study

    OpenAIRE

    Hobden, Mark R.; Guérin-Deremaux, Laetitia; Commane, Daniel M.; Rowland, Ian; Gibson, Glenn R.; Kennedy, Orla B.

    2017-01-01

    Background Preload studies are used to investigate the satiating effects of foods and food ingredients. However, the design of preload studies is complex, with many methodological considerations influencing appetite responses. The aim of this pilot investigation was to determine acceptability, and optimise methods, for a future satiety preload study. Specifically, we investigated the effects of altering (i) energy intake at a standardised breakfast (gender-specific or non-gender specific), an...

  1. Robust optimisation of forest transportation networks: a case study ...

    African Journals Online (AJOL)

    Forest transportation costs are the major cost component for many forest product supply chains. In order to minimise these costs, many organisations have turned ... The simulation results are then evaluated for robustness by means of seven robustness performance measures. For our case study, the results show that (1) the ...

  2. Peran Dimensi Kepribadian Big Five terhadap Psychological Adjustment Pada Mahasiswa Indonesia yang Studi Keluar Negeri

    OpenAIRE

    Adelia, Cindy Inge

    2012-01-01

    This study aims to examine the Effect of Big Five Personality to Psychological Adjustment on Indonesian’s Sojourners. The instruments used to collect the data arepsychological adjustment scale and Big Five Inventory. The scale of psychological adjustment was made within 33 items. Big Five Inventory was used from Big Five Inventory that had been adapted by professional translator. Convenience sampling method was used to gather the respond of 117 samples. The data obtained are later analyzed us...

  3. Radiation dose to children in diagnostic radiology. Measurements and methods for clinical optimisation studies

    International Nuclear Information System (INIS)

    Almen, A.J.

    1995-09-01

    A method for estimating mean absorbed dose to different organs and tissues was developed for paediatric patients undergoing X-ray investigations. The absorbed dose distribution in water was measured for the specific X-ray beam used. Clinical images were studied to determine X-ray beam positions and field sizes. Size and position of organs in the patient were estimated using ORNL phantoms and complementary clinical information. Conversion factors between the mean absorbed dose to various organs and entrance surface dose for five different body sizes were calculated. Direct measurements on patients estimating entrance surface dose and energy imparted for common X-ray investigations were performed. The examination technique for a number of paediatric X-ray investigations used in 19 Swedish hospitals was studied. For a simulated pelvis investigation of a 1-year old child the entrance surface dose was measured and image quality was estimated using a contrast-detail phantom. Mean absorbed doses to organs and tissues in urography, lung, pelvis, thoracic spine, lumbar spine and scoliosis investigations was calculated. Calculations of effective dose were supplemented with risk calculations for special organs e g the female breast. The work shows that the examination technique in paediatric radiology is not yet optimised, and that the non-optimised procedures contribute to a considerable variation in radiation dose. In order to optimise paediatric radiology there is a need for more standardised methods in patient dosimetry. It is especially important to relate measured quantities to the size of the patient, using e g the patient weight and length. 91 refs, 17 figs, 8 tabs

  4. Radiation dose to children in diagnostic radiology. Measurements and methods for clinical optimisation studies

    Energy Technology Data Exchange (ETDEWEB)

    Almen, A J

    1995-09-01

    A method for estimating mean absorbed dose to different organs and tissues was developed for paediatric patients undergoing X-ray investigations. The absorbed dose distribution in water was measured for the specific X-ray beam used. Clinical images were studied to determine X-ray beam positions and field sizes. Size and position of organs in the patient were estimated using ORNL phantoms and complementary clinical information. Conversion factors between the mean absorbed dose to various organs and entrance surface dose for five different body sizes were calculated. Direct measurements on patients estimating entrance surface dose and energy imparted for common X-ray investigations were performed. The examination technique for a number of paediatric X-ray investigations used in 19 Swedish hospitals was studied. For a simulated pelvis investigation of a 1-year old child the entrance surface dose was measured and image quality was estimated using a contrast-detail phantom. Mean absorbed doses to organs and tissues in urography, lung, pelvis, thoracic spine, lumbar spine and scoliosis investigations was calculated. Calculations of effective dose were supplemented with risk calculations for special organs e g the female breast. The work shows that the examination technique in paediatric radiology is not yet optimised, and that the non-optimised procedures contribute to a considerable variation in radiation dose. In order to optimise paediatric radiology there is a need for more standardised methods in patient dosimetry. It is especially important to relate measured quantities to the size of the patient, using e g the patient weight and length. 91 refs, 17 figs, 8 tabs.

  5. The Study of “big data” to support internal business strategists

    Science.gov (United States)

    Ge, Mei

    2018-01-01

    How is big data different from previous data analysis systems? The primary purpose behind traditional small data analytics that all managers are more or less familiar with is to support internal business strategies. But big data also offers a promising new dimension: to discover new opportunities to offer customers high-value products and services. The study focus to introduce some strategists which big data support to. Business decisions using big data can also involve some areas for analytics. They include customer satisfaction, customer journeys, supply chains, risk management, competitive intelligence, pricing, discovery and experimentation or facilitating big data discovery.

  6. Epidemiological study of venous thromboembolism in a big Danish cohort

    DEFF Research Database (Denmark)

    Severinsen, Marianne Tang; Kristensen, Søren Risom; Overvad, Kim

    Introduction: Epidemiological data on venous thromboembolism (VT), i.e. pulmonary emboli (PE) and deep venous thrombosis (DVT) are sparse. We have examined VT-diagnoses registered in a big Danish Cohort study.  Methods: All first-time VT diagnoses in The Danish National Patient Register were...... were probable cases (1.7%) whereas for 449 (41.6%) the diagnosis could be excluded. The incidence rate was 1 per 1000 personyears. Out of the 632 cases 60% were DVT and 40% PE. 315 VT were considered idiopathic (49.8%), 311 were secondary (49.2%) and 15 were unclassifiable. 122 patients had cancer, 87...

  7. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.; Hakiki, Farizal; Syihab, Z.; Ambia, F.; Yasutra, A.; Sutopo, S.; Efendi, M.; Sitompul, V.; Primasari, I.; Apriandi, R.

    2017-01-01

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  8. Revisiting EOR Projects in Indonesia through Integrated Study: EOR Screening, Predictive Model, and Optimisation

    KAUST Repository

    Hartono, A. D.

    2017-10-17

    EOR preliminary analysis is pivotal to be performed at early stage of assessment in order to elucidate EOR feasibility. This study proposes an in-depth analysis toolkit for EOR preliminary evaluation. The toolkit incorporates EOR screening, predictive, economic, risk analysis and optimisation modules. The screening module introduces algorithms which assimilates statistical and engineering notions into consideration. The United States Department of Energy (U.S. DOE) predictive models were implemented in the predictive module. The economic module is available to assess project attractiveness, while Monte Carlo Simulation is applied to quantify risk and uncertainty of the evaluated project. Optimization scenario of EOR practice can be evaluated using the optimisation module, in which stochastic methods of Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Evolutionary Strategy (ES) were applied in the algorithms. The modules were combined into an integrated package of EOR preliminary assessment. Finally, we utilised the toolkit to evaluate several Indonesian oil fields for EOR evaluation (past projects) and feasibility (future projects). The attempt was able to update the previous consideration regarding EOR attractiveness and open new opportunity for EOR implementation in Indonesia.

  9. Optimised Renormalisation Group Flows

    CERN Document Server

    Litim, Daniel F

    2001-01-01

    Exact renormalisation group (ERG) flows interpolate between a microscopic or classical theory and the corresponding macroscopic or quantum effective theory. For most problems of physical interest, the efficiency of the ERG is constrained due to unavoidable approximations. Approximate solutions of ERG flows depend spuriously on the regularisation scheme which is determined by a regulator function. This is similar to the spurious dependence on the ultraviolet regularisation known from perturbative QCD. Providing a good control over approximated ERG flows is at the root for reliable physical predictions. We explain why the convergence of approximate solutions towards the physical theory is optimised by appropriate choices of the regulator. We study specific optimised regulators for bosonic and fermionic fields and compare the optimised ERG flows with generic ones. This is done up to second order in the derivative expansion at both vanishing and non-vanishing temperature. An optimised flow for a ``proper-time ren...

  10. Big Data in HEP: A comprehensive use case study

    Science.gov (United States)

    Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; Jayatilaka, Bo; Kowalkowski, Jim; Pivarski, Jim; Sehrish, Saba; Mantilla Surez, Cristina; Svyatkovskiy, Alexey; Tran, Nhan

    2017-10-01

    Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.

  11. Simulation optimisation

    International Nuclear Information System (INIS)

    Anon

    2010-01-01

    Over the past decade there has been a significant advance in flotation circuit optimisation through performance benchmarking using metallurgical modelling and steady-state computer simulation. This benchmarking includes traditional measures, such as grade and recovery, as well as new flotation measures, such as ore floatability, bubble surface area flux and froth recovery. To further this optimisation, Outotec has released its HSC Chemistry software with simulation modules. The flotation model developed by the AMIRA P9 Project, of which Outotec is a sponsor, is regarded by industry as the most suitable flotation model to use for circuit optimisation. This model incorporates ore floatability with flotation cell pulp and froth parameters, residence time, entrainment and water recovery. Outotec's HSC Sim enables you to simulate mineral processes in different levels, from comminution circuits with sizes and no composition, through to flotation processes with minerals by size by floatability components, to full processes with true particles with MLA data.

  12. Big Data, the perfect instrument to study today's consumer behavior

    Directory of Open Access Journals (Sweden)

    Cristina STOICESCU

    2016-01-01

    Full Text Available Consumer behavior study is a new, interdisciplinary and emerging science, developed in the 1960s. Its main sources of information come from economics, psychology, sociology, anthropology and artificial intelligence. If a century ago, most people were living in small towns, with limited possibilities to leave their community, and few ways to satisfy their needs, now, due to the accelerated evolution of technology and the radical change of life style, consumers begin to have increasingly diverse needs. At the same time the instruments used to study their behavior have evolved, and today databases are included in consumer behavior research. Throughout time many models were developed, first in order to analyze, and later in order to predict the consumer behavior. As a result, the concept of Big Data developed, and by applying it now, companies are trying to understand and predict the behavior of their consumers.

  13. Optimisation of parameters for co-precipitation of uranium and plutonium - results of simulation studies

    International Nuclear Information System (INIS)

    Pandey, N.K.; Velvandan, P.V.; Murugesan, S.; Ahmed, M.K.; Koganti, S.B.

    1999-01-01

    Preparation of plutonium oxide from plutonium nitrate solution generally proceeds via oxalate precipitation route. In a nuclear fuel reprocessing scheme this step succeeds the partitioning step (separation of uranium and plutonium). Results of present studies confirm that it is possible to avoid partitioning step and recover plutonium and uranium as co-precipitated product. This also helps in minimising the risk of proliferation of fissile material. In this procedure, the solubility of uranium oxalate in nitric acid is effectively used. Co-precipitation parameters are optimised with simulated solutions of uranium nitrate and thorium nitrate (in place of plutonium). On the basis of obtained results a reconversion flow-sheet is designed and reported here. (author)

  14. State of the art concerning optimum location of capacitors and studying the exhaustive search approach for optimising a given solution

    Directory of Open Access Journals (Sweden)

    Sergio Raúl Rivera Rodríguez

    2004-09-01

    Full Text Available The present article reviews the state of the art of optimum capacitor location in distribution systems, provideing guidelines for planners engaged in optimising tension profiles and controlling reagents in distribution networks.Optimising a given solution by exhastive search is studied here; the dimensions of a given problem are determined by evaluating the different possibilities for resolving it and the solution algorithm's computational times and requierements are visualised. An example system (9 node, IEEE is used for illustrating the exhaustive search approach, where it was found that methods used in the literature regarding this topic do not always lead to the optimum solution.

  15. Big Earth Data Initiative: Metadata Improvement: Case Studies

    Science.gov (United States)

    Kozimor, John; Habermann, Ted; Farley, John

    2016-01-01

    Big Earth Data Initiative (BEDI) The Big Earth Data Initiative (BEDI) invests in standardizing and optimizing the collection, management and delivery of U.S. Government's civil Earth observation data to improve discovery, access use, and understanding of Earth observations by the broader user community. Complete and consistent standard metadata helps address all three goals.

  16. Study on the evolutionary optimisation of the topology of network control systems

    Science.gov (United States)

    Zhou, Zude; Chen, Benyuan; Wang, Hong; Fan, Zhun

    2010-08-01

    Computer networks have been very popular in enterprise applications. However, optimisation of network designs that allows networks to be used more efficiently in industrial environment and enterprise applications remains an interesting research topic. This article mainly discusses the topology optimisation theory and methods of the network control system based on switched Ethernet in an industrial context. Factors that affect the real-time performance of the industrial control network are presented in detail, and optimisation criteria with their internal relations are analysed. After the definition of performance parameters, the normalised indices for the evaluation of the topology optimisation are proposed. The topology optimisation problem is formulated as a multi-objective optimisation problem and the evolutionary algorithm is applied to solve it. Special communication characteristics of the industrial control network are considered in the optimisation process. In respect to the evolutionary algorithm design, an improved arena algorithm is proposed for the construction of the non-dominated set of the population. In addition, for the evaluation of individuals, the integrated use of the dominative relation method and the objective function combination method, for reducing the computational cost of the algorithm, are given. Simulation tests show that the performance of the proposed algorithm is preferable and superior compared to other algorithms. The final solution greatly improves the following indices: traffic localisation, traffic balance and utilisation rate balance of switches. In addition, a new performance index with its estimation process is proposed.

  17. Optimisation of quantitative lung SPECT applied to mild COPD: a software phantom simulation study.

    Science.gov (United States)

    Norberg, Pernilla; Olsson, Anna; Alm Carlsson, Gudrun; Sandborg, Michael; Gustafsson, Agnetha

    2015-01-01

    The amount of inhomogeneities in a (99m)Tc Technegas single-photon emission computed tomography (SPECT) lung image, caused by reduced ventilation in lung regions affected by chronic obstructive pulmonary disease (COPD), is correlated to disease advancement. A quantitative analysis method, the CVT method, measuring these inhomogeneities was proposed in earlier work. To detect mild COPD, which is a difficult task, optimised parameter values are needed. In this work, the CVT method was optimised with respect to the parameter values of acquisition, reconstruction and analysis. The ordered subset expectation maximisation (OSEM) algorithm was used for reconstructing the lung SPECT images. As a first step towards clinical application of the CVT method in detecting mild COPD, this study was based on simulated SPECT images of an advanced anthropomorphic lung software phantom including respiratory and cardiac motion, where the mild COPD lung had an overall ventilation reduction of 5%. The best separation between healthy and mild COPD lung images as determined using the CVT measure of ventilation inhomogeneity and 125 MBq (99m)Tc was obtained using a low-energy high-resolution collimator (LEHR) and a power 6 Butterworth post-filter with a cutoff frequency of 0.6 to 0.7 cm(-1). Sixty-four reconstruction updates and a small kernel size should be used when the whole lung is analysed, and for the reduced lung a greater number of updates and a larger kernel size are needed. A LEHR collimator and 125 (99m)Tc MBq together with an optimal combination of cutoff frequency, number of updates and kernel size, gave the best result. Suboptimal selections of either cutoff frequency, number of updates and kernel size will reduce the imaging system's ability to detect mild COPD in the lung phantom.

  18. Centralising and optimising decentralised stroke care systems : A simulation study on short-term costs and effects

    NARCIS (Netherlands)

    Lahr, Maarten M. H.; van der Zee, Durk-Jouke; Luijckx, Gert-Jan; Vroomen, Patrick C. A. J.; Buskens, Erik

    2017-01-01

    Background: Centralisation of thrombolysis may offer substantial benefits. The aim of this study was to assess short term costs and effects of centralisation of thrombolysis and optimised care in a decentralised system. Methods: Using simulation modelling, three scenarios to improve decentralised

  19. Metal Removal Process Optimisation using Taguchi Method - Simplex Algorithm (TM-SA) with Case Study Applications

    OpenAIRE

    Ajibade, Oluwaseyi A.; Agunsoye, Johnson O.; Oke, Sunday A.

    2018-01-01

    In the metal removal process industry, the current practice to optimise cutting parameters adoptsa conventional method. It is based on trial and error, in which the machine operator uses experience,coupled with handbook guidelines to determine optimal parametric values of choice. This method is notaccurate, is time-consuming and costly. Therefore, there is a need for a method that is scientific, costeffectiveand precise. Keeping this in mind, a different direction for process optimisation is ...

  20. Energetic study of combustion instabilities and genetic optimisation of chemical kinetics; Etude energetique des instabilites thermo-acoustiques et optimisation genetique des cinetiques reduites

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Ch.E.

    2005-12-15

    Gas turbine burners are now widely operated in lean premixed combustion mode. This technology has been introduced in order to limit pollutants emissions (especially the NO{sub x}), and thus comply with environment norms. Nevertheless, the use of lean premixed combustion decreases the stability margin of the flames. The flames are then more prone to be disturbed by flow disturbances. Combustion instabilities are then a major problem of concern for modern gas turbine conception. Some active control systems have been used to ensure stability of gas turbines retro-fitted to lean premixed combustion. The current generation of gas turbines aims to get rid of these control devices getting stability by a proper design. To do so, precise and adapted numerical tools are needed even it is impossible at the moment to guarantee the absolute stability of a combustion chamber at the design stage. Simulation tools for unsteady combustion are now able to compute the whole combustion chamber. Its intrinsic precision, allows the Large Eddy Simulation (LES) to take into account numerous phenomena involved in combustion instabilities. Chemical modelling is an important element for the precision of reactive LES. This study includes the description of an optimisation tools for the reduced chemical kinetics. The capacity of the LES to capture combustion instabilities in gas turbine chamber is also demonstrated. The acoustic energy analysis points out that the boundary impedances of the combustion systems are of prime importance for their stability. (author)

  1. High school learners' mental construction during solving optimisation problems in Calculus: a South African case study

    Directory of Open Access Journals (Sweden)

    Deonarain Brijlall

    2013-01-01

    Full Text Available This qualitative case study in a rural school in Umgungundlovu District in KwaZulu-Natal, South Africa, explored Grade 12 learners' mental constructions of mathematical knowledge during engagement with optimisation problems. Ten Grade 12 learners who do pure Mathematics participated, and data were collected through structured activity sheets and semi-structured interviews. Structured activity sheets with three tasks were given to learners; these tasks were done in groups, and the group leaders were interviewed. It was found that learners tended to do well with routine-type questions, implying that they were functioning at an action level. From the interviews it appeared that learners might have the correct answer, but lacked conceptual understanding. Exploring learners' mental constructions via their responses to activity sheets and interviews enabled common errors and misconceptions to be identified. Themes that emerged were that learners: 1 lacked the understanding of notation dy/dx, 2 had not constructed the derivative and minima/maxima schema, 3 had some difficulty in modelling problems, 4 preferred rules and formulas, and 5 applied algebraic notions incorrectly. Inferences are drawn for curriculum developers and teachers. This study also formulated itemised genetic decompositions for particular tasks, which contribute to APOS theory.

  2. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  3. Influenza serological studies to inform public health action: best practices to optimise timing, quality and reporting.

    Science.gov (United States)

    Laurie, Karen L; Huston, Patricia; Riley, Steven; Katz, Jacqueline M; Willison, Donald J; Tam, John S; Mounts, Anthony W; Hoschler, Katja; Miller, Elizabeth; Vandemaele, Kaat; Broberg, Eeva; Van Kerkhove, Maria D; Nicoll, Angus

    2013-03-01

    Serological studies can detect infection with a novel influenza virus in the absence of symptoms or positive virology, providing useful information on infection that goes beyond the estimates from epidemiological, clinical and virological data. During the 2009 A(H1N1) pandemic, an impressive number of detailed serological studies were performed, yet the majority of serological data were available only after the first wave of infection. This limited the ability to estimate the transmissibility and severity of this novel infection, and the variability in methodology and reporting limited the ability to compare and combine the serological data.   To identify best practices for conduct and standardisation of serological studies on outbreak and pandemic influenza to inform public policy. An international meeting was held in February 2011 in Ottawa, Canada, to foster the consensus for greater standardisation of influenza serological studies. Best practices for serological investigations of influenza epidemiology include the following: classification of studies as pre-pandemic, outbreak, pandemic or inter-pandemic with a clearly identified objective; use of international serum standards for laboratory assays; cohort and cross-sectional study designs with common standards for data collection; use of serum banks to improve sampling capacity; and potential for linkage of serological, clinical and epidemiological data. Advance planning for outbreak studies would enable a rapid and coordinated response; inclusion of serological studies in pandemic plans should be considered. Optimising the quality, comparability and combinability of influenza serological studies will provide important data upon emergence of a novel or variant influenza virus to inform public health action. © 2012 Blackwell Publishing Ltd.

  4. Optimising the neutron environment of Radiation Portal Monitors: A computational study

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Mark R., E-mail: mark.gilbert@ccfe.ac.uk [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Ghani, Zamir [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); McMillan, John E. [Department of Physics and Astronomy, University of Sheffield, Hicks building, Hounsfield Road, Sheffield S3 7RH (United Kingdom); Packer, Lee W. [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-21

    Efficient and reliable detection of radiological or nuclear threats is a crucial part of national and international efforts to prevent terrorist activities. Radiation Portal Monitors (RPMs), which are deployed worldwide, are intended to interdict smuggled fissile material by detecting emissions of neutrons and gamma rays. However, considering the range and variety of threat sources, vehicular and shielding scenarios, and that only a small signature is present, it is important that the design of the RPMs allows these signatures to be accurately differentiated from the environmental background. Using Monte-Carlo neutron-transport simulations of a model {sup 3}He detector system we have conducted a parameter study to identify the optimum combination of detector shielding, moderation, and collimation that maximises the sensitivity of neutron-sensitive RPMs. These structures, which could be simply and cost-effectively added to existing RPMs, can improve the detector response by more than a factor of two relative to an unmodified, bare design. Furthermore, optimisation of the air gap surrounding the helium tubes also improves detector efficiency.

  5. A Study of the Application of Big Data in a Rural Comprehensive Information Service

    Directory of Open Access Journals (Sweden)

    Leifeng Guo

    2015-05-01

    Full Text Available Big data has attracted extensive interest due to its potential tremendous social and scientific value. Researchers are also trying to extract potential value from agriculture big data. This paper presents a study of information services based on big data from the perspective of a rural comprehensive information service. First, we introduce the background of the rural comprehensive information service, and then we present in detail the National Rural Comprehensive Information Service Platform (NRCISP, which is supported by the national science and technology support program. Next, we discuss big data in the NRCISP according to data characteristics, data sources, and data processing. Finally, we discuss a service model and services based on big data in the NRCISP.

  6. Beam position optimisation for IMRT

    International Nuclear Information System (INIS)

    Holloway, L.; Hoban, P.

    2001-01-01

    Full text: The introduction of IMRT has not generally resulted in the use of optimised beam positions because to find the global solution of the problem a time consuming stochastic optimisation method must be used. Although a deterministic method may not achieve the global minimum it should achieve a superior dose distribution compared to no optimisation. This study aimed to develop and test such a method. The beam optimisation method developed relies on an iterative process to achieve the desired number of beams from a large initial number of beams. The number of beams is reduced in a 'weeding-out' process based on the total fluence which each beam delivers. The process is gradual, with only three beams removed each time (following a small number of iterations), ensuring that the reduction in beams does not dramatically affect the fluence maps of those remaining. A comparison was made between the dose distributions achieved when the beams positions were optimised in this fashion and when the beams positions were evenly distributed. The method has been shown to work quite effectively and efficiently. The Figure shows a comparison in dose distribution with optimised and non optimised beam positions for 5 beams. It can be clearly seen that there is an improvement in the dose distribution delivered to the tumour and a reduction in the dose to the critical structure with beam position optimisation. A method for beam position optimisation for use in IMRT optimisations has been developed. This method although not necessarily achieving the global minimum in beam position still achieves quite a dramatic improvement compared with no beam position optimisation and is very efficiently achieved. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  7. Telecom Big Data for Urban Transport Analysis - a Case Study of Split-Dalmatia County in Croatia

    Science.gov (United States)

    Baučić, M.; Jajac, N.; Bućan, M.

    2017-09-01

    Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the "IPA Adriatic CBC//N.0086/INTERMODAL" project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.

  8. TELECOM BIG DATA FOR URBAN TRANSPORT ANALYSIS – A CASE STUDY OF SPLIT-DALMATIA COUNTY IN CROATIA

    Directory of Open Access Journals (Sweden)

    M. Baučić

    2017-09-01

    Full Text Available Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the “IPA Adriatic CBC//N.0086/INTERMODAL” project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.

  9. An empirical study on website usability elements and how they affect search engine optimisation

    Directory of Open Access Journals (Sweden)

    Eugene B. Visser

    2011-03-01

    Full Text Available The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared to a usability-designed version of the control website,namely the experimental website. All optimisation elements were ignored, thus implementing only usability. The results clearly show that inclusion of the usability attributes positively affect conversion,indicating that usability is a prerequisite for effective website design. Search engine optimisation is also a prerequisite for the very reason that if a website does not rank on the first page of the search engine result page for a given keyword, then that website might as well not exist. According to this empirical work, usability is in contradiction to search engine optimisation best practices. Therefore the two need to be weighed up in terms of importance towards search engines and visitors.

  10. Structure and weights optimisation of a modified Elman network emotion classifier using hybrid computational intelligence algorithms: a comparative study

    Science.gov (United States)

    Sheikhan, Mansour; Abbasnezhad Arabi, Mahdi; Gharavian, Davood

    2015-10-01

    Artificial neural networks are efficient models in pattern recognition applications, but their performance is dependent on employing suitable structure and connection weights. This study used a hybrid method for obtaining the optimal weight set and architecture of a recurrent neural emotion classifier based on gravitational search algorithm (GSA) and its binary version (BGSA), respectively. By considering the features of speech signal that were related to prosody, voice quality, and spectrum, a rich feature set was constructed. To select more efficient features, a fast feature selection method was employed. The performance of the proposed hybrid GSA-BGSA method was compared with similar hybrid methods based on particle swarm optimisation (PSO) algorithm and its binary version, PSO and discrete firefly algorithm, and hybrid of error back-propagation and genetic algorithm that were used for optimisation. Experimental tests on Berlin emotional database demonstrated the superior performance of the proposed method using a lighter network structure.

  11. Options for radiation dose optimisation in pelvic digital radiography: A phantom study

    International Nuclear Information System (INIS)

    Manning-Stanley, Anthony S.; Ward, Anthony J.; England, Andrew

    2012-01-01

    Purpose: To investigate the effects of phantom orientation and AEC chamber selection on radiation dose and image quality (IQ) for digital radiography (DR) examinations of the pelvis. Methods: A phantom study was conducted using a DR detector, utilising all AEC chamber combinations. Current recommended orientation (Cr-AEC) was with the outer AEC chambers cranially orientated. mAs (given), source-to-skin distance and kV p data facilitated entrance surface dose and effective dose calculations. Six anatomical areas were blindly graded by two observers (3-point scale) for IQ. Statistical differences in radiation dose were determined using the paired Student’s t-test. IQ data was analysed for inter-observer variability (ICC) and statistical differences (Wilcoxon test). Results: Switching phantom orientation (caudally orientated outer AEC chambers: Ca-AEC) reduced mean radiation dose by 36.8%, (p < 0.001). A minor reduction in median IQ (15.5 vs. 15) was seen (p < 0.001). One Ca-AEC orientated image (1.6%) had all anatomical areas graded ‘inadequate’ by at least one observer; all other images were considered ‘adequate’ for all areas. In the Ca-AEC orientation, at least a 44% dose reduction was achievable (p < 0.001) when only the outer AEC chambers were used. In the Cr-AEC orientation, at least 11% dose reduction was achieved (p < 0.001); here the central chamber was used alone, or in combination. IQ scores fell, but remained ‘adequate’. Conclusion: Switching pelvic orientation relative to AEC chamber position can optimise radiation dose during pelvic radiography. AEC chamber position should be clearly marked on equipment to facilitate this. AEC selection should be an active process.

  12. Recurrent personality dimensions in inclusive lexical studies: indications for a big six structure.

    Science.gov (United States)

    Saucier, Gerard

    2009-10-01

    Previous evidence for both the Big Five and the alternative six-factor model has been drawn from lexical studies with relatively narrow selections of attributes. This study examined factors from previous lexical studies using a wider selection of attributes in 7 languages (Chinese, English, Filipino, Greek, Hebrew, Spanish, and Turkish) and found 6 recurrent factors, each with common conceptual content across most of the studies. The previous narrow-selection-based six-factor model outperformed the Big Five in capturing the content of the 6 recurrent wideband factors. Adjective markers of the 6 recurrent wideband factors showed substantial incremental prediction of important criterion variables over and above the Big Five. Correspondence between wideband 6 and narrowband 6 factors indicate they are variants of a "Big Six" model that is more general across variable-selection procedures and may be more general across languages and populations.

  13. Modeling and processing for next-generation big-data technologies with applications and case studies

    CERN Document Server

    Barolli, Leonard; Barolli, Admir; Papajorgji, Petraq

    2015-01-01

    This book covers the latest advances in Big Data technologies and provides the readers with a comprehensive review of the state-of-the-art in Big Data processing, analysis, analytics, and other related topics. It presents new models, algorithms, software solutions and methodologies, covering the full data cycle, from data gathering to their visualization and interaction, and includes a set of case studies and best practices. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data are also identified and presented throughout the book, which is intended for researchers, scholars, advanced students, software developers and practitioners working at the forefront in their field.

  14. Big Data in HEP: A comprehensive use case study

    OpenAIRE

    Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; Jayatilaka, Bo; Kowalkowski, Jim; Pivarski, Jim; Sehrish, Saba; Surez, Cristina Mantilla; Svyatkovskiy, Alexey; Tran, Nhan

    2017-01-01

    Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats),...

  15. A pilot investigation to optimise methods for a future satiety preload study.

    Science.gov (United States)

    Hobden, Mark R; Guérin-Deremaux, Laetitia; Commane, Daniel M; Rowland, Ian; Gibson, Glenn R; Kennedy, Orla B

    2017-01-01

    Preload studies are used to investigate the satiating effects of foods and food ingredients. However, the design of preload studies is complex, with many methodological considerations influencing appetite responses. The aim of this pilot investigation was to determine acceptability, and optimise methods, for a future satiety preload study. Specifically, we investigated the effects of altering (i) energy intake at a standardised breakfast (gender-specific or non-gender specific), and (ii) the duration between mid-morning preload and ad libitum lunch meal, on morning appetite scores and energy intake at lunch. Participants attended a single study visit. Female participants consumed a 214-kcal breakfast ( n  = 10) or 266-kcal breakfast ( n  = 10), equivalent to 10% of recommended daily energy intakes for females and males, respectively. Male participants ( n  = 20) consumed a 266-kcal breakfast. All participants received a 250-ml orange juice preload 2 h after breakfast. The impact of different study timings was evaluated in male participants, with 10 males following one protocol (protocol 1) and 10 males following another (protocol 2). The duration between preload and ad libitum lunch meal was 2 h (protocol 1) or 2.5 h (protocol 2), with the ad libitum lunch meal provided at 12.00 or 13.00, respectively. All female participants followed protocol 2. Visual analogue scale (VAS) questionnaires were used to assess appetite responses and food/drink palatability. Correlation between male and female appetite scores was higher with the provision of a gender-specific breakfast, compared to non-gender-specific breakfast (Pearson correlation of 0.747 and 0.479, respectively). No differences in subjective appetite or ad libitum energy intake were found between protocols 1 and 2. VAS mean ratings of liking, enjoyment, and palatability were all > 66 out of 100 mm for breakfast, preload, and lunch meals. The findings of this pilot study confirm the acceptability

  16. Optimisation of process parameters in friction stir welding based on residual stress analysis: a feasibility study

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2010-01-01

    The present paper considers the optimisation of process parameters in friction stir welding (FSW). More specifically, the choices of rotational speed and traverse welding speed have been investigated using genetic algorithms. The welding process is simulated in a transient, two......-dimensional sequentially coupled thermomechanical model in ANSYS. This model is then used in an optimisation case where the two objectives are the minimisation of the peak residual stresses and the maximisation of the welding speed. The results indicate that the objectives for the considered case are conflicting......, and this is presented as a Pareto optimal front. Moreover, a higher welding speed for a fixed rotational speed results, in general, in slightly higher stress levels in the tension zone, whereas a higher rotational speed for a fixed welding speed yields somewhat lower peak residual stress, however, a wider tension zone...

  17. An empirical study on website usability elements and how they affect search engine optimisation

    OpenAIRE

    Eugene B. Visser; Melius Weideman

    2011-01-01

    The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared...

  18. A STUDY ON OPTIMISATION OF RESOURCES FOR MULTIPLE PROJECTS BY USING PRIMAVERA

    Directory of Open Access Journals (Sweden)

    B. S. K. REDDY

    2015-02-01

    Full Text Available Resources play vital role in construction projects. The performance of construction industry depends chiefly on how best the resources are managed. Optimisation play pivotal role in resource management, but task is highly haphazard and chaotic under the influence of complexities and vastness. Management always looks for optimum utility of resources available with them. Hence, the project management has got important place especially in resource allocation and smooth functioning with allocated budget. To achieve these goals and to exercise enhance optimisation certain tools are used for resource allocation optimally. Present work illustrates resource optimisation exercises on two ongoing projects in Dubai, UAE. Resource demands of project A & B are individually levelled and observed cumulative requirement is 17475. In other option demands of projects A & B are aggregated and then together levelled, the necessary resource observed is 16490. Comparison of individually levelled and then combined option with aggregated and then levelled clearly indicates reduction in demand of resources by 5.65% in later option, which could be best considered for economy.

  19. Big-data-based edge biomarkers: study on dynamical drug sensitivity and resistance in individuals.

    Science.gov (United States)

    Zeng, Tao; Zhang, Wanwei; Yu, Xiangtian; Liu, Xiaoping; Li, Meiyi; Chen, Luonan

    2016-07-01

    Big-data-based edge biomarker is a new concept to characterize disease features based on biomedical big data in a dynamical and network manner, which also provides alternative strategies to indicate disease status in single samples. This article gives a comprehensive review on big-data-based edge biomarkers for complex diseases in an individual patient, which are defined as biomarkers based on network information and high-dimensional data. Specifically, we firstly introduce the sources and structures of biomedical big data accessible in public for edge biomarker and disease study. We show that biomedical big data are typically 'small-sample size in high-dimension space', i.e. small samples but with high dimensions on features (e.g. omics data) for each individual, in contrast to traditional big data in many other fields characterized as 'large-sample size in low-dimension space', i.e. big samples but with low dimensions on features. Then, we demonstrate the concept, model and algorithm for edge biomarkers and further big-data-based edge biomarkers. Dissimilar to conventional biomarkers, edge biomarkers, e.g. module biomarkers in module network rewiring-analysis, are able to predict the disease state by learning differential associations between molecules rather than differential expressions of molecules during disease progression or treatment in individual patients. In particular, in contrast to using the information of the common molecules or edges (i.e.molecule-pairs) across a population in traditional biomarkers including network and edge biomarkers, big-data-based edge biomarkers are specific for each individual and thus can accurately evaluate the disease state by considering the individual heterogeneity. Therefore, the measurement of big data in a high-dimensional space is required not only in the learning process but also in the diagnosing or predicting process of the tested individual. Finally, we provide a case study on analyzing the temporal expression

  20. A study of lateral fall-off (penumbra) optimisation for pencil beam scanning (PBS) proton therapy

    Science.gov (United States)

    Winterhalter, C.; Lomax, A.; Oxley, D.; Weber, D. C.; Safai, S.

    2018-01-01

    The lateral fall-off is crucial for sparing organs at risk in proton therapy. It is therefore of high importance to minimize the penumbra for pencil beam scanning (PBS). Three optimisation approaches are investigated: edge-collimated uniformly weighted spots (collimation), pencil beam optimisation of uncollimated pencil beams (edge-enhancement) and the optimisation of edge collimated pencil beams (collimated edge-enhancement). To deliver energies below 70 MeV, these strategies are evaluated in combination with the following pre-absorber methods: field specific fixed thickness pre-absorption (fixed), range specific, fixed thickness pre-absorption (automatic) and range specific, variable thickness pre-absorption (variable). All techniques are evaluated by Monte Carlo simulated square fields in a water tank. For a typical air gap of 10 cm, without pre-absorber collimation reduces the penumbra only for water equivalent ranges between 4-11 cm by up to 2.2 mm. The sharpest lateral fall-off is achieved through collimated edge-enhancement, which lowers the penumbra down to 2.8 mm. When using a pre-absorber, the sharpest fall-offs are obtained when combining collimated edge-enhancement with a variable pre-absorber. For edge-enhancement and large air gaps, it is crucial to minimize the amount of material in the beam. For small air gaps however, the superior phase space of higher energetic beams can be employed when more material is used. In conclusion, collimated edge-enhancement combined with the variable pre-absorber is the recommended setting to minimize the lateral penumbra for PBS. Without collimator, it would be favourable to use a variable pre-absorber for large air gaps and an automatic pre-absorber for small air gaps.

  1. Study on Effects of Different Replacement Rate on Bending Behavior of Big Recycled Aggregate Self Compacting Concrete

    Science.gov (United States)

    Li, Jing; Guo, Tiantian; Gao, Shuai; Jiang, Lin; Zhao, Zhijun; Wang, Yalin

    2018-03-01

    Big recycled aggregate self compacting concrete is a new type of recycled concrete, which has the advantages of low hydration heat and green environmental protection, but its bending behavior can be affected by different replacement rate. Therefor, in this paper, the research status of big Recycled aggregate self compacting concrete was systematically introduced, and the effect of different replacement rate of big recycled aggregate on failure mode, crack distribution and bending strength of the beam were studied through the bending behavior test of 4 big recycled aggregate self compacting concrete beams. The results show that: The crack distribution of the beam can be affected by the replacement rate; The failure modes of big recycled aggregate beams are the same as those of ordinary concrete; The plane section assumption is applicable to the big recycled aggregate self compacting concrete beam; The higher the replacement rate, the lower the bending strength of big recycled aggregate self compacting concrete beams.

  2. Monte carlo study of MOSFET packaging, optimised for improved energy response: single MOSFET filtration.

    Science.gov (United States)

    Othman, M A R; Cutajar, D L; Hardcastle, N; Guatelli, S; Rosenfeld, A B

    2010-09-01

    Monte Carlo simulations of the energy response of a conventionally packaged single metal-oxide field effect transistors (MOSFET) detector were performed with the goal of improving MOSFET energy dependence for personal accident or military dosimetry. The MOSFET detector packaging was optimised. Two different 'drop-in' design packages for a single MOSFET detector were modelled and optimised using the GEANT4 Monte Carlo toolkit. Absorbed photon dose simulations of the MOSFET dosemeter placed in free-air response, corresponding to the absorbed doses at depths of 0.07 mm (D(w)(0.07)) and 10 mm (D(w)(10)) in a water equivalent phantom of size 30 x 30 x 30 cm(3) for photon energies of 0.015-2 MeV were performed. Energy dependence was reduced to within + or - 60 % for photon energies 0.06-2 MeV for both D(w)(0.07) and D(w)(10). Variations in the response for photon energies of 15-60 keV were 200 and 330 % for D(w)(0.07) and D(w)(10), respectively. The obtained energy dependence was reduced compared with that for conventionally packaged MOSFET detectors, which usually exhibit a 500-700 % over-response when used in free-air geometry.

  3. Big Data technology in traffic: A case study of automatic counters

    Directory of Open Access Journals (Sweden)

    Janković Slađana R.

    2016-01-01

    Full Text Available Modern information and communication technologies together with intelligent devices provide a continuous inflow of large amounts of data that are used by traffic and transport systems. Collecting traffic data does not represent a challenge nowadays, but the issues remains in relation to storing and processing increasing amounts of data. In this paper we have investigated the possibilities of using Big Data technology to store and process data in the transport domain. The term Big Data refers to a large volume of information resource, its velocity and variety, far beyond the capabilities of commonly used software for storing, processing and data management. In our case study, Apache™ Hadoop® Big Data was used for processing data collected from 10 automatic traffic counters set up in Novi Sad and its surroundings. Indicators of traffic load which were calculated using the Big Data platforms were presented using tables and graphs in Microsoft Office Excel tool. The visualization and geolocation of the obtained indicators were performed using the Microsoft Business Intelligence (BI tools such as: Excel Power View and Excel Power Map. This case study showed that Big Data technologies combined with the BI tools can be used as a reliable support in monitoring of the traffic management systems.

  4. Optimised cut-off function for Tersoff-like potentials for a BN nanosheet: a molecular dynamics study

    International Nuclear Information System (INIS)

    Kumar, Rajesh; Rajasekaran, G; Parashar, Avinash

    2016-01-01

    In this article, molecular dynamics based simulations were carried out to study the tensile behaviour of boron nitride nanosheets (BNNSs). Four different sets of Tersoff potential parameters were used in the simulations for estimating the interatomic interactions between boron and nitrogen atoms. Modifications were incorporated in the Tersoff cut-off function to improve the accuracy of results with respect to fracture stress, fracture strain and Young’s modulus. In this study, the original cut-off function was optimised in such a way that small and large cut-off distances were made equal, and hence a single cut-off distance was used with all sets of Tersoff potential parameters. The single value of cut-off distance for the Tersoff potential was chosen after analysing the potential energy and bond forces experienced by boron and nitrogen atoms subjected to bond stretching. The simulations performed with the optimised cut-off function help in identifying the Tersoff potential parameters that reproduce the experimentally evaluated mechanical behaviour of BNNSs. (paper)

  5. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  6. Big Data access and infrastructure for modern biology: case studies in data repository utility.

    Science.gov (United States)

    Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R

    2017-01-01

    Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data. © 2016 New York Academy of Sciences.

  7. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  8. Big Data Science Education: A Case Study of a Project-Focused Introductory Course

    Science.gov (United States)

    Saltz, Jeffrey; Heckman, Robert

    2015-01-01

    This paper reports on a case study of a project-focused introduction to big data science course. The pedagogy of the course leveraged boundary theory, where students were positioned to be at the boundary between a client's desire to understand their data and the academic class. The results of the case study demonstrate that using live clients…

  9. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  10. Multi-objective optimisation with stochastic discrete-event simulation in retail banking: a case study

    Directory of Open Access Journals (Sweden)

    E Scholtz

    2012-12-01

    Full Text Available The cash management of an autoteller machine (ATM is a multi-objective optimisation problem which aims to maximise the service level provided to customers at minimum cost. This paper focus on improved cash management in a section of the South African retail banking industry, for which a decision support system (DSS was developed. This DSS integrates four Operations Research (OR methods: the vehicle routing problem (VRP, the continuous review policy for inventory management, the knapsack problem and stochastic, discrete-event simulation. The DSS was applied to an ATM network in the Eastern Cape, South Africa, to investigate 90 different scenarios. Results show that the application of a formal vehicle routing method consistently yields higher service levels at lower cost when compared to two other routing approaches, in conjunction with selected ATM reorder levels and a knapsack-based notes dispensing algorithm. It is concluded that the use of vehicle routing methods is especially beneficial when the bank has substantial control over transportation cost.

  11. Integrated energy optimisation for the cement industry: A case study perspective

    International Nuclear Information System (INIS)

    Swanepoel, Jan Adriaan; Mathews, Edward Henry; Vosloo, Jan; Liebenberg, Leon

    2014-01-01

    Highlights: • Integration of all energy-intensive components of a cement plant production process in a simulation package. • Uniquely, the simulation model incorporates constraints such as maintenance, production and dynamic energy costs. • The system was implemented on four different cement plants and a total energy cost saving of 7.1% was achieved. - Abstract: Energy costs play a major role in the cement production process. As much as 60% of total cost is allocated to energy and 18% to the consumption of electrical energy. Historically, energy cost savings were achieved by large infrastructure upgrades. These upgrades are often costly and lead to interruptions in production. In this paper the operation of all the energy intensive components of the cement production process are identified, modelled, integrated and optimised for minimum operational costs while meeting production targets. This integrated approach allows for simulation of the collective effect of individual production components. The system incorporates constraints such as maintenance, production and dynamic energy costs. No published research could be found where these constraints are incorporated into a single operational solution. The system was implemented on four cement plants and a total energy cost saving of 7% was achieved. This highlights the practical significance of an integrated approach to energy cost savings

  12. Design of farm winery façades for the optimisation of indoor natural lighting: a case study

    Directory of Open Access Journals (Sweden)

    Daniele Torreggiani

    2013-06-01

    Full Text Available This paper deals with the theme of daylighting performances of rural buildings, within a broader research context aimed at establishing design criteria for farm wineries. The objective is to benchmark the performances of different window systems in order to define design guidelines directed towards the optimisation of natural lighting to improve visual comfort and reduce energy consumption. A winegrowing and producing farm with standard features in the Emilia- Romagna region, Northern Italy, is considered as a case study. Particular attention was given to the part of the building designated for tasting activities. The study considered several opening solutions in the building envelope, and showed the effectiveness of those involving south façade glazing with appropriate screening systems. Further analyses will aim to investigate the performance of windows distributed on several fronts, including heat balance assessment.

  13. Communicating the Nature of Science through "The Big Bang Theory": Evidence from a Focus Group Study

    Science.gov (United States)

    Li, Rashel; Orthia, Lindy A.

    2016-01-01

    In this paper, we discuss a little-studied means of communicating about or teaching the nature of science (NOS)--through fiction television. We report some results of focus group research which suggest that the American sitcom "The Big Bang Theory" (2007-present), whose main characters are mostly working scientists, has influenced…

  14. The relationship between the big five personality factors and burnout : A study among volunteer counselors

    NARCIS (Netherlands)

    Bakker, A.B.; Van der Zee, K.I.; Lewig, K.A.; Dollard, M.F.

    In the present study of 80 volunteer counselors who cared for terminally ill patients, the authors examined the relationship between burnout as measured by the Maslach Burnout Inventory (C. Maslach, S. E. Jackson, & M. P. Leiter, 1996) and the 5 basic (Big Five) personality factors (A. A. J.

  15. Rationale, design and baseline results of the Treatment Optimisation in Primary care of Heart failure in the Utrecht region (TOPHU) study : a cluster randomised controlled trial

    NARCIS (Netherlands)

    Valk, Mark J.; Hoes, Arno W.; Mosterd, Arend; Landman, Marcel A.; Broekhuizen, Berna D L; Rutten, Frans H.

    2015-01-01

    BACKGROUND: Heart failure (HF) is mainly detected and managed in primary care, but the care is considered suboptimal. We present the rationale, design and baseline results of the Treatment Optimisation in Primary care of Heart failure in the Utrecht region (TOPHU) study. In this study we assess the

  16. Optimising parallel R correlation matrix calculations on gene expression data using MapReduce.

    Science.gov (United States)

    Wang, Shicai; Pandis, Ioannis; Johnson, David; Emam, Ibrahim; Guitton, Florian; Oehmichen, Axel; Guo, Yike

    2014-11-05

    High-throughput molecular profiling data has been used to improve clinical decision making by stratifying subjects based on their molecular profiles. Unsupervised clustering algorithms can be used for stratification purposes. However, the current speed of the clustering algorithms cannot meet the requirement of large-scale molecular data due to poor performance of the correlation matrix calculation. With high-throughput sequencing technologies promising to produce even larger datasets per subject, we expect the performance of the state-of-the-art statistical algorithms to be further impacted unless efforts towards optimisation are carried out. MapReduce is a widely used high performance parallel framework that can solve the problem. In this paper, we evaluate the current parallel modes for correlation calculation methods and introduce an efficient data distribution and parallel calculation algorithm based on MapReduce to optimise the correlation calculation. We studied the performance of our algorithm using two gene expression benchmarks. In the micro-benchmark, our implementation using MapReduce, based on the R package RHIPE, demonstrates a 3.26-5.83 fold increase compared to the default Snowfall and 1.56-1.64 fold increase compared to the basic RHIPE in the Euclidean, Pearson and Spearman correlations. Though vanilla R and the optimised Snowfall outperforms our optimised RHIPE in the micro-benchmark, they do not scale well with the macro-benchmark. In the macro-benchmark the optimised RHIPE performs 2.03-16.56 times faster than vanilla R. Benefiting from the 3.30-5.13 times faster data preparation, the optimised RHIPE performs 1.22-1.71 times faster than the optimised Snowfall. Both the optimised RHIPE and the optimised Snowfall successfully performs the Kendall correlation with TCGA dataset within 7 hours. Both of them conduct more than 30 times faster than the estimated vanilla R. The performance evaluation found that the new MapReduce algorithm and its

  17. A structural study for the optimisation of functional motifs encoded in protein sequences

    Directory of Open Access Journals (Sweden)

    Helmer-Citterich Manuela

    2004-04-01

    Full Text Available Abstract Background A large number of PROSITE patterns select false positives and/or miss known true positives. It is possible that – at least in some cases – the weak specificity and/or sensitivity of a pattern is due to the fact that one, or maybe more, functional and/or structural key residues are not represented in the pattern. Multiple sequence alignments are commonly used to build functional sequence patterns. If residues structurally conserved in proteins sharing a function cannot be aligned in a multiple sequence alignment, they are likely to be missed in a standard pattern construction procedure. Results Here we present a new procedure aimed at improving the sensitivity and/ or specificity of poorly-performing patterns. The procedure can be summarised as follows: 1. residues structurally conserved in different proteins, that are true positives for a pattern, are identified by means of a computational technique and by visual inspection. 2. the sequence positions of the structurally conserved residues falling outside the pattern are used to build extended sequence patterns. 3. the extended patterns are optimised on the SWISS-PROT database for their sensitivity and specificity. The method was applied to eight PROSITE patterns. Whenever structurally conserved residues are found in the surface region close to the pattern (seven out of eight cases, the addition of information inferred from structural analysis is shown to improve pattern selectivity and in some cases selectivity and sensitivity as well. In some of the cases considered the procedure allowed the identification of functionally interesting residues, whose biological role is also discussed. Conclusion Our method can be applied to any type of functional motif or pattern (not only PROSITE ones which is not able to select all and only the true positive hits and for which at least two true positive structures are available. The computational technique for the identification of

  18. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  19. A Study Of The Internet Of Things And Rfid Technology: Big Data In Navy Medicine

    Science.gov (United States)

    2017-12-01

    technology on electrical power poses a threat to hospitals as well. In the event a power failure either from natural or nefarious purposes, the...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT A STUDY OF THE INTERNET OF THINGS AND RFID TECHNOLOGY ...December 2017 3. REPORT TYPE AND DATES COVERED MBA professional report 4. TITLE AND SUBTITLE A STUDY OF THE INTERNET OF THINGS AND RFID TECHNOLOGY : BIG

  20. Isogeometric Analysis and Shape Optimisation

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Gersborg, Allan Roulund

    of the whole domain. So in every optimisation cycle we need to extend a parametrisation of the boundary of a domain to the whole domain. It has to be fast in order not to slow the optimisation down but it also has to be robust and give a parametrisation of high quality. These are conflicting requirements so we...... will explain how the validity of a parametrisation can be checked and we will describe various ways to parametrise a domain. We will in particular study the Winslow functional which turns out to have some desirable properties. Other problems we touch upon is clustering of boundary control points (design...

  1. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  2. Application of Surpac and Whittle Software in Open Pit Optimisation ...

    African Journals Online (AJOL)

    Application of Surpac and Whittle Software in Open Pit Optimisation and Design. ... This paper studies the Surpac and Whittle software and their application in designing an optimised pit. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  3. (MBO) algorithm in multi-reservoir system optimisation

    African Journals Online (AJOL)

    A comparative study of marriage in honey bees optimisation (MBO) algorithm in ... A practical application of the marriage in honey bees optimisation (MBO) ... to those of other evolutionary algorithms, such as the genetic algorithm (GA), ant ...

  4. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  5. The big five as tendencies in situations : A replication study

    NARCIS (Netherlands)

    Hendriks, AAJ

    1996-01-01

    Van Heck, Perugini, Caprara and Froger (1994) report the average generalizability coefficient reflecting the consistent ordering of persons across different situations and different trait markers (items) to be in the order of 0.70. We performed a replication study in which we improved on their

  6. The big data potential of epidemiological studies for criminology and forensics.

    Science.gov (United States)

    DeLisi, Matt

    2018-07-01

    Big data, the analysis of original datasets with large samples ranging from ∼30,000 to one million participants to mine unexplored data, has been under-utilized in criminology. However, there have been recent calls for greater synthesis between epidemiology and criminology and a small number of scholars have utilized epidemiological studies that were designed to measure alcohol and substance use to harvest behavioral and psychiatric measures that relate to the study of crime. These studies have been helpful in producing knowledge about the most serious, violent, and chronic offenders, but applications to more pathological forensic populations is lagging. Unfortunately, big data relating to crime and justice are restricted and limited to criminal justice purposes and not easily available to the research community. Thus, the study of criminal and forensic populations is limited in terms of data volume, velocity, and variety. Additional forays into epidemiology, increased use of available online judicial and correctional data, and unknown new frontiers are needed to bring criminology up to speed in the big data arena. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  7. Big Data Processing for a Central Texas Groundwater Case Study

    Science.gov (United States)

    Cantu, A.; Rivera, O.; Martínez, A.; Lewis, D. H.; Gentle, J. N., Jr.; Fuentes, G.; Pierce, S. A.

    2016-12-01

    As computational methods improve, scientists are able to expand the level and scale of experimental simulation and testing that is completed for case studies. This study presents a comparative analysis of multiple models for the Barton Springs segment of the Edwards aquifer. Several numerical simulations using state-mandated MODFLOW models ran on Stampede, a High Performance Computing system housed at the Texas Advanced Computing Center, were performed for multiple scenario testing. One goal of this multidisciplinary project aims to visualize and compare the output data of the groundwater model using the statistical programming language R to find revealing data patterns produced by different pumping scenarios. Presenting data in a friendly post-processing format is covered in this paper. Visualization of the data and creating workflows applicable to the management of the data are tasks performed after data extraction. Resulting analyses provide an example of how supercomputing can be used to accelerate evaluation of scientific uncertainty and geological knowledge in relation to policy and management decisions. Understanding the aquifer behavior helps policy makers avoid negative impact on the endangered species, environmental services and aids in maximizing the aquifer yield.

  8. The Big Five of Personality and structural imaging revisited: a VBM - DARTEL study.

    Science.gov (United States)

    Liu, Wei-Yin; Weber, Bernd; Reuter, Martin; Markett, Sebastian; Chu, Woei-Chyn; Montag, Christian

    2013-05-08

    The present study focuses on the neurostructural foundations of the human personality. In a large sample of 227 healthy human individuals (168 women and 59 men), we used MRI to examine the relationship between personality traits and both regional gray and white matter volume, while controlling for age and sex. Personality was assessed using the German version of the NEO Five-Factor Inventory that measures individual differences in the 'Big Five of Personality': extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience. In contrast to most previous studies on neural correlates of the Big Five, we used improved processing strategies: white and gray matter were independently assessed by segmentation steps before data analysis. In addition, customized sex-specific diffeomorphic anatomical registration using exponentiated lie algebra templates were used. Our results did not show significant correlations between any dimension of the Big Five and regional gray matter volume. However, among others, higher conscientiousness scores correlated significantly with reductions in regional white matter volume in different brain areas, including the right insula, putamen, caudate, and left fusiformis. These correlations were driven by the female subsample. The present study suggests that many results from the literature on the neurostructural basis of personality should be reviewed carefully, considering the results when the sample size is larger, imaging methods are rigorously applied, and sex-related and age-related effects are controlled.

  9. submitter Performance studies of CMS workflows using Big Data technologies

    CERN Document Server

    Ambroz, Luca; Grandi, Claudio

    At the Large Hadron Collider (LHC), more than 30 petabytes of data are produced from particle collisions every year of data taking. The data processing requires large volumes of simulated events through Monte Carlo techniques. Furthermore, physics analysis implies daily access to derived data formats by hundreds of users. The Worldwide LHC Computing Grid (WLCG) - an international collaboration involving personnel and computing centers worldwide - is successfully coping with these challenges, enabling the LHC physics program. With the continuation of LHC data taking and the approval of ambitious projects such as the High-Luminosity LHC, such challenges will reach the edge of current computing capacity and performance. One of the keys to success in the next decades - also under severe financial resource constraints - is to optimize the efficiency in exploiting the computing resources. This thesis focuses on performance studies of CMS workflows, namely centrallyscheduled production activities and unpredictable d...

  10. A Study on SE Methodology for Design of Big Data Pilot Platform to Improve Nuclear Power Plant Safety

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Junguk; Cha, Jae-Min; Kim, Jun-Young; Park, Sung-Ho; Yeom, Choong-Sub [Institute for Advanced Engineering (IAE), Yongin (Korea, Republic of)

    2016-10-15

    A big data concept is expected to have a large impact on the safety of the nuclear power plant (NPP) from the beginning of the big data era. Though there are high interests on the NPP safety with the big data, almost no studies on the logical and physical structures and the systematic design methods of the big data platform for the NPP safety have been conducted. For the current study, a new big data pilot platform for the NPP safety is designed with the main focus on the health monitoring and early warning systems, and a tailored design process based on the systems engineering approaches is proposed to manage inherent high complexity of the platform design. The big data concept is expected to have a large impact on the safety of the NPP. So, in this study, the big data pilot platform for the health monitoring and early warning of the NPP is designed. For this, the development process based on the SE approach for the pilot platform is proposed and the design results along with the proposed process are also presented. Implementation of the individual modules and integrations of those are in currently progress.

  11. A Study on SE Methodology for Design of Big Data Pilot Platform to Improve Nuclear Power Plant Safety

    International Nuclear Information System (INIS)

    Shin, Junguk; Cha, Jae-Min; Kim, Jun-Young; Park, Sung-Ho; Yeom, Choong-Sub

    2016-01-01

    A big data concept is expected to have a large impact on the safety of the nuclear power plant (NPP) from the beginning of the big data era. Though there are high interests on the NPP safety with the big data, almost no studies on the logical and physical structures and the systematic design methods of the big data platform for the NPP safety have been conducted. For the current study, a new big data pilot platform for the NPP safety is designed with the main focus on the health monitoring and early warning systems, and a tailored design process based on the systems engineering approaches is proposed to manage inherent high complexity of the platform design. The big data concept is expected to have a large impact on the safety of the NPP. So, in this study, the big data pilot platform for the health monitoring and early warning of the NPP is designed. For this, the development process based on the SE approach for the pilot platform is proposed and the design results along with the proposed process are also presented. Implementation of the individual modules and integrations of those are in currently progress

  12. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  13. Optimal Optimisation in Chemometrics

    NARCIS (Netherlands)

    Hageman, J.A.

    2004-01-01

    The use of global optimisation methods is not straightforward, especially for the more difficult optimisation problems. Solutions have to be found for items such as the evaluation function, representation, step function and meta-parameters, before any useful results can be obtained. This thesis aims

  14. Visualization of big data security: a case study on the KDD99 cup data set

    Directory of Open Access Journals (Sweden)

    Zichan Ruan

    2017-11-01

    Full Text Available Cyber security has been thrust into the limelight in the modern technological era because of an array of attacks often bypassing untrained intrusion detection systems (IDSs. Therefore, greater attention has been directed on being able deciphering better methods for identifying attack types to train IDSs more effectively. Keycyber-attack insights exist in big data; however, an efficient approach is required to determine strong attack types to train IDSs to become more effective in key areas. Despite the rising growth in IDS research, there is a lack of studies involving big data visualization, which is key. The KDD99 data set has served as a strong benchmark since 1999; therefore, we utilized this data set in our experiment. In this study, we utilized hash algorithm, a weight table, and sampling method to deal with the inherent problems caused by analyzing big data; volume, variety, and velocity. By utilizing a visualization algorithm, we were able to gain insights into the KDD99 data set with a clear identification of “normal” clusters and described distinct clusters of effective attacks.

  15. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  16. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-01-01

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing. PMID:27763525

  17. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Directory of Open Access Journals (Sweden)

    Ho Ting Wong

    2016-10-01

    Full Text Available The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  18. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness.

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-10-17

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term "Big Data", which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  19. Optimisation and performance studies of the ATLAS $b$-tagging algorithms for the 2017-18 LHC run

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    The optimisation and performance of the ATLAS $b$-tagging algorithms for the 2017-18 data taking at the LHC are described. This note presents the use of additional taggers to further enhance the discrimination between $b$-, $c$- and light-flavour jets, and on new studies for more performant training of the algorithms and for assessing the universality of the training campaign in typical physics processes where flavour tagging plays a crucial role. Particular attention is paid to the inclusion of novel taggers, namely a Soft Muon Tagger, based on the reconstruction of muons from the semileptonic decay of $b$/$c$-hadrons, and a Recurrent Neural Network Impact-Parameter tagger that exploits correlations between tracks within the jet. New variants of the high-level discriminant, based on boosted decision trees and modern deep learning techniques, are also presented. The overlap between the jets tagged by the various $b$-tagging algorithms is studied, and the dependence of the tagging performance on the physics pr...

  20. Geotechnical studies at Jaduguda uranium mine for optimisation of stopping and support parameters in molybdenite shear zone

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Sinha, A.; Prasad, L.; Prasad, M.; Raju, N.M.

    1991-01-01

    In recent years, a few geotechnical studies have been conducted by the Central Mining Research Station, Dhanbad, at Jaduguda mine to improve ground control system and to optimise stopping parameters in the wide orebody zone at deeper levels and thus to add to productivity and recovery of these mines ensuring adequate safety. The replacement of mechanical point-anchored rock-bolts by full column cement grouted bolts, installed as per the designed pattern, has improved the ground condition, decreased the consumption of timber supports by around 70%, curtailed the support installation time and reduced the support cost to a remarkable extent even at the most problematic sites of Jaduguda mine. The analysis of stress development observations in the slope pillars of this mine reveals that the size of the slope pillars may be reduced by 20% in width which means an extra recovery of about 75 to 100 tonnes of ore per pillar per slice. In this paper, the authors have presented a brief account of their studies at this mine in the last four years. (author). 10 refs., 10 tabs., 9 figs

  1. Medical Big Data Warehouse: Architecture and System Design, a Case Study: Improving Healthcare Resources Distribution.

    Science.gov (United States)

    Sebaa, Abderrazak; Chikh, Fatima; Nouicer, Amina; Tari, AbdelKamel

    2018-02-19

    The huge increases in medical devices and clinical applications which generate enormous data have raised a big issue in managing, processing, and mining this massive amount of data. Indeed, traditional data warehousing frameworks can not be effective when managing the volume, variety, and velocity of current medical applications. As a result, several data warehouses face many issues over medical data and many challenges need to be addressed. New solutions have emerged and Hadoop is one of the best examples, it can be used to process these streams of medical data. However, without an efficient system design and architecture, these performances will not be significant and valuable for medical managers. In this paper, we provide a short review of the literature about research issues of traditional data warehouses and we present some important Hadoop-based data warehouses. In addition, a Hadoop-based architecture and a conceptual data model for designing medical Big Data warehouse are given. In our case study, we provide implementation detail of big data warehouse based on the proposed architecture and data model in the Apache Hadoop platform to ensure an optimal allocation of health resources.

  2. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  3. Standardised approach to optimisation

    International Nuclear Information System (INIS)

    Warren-Forward, Helen M.; Beckhaus, Ronald

    2004-01-01

    Optimisation of radiographic images is said to have been obtained if the patient has achieved an acceptable level of dose and the image is of diagnostic value. In the near future, it will probably be recommended that radiographers measure patient doses and compare them to reference levels. The aim of this paper is to describe a standardised approach to optimisation of radiographic examinations in a diagnostic imaging department. A three-step approach is outlined with specific examples for some common examinations (chest, abdomen, pelvis and lumbar spine series). Step One: Patient doses are calculated. Step Two: Doses are compared to existing reference levels and the technique used compared to image quality criteria. Step Three: Appropriate action is taken if doses are above the reference level. Results: Average entrance surface doses for two rooms were as follows AP Abdomen (6.3mGy and 3.4mGy); AP Lumbar Spine (6.4mGy and 4.1mGy) for AP Pelvis (4.8mGy and 2.6mGy) and PA chest (0.19mGy and 0.20mGy). Comparison with the Commission of the European Communities (CEC) recommended techniques identified large differences in the applied potential. The kVp values in this study were significantly lower (by up to lOkVp) than the CEC recommendations. The results of this study have indicated that there is a need to monitor radiation doses received by patients undergoing diagnostic radiography examinations. Not only has the assessment allowed valuable comparison with International Diagnostic Reference Levels and Radiography Good Practice but has demonstrated large variations in mean doses being delivered from different rooms of the same radiology department. Following the simple 3-step approach advocated in this paper should either provide evidence that department are practising the ALARA principle or assist in making suitable changes to current practice. Copyright (2004) Australian Institute of Radiography

  4. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  5. Research article – Optimisation of paediatrics computed radiographyfor full spine curvature measurements using a phantom: a pilot study

    NARCIS (Netherlands)

    de Haan, Seraphine; Reis, Cláudia; Ndlovu, Junior; Serrenho, Catarina; Akhtar, Ifrah; Garcia, José Antonio; Linde, Daniël; Thorskog, Martine; Franco, Loris; Hogg, Peter

    2015-01-01

    Aim: Optimise a set of exposure factors, with the lowest effective dose, to delineate spinal curvature with the modified Cobb method in a full spine using computed radiography (CR) for a 5-year-old paediatric anthropomorphic phantom. Methods: Images were acquired by varying a set of parameters:

  6. Effectiveness of an implementation optimisation intervention aimed at increasing parent engagement in HENRY, a childhood obesity prevention programme - the Optimising Family Engagement in HENRY (OFTEN) trial: study protocol for a randomised controlled trial.

    Science.gov (United States)

    Bryant, Maria; Burton, Wendy; Cundill, Bonnie; Farrin, Amanda J; Nixon, Jane; Stevens, June; Roberts, Kim; Foy, Robbie; Rutter, Harry; Hartley, Suzanne; Tubeuf, Sandy; Collinson, Michelle; Brown, Julia

    2017-01-24

    Family-based interventions to prevent childhood obesity depend upon parents' taking action to improve diet and other lifestyle behaviours in their families. Programmes that attract and retain high numbers of parents provide an enhanced opportunity to improve public health and are also likely to be more cost-effective than those that do not. We have developed a theory-informed optimisation intervention to promote parent engagement within an existing childhood obesity prevention group programme, HENRY (Health Exercise Nutrition for the Really Young). Here, we describe a proposal to evaluate the effectiveness of this optimisation intervention in regard to the engagement of parents and cost-effectiveness. The Optimising Family Engagement in HENRY (OFTEN) trial is a cluster randomised controlled trial being conducted across 24 local authorities (approximately 144 children's centres) which currently deliver HENRY programmes. The primary outcome will be parental enrolment and attendance at the HENRY programme, assessed using routinely collected process data. Cost-effectiveness will be presented in terms of primary outcomes using acceptability curves and through eliciting the willingness to pay for the optimisation from HENRY commissioners. Secondary outcomes include the longitudinal impact of the optimisation, parent-reported infant intake of fruits and vegetables (as a proxy to compliance) and other parent-reported family habits and lifestyle. This innovative trial will provide evidence on the implementation of a theory-informed optimisation intervention to promote parent engagement in HENRY, a community-based childhood obesity prevention programme. The findings will be generalisable to other interventions delivered to parents in other community-based environments. This research meets the expressed needs of commissioners, children's centres and parents to optimise the potential impact that HENRY has on obesity prevention. A subsequent cluster randomised controlled pilot

  7. Knee Kinematics Estimation Using Multi-Body Optimisation Embedding a Knee Joint Stiffness Matrix: A Feasibility Study.

    Directory of Open Access Journals (Sweden)

    Vincent Richard

    Full Text Available The use of multi-body optimisation (MBO to estimate joint kinematics from stereophotogrammetric data while compensating for soft tissue artefact is still open to debate. Presently used joint models embedded in MBO, such as mechanical linkages, constitute a considerable simplification of joint function, preventing a detailed understanding of it. The present study proposes a knee joint model where femur and tibia are represented as rigid bodies connected through an elastic element the behaviour of which is described by a single stiffness matrix. The deformation energy, computed from the stiffness matrix and joint angles and displacements, is minimised within the MBO. Implemented as a "soft" constraint using a penalty-based method, this elastic joint description challenges the strictness of "hard" constraints. In this study, estimates of knee kinematics obtained using MBO embedding four different knee joint models (i.e., no constraints, spherical joint, parallel mechanism, and elastic joint were compared against reference kinematics measured using bi-planar fluoroscopy on two healthy subjects ascending stairs. Bland-Altman analysis and sensitivity analysis investigating the influence of variations in the stiffness matrix terms on the estimated kinematics substantiate the conclusions. The difference between the reference knee joint angles and displacements and the corresponding estimates obtained using MBO embedding the stiffness matrix showed an average bias and standard deviation for kinematics of 0.9±3.2° and 1.6±2.3 mm. These values were lower than when no joint constraints (1.1±3.8°, 2.4±4.1 mm or a parallel mechanism (7.7±3.6°, 1.6±1.7 mm were used and were comparable to the values obtained with a spherical joint (1.0±3.2°, 1.3±1.9 mm. The study demonstrated the feasibility of substituting an elastic joint for more classic joint constraints in MBO.

  8. Knee Kinematics Estimation Using Multi-Body Optimisation Embedding a Knee Joint Stiffness Matrix: A Feasibility Study.

    Science.gov (United States)

    Richard, Vincent; Lamberto, Giuliano; Lu, Tung-Wu; Cappozzo, Aurelio; Dumas, Raphaël

    2016-01-01

    The use of multi-body optimisation (MBO) to estimate joint kinematics from stereophotogrammetric data while compensating for soft tissue artefact is still open to debate. Presently used joint models embedded in MBO, such as mechanical linkages, constitute a considerable simplification of joint function, preventing a detailed understanding of it. The present study proposes a knee joint model where femur and tibia are represented as rigid bodies connected through an elastic element the behaviour of which is described by a single stiffness matrix. The deformation energy, computed from the stiffness matrix and joint angles and displacements, is minimised within the MBO. Implemented as a "soft" constraint using a penalty-based method, this elastic joint description challenges the strictness of "hard" constraints. In this study, estimates of knee kinematics obtained using MBO embedding four different knee joint models (i.e., no constraints, spherical joint, parallel mechanism, and elastic joint) were compared against reference kinematics measured using bi-planar fluoroscopy on two healthy subjects ascending stairs. Bland-Altman analysis and sensitivity analysis investigating the influence of variations in the stiffness matrix terms on the estimated kinematics substantiate the conclusions. The difference between the reference knee joint angles and displacements and the corresponding estimates obtained using MBO embedding the stiffness matrix showed an average bias and standard deviation for kinematics of 0.9±3.2° and 1.6±2.3 mm. These values were lower than when no joint constraints (1.1±3.8°, 2.4±4.1 mm) or a parallel mechanism (7.7±3.6°, 1.6±1.7 mm) were used and were comparable to the values obtained with a spherical joint (1.0±3.2°, 1.3±1.9 mm). The study demonstrated the feasibility of substituting an elastic joint for more classic joint constraints in MBO.

  9. An Empirical Study on Visualizing the Intellectual Structure and Hotspots of Big Data Research from a Sustainable Perspective

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2018-03-01

    Full Text Available Big data has been extensively applied to many fields and wanted for sustainable development. However, increasingly growing publications and the dynamic nature of research fronts pose challenges to understand the current research situation and sustainable development directions of big data. In this paper, we visually conducted a bibliometric study of big data literatures from the Web of Science (WoS between 2002 and 2016, involving 4927 effective journal articles in 1729 journals contributed by 16,404 authors from 4137 institutions. The bibliometric results reveal the current annual publications distribution, journals distribution and co-citation network, institutions distribution and collaboration network, authors distribution, collaboration network and co-citation network, and research hotspots. The results can help researchers worldwide to understand the panorama of current big data research, to find the potential research gaps, and to focus on the future sustainable development directions.

  10. Reducing passengers’ travel time by optimising stopping patterns in a large-scale network: A case-study in the Copenhagen Region

    DEFF Research Database (Denmark)

    Parbo, Jens; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2018-01-01

    Optimising stopping patterns in railway schedules is a cost-effective way to reduce passengers’ generalised travel costs without increasing train operators’ costs. The challenge consists in striking a balance between an increase in waiting time for passengers at skipped stations and a decrease...... in travel time for through-going passengers, with possible consequent changes in the passenger demand and route choices. This study presents the formulation of the skip-stop problem as a bi-level optimisation problem where the lower level is a schedule-based transit assignment model that delivers passengers...... is a mixed-integer problem, whereas the route choice model is a non-linear non-continuous mapping of the timetable. The method was tested on the suburban railway network in the Greater Copenhagen Region (Denmark): the reduction in railway passengers’ in-vehicle travel time was 5.5%, the reduction...

  11. Spatial issues when optimising waste treatment and energy systems – A Danish Case Study

    DEFF Research Database (Denmark)

    Pizarro Alonso, Amalia Rosa; Münster, Marie; Petrovic, Stefan

    2014-01-01

    This study addresses the challenge of including geographical information related to waste resources, energy demands and production plants, and transport options in the optimization of waste management. It analyses how waste may serve as an energy source through thermal conversion and anaerobic di...

  12. Optimisation of anomalous scattering and structural studies of proteins using synchrotron radiation

    International Nuclear Information System (INIS)

    Helliwell, J.R.

    1979-01-01

    Measurements from crystalline protein samples using SR can be conveniently divided into two classes. Firstly, small samples, large unit cells, the rapid collection of accurate high resolution data and dynamical studies can all benefit from the high intensity. Secondly, an important extension of the classical methods of protein structure determination arises from use of the tunability of SR for optimization of anomalous scattering and subsequent phase determination. This paper concentrates on this area of application. (author)

  13. Optimising intraperitoneal gentamicin dosing in peritoneal dialysis patients with peritonitis (GIPD study

    Directory of Open Access Journals (Sweden)

    Lipman Jeffrey

    2009-12-01

    Full Text Available Abstract Background Antibiotics are preferentially delivered via the peritoneal route to treat peritonitis, a major complication of peritoneal dialysis (PD, so that maximal concentrations are delivered at the site of infection. However, drugs administered intraperitoneally can be absorbed into the systemic circulation. Drugs excreted by the kidneys accumulate in PD patients, increasing the risk of toxicity. The aim of this study is to examine a model of gentamicin pharmacokinetics and to develop an intraperitoneal drug dosing regime that maximises bacterial killing and minimises toxicity. Methods/Design This is an observational pharmacokinetic study of consecutive PD patients presenting to the Royal Brisbane and Women's Hospital with PD peritonitis and who meet the inclusion criteria. Participants will be allocated to either group 1, if anuric as defined by urine output less than 100 ml/day, or group 2: if non-anuric, as defined by urine output more than 100 ml/day. Recruitment will be limited to 15 participants in each group. Gentamicin dosing will be based on the present Royal Brisbane & Women's Hospital guidelines, which reflect the current International Society for Peritoneal Dialysis Peritonitis Treatment Recommendations. The primary endpoint is to describe the pharmacokinetics of gentamicin administered intraperitoneally in PD patients with peritonitis based on serial blood and dialysate drug levels. Discussion The study will develop improved dosing recommendations for intraperitoneally administered gentamicin in PD patients with peritonitis. This will guide clinicians and pharmacists in selecting the most appropriate dosing regime of intraperitoneal gentamicin to treat peritonitis. Trial Registration ACTRN12609000446268

  14. Optimisation of CT procedures by dose reduction in abdominal-pelvic studies of chronic patients

    International Nuclear Information System (INIS)

    Calvo, D.; Rodriguez, A.M.; Peinado, M.A.; Fernandez, B.; Fernandez, B.M.; Jimenez, J.R.

    2006-01-01

    Full text of publication follows: Objectives: CT explorations are responsible of a significant increase of collective dose during last twenty years. However, by adapting the procedures to the specific diagnostic requirements of each kind of exploration, dose values can be decreased. This can be specially interesting for chronic patients who undergo several CT controls. The aim of this research is to contrast CT image diagnostic quality by comparing those techniques commonly used in our hospital with lower dose ones. Materials and methods: In a first phase, a study on phantom has been developed to evaluate image quality variations obtained with standard a several low dose techniques. Dose reduction was quantified as well by means of C.T.D.I. w measurements on an abdominal phantom. Both aspects were taken into account to determine a dose threshold below image quality degradation was considered unacceptable from a diagnostic point of view. Subsequently, a group of 50 chronic patients under follow -up was selected to undergo a control CT but with a low dose-technique. Image diagnostic quality was compared with that of previous controls obtained using the standard technique. Three experimented radiologist carried out this evaluation over a sample of six particular slices located at the abdomen and pelvis using an ordinal scale. Such a scale gradate the confidence level of the image for each radiologist. This evaluation was repeated one and two months later without knowledge of previous results to calculate inter and intra -observer variability. Conclusions: CT studies can be carried out with a significant dose reduction preserving their diagnostic capabilities. A quantitative evaluation will be offered at the end of the study, still running. (authors)

  15. Studies of Big Data metadata segmentation between relational and non-relational databases

    Science.gov (United States)

    Golosova, M. V.; Grigorieva, M. A.; Klimentov, A. A.; Ryabinkin, E. A.; Dimitrov, G.; Potekhin, M.

    2015-12-01

    In recent years the concepts of Big Data became well established in IT. Systems managing large data volumes produce metadata that describe data and workflows. These metadata are used to obtain information about current system state and for statistical and trend analysis of the processes these systems drive. Over the time the amount of the stored metadata can grow dramatically. In this article we present our studies to demonstrate how metadata storage scalability and performance can be improved by using hybrid RDBMS/NoSQL architecture.

  16. Studies of Big Data metadata segmentation between relational and non-relational databases

    CERN Document Server

    Golosova, M V; Klimentov, A A; Ryabinkin, E A; Dimitrov, G; Potekhin, M

    2015-01-01

    In recent years the concepts of Big Data became well established in IT. Systems managing large data volumes produce metadata that describe data and workflows. These metadata are used to obtain information about current system state and for statistical and trend analysis of the processes these systems drive. Over the time the amount of the stored metadata can grow dramatically. In this article we present our studies to demonstrate how metadata storage scalability and performance can be improved by using hybrid RDBMS/NoSQL architecture.

  17. Summary of the Big Lost River fish study on the Idaho National Engineering Laboratory Site

    International Nuclear Information System (INIS)

    Overton, C.K.; Johnson, D.W.

    1978-01-01

    Winter fish mortality and fish migration in the Big Lost River were related to natural phenomenon and man-created impacts. Low winter flows resulted in a reduction in habitat and increased rainbow trout mortality. Man-altered flows stimulated movement and created deleterious conditions. Migratory patterns were related to water discharge and temperature. A food habit study of three sympatric salmonid fishes was undertaken during a low water period. The ratio of food items differed between the three species. Flesh of salmonid fishes from within the INEL Site boundary was monitored for three years for radionuclides. Only one trout contained Cs-137 concentrations above the minimum detection limits

  18. Optimisation of metabolic criteria in the prognostic assessment in patients with lymphoma. A multicentre study.

    Science.gov (United States)

    Del Puig Cózar-Santiago, M; García-Garzón, J R; Moragas-Freixa, M; Soler-Peter, M; Bassa Massanas, P; Sánchez-Delgado, M; Sanchez-Jurado, R; Aguilar-Barrios, J E; Sanz-Llorens, R; Ferrer-Rebolleda, J

    To compare sensitivity, specificity and predictive value of Deauville score (DS) vs. ΔSUVmax in interim-treatment PET (iPET) and end-treatment PET (ePET), in patients with diffuse large B cell lymphoma (DLBCL), Hodgkin lymphoma (HL), and follicular lymphoma (FL). Retrospective longitudinal multicentre study including 138 patients (46 DLBCL, 46 HL, 46 FL), on whom 3 18 F-FDG PET/CT were performed: baseline, iPET, and ePET. Visual (DS) and semi-quantitative (ΔSUVmax) parameters were determined for iPET and ePET. Predictive value was determined in relation to disease-free interval. Statistical analysis. iPET for DLBCL, HL, and FL: 1) sensitivity of DS: 76.92/83.33/61.53%; specificity: 78.78/85/81.81%; 2) sensitivity of ΔSUVmax: 53.84/83.33/61.53%; specificity: 87.87/87.50/78.78%. ePET for DLBCL, HL and FL: 1) sensitivity of DS: 61.53/83.33/69.23%; specificity: 90.90/85/87.87%; 2) sensitivity of ΔSUVmax: 69.23/83.33/69.23%; specificity: 90.90/87.50/84.84%. Predictive assessment. iPET study: in DLBCL, DS resulted in 10.3% recurrence of negative iPET, and 17.1% in ΔSUVmax at disease-free interval; in HL, both parameters showed a 2.8% recurrence of negative iPET; in FL, DS resulted in 15.6% recurrence of negative iPET, and 16.1% in ΔSUVmax, with no statistical significance. ePET study: in DLBCL, DS resulted in 14.3% recurrence of negative ePET, and 11.8% in ΔSUVmax at disease-free interval; in HL and FL, both methods showed 2.8 and 12.5% recurrence in negative ePET, respectively. DS and ΔSUVmax did not show significant differences in DLBCL, HL and FL. Their predictive value also did not show significant differences in HL and FL. In DLBCL, DS was higher in iPET, and ΔSUVmax in ePET. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  19. Centralising and optimising decentralised stroke care systems: a simulation study on short-term costs and effects

    Directory of Open Access Journals (Sweden)

    Maarten M. H. Lahr

    2017-01-01

    Full Text Available Abstract Background Centralisation of thrombolysis may offer substantial benefits. The aim of this study was to assess short term costs and effects of centralisation of thrombolysis and optimised care in a decentralised system. Methods Using simulation modelling, three scenarios to improve decentralised settings in the North of Netherlands were compared from the perspective of the policy maker and compared to current decentralised care: (1 improving stroke care at nine separate hospitals, (2 centralising and improving thrombolysis treatment to four, and (3 two hospitals. Outcomes were annual mean and incremental costs per patient up to the treatment with thrombolysis, incremental cost-effectiveness ratio (iCER per 1% increase in thrombolysis rate, and the proportion treated with thrombolysis. Results Compared to current decentralised care, improving stroke care at individual community hospitals led to mean annual costs per patient of $US 1,834 (95% CI, 1,823–1,843 whereas centralising to four and two hospitals led to $US 1,462 (95% CI, 1,451–1,473 and $US 1,317 (95% CI, 1,306–1,328, respectively (P < 0.001. The iCER of improving community hospitals was $US 113 (95% CI, 91–150 and $US 71 (95% CI, 59–94, $US 56 (95% CI, 44–74 when centralising to four and two hospitals, respectively. Thrombolysis rates decreased from 22.4 to 21.8% and 21.2% (P = 0.120 and P = 0.001 in case of increasing centralisation. Conclusions Centralising thrombolysis substantially lowers mean annual costs per patient compared to raising stroke care at community hospitals simultaneously. Small, but negative effects on thrombolysis rates may be expected.

  20. An Optimisation Study on Integrating and Incentivising Thermal Energy Storage (TES in a Dwelling Energy System

    Directory of Open Access Journals (Sweden)

    Gbemi Oluleye

    2018-04-01

    Full Text Available In spite of the benefits from thermal energy storage (TES integration in dwellings, the penetration rate in Europe is 5%. Effective fiscal policies are necessary to accelerate deployment. However, there is currently no direct support for TES in buildings compared to support for electricity storage. This could be due to lack of evidence to support incentivisation. In this study, a novel systematic framework is developed to provide a case in support of TES incentivisation. The model determines the costs, CO2 emissions, dispatch strategy and sizes of technologies, and TES for a domestic user under policy neutral and policy intensive scenarios. The model is applied to different building types in the UK. The model is applied to a case study for a detached dwelling in the UK (floor area of 122 m2, where heat demand is satisfied by a boiler and electricity imported from the grid. Results show that under a policy neutral scenario, integrating a micro-Combined Heat and Power (CHP reduces the primary energy demand by 11%, CO2 emissions by 21%, but with a 16 year payback. Additional benefits from TES integration can pay for the investment within the first 9 years, reducing to 3.5–6 years when the CO2 levy is accounted for. Under a policy intensive scenario (for example considering the Feed in Tariff (FIT, primary energy demand and CO2 emissions reduce by 17 and 33% respectively with a 5 year payback. In this case, the additional benefits for TES integration can pay for the investment in TES within the first 2 years. The framework developed is a useful tool is determining the role TES in decarbonising domestic energy systems.

  1. Optimising Controlled Human Malaria Infection Studies Using Cryopreserved P. falciparum Parasites Administered by Needle and Syringe.

    Directory of Open Access Journals (Sweden)

    Susanne H Sheehy

    Full Text Available Controlled human malaria infection (CHMI studies have become a routine tool to evaluate efficacy of candidate anti-malarial drugs and vaccines. To date, CHMI trials have mostly been conducted using the bite of infected mosquitoes, restricting the number of trial sites that can perform CHMI studies. Aseptic, cryopreserved P. falciparum sporozoites (PfSPZ Challenge provide a potentially more accurate, reproducible and practical alternative, allowing a known number of sporozoites to be administered simply by injection.We sought to assess the infectivity of PfSPZ Challenge administered in different dosing regimens to malaria-naive healthy adults (n = 18. Six participants received 2,500 sporozoites intradermally (ID, six received 2,500 sporozoites intramuscularly (IM and six received 25,000 sporozoites IM.Five out of six participants receiving 2,500 sporozoites ID, 3/6 participants receiving 2,500 sporozoites IM and 6/6 participants receiving 25,000 sporozoites IM were successfully infected. The median time to diagnosis was 13.2, 17.8 and 12.7 days for 2,500 sporozoites ID, 2,500 sporozoites IM and 25,000 sporozoites IM respectively (Kaplan Meier method; p = 0.024 log rank test.2,500 sporozoites ID and 25,000 sporozoites IM have similar infectivities. Given the dose response in infectivity seen with IM administration, further work should evaluate increasing doses of PfSPZ Challenge IM to identify a dosing regimen that reliably infects 100% of participants.ClinicalTrials.gov NCT01465048.

  2. Optimising perioperative care for hip and knee arthroplasty in South Africa: a Delphi consensus study.

    Science.gov (United States)

    Plenge, U; Nortje, M B; Marais, L C; Jordaan, J D; Parker, R; van der Westhuizen, N; van der Merwe, J F; Marais, J; September, W V; Davies, G L; Pretorius, T; Solomon, C; Ryan, P; Torborg, A M; Farina, Z; Smit, R; Cairns, C; Shanahan, H; Sombili, S; Mazibuko, A; Hobbs, H R; Porrill, O S; Timothy, N E; Siebritz, R E; van der Westhuizen, C; Troskie, A J; Blake, C A; Gray, L A; Munting, T W; Steinhaus, H K S; Rowe, P; van der Walt, J G; Isaacs Noordien, R; Theron, A; Biccard, B M

    2018-05-09

    A structured approach to perioperative patient management based on an enhanced recovery pathway protocol facilitates early recovery and reduces morbidity in high income countries. However, in low- and middle-income countries (LMICs), the feasibility of implementing enhanced recovery pathways and its influence on patient outcomes is scarcely investigated. To inform similar practice in LMICs for total hip and knee arthroplasty, it is necessary to identify potential factors for inclusion in such a programme, appropriate for LMICs. Applying a Delphi method, 33 stakeholders (13 arthroplasty surgeons, 12 anaesthetists and 8 physiotherapists) from 10 state hospitals representing 4 South African provinces identified and prioritised i) risk factors associated with poor outcomes, ii) perioperative interventions to improve outcomes and iii) patient and clinical outcomes necessary to benchmark practice for patients scheduled for primary elective unilateral total hip and knee arthroplasty. Thirty of the thirty-three stakeholders completed the 3 months Delphi study. The first round yielded i) 36 suggestions to preoperative risk factors, ii) 14 (preoperative), 18 (intraoperative) and 23 (postoperative) suggestions to best practices for perioperative interventions to improve outcomes and iii) 25 suggestions to important postsurgical outcomes. These items were prioritised by the group in the consecutive rounds and consensus was reached for the top ten priorities for each category. The consensus derived risk factors, perioperative interventions and important outcomes will inform the development of a structured, perioperative multidisciplinary enhanced patient care protocol for total hip and knee arthroplasty. It is anticipated that this study will provide the construct necessary for developing pragmatic enhanced care pathways aimed at improving patient outcomes after arthroplasty in LMICs.

  3. Assessment of empirical antibiotic therapy optimisation in six hospitals: an observational cohort study.

    Science.gov (United States)

    Braykov, Nikolay P; Morgan, Daniel J; Schweizer, Marin L; Uslan, Daniel Z; Kelesidis, Theodoros; Weisenberg, Scott A; Johannsson, Birgir; Young, Heather; Cantey, Joseph; Srinivasan, Arjun; Perencevich, Eli; Septimus, Edward; Laxminarayan, Ramanan

    2014-12-01

    Modification of empirical antimicrobials when warranted by culture results or clinical signs is recommended to control antimicrobial overuse and resistance. We aimed to assess the frequency with which patients were started on empirical antimicrobials, characteristics of the empirical regimen and the clinical characteristics of patients at the time of starting antimicrobials, patterns of changes to empirical therapy at different timepoints, and modifiable factors associated with changes to the initial empirical regimen in the first 5 days of therapy. We did a chart review of adult inpatients receiving one or more antimicrobials in six US hospitals on 4 days during 2009 and 2010. Our primary outcome was the modification of antimicrobial regimen on or before the 5th day of empirical therapy, analysed as a three-category variable. Bivariate analyses were used to establish demographic and clinical variables associated with the outcome. Variables with p values below 0·1 were included in a multivariable generalised linear latent and mixed model with multinomial logit link to adjust for clustering within hospitals and accommodate a non-binary outcome variable. Across the six study sites, 4119 (60%) of 6812 inpatients received antimicrobials. Of 1200 randomly selected patients with active antimicrobials, 730 (61%) met inclusion criteria. At the start of therapy, 220 (30%) patients were afebrile and had normal white blood cell counts. Appropriate cultures were collected from 432 (59%) patients, and 250 (58%) were negative. By the 5th day of therapy, 12·5% of empirical antimicrobials were escalated, 21·5% were narrowed or discontinued, and 66·4% were unchanged. Narrowing or discontinuation was more likely when cultures were collected at the start of therapy (adjusted OR 1·68, 95% CI 1·05-2·70) and no infection was noted on an initial radiological study (1·76, 1·11-2·79). Escalation was associated with multiple infection sites (2·54, 1·34-4·83) and a positive

  4. Contribution to optimisation of Environmental Isotopes tracing in Hydrogeology. Case study of Madagascar

    International Nuclear Information System (INIS)

    RAJAOBELISON, J.

    2003-01-01

    The aim of this work is to suggest some improvements on the theory of interpretation and on the methodological approach for the optimum use of environmental isotopes tracing applied to hydrogeological investigation. A review of the theory of environmental isotopes used in hydrogeology has been made. The main constraints have been highlighted and led to some comments and proposals of improvement, in particular with regard to the continental effect on stable isotopes, to the seasonal variation of groundwater 1 4C content, and to the appropriate model for fractured crystalline aquifers. A literature survey on ten specific scientific papers, dealing with isotopic hydrology in miscellaneous types of aquifers and catchments, allowed to draw a synthesis of the hydrogeological, geochemical and isotopic constraints. A proposal of optimum methodological approach, taking into account the above mentioned constraints, have been inferred. The results of an on-going hydrogeological investigation carried out in the Southern crystalline basement and coastal sedimentary aquifers of Madagascar highlights an unusual methodological approach based on the lack of initial basic hydrogeological data. Besides, it shows to what extent the experience of the above mentioned research works can apply in the specific case of the complex aquifers of Madagascar. The lessons gained from this study contribute to enrich the synthesis of environmental isotopes constraints in hydrogeology and lead to a more realistic methodological approach proposal wich is likely to better make profitable the isotope hydrology technology

  5. Study and optimisation of manganese oxide-based electrodes for electrochemical supercapacitors

    Energy Technology Data Exchange (ETDEWEB)

    Staiti, P.; Lufrano, F. [CNR-ITAE, Istituto di Tecnologie Avanzate per l' Energia ' ' Nicola Giordano' ' , Via Salita S. Lucia n. 5, 98126 S. Lucia, Messina (Italy)

    2009-02-01

    A manganese oxide material was synthesised by an easy precipitation method based on reduction of potassium permanganate(VII) with a manganese(II) salt. The material was treated at different temperatures to study the effect of thermal treatment on capacitive property. The best capacitive performance was obtained with the material treated at 200 C. This material was used to prepare electrodes with different amounts of polymer binder, carbon black and graphite fibres to individuate the optimal composition that gave the best electrochemical performances. It was found that graphite fibres improve the electrochemical performance of electrodes. The highest specific capacitance (267 F g{sup -1} MnO{sub x}) was obtained with an electrode containing 70% of MnO{sub x}, 15% of carbon black, 10% of graphite fibres and 5% of PVDF. This electrode, with CB/GF ratio of 1.5, showed a higher utilization of manganese oxide. The results reported in the present paper further confirmed that manganese oxide is a very interesting material for supercapacitor application. (author)

  6. Study and optimisation of SIMS performed with He+ and Ne+ bombardment

    International Nuclear Information System (INIS)

    Pillatsch, L.; Vanhove, N.; Dowsett, D.; Sijbrandij, S.; Notte, J.; Wirtz, T.

    2013-01-01

    The combination of the high-brightness He + /Ne + atomic level ion source with the detection capabilities of secondary ion mass spectrometry (SIMS) opens up the prospect of obtaining chemical information with high lateral resolution and high sensitivity on the Zeiss ORION helium ion microscope (HIM). A feasibility study with He + and Ne + ion bombardment is presented in order to determine the performance of SIMS analyses using the HIM. Therefore, the sputtering yields, useful yields and detection limits obtained for metallic (Al, Ni and W) as well as semiconductor samples (Si, Ge, GaAs and InP) were investigated. All the experiments were performed on a Cameca IMS4f SIMS instrument which was equipped with a caesium evaporator and oxygen flooding system. For most of the elements, useful yields in the range of 10 −4 to 3 × 10 −2 were measured with either O 2 or Cs flooding. SIMS experiments performed directly on the ORION with a prototype secondary ion extraction and detection system lead to results that are consistent with those obtained on the IMS4f. Taking into account the obtained useful yields and the analytical conditions, such as the ion current and typical dwell time on the ORION HIM, detection limits in the at% range and better can be obtained during SIMS imaging at 10 nm lateral resolution with Ne + bombardment and down to the ppm level when a lateral resolution of 100 nm is chosen. Performing SIMS on the HIM with a good detection limit while maintaining an excellent lateral resolution (<50 nm) is therefore very promising.

  7. Comparative case study on website traffic generated by search engine optimisation and a pay-per-click campaign, versus marketing expenditure

    Directory of Open Access Journals (Sweden)

    Wouter T. Kritzinger

    2015-09-01

    Full Text Available Background: No empirical work was found on how marketing expenses compare when used solely for either the one or the other of the two main types of search engine marketing. Objectives: This research set out to determine how the results of the implementation of a pay-per-click campaign compared to those of a search engine optimisation campaign, given the same website and environment. At the same time, the expenses incurred on both these marketing methods were recorded and compared. Method: The active website of an existing, successful e-commerce concern was used as platform. The company had been using pay-per-click only for a period, whilst traffic was monitored. This system was decommissioned on a particular date and time, and an alternative search engine optimisation system was started at the same time. Again, both traffic and expenses were monitored. Results: The results indicate that the pay-per-click system did produce favourable results, but on the condition that a monthly fee has to be set aside to guarantee consistent traffic. The implementation of search engine optimisation required a relatively large investment at the outset, but it was once-off. After a drop in traffic owing to crawler visitation delays, the website traffic bypassed the average figure achieved during the pay-per-click period after a little over three months, whilst the expenditure crossed over after just six months. Conclusion: Whilst considering the specific parameters of this study, an investment in search engine optimisation rather than a pay-per-click campaign appears to produce better results at a lower cost, after a given period of time. [PDF to follow

  8. Optimising conservative management of chronic low back pain: study protocol for a randomised controlled trial.

    Science.gov (United States)

    Simson, Katherine J; Miller, Clint T; Ford, Jon; Hahne, Andrew; Main, Luana; Rantalainen, Timo; Teo, Wei-Peng; Teychenne, Megan; Connell, David; Trudel, Guy; Zheng, Guoyan; Thickbroom, Gary; Belavy, Daniel L

    2017-04-20

    captured fortnightly by questionnaires. Chronic low back pain is ranked the highest disabling disorder in Australia. The findings of this study will inform clinical practice guidelines to assist with decision-making approaches where outcomes beyond pain are sought for adults with chronic low back pain. Australian New Zealand Clinical Trials Registry, ACTRN12615001270505 . Registered on 20 November 2015.

  9. Physical aspects of thermotherapy: A study of heat transport with a view to treatment optimisation

    Science.gov (United States)

    Olsrud, Johan Karl Otto

    1998-12-01

    Local treatment with the aim to destruct tissue by heating (thermotherapy) may in some cases be an alternative or complement to surgical methods, and has gained increased interest during the last decade. The major advantage of these, often minimally-invasive methods, is that the disease can be controlled with reduced treatment trauma and complications. The extent of thermal damage is a complex function of the physical properties of tissue, which influence the temperature distribution, and of the biological response to heat. In this thesis, methods of obtaining a well-controlled treatment have been studied from a physical point of view, with emphasis on interstitial laser-induced heating of tumours in the liver and intracavitary heating as a treatment for menorrhagia. Hepatic inflow occlusion, in combination with temperature-feedback control of the output power of the laser, resulted in well defined damaged volumes during interstitial laser thermotherapy in normal porcine liver. In addition, phantom experiments showed that the use of multiple diffusing laser fibres allows heating of clinically relevant tissue volumes in a single session. Methods for numerical simulation of heat transport were used to calculate the temperature distribution and the results agreed well with experiments. It was also found from numerical simulation that the influence of light transport on the damaged volume may be negligible in interstitial laser thermotherapy in human liver. Finite element analysis, disregarding light transport, was therefore proposed as a suitable method for 3D treatment planning. Finite element simulation was also used to model intracavitary heating of the uterus, with the purpose of providing an increased understanding of the influence of various treatment parameters on blood flow and on the depth of tissue damage. The thermal conductivity of human uterine tissue, which was used in these simulations, was measured. Furthermore, magnetic resonance imaging (MRI) was

  10. Extraversion Is a Mediator of Gelotophobia: A Study of Autism Spectrum Disorder and the Big Five

    Directory of Open Access Journals (Sweden)

    Meng-Ning Tsai

    2018-02-01

    Full Text Available Previous research has shown that individuals with autism are frequently mocked in their childhood and are consequently more anxious about being ridiculed. Research has also shown that autistic individuals have a higher level of gelotophobia (fear of being laughed at compared to typically developed individuals. However, recent studies have also found that gelotophobia is strongly related to personality, which suggests that personality is a factor that helps to create a higher level of gelotophobia in autistic individuals. To investigate whether this is the case, we recruited 279 Taiwanese high school students, 123 with autism spectrum disorder (ASD and 156 typically developed students as a control group. Self-reporting questionnaires were used to gather data on the Big Five personality traits and on the gelotophobia-related traits of gelotophobia, gelotophilia, and katagelasticism. The results were analyzed and the two groups were compared for differences in gelotophobia and personality. The ASD group was found to have a higher level of gelotophobia than the typically developed group, but lower levels of gelotophilia and katagelasticism. Additionally, the ASD group was found to have lower levels of extraversion and agreeableness than the typically developed group, but no significant difference was found between the two groups in terms of conscientiousness, openness, and emotional stability. We then investigated the possible correlations between gelotophobia-related traits and the Big Five, and consequently the mediation effect of the Big Five on gelotophobia. The results show, firstly, that extraversion rather than ASD is a direct factor in gelotophobia. Secondly, the level of gelotophilia was partly influenced by autism but also to a certain extent by the level of extraversion. Lastly, the results indicate that autism and the level of agreeableness are in conflict when predicting the level of katagelasticism.

  11. Extraversion Is a Mediator of Gelotophobia: A Study of Autism Spectrum Disorder and the Big Five.

    Science.gov (United States)

    Tsai, Meng-Ning; Wu, Ching-Lin; Tseng, Lei-Pin; An, Chih-Pei; Chen, Hsueh-Chih

    2018-01-01

    Previous research has shown that individuals with autism are frequently mocked in their childhood and are consequently more anxious about being ridiculed. Research has also shown that autistic individuals have a higher level of gelotophobia (fear of being laughed at) compared to typically developed individuals. However, recent studies have also found that gelotophobia is strongly related to personality, which suggests that personality is a factor that helps to create a higher level of gelotophobia in autistic individuals. To investigate whether this is the case, we recruited 279 Taiwanese high school students, 123 with autism spectrum disorder (ASD) and 156 typically developed students as a control group. Self-reporting questionnaires were used to gather data on the Big Five personality traits and on the gelotophobia-related traits of gelotophobia, gelotophilia, and katagelasticism. The results were analyzed and the two groups were compared for differences in gelotophobia and personality. The ASD group was found to have a higher level of gelotophobia than the typically developed group, but lower levels of gelotophilia and katagelasticism. Additionally, the ASD group was found to have lower levels of extraversion and agreeableness than the typically developed group, but no significant difference was found between the two groups in terms of conscientiousness, openness, and emotional stability. We then investigated the possible correlations between gelotophobia-related traits and the Big Five, and consequently the mediation effect of the Big Five on gelotophobia. The results show, firstly, that extraversion rather than ASD is a direct factor in gelotophobia. Secondly, the level of gelotophilia was partly influenced by autism but also to a certain extent by the level of extraversion. Lastly, the results indicate that autism and the level of agreeableness are in conflict when predicting the level of katagelasticism.

  12. Advances in the optimisation of apparel heating products: A numerical approach to study heat transport through a blanket with an embedded smart heating system

    International Nuclear Information System (INIS)

    Neves, S.F.; Couto, S.; Campos, J.B.L.M.; Mayor, T.S.

    2015-01-01

    The optimisation of the performance of products with smart/active functionalities (e. g. in protective clothing, home textiles products, automotive seats, etc.) is still a challenge for manufacturers and developers. The aim of this study was to optimise the thermal performance of a heating product by a numerical approach, by analysing several opposing requirements and defining solutions for the identified limitations, before the construction of the first prototype. A transfer model was developed to investigate the transport of heat from the skin to the environment, across a heating blanket with an embedded smart heating system. Several parameters of the textile material and of the heating system were studied, in order to optimise the thermal performance of the heating blanket. Focus was put on the effects of thickness and thermal conductivity of each layer, and on parameters associated with the heating elements, e.g. position of the heating wires relative to the skin, distance between heating wires, applied heating power, and temperature range for operation of the heating system. Furthermore, several configurations of the blanket (and corresponding heating powers) were analysed in order to minimise the heat loss from the body to the environment, and the temperature distribution along the skin. The results show that, to ensure an optimal compromise between the thermal performance of the product and the temperature oscillation along its surface, the distance between the wires should be small (and not bigger than 50 mm), and each layer of the heating blanket should have a specific thermal resistance, based on the expected external conditions during use and the requirements of the heating system (i.e. requirements regarding energy consumption/efficiency and capacity to effectively regulate body exchanges with surrounding environment). The heating system should operate in an ON/OFF mode based on the body heating needs and within a temperature range specified based on

  13. Study of alexithymia trait based on Big-Five Personality Dimensions

    Directory of Open Access Journals (Sweden)

    Rasoul Heshmati

    2017-12-01

    Full Text Available The purpose of this research was to study the relationship between Big Five personality traits and alexithymia and to determine differences of alexithymic compare with non- alexithymic individuals in these personality traits in university students. In present study, 150 university students at Tabriz University were selected and asked to answer NEO – Five Factor Inventory (NEO - FFI, and Toronto Alexithymia Scale (TAS - 20. Results showed that there are negative and significant relationships between conscientiousness and openness to experiences with alexithymia and positive and significant relationships between neuroticism with alexithymia. As well as, there is significant difference between alexithymic and non-alexithymic individuals in neuroticism and openness to experiences. In one hand, these results suggest that neuroticism, conscientiousness and openness to experiences are determinant of alexithymia; and in the other hand, high level of neuroticism and low level of openness to experiences are the characteristic of alexithymic people based on Big-five. Therefore, it can be conclude that high neuroticism and low openness to experiences are the alexithymic individual’s traits.

  14. Geochemical study of acid mine drainage of the Big Lick Tunnel area, Williamstown, PA

    International Nuclear Information System (INIS)

    Tollin, S.

    1993-01-01

    Acid mine drainage in the anthracite region of Pennsylvania continues to be a significant environmental problem. This study examines the acid mine outflow from the Big Lick Tunnel, north of Williamstown, Dauphin County, Pennsylvania. The tunnel drains abandoned mines on the north side of the Big Lick Mountain. Mining ceased in the area circa 1940, and the tunnel has been in operation since that time. The water, soil and stream bed sediment geochemistry has been studied to determine their changes in chemistry over distance. The pH, TDS and metal concentrations were the primary focus. Metal concentrations were determined using an ICP unit. Data indicates the pH of the outflow to range between 6.7 and 7.3 Fe and Mn concentrations are as high as 9.7 ppb. Extensive metal precipitation (''yellow boy'') occurs within the tunnel and for several hundred meters from the mouth of the tunnel. The combination of near neutral pH and high metal concentration suggest that the drainage is in contact with highly alkaline materials prior to discharge from the tunnel. The geology of the area does not suggest bedrock as the possible source of alkaline material. One hypothesis is that the acidic water is reacting with the concrete tunnel and being neutralized. Data also suggests that the Fe precipitates much quicker than the Mn, resulting in a zonation between Fe-rich and Mn-rich sediments along the length of the drainage

  15. Optimisation of tungsten ore processing through a deep mineralogical characterisation and the study of the crushing process

    OpenAIRE

    Bascompte Vaquero, Jordi

    2017-01-01

    The unstoppable increasing global demand for metals calls for an urgent development of more efficient extraction and processing methods in the mining industry. Comminution is responsible for nearly half of the energy consumption of the entire mining process, and in the majority of the cases it is far from being optimised. Inside comminution, grinding is widely known for being more inefficient than crushing, however, it is needed to reach liberation at an ultrafine particle size. ...

  16. Focal psychodynamic therapy, cognitive behaviour therapy, and optimised treatment as usual in outpatients with anorexia nervosa (ANTOP study): randomised controlled trial.

    Science.gov (United States)

    Zipfel, Stephan; Wild, Beate; Groß, Gaby; Friederich, Hans-Christoph; Teufel, Martin; Schellberg, Dieter; Giel, Katrin E; de Zwaan, Martina; Dinkel, Andreas; Herpertz, Stephan; Burgmer, Markus; Löwe, Bernd; Tagay, Sefik; von Wietersheim, Jörn; Zeeck, Almut; Schade-Brittinger, Carmen; Schauenburg, Henning; Herzog, Wolfgang

    2014-01-11

    Psychotherapy is the treatment of choice for patients with anorexia nervosa, although evidence of efficacy is weak. The Anorexia Nervosa Treatment of OutPatients (ANTOP) study aimed to assess the efficacy and safety of two manual-based outpatient treatments for anorexia nervosa--focal psychodynamic therapy and enhanced cognitive behaviour therapy--versus optimised treatment as usual. The ANTOP study is a multicentre, randomised controlled efficacy trial in adults with anorexia nervosa. We recruited patients from ten university hospitals in Germany. Participants were randomly allocated to 10 months of treatment with either focal psychodynamic therapy, enhanced cognitive behaviour therapy, or optimised treatment as usual (including outpatient psychotherapy and structured care from a family doctor). The primary outcome was weight gain, measured as increased body-mass index (BMI) at the end of treatment. A key secondary outcome was rate of recovery (based on a combination of weight gain and eating disorder-specific psychopathology). Analysis was by intention to treat. This trial is registered at http://isrctn.org, number ISRCTN72809357. Of 727 adults screened for inclusion, 242 underwent randomisation: 80 to focal psychodynamic therapy, 80 to enhanced cognitive behaviour therapy, and 82 to optimised treatment as usual. At the end of treatment, 54 patients (22%) were lost to follow-up, and at 12-month follow-up a total of 73 (30%) had dropped out. At the end of treatment, BMI had increased in all study groups (focal psychodynamic therapy 0·73 kg/m(2), enhanced cognitive behaviour therapy 0·93 kg/m(2), optimised treatment as usual 0·69 kg/m(2)); no differences were noted between groups (mean difference between focal psychodynamic therapy and enhanced cognitive behaviour therapy -0·45, 95% CI -0·96 to 0·07; focal psychodynamic therapy vs optimised treatment as usual -0·14, -0·68 to 0·39; enhanced cognitive behaviour therapy vs optimised treatment as usual -0·30

  17. Komunikasi Pemasaran Produk Big-Cola Dan Coca-Cola Terhadap Minat Beli Konsumen (Studi Komparatif Komunikasi Pemasaran Produk Big-Cola Dan Coca-Cola Terahadap Minat Beli Konsumen Pada Mahasiswa Di Universitas Sumatera Utara )

    OpenAIRE

    Harahap, Mirza swardani

    2015-01-01

    Research is called communication of its marketing Big-Cola and Coca-Cola to interest in buying types of buyers in student at the university of north sumatra.The purpose of this research to know the influence of communication of its marketing Big-Cola and Coca-Cola to interest in buying student at the university of north sumatra. A study of correlational to perceive the difference or comparative marketing communications Big-Cola products and Coca-Cola to interest in buying types of buyers in s...

  18. Big cats as a model system for the study of the evolution of intelligence.

    Science.gov (United States)

    Borrego, Natalia

    2017-08-01

    Currently, carnivores, and felids in particular, are vastly underrepresented in cognitive literature, despite being an ideal model system for tests of social and ecological intelligence hypotheses. Within Felidae, big cats (Panthera) are uniquely suited to studies investigating the evolutionary links between social, ecological, and cognitive complexity. Intelligence likely did not evolve in a unitary way but instead evolved as the result of mutually reinforcing feedback loops within the physical and social environments. The domain-specific social intelligence hypothesis proposes that social complexity drives only the evolution of cognitive abilities adapted only to social domains. The domain-general hypothesis proposes that the unique demands of social life serve as a bootstrap for the evolution of superior general cognition. Big cats are one of the few systems in which we can directly address conflicting predictions of the domain-general and domain-specific hypothesis by comparing cognition among closely related species that face roughly equivalent ecological complexity but vary considerably in social complexity. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Study on key technologies of optimization of big data for thermal power plant performance

    Science.gov (United States)

    Mao, Mingyang; Xiao, Hong

    2018-06-01

    Thermal power generation accounts for 70% of China's power generation, the pollutants accounted for 40% of the same kind of emissions, thermal power efficiency optimization needs to monitor and understand the whole process of coal combustion and pollutant migration, power system performance data show explosive growth trend, The purpose is to study the integration of numerical simulation of big data technology, the development of thermal power plant efficiency data optimization platform and nitrogen oxide emission reduction system for the thermal power plant to improve efficiency, energy saving and emission reduction to provide reliable technical support. The method is big data technology represented by "multi-source heterogeneous data integration", "large data distributed storage" and "high-performance real-time and off-line computing", can greatly enhance the energy consumption capacity of thermal power plants and the level of intelligent decision-making, and then use the data mining algorithm to establish the boiler combustion mathematical model, mining power plant boiler efficiency data, combined with numerical simulation technology to find the boiler combustion and pollutant generation rules and combustion parameters of boiler combustion and pollutant generation Influence. The result is to optimize the boiler combustion parameters, which can achieve energy saving.

  20. How Can Big Data Complement Expert Analysis? A Value Chain Case Study

    Directory of Open Access Journals (Sweden)

    Kyungtae Kim

    2018-03-01

    Full Text Available In the world of big data, there is a need to investigate how data-driven approaches can support expert-based analyses during a technology planning process. To meet this goal, we examined opportunities and challenges for big data analytics in the social sciences, particularly with respect to value chain analysis. To accomplish this, we designed a value chain mapping experiment that aimed to compare the results of expert-based and data-based mappings. In the expert-based approach, we asked an industry expert to visually depict an industry value chain based on insights and collected data. We also reviewed a previously published value chain developed by a panel of industry experts during a national technology planning process. In the data-driven analysis, we used a massive number of business transaction records between companies under the assumption that the data would be useful in identifying relationships between items in a value chain. The case study results demonstrated that data-driven analysis can help researchers understand the current status of industry structures, enabling them to develop more realistic, although less flexible value chain maps. This approach is expected to provide more value when used in combination with other databases. It is important to note that significant effort is required to develop an elaborate analysis algorithm, and data preprocessing is essential for obtaining meaningful results, both of which make this approach challenging. Experts’ insights are still helpful for validating the analytic results in value chain mapping.

  1. Topology optimisation of natural convection problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations...... coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences...... in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach...

  2. A study on effect of big five personality traits on emotional intelligence

    Directory of Open Access Journals (Sweden)

    Hamed Dehghanan

    2014-06-01

    Full Text Available This paper presents a study to investigate the effects of big five personal traits on emotional intelligence on some Iranian firms located in city of Tehran, Iran. The proposed study uses two questionnaires, one, which is originally developed by McCare and Costa (1992 [McCrae, R. R., & Costa, P. T., Jr. (1992. Discriminant validity of NEO-PI-R facet scales. Educational and Psychological Measurement, 52, 229-237.] for measuring personality traits and the other, which is used for measuring emotional intelligence . The first questionnaire consists of five personal categories including extraversion, agreeableness, conscientiousness, emotional stability versus neuroticism, and openness. Using structural equation modeling and stepwise regression model, the study has detected a positive and meaningful relationship between four components namely, extraversion, agreeableness, conscientiousness as well as openness and emotional intelligence. In addition, the study detects a negative and meaningful relationship between neuroticism and emotional intelligence.

  3. Optimisation of load control

    International Nuclear Information System (INIS)

    Koponen, P.

    1998-01-01

    Electricity cannot be stored in large quantities. That is why the electricity supply and consumption are always almost equal in large power supply systems. If this balance were disturbed beyond stability, the system or a part of it would collapse until a new stable equilibrium is reached. The balance between supply and consumption is mainly maintained by controlling the power production, but also the electricity consumption or, in other words, the load is controlled. Controlling the load of the power supply system is important, if easily controllable power production capacity is limited. Temporary shortage of capacity causes high peaks in the energy price in the electricity market. Load control either reduces the electricity consumption during peak consumption and peak price or moves electricity consumption to some other time. The project Optimisation of Load Control is a part of the EDISON research program for distribution automation. The following areas were studied: Optimization of space heating and ventilation, when electricity price is time variable, load control model in power purchase optimization, optimization of direct load control sequences, interaction between load control optimization and power purchase optimization, literature on load control, optimization methods and field tests and response models of direct load control and the effects of the electricity market deregulation on load control. An overview of the main results is given in this chapter

  4. Optimisation of load control

    Energy Technology Data Exchange (ETDEWEB)

    Koponen, P [VTT Energy, Espoo (Finland)

    1998-08-01

    Electricity cannot be stored in large quantities. That is why the electricity supply and consumption are always almost equal in large power supply systems. If this balance were disturbed beyond stability, the system or a part of it would collapse until a new stable equilibrium is reached. The balance between supply and consumption is mainly maintained by controlling the power production, but also the electricity consumption or, in other words, the load is controlled. Controlling the load of the power supply system is important, if easily controllable power production capacity is limited. Temporary shortage of capacity causes high peaks in the energy price in the electricity market. Load control either reduces the electricity consumption during peak consumption and peak price or moves electricity consumption to some other time. The project Optimisation of Load Control is a part of the EDISON research program for distribution automation. The following areas were studied: Optimization of space heating and ventilation, when electricity price is time variable, load control model in power purchase optimization, optimization of direct load control sequences, interaction between load control optimization and power purchase optimization, literature on load control, optimization methods and field tests and response models of direct load control and the effects of the electricity market deregulation on load control. An overview of the main results is given in this chapter

  5. SPS batch spacing optimisation

    CERN Document Server

    Velotti, F M; Carlier, E; Goddard, B; Kain, V; Kotzian, G

    2017-01-01

    Until 2015, the LHC filling schemes used the batch spac-ing as specified in the LHC design report. The maximumnumber of bunches injectable in the LHC directly dependson the batch spacing at injection in the SPS and hence onthe MKP rise time.As part of the LHC Injectors Upgrade project for LHCheavy ions, a reduction of the batch spacing is needed. In thisdirection, studies to approach the MKP design rise time of150ns(2-98%) have been carried out. These measurementsgave clear indications that such optimisation, and beyond,could be done also for higher injection momentum beams,where the additional slower MKP (MKP-L) is needed.After the successful results from 2015 SPS batch spacingoptimisation for the Pb-Pb run [1], the same concept wasthought to be used also for proton beams. In fact, thanksto the SPS transverse feed back, it was already observedthat lower batch spacing than the design one (225ns) couldbe achieved. For the 2016 p-Pb run, a batch spacing of200nsfor the proton beam with100nsbunch spacing wasreque...

  6. A study and analysis of recommendation systems for location-based social network (LBSN with big data

    Directory of Open Access Journals (Sweden)

    Murale Narayanan

    2016-03-01

    Full Text Available Recommender systems play an important role in our day-to-day life. A recommender system automatically suggests an item to a user that he/she might be interested in. Small-scale datasets are used to provide recommendations based on location, but in real time, the volume of data is large. We have selected Foursquare dataset to study the need for big data in recommendation systems for location-based social network (LBSN. A few quality parameters like parallel processing and multimodal interface have been selected to study the need for big data in recommender systems. This paper provides a study and analysis of quality parameters of recommendation systems for LBSN with big data.

  7. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    Science.gov (United States)

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  8. Optimising Magnetostatic Assemblies

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Smith, Anders

    theorem. This theorem formulates an energy equivalence principle with several implications concerning the optimisation of objective functionals that are linear with respect to the magnetic field. Linear functionals represent different optimisation goals, e.g. maximising a certain component of the field...... approached employing a heuristic algorithm, which led to new design concepts. Some of the procedures developed for linear objective functionals have been extended to non-linear objectives, by employing iterative techniques. Even though most the optimality results discussed in this work have been derived...

  9. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  10. Study on the application of big cup membranate stent on restenosis after stenting for carcinoma of esophago cardia

    International Nuclear Information System (INIS)

    Wang Xiuping; Yao Zhongqiang; Liu Jian; Zhang Yan

    2007-01-01

    Objective: To evaluate the clinical value of self-designed big cup membranate stent on restenosis after stenting for carcinoma of esophago-cardia. Methods: 12 cases of restenosis after stenting for carcinoma of esophago-cardia were involved in the study. Self-designed big cup membranate stent made by Nanjing Weichuang Company (the length of the cup was 3.5 cm)was placed into the constricted stent under guidance of fluoroscopy. Clinical effect, restenosis, and complications were observed during followed up. Results: All the 12 cases of big cup membranat stent placement went along smoothly without indigitation of the cup of the stent. Follow-up of 1.5-8 months showed that 2 cases developed severe restenosis on the big cup of the stem, resulting in third grade difficult deglutition. Among them, one occurred 1 month after stenting, caused by hyperplasia of large amount of granulation tissue; another occurred 6 months after stenting, caused by growth of tumor tissue. 3 cases developed mild to moderate restenosis, 2.3-7 months (mean: 4.6 months) after stenting, with result of first grade difficult deglutition. The remaining 7 cases (mean 5.6 months follow-up) did not have difficulty during deglutition. Conclusions: Application of big cup membranate stent on restenosis after stenting for carcinoma of esophago-cardia can effectively prevent the stent from moving downwards and thus lower down the rate of restenosis, and postpone the occurrence of restenosis. (authors)

  11. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  12. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  13. How to Use TCM Informatics to Study Traditional Chinese Medicine in Big Data Age.

    Science.gov (United States)

    Shi, Cheng; Gong, Qing-Yue; Zhou, Jinhai

    2017-01-01

    This paper introduces the characteristics and complexity of traditional Chinese medicine (TCM) data, considers that modern big data processing technology has brought new opportunities for the research of TCM, and gives some ideas and methods to apply big data technology in TCM.

  14. Dose optimisation in single plane interstitial brachytherapy

    DEFF Research Database (Denmark)

    Tanderup, Kari; Hellebust, Taran Paulsen; Honoré, Henriette Benedicte

    2006-01-01

    patients,       treated for recurrent rectal and cervical cancer, flexible catheters were       sutured intra-operatively to the tumour bed in areas with compromised       surgical margin. Both non-optimised, geometrically and graphically       optimised CT -based dose plans were made. The overdose index...... on the       regularity of the implant, such that the benefit of optimisation was       larger for irregular implants. OI and HI correlated strongly with target       volume limiting the usability of these parameters for comparison of dose       plans between patients. CONCLUSIONS: Dwell time optimisation significantly......BACKGROUND AND PURPOSE: Brachytherapy dose distributions can be optimised       by modulation of source dwell times. In this study dose optimisation in       single planar interstitial implants was evaluated in order to quantify the       potential benefit in patients. MATERIAL AND METHODS: In 14...

  15. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  16. Experimental Study on the Compressive Strength of Big Mobility Concrete with Nondestructive Testing Method

    Directory of Open Access Journals (Sweden)

    Huai-Shuai Shang

    2012-01-01

    Full Text Available An experimental study of C20, C25, C30, C40, and C50 big mobility concrete cubes that came from laboratory and construction site was completed. Nondestructive testing (NDT was carried out using impact rebound hammer (IRH techniques to establish a correlation between the compressive strengths and the rebound number. The local curve for measuring strength of the regression method is set up and its superiority is proved. The rebound method presented is simple, quick, and reliable and covers wide ranges of concrete strengths. The rebound method can be easily applied to concrete specimens as well as existing concrete structures. The final results were compared with previous ones from the literature and also with actual results obtained from samples extracted from existing structures.

  17. A longitudinal study of the relationships between the Big Five personality traits and body size perception.

    Science.gov (United States)

    Hartmann, Christina; Siegrist, Michael

    2015-06-01

    The present study investigated the longitudinal development of body size perception in relation to different personality traits. A sample of Swiss adults (N=2905, 47% men), randomly selected from the telephone book, completed a questionnaire on two consecutive years (2012, 2013). Body size perception was assessed with the Contour Drawing Rating Scale and personality traits were assessed with a short version of the Big Five Inventory. Longitudinal analysis of change indicated that men and women scoring higher on conscientiousness perceived themselves as thinner one year later. In contrast, women scoring higher on neuroticism perceived their body size as larger one year later. No significant effect was observed for men scoring higher on neuroticism. These results were independent of weight changes, body mass index, age, and education. Our findings suggest that personality traits contribute to body size perception among adults. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Using Big Data in oncology to prospectively impact clinical patient care: A proof of concept study.

    Science.gov (United States)

    Dougoud-Chauvin, Vérène; Lee, Jae Jin; Santos, Edgardo; Williams, Vonetta L; Battisti, Nicolò M L; Ghia, Kavita; Sehovic, Marina; Croft, Cortlin; Kim, Jongphil; Balducci, Lodovico; Kish, Julie A; Extermann, Martine

    2018-04-17

    Big Data is widely seen as a major opportunity for progress in the practice of personalized medicine, attracting the attention from medical societies and presidential teams alike as it offers a unique opportunity to enlarge the base of evidence, especially for older patients underrepresented in clinical trials. This study prospectively assessed the real-time availability of clinical cases in the Health & Research Informatics Total Cancer Care™ (TCC) database matching community patients with cancer, and the impact of such a consultation on treatment. Patients aged 70 and older seen at the Lynn Cancer Institute (LCI) with a documented malignancy were eligible. Geriatric screening information and the oncologist's pre-consultation treatment plan were sent to Moffitt. A search for similar patients was done in TCC and additional information retrieved from Electronic Medical Records. A report summarizing the data was sent and the utility of such a consultation was assessed per email after the treatment decision. Thirty one patients were included. The geriatric screening was positive in 87.1% (27) of them. The oncogeriatric consultation took on average 2.2 working days. It influenced treatment in 38.7% (12), and modified it in 19.4% (6). The consultation was perceived as "somewhat" to "very useful" in 83.9% (26). This study establishes a proof of concept of the feasibility of real time use of Big Data for clinical practice. The geriatric screening and the consultation report influenced treatment in 38.7% of cases and modified it in 19.4%, which compares very well with oncogeriatric literature. Additional steps are needed to render it financially and clinically viable. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  20. Cholesterol, Cholesterol-Lowering Medication Use, and Breast Cancer Outcome in the BIG 1-98 Study

    DEFF Research Database (Denmark)

    Borgquist, Signe; Giobbie-Hurder, Anita; Ahern, Thomas P

    2017-01-01

    on cholesterol levels and hypercholesterolemia per se may counteract the intended effect of aromatase inhibitors. Patients and Methods The Breast International Group (BIG) conducted a randomized, phase III, double-blind trial, BIG 1-98, which enrolled 8,010 postmenopausal women with early-stage, hormone receptor......-positive invasive breast cancer from 1998 to 2003. Systemic levels of total cholesterol and use of CLM were measured at study entry and every 6 months up to 5.5 years. Cumulative incidence functions were used to describe the initiation of CLM in the presence of competing risks. Marginal structural Cox proportional...

  1. Impact of physical exercise on reaction time in patients with Parkinson's disease-data from the Berlin BIG Study.

    Science.gov (United States)

    Ebersbach, Georg; Ebersbach, Almut; Gandor, Florin; Wegner, Brigitte; Wissel, Jörg; Kupsch, Andreas

    2014-05-01

    To determine whether physical activity may affect cognitive performance in patients with Parkinson's disease by measuring reaction times in patients participating in the Berlin BIG study. Randomized controlled trial, rater-blinded. Ambulatory care. Patients with mild to moderate Parkinson's disease (N=60) were randomly allocated to 3 treatment arms. Outcome was measured at the termination of training and at follow-up 16 weeks after baseline in 58 patients (completers). Patients received 16 hours of individual Lee Silverman Voice Treatment-BIG training (BIG; duration of treatment, 4wk), 16 hours of group training with Nordic Walking (WALK; duration of treatment, 8wk), or nonsupervised domestic exercise (HOME; duration of instruction, 1hr). Cued reaction time (cRT) and noncued reaction time (nRT). Differences between treatment groups in improvement in reaction times from baseline to intermediate and baseline to follow-up assessments were observed for cRT but not for nRT. Pairwise t test comparisons revealed differences in change in cRT at both measurements between BIG and HOME groups (intermediate: -52ms; 95% confidence interval [CI], -84/-20; P=.002; follow-up: 55ms; CI, -105/-6; P=.030) and between WALK and HOME groups (intermediate: -61ms; CI, -120/-2; P=.042; follow-up: -78ms; CI, -136/-20; P=.010). There was no difference between BIG and WALK groups (intermediate: 9ms; CI, -49/67; P=.742; follow-up: 23ms; CI, -27/72; P=.361). Supervised physical exercise with Lee Silverman Voice Treatment-BIG or Nordic Walking is associated with improvement in cognitive aspects of movement preparation. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  3. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  4. The Interplay of "Big Five" Personality Factors and Metaphorical Schemas: A Pilot Study with 20 Lung Transplant Recipients

    Science.gov (United States)

    Goetzmann, Lutz; Moser, Karin S.; Vetsch, Esther; Grieder, Erhard; Klaghofer, Richard; Naef, Rahel; Russi, Erich W.; Boehler, Annette; Buddeberg, Claus

    2007-01-01

    The aim of the present study was to investigate the interplay between personality factors and metaphorical schemas. The "Big Five" personality factors of 20 patients after lung transplantation were examined with the NEO-FFI. Patients were questioned about their social network, and self- and body-image. The interviews were assessed with metaphor…

  5. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

    Directory of Open Access Journals (Sweden)

    Nigsch Florian

    2008-10-01

    Full Text Available Abstract Background We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC, that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024–1029. We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581–590 of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6°C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, ε of 0.21 and an RMSE of 45.1°C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3°C, R2 of 0.47 for the same data and has similar performance to a Random Forest model (RMSE of 44.5°C, R2 of 0.55. However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  6. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction.

    Science.gov (United States)

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John Bo

    2008-10-29

    We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  7. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R H

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  8. Implementing large-scale programmes to optimise the health workforce in low- and middle-income settings: a multicountry case study synthesis.

    Science.gov (United States)

    Gopinathan, Unni; Lewin, Simon; Glenton, Claire

    2014-12-01

    To identify factors affecting the implementation of large-scale programmes to optimise the health workforce in low- and middle-income countries. We conducted a multicountry case study synthesis. Eligible programmes were identified through consultation with experts and using Internet searches. Programmes were selected purposively to match the inclusion criteria. Programme documents were gathered via Google Scholar and PubMed and from key informants. The SURE Framework - a comprehensive list of factors that may influence the implementation of health system interventions - was used to organise the data. Thematic analysis was used to identify the key issues that emerged from the case studies. Programmes from Brazil, Ethiopia, India, Iran, Malawi, Venezuela and Zimbabwe were selected. Key system-level factors affecting the implementation of the programmes were related to health worker training and continuing education, management and programme support structures, the organisation and delivery of services, community participation, and the sociopolitical environment. Existing weaknesses in health systems may undermine the implementation of large-scale programmes to optimise the health workforce. Changes in the roles and responsibilities of cadres may also, in turn, impact the health system throughout. © 2014 John Wiley & Sons Ltd.

  9. Comparison of Oncentra® Brachy IPSA and graphical optimisation techniques: a case study of HDR brachytherapy head and neck and prostate plans

    International Nuclear Information System (INIS)

    Jameson, Michael G; Ohanessian, Lucy; Batumalai, Vikneswary; Patel, Virendra; Holloway, Lois C

    2015-01-01

    There are a number of different dwell positions and time optimisation options available in the Oncentra® Brachy (Elekta Brachytherapy Solutions, Veenendaal, The Netherlands) brachytherapy treatment planning system. The purpose of this case study was to compare graphical (GRO) and inverse planning by simulated annealing (IPSA) optimisation techniques for interstitial head and neck (HN) and prostate plans considering dosimetry, modelled radiobiology outcome and planning time. Four retrospective brachytherapy patients were chosen for this study, two recurrent HN and two prostatic boosts. Manual GRO and IPSA plans were generated for each patient. Plans were compared using dose–volume histograms (DVH) and dose coverage metrics including; conformity index (CI), homogeneity index (HI) and conformity number (CN). Logit and relative seriality models were used to calculate tumour control probability (TCP) and normal tissue complication probability (NTCP). Approximate planning time was also recorded. There was no significant difference between GRO and IPSA in terms of dose metrics with mean CI of 1.30 and 1.57 (P > 0.05) respectively. IPSA achieved an average HN TCP of 0.32 versus 0.12 for GRO while for prostate there was no significant difference. Mean GRO planning times were greater than 75 min while average IPSA planning times were less than 10 min. Planning times for IPSA were greatly reduced compared to GRO and plans were dosimetrically similar. For this reason, IPSA makes for a useful planning tool in HN and prostate brachytherapy

  10. On Study of Application of Big Data and Cloud Computing Technology in Smart Campus

    Science.gov (United States)

    Tang, Zijiao

    2017-12-01

    We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.

  11. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters....... Accordingly, two novel optimisation configurations are developed, where the optimisation either acts on the actuators or acts on the regulatory control layer. These two optimisation designs are evaluated on a sub-catchment of the sewer system in Copenhagen, and found to perform better than the existing...

  12. How big data analytics affect decision-making : A study of the newspaper industry

    OpenAIRE

    Björkman, Filip; Franco, Sebastian

    2017-01-01

    Big data analytics is a topic that is surrounded by a lot of enthusiasm and hype among both researchers and practitioners and is quickly being applied in different industries. The purpose of the thesis is to investigate the emerging technology of big data analytics and how it affects decision-making. In order to investigate this, we conducted empirical research in the newspaper industry, which is an industry that is going through a crisis with decreasing revenues, old business models collapsi...

  13. A study on decision-making of food supply chain based on big data

    OpenAIRE

    Ji, Guojun; Hu, Limei; Tan, Kim Hua

    2017-01-01

    As more and more companies have captured and analyzed huge volumes of data to improve the performance of supply chain, this paper develops a big data harvest model that uses big data as inputs to make more informed production decisions in the food supply chain. By introducing a method of Bayesian network, this paper integrates sample data and finds a cause-and-effect between data to predict market demand. Then the deduction graph model that translates products demand into processes and divide...

  14. The Peculiarities of Functioning of the Infrastructure in the Russian Big Cities (Study in Sterlitamak

    Directory of Open Access Journals (Sweden)

    E V Frolova

    2012-06-01

    Full Text Available The development of the infrastructure in modern municipal entities is one of the most pressing problems of the Russian state. This article focuses on the analysis of the results of researches, which describe the characteristics of the infrastructure in big cities of Russia. The poor condition of urban roads, poor quality of health services, inefficient functioning of the culture and leisure complex - these problems are very important today for the population of big cities.

  15. A study of the Internet of things and RFID technology: big data in Navy medicine

    OpenAIRE

    Trainor, Gill S.

    2017-01-01

    Approved for public release; distribution is unlimited The Internet of things (IoT) and big data describe the movement of many industries toward a focus on data collection, communication and analysis. Tools such as radio frequency identification (RFID) are leveraged in order to maximize the potential benefits gained from the IoT and big data, connecting devices to one another and providing end users meaningful ways to interact with technology. The healthcare industry has revolutionized its...

  16. Development of an optimised 1:1 physiotherapy intervention post first-time lumbar discectomy: a mixed-methods study

    Science.gov (United States)

    Rushton, A; White, L; Heap, A; Heneghan, N; Goodwin, P

    2016-01-01

    Objectives To develop an optimised 1:1 physiotherapy intervention that reflects best practice, with flexibility to tailor management to individual patients, thereby ensuring patient-centred practice. Design Mixed-methods combining evidence synthesis, expert review and focus groups. Setting Secondary care involving 5 UK specialist spinal centres. Participants A purposive panel of clinical experts from the 5 spinal centres, comprising spinal surgeons, inpatient and outpatient physiotherapists, provided expert review of the draft intervention. Purposive samples of patients (n=10) and physiotherapists (n=10) (inpatient/outpatient physiotherapists managing patients with lumbar discectomy) were invited to participate in the focus groups at 1 spinal centre. Methods A draft intervention developed from 2 systematic reviews; a survey of current practice and research related to stratified care was circulated to the panel of clinical experts. Lead physiotherapists collaborated with physiotherapy and surgeon colleagues to provide feedback that informed the intervention presented at 2 focus groups investigating acceptability to patients and physiotherapists. The focus groups were facilitated by an experienced facilitator, recorded in written and tape-recorded forms by an observer. Tape recordings were transcribed verbatim. Data analysis, conducted by 2 independent researchers, employed an iterative and constant comparative process of (1) initial descriptive coding to identify categories and subsequent themes, and (2) deeper, interpretive coding and thematic analysis enabling concepts to emerge and overarching pattern codes to be identified. Results The intervention reflected best available evidence and provided flexibility to ensure patient-centred care. The intervention comprised up to 8 sessions of 1:1 physiotherapy over 8 weeks, starting 4 weeks postsurgery. The intervention was acceptable to patients and physiotherapists. Conclusions A rigorous process informed an

  17. Optimisation of radiation protection

    International Nuclear Information System (INIS)

    1988-01-01

    Optimisation of radiation protection is one of the key elements in the current radiation protection philosophy. The present system of dose limitation was issued in 1977 by the International Commission on Radiological Protection (ICRP) and includes, in addition to the requirements of justification of practices and limitation of individual doses, the requirement that all exposures be kept as low as is reasonably achievable, taking social and economic factors into account. This last principle is usually referred to as optimisation of radiation protection, or the ALARA principle. The NEA Committee on Radiation Protection and Public Health (CRPPH) organised an ad hoc meeting, in liaison with the NEA committees on the safety of nuclear installations and radioactive waste management. Separate abstracts were prepared for individual papers presented at the meeting

  18. Optimisation by hierarchical search

    Science.gov (United States)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  19. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  20. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  1. Big Data for better urban life?: An exploratory study of critical urban issues in two Caribbean cities: Paramaribo (Suriname) and Port of Spain (Trinidad and Tobago)

    NARCIS (Netherlands)

    Pfeffer, K.; Verrest, H.; Poorthuis, A.

    2015-01-01

    Big Data is increasingly seen as important in studying the city. This pertains to both its methodological capacity and the societal implications it may have. In this article we draw on contemporary literature to discuss the potentials and challenges of Big Data for addressing pressing urban issues.

  2. Optimisation of Investment Resources at Small Enterprises

    Directory of Open Access Journals (Sweden)

    Shvets Iryna B.

    2014-03-01

    Full Text Available The goal of the article lies in the study of the process of optimisation of the structure of investment resources, development of criteria and stages of optimisation of volumes of investment resources for small enterprises by types of economic activity. The article characterises the process of transformation of investment resources into assets and liabilities of the balances of small enterprises and conducts calculation of the structure of sources of formation of investment resources in Ukraine at small enterprises by types of economic activity in 2011. On the basis of the conducted analysis of the structure of investment resources of small enterprises the article forms main groups of criteria of optimisation in the context of individual small enterprises by types of economic activity. The article offers an algorithm and step-by-step scheme of optimisation of investment resources at small enterprises in the form of a multi-stage process of management of investment resources in the context of increase of their mobility and rate of transformation of existing resources into investments. The prospect of further studies in this direction is development of a structural and logic scheme of optimisation of volumes of investment resources at small enterprises.

  3. Risk based test interval and maintenance optimisation - Application and uses

    International Nuclear Information System (INIS)

    Sparre, E.

    1999-10-01

    The project is part of an IAEA co-ordinated Research Project (CRP) on 'Development of Methodologies for Optimisation of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs'. The purpose of the project is to investigate the sensitivity of the results obtained when performing risk based optimisation of the technical specifications. Previous projects have shown that complete LPSA models can be created and that these models allow optimisation of technical specifications. However, these optimisations did not include any in depth check of the result sensitivity with regards to methods, model completeness etc. Four different test intervals have been investigated in this study. Aside from an original, nominal, optimisation a set of sensitivity analyses has been performed and the results from these analyses have been compared to the original optimisation. The analyses indicate that the result of an optimisation is rather stable. However, it is not possible to draw any certain conclusions without performing a number of sensitivity analyses. Significant differences in the optimisation result were discovered when analysing an alternative configuration. Also deterministic uncertainties seem to affect the result of an optimisation largely. The sensitivity of failure data uncertainties is important to investigate in detail since the methodology is based on the assumption that the unavailability of a component is dependent on the length of the test interval

  4. Advanced optimisation - coal fired power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Turney, D.M.; Mayes, I. [E.ON UK, Nottingham (United Kingdom)

    2005-03-01

    The purpose of this unit optimization project is to develop an integrated approach to unit optimisation and develop an overall optimiser that is able to resolve any conflicts between the individual optimisers. The individual optimisers have been considered during this project are: on-line thermal efficiency package, GNOCIS boiler optimiser, GNOCIS steam side optimiser, ESP optimisation, and intelligent sootblowing system. 6 refs., 7 figs., 3 tabs.

  5. Underground Study of Big Bang Nucleosynthesis in the Precision Era of Cosmology

    Directory of Open Access Journals (Sweden)

    Gustavino Carlo

    2017-01-01

    Full Text Available Big Bang Nucleosinthesis (BBN theory provides definite predictions for the abundance of light elements produced in the early universe, as far as the knowledge of the relevant nuclear processes of the BBN chain is accurate. At BBN energies (30 ≲ Ecm ≲ 300 MeV the cross section of many BBN processes is very low because of the Coulomb repulsion between the interacting nuclei. For this reason it is convenient to perform the measurements deep underground. Presently the world’s only facility operating underground is LUNA (Laboratory for Undergound Nuclear astrophysics at LNGS (“Laboratorio Nazionale del Gran Sasso”, Italy. In this presentation the BBN measurements of LUNA are briefly reviewed and discussed. It will be shown that the ongoing study of the D(p, γ3He reaction is of primary importance to derive the baryon density of universe Ωb with high accuracy. Moreover, this study allows to constrain the existence of the so called “dark radiation”, composed by undiscovered relativistic species permeating the universe, such as sterile neutrinos.

  6. Underground Study of Big Bang Nucleosynthesis in the Precision Era of Cosmology

    Science.gov (United States)

    Gustavino, Carlo

    2017-03-01

    Big Bang Nucleosinthesis (BBN) theory provides definite predictions for the abundance of light elements produced in the early universe, as far as the knowledge of the relevant nuclear processes of the BBN chain is accurate. At BBN energies (30 ≲ Ecm ≲ 300 MeV) the cross section of many BBN processes is very low because of the Coulomb repulsion between the interacting nuclei. For this reason it is convenient to perform the measurements deep underground. Presently the world's only facility operating underground is LUNA (Laboratory for Undergound Nuclear astrophysics) at LNGS ("Laboratorio Nazionale del Gran Sasso", Italy). In this presentation the BBN measurements of LUNA are briefly reviewed and discussed. It will be shown that the ongoing study of the D(p, γ)3He reaction is of primary importance to derive the baryon density of universe Ωb with high accuracy. Moreover, this study allows to constrain the existence of the so called "dark radiation", composed by undiscovered relativistic species permeating the universe, such as sterile neutrinos.

  7. Internet of things and Big Data as potential solutions to the problems in waste electrical and electronic equipment management: An exploratory study.

    Science.gov (United States)

    Gu, Fu; Ma, Buqing; Guo, Jianfeng; Summers, Peter A; Hall, Philip

    2017-10-01

    Management of Waste Electrical and Electronic Equipment (WEEE) is a vital part in solid waste management, there are still some difficult issues require attentionss. This paper investigates the potential of applying Internet of Things (IoT) and Big Data as the solutions to the WEEE management problems. The massive data generated during the production, consumption and disposal of Electrical and Electronic Equipment (EEE) fits the characteristics of Big Data. Through using the state-of-the-art communication technologies, the IoT derives the WEEE "Big Data" from the life cycle of EEE, and the Big Data technologies process the WEEE "Big Data" for supporting decision making in WEEE management. The framework of implementing the IoT and the Big Data technologies is proposed, with its multiple layers are illustrated. Case studies with the potential application scenarios of the framework are presented and discussed. As an unprecedented exploration, the combined application of the IoT and the Big Data technologies in WEEE management brings a series of opportunities as well as new challenges. This study provides insights and visions for stakeholders in solving the WEEE management problems under the context of IoT and Big Data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Optimisation in radiotherapy II: Programmed and inversion optimisation algorithms

    International Nuclear Information System (INIS)

    Ebert, M.

    1997-01-01

    This is the second article in a three part examination of optimisation in radiotherapy. The previous article established the bases of optimisation in radiotherapy, and the formulation of the optimisation problem. This paper outlines several algorithms that have been used in radiotherapy, for searching for the best irradiation strategy within the full set of possible strategies. Two principle classes of algorithm are considered - those associated with mathematical programming which employ specific search techniques, linear programming type searches or artificial intelligence - and those which seek to perform a numerical inversion of the optimisation problem, finishing with deterministic iterative inversion. (author)

  9. Family Connections versus optimised treatment-as-usual for family members of individuals with borderline personality disorder: non-randomised controlled study.

    LENUS (Irish Health Repository)

    Flynn, Daniel

    2017-01-01

    Borderline personality disorder (BPD) is challenging for family members who are often required to fulfil multiple roles such as those of advocate, caregiver, coach and guardian. To date, two uncontrolled studies by the treatment developers suggest that Family Connections (FC) is an effective programme to support, educate and teach skills to family members of individuals with BPD. However, such studies have been limited by lack of comparison to other treatment approaches. This study aimed to compare the effectiveness of FC with an optimised treatment-as-usual (OTAU) programme for family members of individuals with BPD. A secondary aim was to introduce a long term follow-up to investigate if positive gains from the intervention would be maintained following programme completion.

  10. Big Data Research in Neurosurgery: A Critical Look at this Popular New Study Design.

    Science.gov (United States)

    Oravec, Chesney S; Motiwala, Mustafa; Reed, Kevin; Kondziolka, Douglas; Barker, Fred G; Michael, L Madison; Klimo, Paul

    2018-05-01

    The use of "big data" in neurosurgical research has become increasingly popular. However, using this type of data comes with limitations. This study aimed to shed light on this new approach to clinical research. We compiled a list of commonly used databases that were not specifically created to study neurosurgical procedures, conditions, or diseases. Three North American journals were manually searched for articles published since 2000 utilizing these and other non-neurosurgery-specific databases. A number of data points per article were collected, tallied, and analyzed.A total of 324 articles were identified since 2000 with an exponential increase since 2011 (257/324, 79%). The Journal of Neurosurgery Publishing Group published the greatest total number (n = 200). The National Inpatient Sample was the most commonly used database (n = 136). The average study size was 114 841 subjects (range, 30-4 146 777). The most prevalent topics were vascular (n = 77) and neuro-oncology (n = 66). When categorizing study objective (recognizing that many papers reported more than 1 type of study objective), "Outcomes" was the most common (n = 154). The top 10 institutions by primary or senior author accounted for 45%-50% of all publications. Harvard Medical School was the top institution, using this research technique with 59 representations (31 by primary author and 28 by senior).The increasing use of data from non-neurosurgery-specific databases presents a unique challenge to the interpretation and application of the study conclusions. The limitations of these studies must be more strongly considered in designing and interpreting these studies.

  11. Big Data Analytics for Smart Manufacturing: Case Studies in Semiconductor Manufacturing

    Directory of Open Access Journals (Sweden)

    James Moyne

    2017-07-01

    Full Text Available Smart manufacturing (SM is a term generally applied to the improvement in manufacturing operations through integration of systems, linking of physical and cyber capabilities, and taking advantage of information including leveraging the big data evolution. SM adoption has been occurring unevenly across industries, thus there is an opportunity to look to other industries to determine solution and roadmap paths for industries such as biochemistry or biology. The big data evolution affords an opportunity for managing significantly larger amounts of information and acting on it with analytics for improved diagnostics and prognostics. The analytics approaches can be defined in terms of dimensions to understand their requirements and capabilities, and to determine technology gaps. The semiconductor manufacturing industry has been taking advantage of the big data and analytics evolution by improving existing capabilities such as fault detection, and supporting new capabilities such as predictive maintenance. For most of these capabilities: (1 data quality is the most important big data factor in delivering high quality solutions; and (2 incorporating subject matter expertise in analytics is often required for realizing effective on-line manufacturing solutions. In the future, an improved big data environment incorporating smart manufacturing concepts such as digital twin will further enable analytics; however, it is anticipated that the need for incorporating subject matter expertise in solution design will remain.

  12. Big five personality factors and cigarette smoking: a 10-year study among US adults.

    Science.gov (United States)

    Zvolensky, Michael J; Taha, Farah; Bono, Amanda; Goodwin, Renee D

    2015-04-01

    The present study examined the relation between the big five personality traits and any lifetime cigarette use, progression to daily smoking, and smoking persistence among adults in the United States (US) over a ten-year period. Data were drawn from the Midlife Development in the US (MIDUS) I and II (N = 2101). Logistic regression was used to examine the relationship between continuously measured personality factors and any lifetime cigarette use, smoking progression, and smoking persistence at baseline (1995-1996) and at follow-up (2004-2006). The results revealed that higher levels of openness to experience and neuroticism were each significantly associated with increased risk of any lifetime cigarette use. Neuroticism also was associated with increased risk of progression from ever smoking to daily smoking and persistent daily smoking over a ten-year period. In contrast, conscientiousness was associated with decreased risk of lifetime cigarette use, progression to daily smoking, and smoking persistence. Most, but not all, associations between smoking and personality persisted after adjusting for demographic characteristics, depression, anxiety disorders, and substance use problems. The findings suggest that openness to experience and neuroticism may be involved in any lifetime cigarette use and smoking progression, and that conscientiousness appears to protect against smoking progression and persistence. These data add to a growing literature suggesting that certain personality factors--most consistently neuroticism--are important to assess and perhaps target during intervention programs for smoking behavior. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Mobilizing and integrating big data in studies of spatial and phylogenetic patterns of biodiversity

    Directory of Open Access Journals (Sweden)

    Douglas E. Soltis

    2016-12-01

    Full Text Available The current global challenges that threaten biodiversity are immense and rapidly growing. These biodiversity challenges demand approaches that meld bioinformatics, large-scale phylogeny reconstruction, use of digitized specimen data, and complex post-tree analyses (e.g. niche modeling, niche diversification, and other ecological analyses. Recent developments in phylogenetics coupled with emerging cyberinfrastructure and new data sources provide unparalleled opportunities for mobilizing and integrating massive amounts of biological data, driving the discovery of complex patterns and new hypotheses for further study. These developments are not trivial in that biodiversity data on the global scale now being collected and analyzed are inherently complex. The ongoing integration and maturation of biodiversity tools discussed here is transforming biodiversity science, enabling what we broadly term “next-generation” investigations in systematics, ecology, and evolution (i.e., “biodiversity science”. New training that integrates domain knowledge in biodiversity and data science skills is also needed to accelerate research in these areas. Integrative biodiversity science is crucial to the future of global biodiversity. We cannot simply react to continued threats to biodiversity, but via the use of an integrative, multifaceted, big data approach, researchers can now make biodiversity projections to provide crucial data not only for scientists, but also for the public, land managers, policy makers, urban planners, and agriculture.

  14. Modelling study, efficiency analysis and optimisation of large-scale Adiabatic Compressed Air Energy Storage systems with low-temperature thermal storage

    International Nuclear Information System (INIS)

    Luo, Xing; Wang, Jihong; Krupke, Christopher; Wang, Yue; Sheng, Yong; Li, Jian; Xu, Yujie; Wang, Dan; Miao, Shihong; Chen, Haisheng

    2016-01-01

    Highlights: • The paper presents an A-CAES system thermodynamic model with low temperature thermal energy storage integration. • The initial parameter value ranges for A-CAES system simulation are identified from the study of a CAES plant in operation. • The strategies of system efficiency improvement are investigated via a parametric study with a sensitivity analysis. • Various system configurations are discussed for analysing the efficiency improvement potentials. - Abstract: The key feature of Adiabatic Compressed Air Energy Storage (A-CAES) is the reuse of the heat generated from the air compression process at the stage of air expansion. This increases the complexity of the whole system since the heat exchange and thermal storage units must have the capacities and performance to match the air compression/expansion units. Thus it raises a strong demand in the whole system modelling and simulation tool for A-CAES system optimisation. The paper presents a new whole system mathematical model for A-CAES with simulation implementation and the model is developed with consideration of lowing capital cost of the system. The paper then focuses on the study of system efficiency improvement strategies via parametric analysis and system structure optimisation. The paper investigates how the system efficiency is affected by the system component performance and parameters. From the study, the key parameters are identified, which give dominant influences in improving the system efficiency. The study is extended onto optimal system configuration and the recommendations are made for achieving higher efficiency, which provides a useful guidance for A-CAES system design.

  15. Optimisation of monochrome images

    International Nuclear Information System (INIS)

    Potter, R.

    1983-01-01

    Gamma cameras with modern imaging systems usually digitize the signals to allow storage and processing of the image in a computer. Although such computer systems are widely used for the extraction of quantitative uptake estimates and the analysis of time variant data, the vast majority of nuclear medicine images is still interpreted on the basis of an observer's visual assessment of a photographic hardcopy image. The optimisation of hardcopy devices is therefore vital and factors such as resolution, uniformity, noise grey scales and display matrices are discussed. Once optimum display parameters have been determined, routine procedures for quality control need to be established; suitable procedures are discussed. (U.K.)

  16. Optimising resource management in neurorehabilitation.

    Science.gov (United States)

    Wood, Richard M; Griffiths, Jeff D; Williams, Janet E; Brouwers, Jakko

    2014-01-01

    To date, little research has been published regarding the effective and efficient management of resources (beds and staff) in neurorehabilitation, despite being an expensive service in limited supply. To demonstrate how mathematical modelling can be used to optimise service delivery, by way of a case study at a major 21 bed neurorehabilitation unit in the UK. An automated computer program for assigning weekly treatment sessions is developed. Queue modelling is used to construct a mathematical model of the hospital in terms of referral submissions to a waiting list, admission and treatment, and ultimately discharge. This is used to analyse the impact of hypothetical strategic decisions on a variety of performance measures and costs. The project culminates in a hybridised model of these two approaches, since a relationship is found between the number of therapy hours received each week (scheduling output) and length of stay (queuing model input). The introduction of the treatment scheduling program has substantially improved timetable quality (meaning a better and fairer service to patients) and has reduced employee time expended in its creation by approximately six hours each week (freeing up time for clinical work). The queuing model has been used to assess the effect of potential strategies, such as increasing the number of beds or employing more therapists. The use of mathematical modelling has not only optimised resources in the short term, but has allowed the optimality of longer term strategic decisions to be assessed.

  17. Application of ant colony optimisation in distribution transformer sizing

    African Journals Online (AJOL)

    This study proposes an optimisation method for transformer sizing in power system using ant colony optimisation and a verification of the process by MATLAB software. The aim is to address the issue of transformer sizing which is a major challenge affecting its effective performance, longevity, huge capital cost and power ...

  18. Adjoint Optimisation of the Turbulent Flow in an Annular Diffuser

    DEFF Research Database (Denmark)

    Gotfredsen, Erik; Agular Knudsen, Christian; Kunoy, Jens Dahl

    2017-01-01

    In the present study, a numerical optimisation of guide vanes in an annular diffuser, is performed. The optimisation is preformed for the purpose of improving the following two parameters simultaneously; the first parameter is the uniformity perpen-dicular to the flow direction, a 1/3 diameter do...

  19. Optimising of Steel Fiber Reinforced Concrete Mix Design | Beddar ...

    African Journals Online (AJOL)

    Optimising of Steel Fiber Reinforced Concrete Mix Design. ... as a result of the loss of mixture workability that will be translated into a difficult concrete casting in site. ... An experimental study of an optimisation method of fibres in reinforced ...

  20. From experimental zoology to big data: Observation and integration in the study of animal development.

    Science.gov (United States)

    Bolker, Jessica; Brauckmann, Sabine

    2015-06-01

    The founding of the Journal of Experimental Zoology in 1904 was inspired by a widespread turn toward experimental biology in the 19th century. The founding editors sought to promote experimental, laboratory-based approaches, particularly in developmental biology. This agenda raised key practical and epistemological questions about how and where to study development: Does the environment matter? How do we know that a cell or embryo isolated to facilitate observation reveals normal developmental processes? How can we integrate descriptive and experimental data? R.G. Harrison, the journal's first editor, grappled with these questions in justifying his use of cell culture to study neural patterning. Others confronted them in different contexts: for example, F.B. Sumner insisted on the primacy of fieldwork in his studies on adaptation, but also performed breeding experiments using wild-collected animals. The work of Harrison, Sumner, and other early contributors exemplified both the power of new techniques, and the meticulous explanation of practice and epistemology that was marshaled to promote experimental approaches. A century later, experimentation is widely viewed as the standard way to study development; yet at the same time, cutting-edge "big data" projects are essentially descriptive, closer to natural history than to the approaches championed by Harrison et al. Thus, the original questions about how and where we can best learn about development are still with us. Examining their history can inform current efforts to incorporate data from experiment and description, lab and field, and a broad range of organisms and disciplines, into an integrated understanding of animal development. © 2015 Wiley Periodicals, Inc.

  1. TELECOM BIG DATA FOR URBAN TRANSPORT ANALYSIS – A CASE STUDY OF SPLIT-DALMATIA COUNTY IN CROATIA

    OpenAIRE

    M. Baučić; N. Jajac; M. Bućan

    2017-01-01

    Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the...

  2. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  3. Molecular pathological epidemiology: new developing frontiers of big data science to study etiologies and pathogenesis.

    Science.gov (United States)

    Hamada, Tsuyoshi; Keum, NaNa; Nishihara, Reiko; Ogino, Shuji

    2017-03-01

    Molecular pathological epidemiology (MPE) is an integrative field that utilizes molecular pathology to incorporate interpersonal heterogeneity of a disease process into epidemiology. In each individual, the development and progression of a disease are determined by a unique combination of exogenous and endogenous factors, resulting in different molecular and pathological subtypes of the disease. Based on "the unique disease principle," the primary aim of MPE is to uncover an interactive relationship between a specific environmental exposure and disease subtypes in determining disease incidence and mortality. This MPE approach can provide etiologic and pathogenic insights, potentially contributing to precision medicine for personalized prevention and treatment. Although breast, prostate, lung, and colorectal cancers have been among the most commonly studied diseases, the MPE approach can be used to study any disease. In addition to molecular features, host immune status and microbiome profile likely affect a disease process, and thus serve as informative biomarkers. As such, further integration of several disciplines into MPE has been achieved (e.g., pharmaco-MPE, immuno-MPE, and microbial MPE), to provide novel insights into underlying etiologic mechanisms. With the advent of high-throughput sequencing technologies, available genomic and epigenomic data have expanded dramatically. The MPE approach can also provide a specific risk estimate for each disease subgroup, thereby enhancing the impact of genome-wide association studies on public health. In this article, we present recent progress of MPE, and discuss the importance of accounting for the disease heterogeneity in the era of big-data health science and precision medicine.

  4. Using Multiple Big Datasets and Machine Learning to Produce a New Global Particulate Dataset: A Technology Challenge Case Study

    Science.gov (United States)

    Lary, D. J.

    2013-12-01

    A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.

  5. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  6. Carbon dioxide sequestration using NaHSO4 and NaOH: A dissolution and carbonation optimisation study.

    Science.gov (United States)

    Sanna, Aimaro; Steel, Luc; Maroto-Valer, M Mercedes

    2017-03-15

    The use of NaHSO 4 to leach out Mg fromlizardite-rich serpentinite (in form of MgSO 4 ) and the carbonation of CO 2 (captured in form of Na 2 CO 3 using NaOH) to form MgCO 3 and Na 2 SO 4 was investigated. Unlike ammonium sulphate, sodium sulphate can be separated via precipitation during the recycling step avoiding energy intensive evaporation process required in NH 4 -based processes. To determine the effectiveness of the NaHSO 4 /NaOH process when applied to lizardite, the optimisation of the dissolution and carbonation steps were performed using a UK lizardite-rich serpentine. Temperature, solid/liquid ratio, particle size, concentration and molar ratio were evaluated. An optimal dissolution efficiency of 69.6% was achieved over 3 h at 100 °C using 1.4 M sodium bisulphate and 50 g/l serpentine with particle size 75-150 μm. An optimal carbonation efficiency of 95.4% was achieved over 30 min at 90 °C and 1:1 magnesium:sodium carbonate molar ratio using non-synthesised solution. The CO 2 sequestration capacity was 223.6 g carbon dioxide/kg serpentine (66.4% in terms of Mg bonded to hydromagnesite), which is comparable with those obtained using ammonium based processes. Therefore, lizardite-rich serpentinites represent a valuable resource for the NaHSO 4 /NaOH based pH swing mineralisation process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Deciding between carbon trading and carbon capture and sequestration: an optimisation-based case study for methanol synthesis from syngas.

    Science.gov (United States)

    Üçtuğ, Fehmi Görkem; Ağralı, Semra; Arıkan, Yıldız; Avcıoğlu, Eray

    2014-01-01

    The economic and technical feasibility of carbon capture and sequestration (CCS) systems are gaining importance as CO2 emission reduction is becoming a more pressing issue for parties from production sectors. Public and private entities have to comply with national schemes imposing tighter limits on their emission allowances. Often these parties face two options as whether to invest in CCS or buy carbon credits for the excess emissions above their limits. CCS is an expensive system to invest in and to operate. Therefore, its feasibility depends on the carbon credit prices prevailing in the markets now and in the future. In this paper we consider the problem of installing a CCS unit in order to ensure that the amount of CO2 emissions is within its allowable limits. We formulate this problem as a non-linear optimisation problem where the objective is to maximise the net returns from pursuing an optimal mix of the two options described above. General Algebraic Modelling Systems (GAMS) software was used to solve the model. The results were found to be sensitive to carbon credit prices and the discount rate, which determines the choices with respect to the future and the present. The model was applied to a methanol synthesis plant as an example. However, the formulation can easily be extended to any production process if the CO2 emissions level per unit of physical production is known. The results showed that for CCS to be feasible, carbon credit prices must be above 15 Euros per ton. This value, naturally, depends on the plant-specific data, and the costs we have employed for CCS. The actual prices (≈5 Euros/ton CO2) at present are far from encouraging the investors into CCS technology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Trait Emotional Intelligence and the Big Five: A Study on Italian Children and Preadolescents

    Science.gov (United States)

    Russo, Paolo Maria; Mancini, Giacomo; Trombini, Elena; Baldaro, Bruno; Mavroveli, Stella; Petrides, K. V.

    2012-01-01

    Trait emotional intelligence (EI) is a constellation of emotion-related self-perceptions located at the lower levels of personality hierarchies. This article examines the validity of the Trait Emotional Intelligence Questionnaire-Child Form and investigates its relationships with Big Five factors and cognitive ability. A total of 690 children (317…

  9. Study on LBS for Characterization and Analysis of Big Data Benchmarks

    Directory of Open Access Journals (Sweden)

    Aftab Ahmed Chandio

    2014-10-01

    Full Text Available In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a on-demand accessed and (b large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes to thousands of PT (Peta Bytes (i.e. Big Data. To increase the development and the assessment of the applications such as LBS (Location Based Services, a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction

  10. Study of LBS for characterization and analysis of big data benchmarks

    International Nuclear Information System (INIS)

    Chandio, A.A.; Zhang, F.; Memon, T.D.

    2014-01-01

    In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a) on-demand accessed and (b) large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes) to thousands of PT (Peta Bytes) (i.e. Big Data). To increase the development and the assessment of the applications such as LBS (Location Based Services), a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction. (author)

  11. Understanding the implementation and adoption of an information technology intervention to support medicine optimisation in primary care: qualitative study using strong structuration theory.

    Science.gov (United States)

    Jeffries, Mark; Phipps, Denham; Howard, Rachel L; Avery, Anthony; Rodgers, Sarah; Ashcroft, Darren

    2017-05-10

    Using strong structuration theory, we aimed to understand the adoption and implementation of an electronic clinical audit and feedback tool to support medicine optimisation for patients in primary care. This is a qualitative study informed by strong structuration theory. The analysis was thematic, using a template approach. An a priori set of thematic codes, based on strong structuration theory, was developed from the literature and applied to the transcripts. The coding template was then modified through successive readings of the data. Clinical commissioning group in the south of England. Four focus groups and five semi-structured interviews were conducted with 18 participants purposively sampled from a range of stakeholder groups (general practitioners, pharmacists, patients and commissioners). Using the system could lead to improved medication safety, but use was determined by broad institutional contexts; by the perceptions, dispositions and skills of users; and by the structures embedded within the technology. These included perceptions of the system as new and requiring technical competence and skill; the adoption of the system for information gathering; and interactions and relationships that involved individual, shared or collective use. The dynamics between these external, internal and technological structures affected the adoption and implementation of the system. Successful implementation of information technology interventions for medicine optimisation will depend on a combination of the infrastructure within primary care, social structures embedded in the technology and the conventions, norms and dispositions of those utilising it. Future interventions, using electronic audit and feedback tools to improve medication safety, should consider the complexity of the social and organisational contexts and how internal and external structures can affect the use of the technology in order to support effective implementation. © Article author(s) (or their

  12. Study and optimisation of the high energy detector in Cd(Zn)Te of the Simbol-X space mission for X and gamma astronomy

    International Nuclear Information System (INIS)

    Meuris, A.

    2009-09-01

    Stars in final phases of evolution are sites of highest energetic phenomena of the Universe. The understanding of their mechanisms is based on the observation of the X and gamma rays from the sources. The Simbol-X French-Italian project is a novel concept of telescope with two satellites flying in formation. This space mission combines upgraded optics from X-ray telescopes with detection Systems from gamma-ray telescopes. CEA Saclay involved in major space missions for gamma astronomy is in charge of the definition and the design of the High Energy Detector (HED) of Simbol-X to cover the spectral range from 8 to 80 keV. Two generations of micro-cameras called Caliste have been designed, fabricated and tested. They integrate cadmium telluride (CdTe) crystals and optimised front-end electronics named Idef-X. The hybridization technique enables to put them side by side as a mosaic to achieve for the first time a CdTe detection plane with fine spatial resolution (600 μm) and arbitrarily large surface. By setting up test benches and leading test campaigns, I was involved in the fabrication of Caliste prototypes and I assessed temporal, spatial and spectral resolutions. At the conclusion of experiments and simulations, I propose a detector type, operating conditions and digital processing on board the spacecraft to optimise HED performance. The best detector candidate is CdTe Schottky, well suited to high resolution spectroscopy; however, it suffers from lost in stability during biasing. Beyond Simbol-X mission, I studied theoretically and experimentally this kind of detector to build an updated model that can apply to other projects of gamma spectroscopy and imaging. (author)

  13. The scientific production on data quality in big data: a study in the Web of Science database

    Directory of Open Access Journals (Sweden)

    Priscila Basto Fagundes

    2017-11-01

    Full Text Available More and more, the big data theme has attracted interest in researchers from different areas of knowledge, among them information scientists who need to understand their concepts and applications in order to contribute with new proposals for the management of the information generated from the data stored in these environments. The objective of this article is to present a survey of publications about data quality in big data in the Web of Science database until the year 2016. Will be presented the total number of publications indexed in the database, the number of publications per year, the location the origin of the research and a synthesis of the studies found. The survey in the database was conducted in July 2017 and resulted in a total of 23 publications. In order to make it possible to present a summary of the publications in this article, searches were made of the full texts of all the publications on the Internet and read the ones that were available. With this survey it was possible to conclude that the studies on data quality in big data had their publications starting in 2013, most of which present literature reviews and few effective proposals for the monitoring and management of data quality in environments with large volumes of data. Therefore, it is intended with this survey to contribute and foster new research on the context of data quality in big data environments.

  14. Particle Swarm Optimisation with Spatial Particle Extension

    DEFF Research Database (Denmark)

    Krink, Thiemo; Vesterstrøm, Jakob Svaneborg; Riget, Jacques

    2002-01-01

    In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed...

  15. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  16. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  17. Optimisation on processing parameters for minimising warpage on side arm using response surface methodology (RSM) and particle swarm optimisation (PSO)

    Science.gov (United States)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.

    2017-09-01

    This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.

  18. A Study of the Subtitle Translation in “The Big Bang Theory” from Newmark’s Communicative Translation Theory

    Institute of Scientific and Technical Information of China (English)

    甄宽; 彭念凡; 甄顺

    2015-01-01

    The subtitle translation is very different from other forms of translation.We translators should meet the particular needs of the subtitle.This study is going to analyze the subtitle translation in "The Big Bang Theory" from Newmark’s Communicative Translation Theory in three main perspectives:the information transmission,the aesthetics effect and the emotional transmission.In the information transmission the study will put emphasis on the limited circumstance.In the aesthetics effect the study will explore the expression of the sense of beauty.In the emotional transmission this study will study the use of rhetoric to express different emotions.

  19. An artificial intelligence approach fit for tRNA gene studies in the era of big sequence data.

    Science.gov (United States)

    Iwasaki, Yuki; Abe, Takashi; Wada, Kennosuke; Wada, Yoshiko; Ikemura, Toshimichi

    2017-09-12

    Unsupervised data mining capable of extracting a wide range of knowledge from big data without prior knowledge or particular models is a timely application in the era of big sequence data accumulation in genome research. By handling oligonucleotide compositions as high-dimensional data, we have previously modified the conventional self-organizing map (SOM) for genome informatics and established BLSOM, which can analyze more than ten million sequences simultaneously. Here, we develop BLSOM specialized for tRNA genes (tDNAs) that can cluster (self-organize) more than one million microbial tDNAs according to their cognate amino acid solely depending on tetra- and pentanucleotide compositions. This unsupervised clustering can reveal combinatorial oligonucleotide motifs that are responsible for the amino acid-dependent clustering, as well as other functionally and structurally important consensus motifs, which have been evolutionarily conserved. BLSOM is also useful for identifying tDNAs as phylogenetic markers for special phylotypes. When we constructed BLSOM with 'species-unknown' tDNAs from metagenomic sequences plus 'species-known' microbial tDNAs, a large portion of metagenomic tDNAs self-organized with species-known tDNAs, yielding information on microbial communities in environmental samples. BLSOM can also enhance accuracy in the tDNA database obtained from big sequence data. This unsupervised data mining should become important for studying numerous functionally unclear RNAs obtained from a wide range of organisms.

  20. Optimisation of occupational exposure

    International Nuclear Information System (INIS)

    Webb, G.A.M.; Fleishman, A.B.

    1982-01-01

    The general concept of the optimisation of protection of the public is briefly described. Some ideas being developed for extending the cost benefit framework to include radiation workers with full implementation of the ALARA criterion are described. The role of cost benefit analysis in radiological protection and the valuation of health detriment including the derivation of monetary values and practical implications are discussed. Cost benefit analysis can lay out for inspection the doses, the associated health detriment costs and the costs of protection for alternative courses of action. However it is emphasised that the cost benefit process is an input to decisions on what is 'as low as reasonably achievable' and not a prescription for making them. (U.K.)

  1. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  2. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  3. A comparison of forward planning and optimised inverse planning

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony; Webb, Steve

    1995-01-01

    A radiotherapy treatment plan optimisation algorithm has been applied to 48 prostate plans and the results compared with those of an experienced human planner. Twelve patients were used in the study, and a 3, 4, 6 and 8 field plan (with standard coplanar beam angles for each plan type) were optimised by both the human planner and the optimisation algorithm. The human planner 'optimised' the plan by conventional forward planning techniques. The optimisation algorithm was based on fast-simulated-annealing. 'Importance factors' assigned to different regions of the patient provide a method for controlling the algorithm, and it was found that the same values gave good results for almost all plans. The plans were compared on the basis of dose statistics and normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results show that the optimisation algorithm yielded results that were at least as good as the human planner for all plan types, and on the whole slightly better. A study of the beam-weights chosen by the optimisation algorithm and the planner will be presented. The optimisation algorithm showed greater variation, in response to individual patient geometry. For simple (e.g. 3 field) plans it was found to consistently achieve slightly higher TCP and lower NTCP values. For more complicated (e.g. 8 fields) plans the optimisation also achieved slightly better results with generally less numbers of beams. The optimisation time was always ≤5 minutes; a factor of up to 20 times faster than the human planner

  4. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  5. Big geo data surface approximation using radial basis functions: A comparative study

    Science.gov (United States)

    Majdisova, Zuzana; Skala, Vaclav

    2017-12-01

    Approximation of scattered data is often a task in many engineering problems. The Radial Basis Function (RBF) approximation is appropriate for big scattered datasets in n-dimensional space. It is a non-separable approximation, as it is based on the distance between two points. This method leads to the solution of an overdetermined linear system of equations. In this paper the RBF approximation methods are briefly described, a new approach to the RBF approximation of big datasets is presented, and a comparison for different Compactly Supported RBFs (CS-RBFs) is made with respect to the accuracy of the computation. The proposed approach uses symmetry of a matrix, partitioning the matrix into blocks and data structures for storage of the sparse matrix. The experiments are performed for synthetic and real datasets.

  6. Data, BigData and smart cities. Considerations and case study on environmental monitoring

    Directory of Open Access Journals (Sweden)

    Giacomo Chiesa

    2014-10-01

    Full Text Available The growing interest in technologies and strategies for constructing smart cities and smart buildings promotes the spread of ICT solutions which often use large amounts of data. Nowadays, urban monitoring are often interrelated with the innovations introduced by BigData and the neologism “datization”, passing from the collection of a limited number of datapoints to the accumulation of as much data as possible, regardless of their future uses. The paper focuses on the production phase of data from the monitoring of environmental variables by using several measurement stations spread on the territory. The aim is to identify operational problems and possible solutions for a bottom-up construction of BigData datasets.

  7. Predictive Big Data Analytics: A Study of Parkinson?s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    OpenAIRE

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph

    2016-01-01

    Background A unique archive of Big Data on Parkinson?s Disease is collected, managed and disseminated by the Parkinson?s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationsh...

  8. Stable isotope and trace element studies of black bear hair, Big Bend ecosystem, Texas and Mexico

    Science.gov (United States)

    Shanks, W.C. Pat; Hellgren, Eric C.; Stricker, Craig A.; Gemery-Hill, Pamela A.; Onorato, David P.

    2008-01-01

    Hair from black bears (Ursus americanus), collected from four areas in the Big Bend ecosystem, has been analyzed for stable isotopes of carbon, nitrogen, and sulfur to determine major food sources and for trace metals to infer possible effects of environmental contaminants. Results indicate that black bears are largely vegetarian, feeding on desert plants, nuts, and berries. Mercury concentrations in bear hair are below safe level standards (

  9. The Role of Distributed Computing in Big Data Science: Case Studies in Forensics and Bioinformatics

    OpenAIRE

    Roscigno, Gianluca

    2016-01-01

    2014 - 2015 The era of Big Data is leading the generation of large amounts of data, which require storage and analysis capabilities that can be only ad- dressed by distributed computing systems. To facilitate large-scale distributed computing, many programming paradigms and frame- works have been proposed, such as MapReduce and Apache Hadoop, which transparently address some issues of distributed systems and hide most of their technical details. Hadoop is curren...

  10. Results of the 2010 IGSC Topical Session on Optimisation

    International Nuclear Information System (INIS)

    Bailey, Lucy

    2014-01-01

    Document available in abstract form only. Full text follows: The 2010 IGSC topical session on optimisation explored a wide range of issues concerning optimisation throughout the radioactive waste management process. Philosophical and ethical questions were discussed, such as: - To what extent is the process of optimisation more important than the end result? - How do we balance long-term environmental safety with near-term operational safety? - For how long should options be kept open? - In balancing safety and excessive cost, when is BAT achieved and who decides on this? * How should we balance the needs of current society with those of future generations? It was clear that optimisation is about getting the right balance between a range of issues that cover: radiation protection, environmental protection, operational safety, operational requirements, social expectations and cost. The optimisation process will also need to respect various constraints, which are likely to include: regulatory requirements, site restrictions, community-imposed requirements or restrictions and resource constraints. These issues were explored through a number of presentations that discussed practical cases of optimisation occurring at different stages of international radioactive waste management programmes. These covered: - Operations and decommissioning - management of large disused components, from the findings of an international study, presented by WPDD; - Concept option selection, prior to site selection - upstream and disposal system optioneering in the UK; - Siting decisions - examples from both Germany and France, explaining how optimisation is being used to support site comparisons and communicate siting decisions; - Repository design decisions - comparison of KBS-3 horizontal and vertical deposition options in Finland; and - On-going optimisation during repository operation - operational experience from WIPP in the US. The variety of the remarks and views expressed during the

  11. Operational Radiological Protection and Aspects of Optimisation

    International Nuclear Information System (INIS)

    Lazo, E.; Lindvall, C.G.

    2005-01-01

    Since 1992, the Nuclear Energy Agency (NEA), along with the International Atomic Energy Agency (IAEA), has sponsored the Information System on Occupational Exposure (ISOE). ISOE collects and analyses occupational exposure data and experience from over 400 nuclear power plants around the world and is a forum for radiological protection experts from both nuclear power plants and regulatory authorities to share lessons learned and best practices in the management of worker radiation exposures. In connection to the ongoing work of the International Commission on Radiological Protection (ICRP) to develop new recommendations, the ISOE programme has been interested in how the new recommendations would affect operational radiological protection application at nuclear power plants. Bearing in mind that the ICRP is developing, in addition to new general recommendations, a new recommendation specifically on optimisation, the ISOE programme created a working group to study the operational aspects of optimisation, and to identify the key factors in optimisation that could usefully be reflected in ICRP recommendations. In addition, the Group identified areas where further ICRP clarification and guidance would be of assistance to practitioners, both at the plant and the regulatory authority. The specific objective of this ISOE work was to provide operational radiological protection input, based on practical experience, to the development of new ICRP recommendations, particularly in the area of optimisation. This will help assure that new recommendations will best serve the needs of those implementing radiation protection standards, for the public and for workers, at both national and international levels. (author)

  12. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  13. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  14. Optimising Ankle Foot Orthoses for children with Cerebral Palsy walking with excessive knee flexion to improve their mobility and participation; protocol of the AFO-CP study

    Directory of Open Access Journals (Sweden)

    Kerkum Yvette L

    2013-02-01

    Full Text Available Abstract Background Ankle-Foot-Orthoses with a ventral shell, also known as Floor Reaction Orthoses (FROs, are often used to reduce gait-related problems in children with spastic cerebral palsy (SCP, walking with excessive knee flexion. However, current evidence for the effectiveness (e.g. in terms of walking energy cost of FROs is both limited and inconclusive. Much of this ambiguity may be due to a mismatch between the FRO ankle stiffness and the patient’s gait deviations. The primary aim of this study is to evaluate the effect of FROs optimised for ankle stiffness on the walking energy cost in children with SCP, compared to walking with shoes alone. In addition, effects on various secondary outcome measures will be evaluated in order to identify possible working mechanisms and potential predictors of FRO treatment success. Method/Design A pre-post experimental study design will include 32 children with SCP, walking with excessive knee flexion in midstance, recruited from our university hospital and affiliated rehabilitation centres. All participants will receive a newly designed FRO, allowing ankle stiffness to be varied into three configurations by means of a hinge. Gait biomechanics will be assessed for each FRO configuration. The FRO that results in the greatest reduction in knee flexion during the single stance phase will be selected as the subject’s optimal FRO. Subsequently, the effects of wearing this optimal FRO will be evaluated after 12–20 weeks. The primary study parameter will be walking energy cost, with the most important secondary outcomes being intensity of participation, daily activity, walking speed and gait biomechanics. Discussion The AFO-CP trial will be the first experimental study to evaluate the effect of individually optimised FROs on mobility and participation. The evaluation will include outcome measures at all levels of the International Classification of Functioning, Disability and Health, providing a unique

  15. Machine Learning for Big Data: A Study to Understand Limits at Scale

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Del-Castillo-Negrete, Carlos Emilio [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-21

    This report aims to empirically understand the limits of machine learning when applied to Big Data. We observe that recent innovations in being able to collect, access, organize, integrate, and query massive amounts of data from a wide variety of data sources have brought statistical data mining and machine learning under more scrutiny, evaluation and application for gleaning insights from the data than ever before. Much is expected from algorithms without understanding their limitations at scale while dealing with massive datasets. In that context, we pose and address the following questions How does a machine learning algorithm perform on measures such as accuracy and execution time with increasing sample size and feature dimensionality? Does training with more samples guarantee better accuracy? How many features to compute for a given problem? Do more features guarantee better accuracy? Do efforts to derive and calculate more features and train on larger samples worth the effort? As problems become more complex and traditional binary classification algorithms are replaced with multi-task, multi-class categorization algorithms do parallel learners perform better? What happens to the accuracy of the learning algorithm when trained to categorize multiple classes within the same feature space? Towards finding answers to these questions, we describe the design of an empirical study and present the results. We conclude with the following observations (i) accuracy of the learning algorithm increases with increasing sample size but saturates at a point, beyond which more samples do not contribute to better accuracy/learning, (ii) the richness of the feature space dictates performance - both accuracy and training time, (iii) increased dimensionality often reflected in better performance (higher accuracy in spite of longer training times) but the improvements are not commensurate the efforts for feature computation and training and (iv) accuracy of the learning algorithms

  16. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  17. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  18. Effects of benchmarking on the quality of type 2 diabetes care: results of the OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study in Greece

    Science.gov (United States)

    Tsimihodimos, Vasilis; Kostapanos, Michael S.; Moulis, Alexandros; Nikas, Nikos; Elisaf, Moses S.

    2015-01-01

    Objectives: To investigate the effect of benchmarking on the quality of type 2 diabetes (T2DM) care in Greece. Methods: The OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study [ClinicalTrials.gov identifier: NCT00681850] was an international multicenter, prospective cohort study. It included physicians randomized 3:1 to either receive benchmarking for glycated hemoglobin (HbA1c), systolic blood pressure (SBP) and low-density lipoprotein cholesterol (LDL-C) treatment targets (benchmarking group) or not (control group). The proportions of patients achieving the targets of the above-mentioned parameters were compared between groups after 12 months of treatment. Also, the proportions of patients achieving those targets at 12 months were compared with baseline in the benchmarking group. Results: In the Greek region, the OPTIMISE study included 797 adults with T2DM (570 in the benchmarking group). At month 12 the proportion of patients within the predefined targets for SBP and LDL-C was greater in the benchmarking compared with the control group (50.6 versus 35.8%, and 45.3 versus 36.1%, respectively). However, these differences were not statistically significant. No difference between groups was noted in the percentage of patients achieving the predefined target for HbA1c. At month 12 the increase in the percentage of patients achieving all three targets was greater in the benchmarking (5.9–15.0%) than in the control group (2.7–8.1%). In the benchmarking group more patients were on target regarding SBP (50.6% versus 29.8%), LDL-C (45.3% versus 31.3%) and HbA1c (63.8% versus 51.2%) at 12 months compared with baseline (p Benchmarking may comprise a promising tool for improving the quality of T2DM care. Nevertheless, target achievement rates of each, and of all three, quality indicators were suboptimal, indicating there are still unmet needs in the management of T2DM. PMID:26445642

  19. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  20. Study rationale and design of OPTIMISE, a randomised controlled trial on the effect of benchmarking on quality of care in type 2 diabetes mellitus

    Science.gov (United States)

    2011-01-01

    Background To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Methods Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Results Recruitment was completed in December 2008 with 3994 evaluable patients. Conclusions This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. Trial registration NCT00681850 PMID:21939502

  1. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  2. The Promise and Perils of Using Big Data in the Study of Corporate Networks

    DEFF Research Database (Denmark)

    Heemskerk, Eelke; Young, Kevin; Takes, Frank W.

    2018-01-01

    problems. While acknowledging that different research questions require different approaches to data quality, we offer a schematic platform that researchers can follow to make informed and intelligent decisions about BCND issues and address these through a specific work-flow procedure. For each step...... challenges associated with the nature of the subject matter, variable data quality and other problems associated with currently available data on this scale, we discuss the promise and perils of using big corporate network data (BCND). We propose a standard procedure for helping researchers deal with BCND...

  3. Turbulence optimisation in stellarator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Proll, Josefine H.E. [Max-Planck/Princeton Center for Plasma Physics (Germany); Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstr. 1, 17491 Greifswald (Germany); Faber, Benjamin J. [HSX Plasma Laboratory, University of Wisconsin-Madison, Madison, WI 53706 (United States); Helander, Per; Xanthopoulos, Pavlos [Max-Planck/Princeton Center for Plasma Physics (Germany); Lazerson, Samuel A.; Mynick, Harry E. [Plasma Physics Laboratory, Princeton University, P.O. Box 451 Princeton, New Jersey 08543-0451 (United States)

    2015-05-01

    Stellarators, the twisted siblings of the axisymmetric fusion experiments called tokamaks, have historically suffered from confining the heat of the plasma insufficiently compared with tokamaks and were therefore considered to be less promising candidates for a fusion reactor. This has changed, however, with the advent of stellarators in which the laminar transport is reduced to levels below that of tokamaks by shaping the magnetic field accordingly. As in tokamaks, the turbulent transport remains as the now dominant transport channel. Recent analytical theory suggests that the large configuration space of stellarators allows for an additional optimisation of the magnetic field to also reduce the turbulent transport. In this talk, the idea behind the turbulence optimisation is explained. We also present how an optimised equilibrium is obtained and how it might differ from the equilibrium field of an already existing device, and we compare experimental turbulence measurements in different configurations of the HSX stellarator in order to test the optimisation procedure.

  4. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  5. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  6. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  7. Transforming fragments into candidates: small becomes big in medicinal chemistry.

    Science.gov (United States)

    de Kloe, Gerdien E; Bailey, David; Leurs, Rob; de Esch, Iwan J P

    2009-07-01

    Fragment-based drug discovery (FBDD) represents a logical and efficient approach to lead discovery and optimisation. It can draw on structural, biophysical and biochemical data, incorporating a wide range of inputs, from precise mode-of-binding information on specific fragments to wider ranging pharmacophoric screening surveys using traditional HTS approaches. It is truly an enabling technology for the imaginative medicinal chemist. In this review, we analyse a representative set of 23 published FBDD studies that describe how low molecular weight fragments are being identified and efficiently transformed into higher molecular weight drug candidates. FBDD is now becoming warmly endorsed by industry as well as academia and the focus on small interacting molecules is making a big scientific impact.

  8. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  9. Are Big Food's corporate social responsibility strategies valuable to communities? A qualitative study with parents and children.

    Science.gov (United States)

    Richards, Zoe; Phillipson, Lyn

    2017-12-01

    Recent studies have identified parents and children as two target groups whom Big Food hopes to positively influence through its corporate social responsibility (CSR) strategies. The current preliminary study aimed to gain an in-depth understanding of parents and children's awareness and interpretation of Big Food's CSR strategies to understand how CSR shapes their beliefs about companies. Community-based qualitative semi-structured interviews. New South Wales, Australia. Parents (n 15) and children aged 8-12 years (n 15). Parents and children showed unprompted recognition of CSR activities when shown McDonald's and Coca-Cola brand logos, indicating a strong level of association between the brands and activities that target the settings of children. When discussing CSR strategies some parents and most children saw value in the activities, viewing them as acts of merit or worth. For some parents and children, the companies' CSR activities were seen as a reflection of the company's moral attributes, which resonated with their own values of charity and health. For others, CSR strategies were in conflict with companies' core business. Finally, some also viewed the activities as harmful, representing a deceit of the public and a smokescreen for the companies' ultimately unethical behaviour. A large proportion of participants valued the CSR activities, signalling that denormalising CSR to sever the strong ties between the community and Big Food will be a difficult process for the public health community. Efforts to gain public acceptance for action on CSR may need greater levels of persuasion to gain public support of a comprehensive and restrictive approach.

  10. Development of a United States-Mexico Emissions Inventory for the Big Bend Regional Aerosol and Visibility Observational (BRAVO) Study.

    Science.gov (United States)

    Kuhns, Hampden; Knipping, Eladio M; Vukovich, Jeffrey M

    2005-05-01

    The Big Bend Regional Aerosol and Visibility Observational (BRAVO) Study was commissioned to investigate the sources of haze at Big Bend National Park in southwest Texas. The modeling domain of the BRAVO Study includes most of the continental United States and Mexico. The BRAVO emissions inventory was constructed from the 1999 National Emission Inventory for the United States, modified to include finer-resolution data for Texas and 13 U.S. states in close proximity. The first regional-scale Mexican emissions inventory designed for air-quality modeling applications was developed for 10 northern Mexican states, the Tula Industrial Park in the state of Hidalgo, and the Popocatépetl volcano in the state of Puebla. Emissions data were compiled from numerous sources, including the U.S. Environmental Protection Agency (EPA), the Texas Natural Resources Conservation Commission (now Texas Commission on Environmental Quality), the Eastern Research Group, the Minerals Management Service, the Instituto Nacional de Ecología, and the Instituto Nacional de Estadistica Geografía y Informática. The inventory includes emissions for CO, nitrogen oxides, sulfur dioxide, volatile organic compounds (VOCs), ammonia, particulate matter (PM) < 10 microm in aerodynamic diameter, and PM < 2.5 microm in aerodynamic diameter. Wind-blown dust and biomass burning were not included in the inventory, although high concentrations of dust and organic PM attributed to biomass burning have been observed at Big Bend National Park. The SMOKE modeling system was used to generate gridded emissions fields for use with the Regional Modeling System for Aerosols and Deposition (REMSAD) and the Community Multiscale Air Quality model modified with the Model of Aerosol Dynamics, Reaction, Ionization and Dissolution (CMAQ-MADRID). The compilation of the inventory, supporting model input data, and issues encountered during the development of the inventory are documented. A comparison of the BRAVO emissions

  11. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  12. Vaccine strategies: Optimising outcomes.

    Science.gov (United States)

    Hardt, Karin; Bonanni, Paolo; King, Susan; Santos, Jose Ignacio; El-Hodhod, Mostafa; Zimet, Gregory D; Preiss, Scott

    2016-12-20

    factors that encourage success, which often include strong support from government and healthcare organisations, as well as tailored, culturally-appropriate local approaches to optimise outcomes. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. ASAP ECMO: Antibiotic, Sedative and Analgesic Pharmacokinetics during Extracorporeal Membrane Oxygenation: a multi-centre study to optimise drug therapy during ECMO

    Directory of Open Access Journals (Sweden)

    Shekar Kiran

    2012-11-01

    Full Text Available Abstract Background Given the expanding scope of extracorporeal membrane oxygenation (ECMO and its variable impact on drug pharmacokinetics as observed in neonatal studies, it is imperative that the effects of the device on the drugs commonly prescribed in the intensive care unit (ICU are further investigated. Currently, there are no data to confirm the appropriateness of standard drug dosing in adult patients on ECMO. Ineffective drug regimens in these critically ill patients can seriously worsen patient outcomes. This study was designed to describe the pharmacokinetics of the commonly used antibiotic, analgesic and sedative drugs in adult patients receiving ECMO. Methods/Design This is a multi-centre, open-label, descriptive pharmacokinetic (PK study. Eligible patients will be adults treated with ECMO for severe cardiac and/or respiratory failure at five Intensive Care Units in Australia and New Zealand. Patients will receive the study drugs as part of their routine management. Blood samples will be taken from indwelling catheters to investigate plasma concentrations of several antibiotics (ceftriaxone, meropenem, vancomycin, ciprofloxacin, gentamicin, piperacillin-tazobactum, ticarcillin-clavulunate, linezolid, fluconazole, voriconazole, caspofungin, oseltamivir, sedatives and analgesics (midazolam, morphine, fentanyl, propofol, dexmedetomidine, thiopentone. The PK of each drug will be characterised to determine the variability of PK in these patients and to develop dosing guidelines for prescription during ECMO. Discussion The evidence-based dosing algorithms generated from this analysis can be evaluated in later clinical studies. This knowledge is vitally important for optimising pharmacotherapy in these most severely ill patients to maximise the opportunity for therapeutic success and minimise the risk of therapeutic failure. Trial registration ACTRN12612000559819

  14. Study of the magnetic spectrograph BIG KARL on image errors and their causes

    International Nuclear Information System (INIS)

    Paul, D.

    1987-12-01

    The ionoptical aberrations of the QQDDQ spectrograph BIG KARL are measured and analyzed in order to improve resolution and transmission at large acceptance. The entrance phasespace is scanned in a cartesian grid by means of a narrow collimated beam of scattered deuterons. The distortions due to the nonlinear transformation by the system are measured in the detector plane. A model is developed which describes the measured distortions. The model allows to locate nonlinearities in the system responsible for the observed distortions. It gives a good understanding of geometrical nonlinearities up to the fifth order and chromatical nonlinearities up to the third order. To confirm the model, the magnetic field in the quadrupoles is measured including the fringe field region. Furthermore, nonlinearities appearing in ideal magnets are discussed and compared to experimental data. (orig.) [de

  15. Big(ger Data as Better Data in Open Distance Learning

    Directory of Open Access Journals (Sweden)

    Paul Prinsloo

    2015-02-01

    Full Text Available In the context of the hype, promise and perils of Big Data and the currently dominant paradigm of data-driven decision-making, it is important to critically engage with the potential of Big Data for higher education. We do not question the potential of Big Data, but we do raise a number of issues, and present a number of theses to be seriously considered in realising this potential. The University of South Africa (Unisa is one of the mega ODL institutions in the world with more than 360,000 students and a range of courses and programmes. Unisa already has access to a staggering amount of student data, hosted in disparate sources, and governed by different processes. As the university moves to mainstreaming online learning, the amount of and need for analyses of data are increasing, raising important questions regarding our assumptions, understanding, data sources, systems and processes. This article presents a descriptive case study of the current state of student data at Unisa, as well as explores the impact of existing data sources and analytic approaches. From the analysis it is clear that in order for big(ger data to be better data, a number of issues need to be addressed. The article concludes by presenting a number of theses that should form the basis for the imperative to optimise the harvesting, analysis and use of student data.

  16. Public transport optimisation emphasising passengers’ travel behaviour.

    OpenAIRE

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in ...

  17. Analysis of total productive maintenance (TPM) implementation using overall equipment effectiveness (OEE) and six big losses: A case study

    Science.gov (United States)

    Martomo, Zenithia Intan; Laksono, Pringgo Widyo

    2018-02-01

    In improving the productivity of the machine, the management of the decision or maintenance policy must be appropriate. In Spinning II unit at PT Apac Inti Corpora, there are 124 ring frame machines that often have breakdown and cause a high downtime so that the production target is not achieved, so this research was conducted on the ring frame machine. This study aims to measure the value of equipment effectiveness, find the root cause of the problem and provide suggestions for improvement. This research begins with measuring the achievement of overall equipment effectiveness (OEE) value, then identifying the six big losses that occur. The results show that the average value of OEE in the ring frame machine is 79.96%, the effectiveness value is quite low because the standard of OEE value for world class company ideally is 85%. The biggest factor that influences the low value of OEE is performance rate with percentage factor six big losses at reduced speed losses of 17.303% of all time loss. Proposed improvement actions are the application of autonomous maintenance, providing training for operators and maintenance technicians and supervising operators in the workplace.

  18. Optimising the quality of antibiotic prescribing in out-of-hours primary care in Belgium: a study protocol for an action research project.

    Science.gov (United States)

    Colliers, Annelies; Coenen, Samuel; Philips, Hilde; Remmen, Roy; Anthierens, Sibyl

    2017-10-15

    Antimicrobial resistance is a major public health threat driven by inappropriate antibiotic use, mainly in general practice and for respiratory tract infections. In Belgium, the quality of general practitioners' (GPs) antibiotic prescribing is low. To improve antibiotic use, we need a better understanding of this quality problem and corresponding interventions. A general practitioners cooperative (GPC) for out-of-hours (OOH) care presents a unique opportunity to reach a large group of GPs and work on quality improvement. Participatory action research (PAR) is a bottom-up approach that focuses on implementing change into daily practice and has the potential to empower practitioners to produce their own solutions to optimise their antibiotic prescribing. This PAR study to improve antibiotic prescribing quality in OOH care uses a mixed methods approach. In a first exploratory phase, we will develop a partnership with a GPC and map the existing barriers and opportunities. In a second phase, we will focus on facilitating change and implementing interventions through PDSA (Plan-Do-Study-Act) cycles. In a third phase, antibiotic prescribing quality outside and antibiotic use during office hours will be evaluated. Equally important are the process evaluation and theory building on improving antibiotic prescribing. The study protocol was approved by the Ethics Committee of the Antwerp University Hospital/University of Antwerp. PAR unfolds in response to the needs and issues of the stakeholders, therefore new ethics approval will be obtained at each new stage of the research. Interventions to improve antibiotic prescribing are needed now more than ever and outcomes will be highly relevant for GPCs, GPs in daily practice, national policymakers and the international scientific community. NCT03082521; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless

  19. Acoustic Resonator Optimisation for Airborne Particle Manipulation

    Science.gov (United States)

    Devendran, Citsabehsan; Billson, Duncan R.; Hutchins, David A.; Alan, Tuncay; Neild, Adrian

    Advances in micro-electromechanical systems (MEMS) technology and biomedical research necessitate micro-machined manipulators to capture, handle and position delicate micron-sized particles. To this end, a parallel plate acoustic resonator system has been investigated for the purposes of manipulation and entrapment of micron sized particles in air. Numerical and finite element modelling was performed to optimise the design of the layered acoustic resonator. To obtain an optimised resonator design, careful considerations of the effect of thickness and material properties are required. Furthermore, the effect of acoustic attenuation which is dependent on frequency is also considered within this study, leading to an optimum operational frequency range. Finally, experimental results demonstrated good particle levitation and capture of various particle properties and sizes ranging to as small as 14.8 μm.

  20. Techno-economic optimisation of energy systems

    International Nuclear Information System (INIS)

    Mansilla Pellen, Ch.

    2006-07-01

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  1. A corporate ALARA engineering support for all EDF sites. A major improvement: the generic work areas optimisation studies

    Energy Technology Data Exchange (ETDEWEB)

    Quiot, Alain [EDF, SPT, UTO, Le Central, Bat. 420, BP 129, 93162 Noisy-le-Grand Cedex (France); Lebeau, Jacques [Electricite de France, ALARA Project, Site Cap Ampere, 1, place Pleyel, 93282 Saint Denis Cedex (France)

    2004-07-01

    ALARA studies performed by EDF plants are quite simple and empirical. Most often, feedback experience and common sense, with the help of simple calculations allow reaching useful and efficient decisions. This is particularly the case when the exposure situations are not complex, within a simple environment and with a single source, or one major source. However, in more complex cases this is not enough to guarantee that actual ALARA solutions are implemented. EDF has then decided to use its national corporate engineering as a support for its sites. That engineering support is in charge of using very efficient tools such as PANTHER-RP. The objective of the presentation is to describe the engineering process and tools now available at EDF, to illustrate them with a few case studies and to describe the goals and procedures set up by EDF. (authors)

  2. The OPTIMIST study: optimisation of cost effectiveness through individualised FSH stimulation dosages for IVF treatment. A randomised controlled trial

    Directory of Open Access Journals (Sweden)

    van Tilborg Theodora C

    2012-09-01

    Full Text Available Abstract Background Costs of in vitro fertilisation (IVF are high, which is partly due to the use of follicle stimulating hormone (FSH. FSH is usually administered in a standard dose. However, due to differences in ovarian reserve between women, ovarian response also differs with potential negative consequences on pregnancy rates. A Markov decision-analytic model showed that FSH dose individualisation according to ovarian reserve is likely to be cost-effective in women who are eligible for IVF. However, this has never been confirmed in a large randomised controlled trial (RCT. The aim of the present study is to assess whether an individualised FSH dose regime based on an ovarian reserve test (ORT is more cost-effective than a standard dose regime. Methods/Design Multicentre RCT in subfertile women indicated for a first IVF or intracytoplasmic sperm injection cycle, who are aged  Discussion The results of this study will be integrated into a decision model that compares cost-effectiveness of the three dose-adjustment strategies to a standard dose strategy. The study outcomes will provide scientific foundation for national and international guidelines. Trial registration NTR2657

  3. Multi-Optimisation Consensus Clustering

    Science.gov (United States)

    Li, Jian; Swift, Stephen; Liu, Xiaohui

    Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.

  4. The OPTIMIST study: optimisation of cost effectiveness through individualised FSH stimulation dosages for IVF treatment. A randomised controlled trial.

    Science.gov (United States)

    van Tilborg, Theodora C; Eijkemans, Marinus J C; Laven, Joop S E; Koks, Carolien A M; de Bruin, Jan Peter; Scheffer, Gabrielle J; van Golde, Ron J T; Fleischer, Kathrin; Hoek, Annemieke; Nap, Annemiek W; Kuchenbecker, Walter K H; Manger, Petra A; Brinkhuis, Egbert A; van Heusden, Arne M; Sluijmer, Alexander V; Verhoeff, Arie; van Hooff, Marcel H A; Friederich, Jaap; Smeenk, Jesper M J; Kwee, Janet; Verhoeve, Harold R; Lambalk, Cornelis B; Helmerhorst, Frans M; van der Veen, Fulco; Mol, Ben Willem J; Torrance, Helen L; Broekmans, Frank J M

    2012-09-18

    Costs of in vitro fertilisation (IVF) are high, which is partly due to the use of follicle stimulating hormone (FSH). FSH is usually administered in a standard dose. However, due to differences in ovarian reserve between women, ovarian response also differs with potential negative consequences on pregnancy rates. A Markov decision-analytic model showed that FSH dose individualisation according to ovarian reserve is likely to be cost-effective in women who are eligible for IVF. However, this has never been confirmed in a large randomised controlled trial (RCT). The aim of the present study is to assess whether an individualised FSH dose regime based on an ovarian reserve test (ORT) is more cost-effective than a standard dose regime. Multicentre RCT in subfertile women indicated for a first IVF or intracytoplasmic sperm injection cycle, who are aged IVF with oocyte donation, will not be included. Ovarian reserve will be assessed by measuring the antral follicle count. Women with a predicted poor response or hyperresponse will be randomised for a standard versus an individualised FSH regime (150 IU/day, 225-450 IU/day and 100 IU/day, respectively). Participants will undergo a maximum of three stimulation cycles during maximally 18 months. The primary study outcome is the cumulative ongoing pregnancy rate resulting in live birth achieved within 18 months after randomisation. Secondary outcomes are parameters for ovarian response, multiple pregnancies, number of cycles needed per live birth, total IU of FSH per stimulation cycle, and costs. All data will be analysed according to the intention-to-treat principle. Cost-effectiveness analysis will be performed to assess whether the health and associated economic benefits of individualised treatment of subfertile women outweigh the additional costs of an ORT. The results of this study will be integrated into a decision model that compares cost-effectiveness of the three dose-adjustment strategies to a standard dose strategy

  5. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  6. Clinical validation of a public health policy-making platform for hearing loss (EVOTION): protocol for a big data study.

    Science.gov (United States)

    Dritsakis, Giorgos; Kikidis, Dimitris; Koloutsou, Nina; Murdin, Louisa; Bibas, Athanasios; Ploumidou, Katherine; Laplante-Lévesque, Ariane; Pontoppidan, Niels Henrik; Bamiou, Doris-Eva

    2018-02-15

    The holistic management of hearing loss (HL) requires an understanding of factors that predict hearing aid (HA) use and benefit beyond the acoustics of listening environments. Although several predictors have been identified, no study has explored the role of audiological, cognitive, behavioural and physiological data nor has any study collected real-time HA data. This study will collect 'big data', including retrospective HA logging data, prospective clinical data and real-time data via smart HAs, a mobile application and biosensors. The main objective is to enable the validation of the EVOTION platform as a public health policy-making tool for HL. This will be a big data international multicentre study consisting of retrospective and prospective data collection. Existing data from approximately 35 000 HA users will be extracted from clinical repositories in the UK and Denmark. For the prospective data collection, 1260 HA candidates will be recruited across four clinics in the UK and Greece. Participants will complete a battery of audiological and other assessments (measures of patient-reported HA benefit, mood, cognition, quality of life). Patients will be offered smart HAs and a mobile phone application and a subset will also be given wearable biosensors, to enable the collection of dynamic real-life HA usage data. Big data analytics will be used to detect correlations between contextualised HA usage and effectiveness, and different factors and comorbidities affecting HL, with a view to informing public health decision-making. Ethical approval was received from the London South East Research Ethics Committee (17/LO/0789), the Hippokrateion Hospital Ethics Committee (1847) and the Athens Medical Center's Ethics Committee (KM140670). Results will be disseminated through national and international events in Greece and the UK, scientific journals, newsletters, magazines and social media. Target audiences include HA users, clinicians, policy-makers and the

  7. Mothers' experiences of a Touch and Talk nursing intervention to optimise pain management in the PICU: a qualitative descriptive study.

    Science.gov (United States)

    Rennick, Janet E; Lambert, Sylvie; Childerhose, Janet; Campbell-Yeo, Marsha; Filion, Françoise; Johnston, C Celeste

    2011-06-01

    Parents consistently express a desire to support their child and retain a care-giving role in the paediatric intensive care unit (PICU). Qualitative data gathered as part of a PICU intervention study were analysed to explore mothers' experiences using a Touch and Talk intervention to comfort their children during invasive procedures. To describe how mothers experienced involvement in their children's care through a Touch and Talk intervention and whether they would participate in a similar intervention again. RESEARCH METHODOLOGY AND SETTING: A qualitative descriptive design was used and semi-structured interviews conducted with 65 mothers in three Canadian PICUs. Data were subjected to thematic analysis. The overarching theme centred on the importance of comforting the critically ill child. This included being there for the child (the importance of parental presence); making a difference in the child's pain experience; and feeling comfortable and confident about participating in care. All but two mothers would participate in the intervention again and all would recommend it to others. Giving parents the choice of being involved in their child's care using touch and distraction techniques during painful procedures can provide an invaluable opportunity to foster parenting and support the child during a difficult PICU experience. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  9. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  10. Real-Time Prediction of Gamers Behavior Using Variable Order Markov and Big Data Technology: A Case of Study

    Directory of Open Access Journals (Sweden)

    Alejandro Baldominos Gómez

    2016-03-01

    Full Text Available This paper presents the results and conclusions found when predicting the behavior of gamers in commercial videogames datasets. In particular, it uses Variable-Order Markov (VOM to build a probabilistic model that is able to use the historic behavior of gamers and to infer what will be their next actions. Being able to predict with accuracy the next user’s actions can be of special interest to learn from the behavior of gamers, to make them more engaged and to reduce churn rate. In order to support a big volume and velocity of data, the system is built on top of the Hadoop ecosystem, using HBase for real-time processing; and the prediction tool is provided as a service (SaaS and accessible through a RESTful API. The prediction system is evaluated using a case of study with two commercial videogames, attaining promising results with high prediction accuracies.

  11. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    Science.gov (United States)

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  12. Symptoms of endocrine treatment and outcome in the BIG 1-98 study.

    Science.gov (United States)

    Huober, J; Cole, B F; Rabaglio, M; Giobbie-Hurder, A; Wu, J; Ejlertsen, B; Bonnefoi, H; Forbes, J F; Neven, P; Láng, I; Smith, I; Wardley, A; Price, K N; Goldhirsch, A; Coates, A S; Colleoni, M; Gelber, R D; Thürlimann, B

    2014-01-01

    There may be a relationship between the incidence of vasomotor and arthralgia/myalgia symptoms and treatment outcomes for postmenopausal breast cancer patients with endocrine-responsive disease who received adjuvant letrozole or tamoxifen. Data on patients randomized into the monotherapy arms of the BIG 1-98 clinical trial who did not have either vasomotor or arthralgia/myalgia/carpal tunnel (AMC) symptoms reported at baseline, started protocol treatment and were alive and disease-free at the 3-month landmark (n = 4,798) and at the 12-month landmark (n = 4,682) were used for this report. Cohorts of patients with vasomotor symptoms, AMC symptoms, neither, or both were defined at both 3 and 12 months from randomization. Landmark analyses were performed for disease-free survival (DFS) and for breast cancer free interval (BCFI), using regression analysis to estimate hazard ratios (HR) and 95 % confidence intervals (CI). Median follow-up was 7.0 years. Reporting of AMC symptoms was associated with better outcome for both the 3- and 12-month landmark analyses [e.g., 12-month landmark, HR (95 % CI) for DFS = 0.65 (0.49-0.87), and for BCFI = 0.70 (0.49-0.99)]. By contrast, reporting of vasomotor symptoms was less clearly associated with DFS [12-month DFS HR (95 % CI) = 0.82 (0.70-0.96)] and BCFI (12-month DFS HR (95 % CI) = 0.97 (0.80-1.18). Interaction tests indicated no effect of treatment group on associations between symptoms and outcomes. While reporting of AMC symptoms was clearly associated with better DFS and BCFI, the association between vasomotor symptoms and outcome was less clear, especially with respect to breast cancer-related events.

  13. Comparative case study on website traffic generated by search engine optimisation and a pay-per-click campaign, versus marketing expenditure

    OpenAIRE

    Wouter T. Kritzinger; Melius Weideman

    2015-01-01

    Background: No empirical work was found on how marketing expenses compare when used solely for either the one or the other of the two main types of search engine marketing. Objectives: This research set out to determine how the results of the implementation of a pay-per-click campaign compared to those of a search engine optimisation campaign, given the same website and environment. At the same time, the expenses incurred on both these marketing methods were recorded and compared. M...

  14. Cogeneration technologies, optimisation and implementation

    CERN Document Server

    Frangopoulos, Christos A

    2017-01-01

    Cogeneration refers to the use of a power station to deliver two or more useful forms of energy, for example, to generate electricity and heat at the same time. This book provides an integrated treatment of cogeneration, including a tour of the available technologies and their features, and how these systems can be analysed and optimised.

  15. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  16. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  17. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  18. A Study of Ethnic Minority College Students: A Relationship among the Big Five Personality Traits, Cultural Intelligence, and Psychological Well-Being

    Science.gov (United States)

    Smith, Teresa Ann

    2012-01-01

    Institutions of Higher Education are challenged to educate an increasing, diverse ethnic minority population. This study examines (1) if the theory of the Big Five personality traits as a predictor of the cultural intelligence theoretical model remains constant with ethnic minority college students attending a southeastern United States…

  19. The INTERVAL trial to determine whether intervals between blood donations can be safely and acceptably decreased to optimise blood supply: study protocol for a randomised controlled trial.

    Science.gov (United States)

    Moore, Carmel; Sambrook, Jennifer; Walker, Matthew; Tolkien, Zoe; Kaptoge, Stephen; Allen, David; Mehenny, Susan; Mant, Jonathan; Di Angelantonio, Emanuele; Thompson, Simon G; Ouwehand, Willem; Roberts, David J; Danesh, John

    2014-09-17

    Ageing populations may demand more blood transfusions, but the blood supply could be limited by difficulties in attracting and retaining a decreasing pool of younger donors. One approach to increase blood supply is to collect blood more frequently from existing donors. If more donations could be safely collected in this manner at marginal cost, then it would be of considerable benefit to blood services. National Health Service (NHS) Blood and Transplant in England currently allows men to donate up to every 12 weeks and women to donate up to every 16 weeks. In contrast, some other European countries allow donations as frequently as every 8 weeks for men and every 10 weeks for women. The primary aim of the INTERVAL trial is to determine whether donation intervals can be safely and acceptably decreased to optimise blood supply whilst maintaining the health of donors. INTERVAL is a randomised trial of whole blood donors enrolled from all 25 static centres of NHS Blood and Transplant. Recruitment of about 50,000 male and female donors started in June 2012 and was completed in June 2014. Men have been randomly assigned to standard 12-week versus 10-week versus 8-week inter-donation intervals, while women have been assigned to standard 16-week versus 14-week versus 12-week inter-donation intervals. Sex-specific comparisons will be made by intention-to-treat analysis of outcomes assessed after two years of intervention. The primary outcome is the number of blood donations made. A key secondary outcome is donor quality of life, assessed using the Short Form Health Survey. Additional secondary endpoints include the number of 'deferrals' due to low haemoglobin (and other factors), iron status, cognitive function, physical activity, and donor attitudes. A comprehensive health economic analysis will be undertaken. The INTERVAL trial should yield novel information about the effect of inter-donation intervals on blood supply, acceptability, and donors' physical and mental well

  20. Is By-passing the Stomach a Means to Optimise Sodium Bicarbonate Supplementation? A Case-study With a Post-Bariatric Surgery Individual.

    Science.gov (United States)

    de Oliveira, Luana Farias; Saunders, Bryan; Artioli, Guilherme Giannini

    2018-05-03

    Sodium bicarbonate (SB) is an ergogenic supplement shown to improve high-intensity exercise via increased blood bicarbonate buffering. Substantial amounts of the ingested bicarbonate are neutralised in the stomach. Bariatric surgery results in a small gastric pouch which dramatically reduces exposure time of any ingested food in the stomach. The aim of this study was to examine the pharmacokinetics of orally ingested SB in a post-gastric bypass individual to determine the magnitude of changes in blood bicarbonate and associated side-effects. We hypothesized that SB supplementation in a gastric bypass model would result in greater blood bicarbonate increases and less side-effects than in healthy individuals due to minimal bicarbonate losses in the stomach. One post-bariatric male ingested 0.3 g·kg -1 BM of SB on three occasions (SB1, SB2, SB3) and 0.3 g·kg -1 BM of placebo (PL) on a further occasion. Blood bicarbonate was determined before and every 10-min following supplement ingestion for 3 h and then every 20 min for a further 1 h. Side-effects were reported using an adapted questionnaire at identical time points. Maximal increases in blood bicarbonate with SB were +20.0, +15.2 and +12.6 mM, resulting in maximal bicarbonate concentrations of 42.8, 39.3 and 36.2 mM. Area under the curve was SB1: 8328, SB2: 7747, SB3: 7627 mM·min -1 and 6436 mM·min -1 for PL. Side-effects with SB were scarce. Maximal bicarbonate increases were well above those shown previously, with minimal side-effects, indicative of minimal neutralisation of bicarbonate in the stomach. The large increases in circulating bicarbonate and minimal side-effects experienced by our post-gastric surgery patient are indicative that minimising neutralisation of bicarbonate in the stomach, as would occur with enteric coated capsules, may optimise SB supplementation and thus warrants investigation.

  1. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  2. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  3. Study and optimisation of the high energy detector in Cd(Zn)Te of the Simbol-X space mission for X and gamma astronomy; Etude et optimisation du plan de detection de haute energie en Cd(Zn)Te pour la mission spatiale d'observation astronomie X et gamma SIMBOL-X

    Energy Technology Data Exchange (ETDEWEB)

    Meuris, A.

    2009-09-15

    Stars in final phases of evolution are sites of highest energetic phenomena of the Universe. The understanding of their mechanisms is based on the observation of the X and gamma rays from the sources. The Simbol-X French-Italian project is a novel concept of telescope with two satellites flying in formation. This space mission combines upgraded optics from X-ray telescopes with detection Systems from gamma-ray telescopes. CEA Saclay involved in major space missions for gamma astronomy is in charge of the definition and the design of the High Energy Detector (HED) of Simbol-X to cover the spectral range from 8 to 80 keV. Two generations of micro-cameras called Caliste have been designed, fabricated and tested. They integrate cadmium telluride (CdTe) crystals and optimised front-end electronics named Idef-X. The hybridization technique enables to put them side by side as a mosaic to achieve for the first time a CdTe detection plane with fine spatial resolution (600 {mu}m) and arbitrarily large surface. By setting up test benches and leading test campaigns, I was involved in the fabrication of Caliste prototypes and I assessed temporal, spatial and spectral resolutions. At the conclusion of experiments and simulations, I propose a detector type, operating conditions and digital processing on board the spacecraft to optimise HED performance. The best detector candidate is CdTe Schottky, well suited to high resolution spectroscopy; however, it suffers from lost in stability during biasing. Beyond Simbol-X mission, I studied theoretically and experimentally this kind of detector to build an updated model that can apply to other projects of gamma spectroscopy and imaging. (author)

  4. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  5. Optimising Comprehensibility in Interlingual Translation

    DEFF Research Database (Denmark)

    Nisbeth Jensen, Matilde

    2015-01-01

    The increasing demand for citizen engagement in areas traditionally belonging exclusively to experts, such as health, law and technology has given rise to the necessity of making expert knowledge available to the general public through genres such as instruction manuals for consumer goods, patien...... the functional text type of Patient Information Leaflet. Finally, the usefulness of applying the principles of Plain Language and intralingual translation for optimising comprehensibility in interlingual translation is discussed....

  6. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  7. TEM turbulence optimisation in stellarators

    Science.gov (United States)

    Proll, J. H. E.; Mynick, H. E.; Xanthopoulos, P.; Lazerson, S. A.; Faber, B. J.

    2016-01-01

    With the advent of neoclassically optimised stellarators, optimising stellarators for turbulent transport is an important next step. The reduction of ion-temperature-gradient-driven turbulence has been achieved via shaping of the magnetic field, and the reduction of trapped-electron mode (TEM) turbulence is addressed in the present paper. Recent analytical and numerical findings suggest TEMs are stabilised when a large fraction of trapped particles experiences favourable bounce-averaged curvature. This is the case for example in Wendelstein 7-X (Beidler et al 1990 Fusion Technol. 17 148) and other Helias-type stellarators. Using this knowledge, a proxy function was designed to estimate the TEM dynamics, allowing optimal configurations for TEM stability to be determined with the STELLOPT (Spong et al 2001 Nucl. Fusion 41 711) code without extensive turbulence simulations. A first proof-of-principle optimised equilibrium stemming from the TEM-dominated stellarator experiment HSX (Anderson et al 1995 Fusion Technol. 27 273) is presented for which a reduction of the linear growth rates is achieved over a broad range of the operational parameter space. As an important consequence of this property, the turbulent heat flux levels are reduced compared with the initial configuration.

  8. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  9. An approach to next step device optimisation

    International Nuclear Information System (INIS)

    Salpietro, E.

    2000-01-01

    The requirements for ITER EDA were to achieve ignition with a good safety margin, and controlled long inductive burn. These requirements lead to a big device, which requested a too ambitious step to be undertaken by the world fusion community. More realistic objectives for a next step device shall be to demonstrate the net production of energy with a high energy gain factor (Q) and a high boot strap current fraction (>60%) which is required for a Fusion Power Plant (FPP). The Next Step Device (NSD) shall also allow operation flexibility in order to explore a large range of plasma parameters to find out the optimum concept for the fusion power plant prototype. These requirements could be too demanding for one single device and could probably be better explored in a strongly integrated world programme. The cost of one or more devices is the decisive factor for the choice of the fusion power development programme strategy. The plasma elongation and triangularity have a strong impact in the cost of the device and are limited by the plasma vertical position control issue. The distance between plasma separatrix and the toroidal field conductor does not vary a lot between devices. It is determined by the sum of the distance between first wall-plasma sepratrix and the thickness of the nuclear shield required to protect the toroidal field coil insultation. The thickness of the TF coil is determined by the allowable stresses and superconducting characteristics. The outer radius of the central solenoid is the result of an optimisation to provide the magnetic flux to inductively drive the plasma. Therefore, in order to achieve the objectives for Q and boot-strap current fractions at the minimum cost, the plasma aspect ratio and magnetic field value shall be determined. The paper will present the critical issues for the next device and will make considerations on the optimal way to proceed towards the realisation of the fusion power plant

  10. Robustness analysis of bogie suspension components Pareto optimised values

    Science.gov (United States)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  11. Study protocol for the optimisation, feasibility testing and pilot cluster randomised trial of Positive Choices: a school-based social marketing intervention to promote sexual health, prevent unintended teenage pregnancies and address health inequalities in England.

    Science.gov (United States)

    Ponsford, Ruth; Allen, Elizabeth; Campbell, Rona; Elbourne, Diana; Hadley, Alison; Lohan, Maria; Melendez-Torres, G J; Mercer, Catherine H; Morris, Steve; Young, Honor; Bonell, Chris

    2018-01-01

    Since the introduction of the Teenage Pregnancy Strategy (TPS), England's under-18 conception rate has fallen by 55%, but a continued focus on prevention is needed to maintain and accelerate progress. The teenage birth rate remains higher in the UK than comparable Western European countries. Previous trials indicate that school-based social marketing interventions are a promising approach to addressing teenage pregnancy and improving sexual health. Such interventions are yet to be trialled in the UK. This study aims to optimise and establish the feasibility and acceptability of one such intervention: Positive Choices. Design: Optimisation, feasibility testing and pilot cluster randomised trial.Interventions: The Positive Choices intervention comprises a student needs survey, a student/staff led School Health Promotion Council (SHPC), a classroom curriculum for year nine students covering social and emotional skills and sex education, student-led social marketing activities, parent information and a review of school sexual health services.Systematic optimisation of Positive Choices will be carried out with the National Children's Bureau Sex Education Forum (NCB SEF), one state secondary school in England and other youth and policy stakeholders.Feasibility testing will involve the same state secondary school and will assess progression criteria to advance to the pilot cluster RCT.Pilot cluster RCT with integral process evaluation will involve six different state secondary schools (four interventions and two controls) and will assess the feasibility and utility of progressing to a full effectiveness trial.The following outcome measures will be trialled as part of the pilot:Self-reported pregnancy and unintended pregnancy (initiation of pregnancy for boys) and sexually transmitted infections,Age of sexual debut, number of sexual partners, use of contraception at first and last sex and non-volitional sexEducational attainmentThe feasibility of linking administrative

  12. Web Site Optimisation

    OpenAIRE

    Petrželka, Jiří

    2007-01-01

    This BSc Project was performed during a study stay at the Coventry University, UK. The goal of this project is to enhance the accessibility and usability of an existing company presentation located at http://www.hcc.cz, boost the site's traffic and so increase the company's revenues. The project follows these steps to accomplish this: a ) A partial refactoring of the back-end (PHP scripts). b ) Transformation of the website contents according to the recommendations of the World Wide Web conso...

  13. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  14. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  15. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  16. Optimising Lecture Time

    DEFF Research Database (Denmark)

    Holst-Christensen, Bo

    interest in getting a degree, they prefer the educators to do the work for them. The focus of my experiments have therefore been to develop teaching techniques that ensures that the students study efficiently and at the same time moves the task of identifying which parts of the subjects that are giving...... the students problems from the educator to the students. By using techniques that put more weight on student participation, cooperation and preparation, I have been able to cut significantly down on the time used for lecturing, allowing more time for student work and reflection. As an example by getting...... the students to identify the parts of the subjects that need further explanation, I get the students to take ownership of the learning task and at the same time give me a more direct feedback. By creating teaching materials and exercises that can be used in a number of different ways, it is possible to involve...

  17. Experimental study of the combined utilization of nuclear power heating plants for big towns and industrial complexes

    International Nuclear Information System (INIS)

    Neumann, J.; Barabas, K.

    1977-01-01

    The paper describes a comparison of nuclear power heating plants with an output corresponding to 1000MW(e) with plants of the same output using coal or oil. The economic aspects are compared, both as regards investment and operation costs. The comparison of the environmental aspects is performed on the atmospheric pollution from exhausts and gaseous emission and on the thermal pollutions in hydrosphere and atmosphere. Basic nuclear power plant schemes with two PWRs, each of 1500MW(th), are described. The plant supplies electric power and heat for factories and municipal heating systems (apartments, shops, and other auxiliary municipal facilities). At the same time the basic heat-flow diagram of a nuclear power heating plant is given, together with the relative losses. The study emphasizes the possible utilization of waste heat for heating glasshouses of 200m 2 . The problems of utilizing waste heat, and the needs of a big town and of industrial complexes in the vicinity of the nuclear power heating plant are also considered. (author)

  18. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  19. Optimisation of staff protection

    International Nuclear Information System (INIS)

    Faulkner, K.; Marshall, N.W.; Rawlings, D.J.

    1997-01-01

    It is important to minimize the radiation dose received by staff, but it is particularly important in interventional radiology. Staff doses may be reduced by minimizing the fluoroscopic screening time and number of images, compatible with the clinical objective of the procedure. Staff may also move to different positions in the room in an attempt to reduce doses. Finally, staff should wear appropriate protective clothing to reduce their occupational doses. This paper will concentrate on the optimization of personal shielding in interventional radiology. The effect of changing the lead equivalence of various protective devices on effective dose to staff has been studied by modeling the exposure of staff to realistic scattered radiation. Both overcouch x-ray tube/undercouch image intensified and overcouch image intensifier/undercouch x-ray tube geometries were simulated. It was deduced from this simulation that increasing the lead apron thickness from 0.35 mm lead to 0.5 mm lead had only a small reducing effect. By contrast, wearing a lead rubber thyroid shield or face mask is a superior means of reducing the effective dose to staff. Standing back from the couch when the x-ray tube is emitting radiation is another good method of reducing doses, being better than exchanging a 0.35 mm lead apron for a 0.5 mm apron. In summary, it is always preferable to shield more organs than to increase the thickness of the lead apron. (author)

  20. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  1. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  2. Particle swarm optimisation classical and quantum perspectives

    CERN Document Server

    Sun, Jun; Wu, Xiao-Jun

    2016-01-01

    IntroductionOptimisation Problems and Optimisation MethodsRandom Search TechniquesMetaheuristic MethodsSwarm IntelligenceParticle Swarm OptimisationOverviewMotivationsPSO Algorithm: Basic Concepts and the ProcedureParadigm: How to Use PSO to Solve Optimisation ProblemsSome Harder Examples Some Variants of Particle Swarm Optimisation Why Does the PSO Algorithm Need to Be Improved? Inertia and Constriction-Acceleration Techniques for PSOLocal Best ModelProbabilistic AlgorithmsOther Variants of PSO Quantum-Behaved Particle Swarm Optimisation OverviewMotivation: From Classical Dynamics to Quantum MechanicsQuantum Model: Fundamentals of QPSOQPSO AlgorithmSome Essential ApplicationsSome Variants of QPSOSummary Advanced Topics Behaviour Analysis of Individual ParticlesConvergence Analysis of the AlgorithmTime Complexity and Rate of ConvergenceParameter Selection and PerformanceSummaryIndustrial Applications Inverse Problems for Partial Differential EquationsInverse Problems for Non-Linear Dynamical SystemsOptimal De...

  3. Warpage optimisation on the moulded part with straight-drilled and conformal cooling channels using response surface methodology (RSM) and glowworm swarm optimisation (GSO)

    Science.gov (United States)

    Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.

    2017-09-01

    In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.

  4. Cultural-based particle swarm for dynamic optimisation problems

    Science.gov (United States)

    Daneshyari, Moayed; Yen, Gary G.

    2012-07-01

    Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.

  5. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...... using the boundary element method where absorption is incorporated. An example is given where the geometry of a room is defined by four design modes. The room geometry is optimised to get a uniform sound pressure....

  6. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  7. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  8. Optimisation of technical specifications using probabilistic methods

    International Nuclear Information System (INIS)

    Ericsson, G.; Knochenhauer, M.; Hultqvist, G.

    1986-01-01

    During the last few years the development of methods for modifying and optimising nuclear power plant Technical Specifications (TS) for plant operations has received increased attention. Probalistic methods in general, and the plant and system models of probabilistic safety assessment (PSA) in particular, seem to provide the most forceful tools for optimisation. This paper first gives some general comments on optimisation, identifying important parameters and then gives a description of recent Swedish experiences from the use of nuclear power plant PSA models and results for TS optimisation

  9. Layout Optimisation of Wave Energy Converter Arrays

    DEFF Research Database (Denmark)

    Ruiz, Pau Mercadé; Nava, Vincenzo; Topper, Mathew B. R.

    2017-01-01

    This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC) arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation......, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA), a genetic algorithm (GA) and the glowworm swarm optimisation (GSO) algorithm...

  10. Optimisation combinatoire Theorie et algorithmes

    CERN Document Server

    Korte, Bernhard; Fonlupt, Jean

    2010-01-01

    Ce livre est la traduction fran aise de la quatri me et derni re dition de Combinatorial Optimization: Theory and Algorithms crit par deux minents sp cialistes du domaine: Bernhard Korte et Jens Vygen de l'universit de Bonn en Allemagne. Il met l accent sur les aspects th oriques de l'optimisation combinatoire ainsi que sur les algorithmes efficaces et exacts de r solution de probl mes. Il se distingue en cela des approches heuristiques plus simples et souvent d crites par ailleurs. L ouvrage contient de nombreuses d monstrations, concises et l gantes, de r sultats difficiles. Destin aux tudia

  11. HVAC system optimisation-in-building section

    Energy Technology Data Exchange (ETDEWEB)

    Lu, L.; Cai, W.; Xie, L.; Li, S.; Soh, Y.C. [School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore (Singapore)

    2004-07-01

    This paper presents a practical method to optimise in-building section of centralised Heating, Ventilation and Air-Conditioning (HVAC) systems which consist of indoor air loops and chilled water loops. First, through component characteristic analysis, mathematical models associated with cooling loads and energy consumption for heat exchangers and energy consuming devices are established. By considering variation of cooling load of each end user, adaptive neuro-fuzzy inference system (ANFIS) is employed to model duct and pipe networks and obtain optimal differential pressure (DP) set points based on limited sensor information. A mix-integer nonlinear constraint optimization of system energy is formulated and solved by a modified genetic algorithm. The main feature of our paper is a systematic approach in optimizing the overall system energy consumption rather than that of individual component. A simulation study for a typical centralized HVAC system is provided to compare the proposed optimisation method with traditional ones. The results show that the proposed method indeed improves the system performance significantly. (author)

  12. Optimisation of milling parameters using neural network

    Directory of Open Access Journals (Sweden)

    Lipski Jerzy

    2017-01-01

    Full Text Available The purpose of this study was to design and test an intelligent computer software developed with the purpose of increasing average productivity of milling not compromising the design features of the final product. The developed system generates optimal milling parameters based on the extent of tool wear. The introduced optimisation algorithm employs a multilayer model of a milling process developed in the artificial neural network. The input parameters for model training are the following: cutting speed vc, feed per tooth fz and the degree of tool wear measured by means of localised flank wear (VB3. The output parameter is the surface roughness of a machined surface Ra. Since the model in the neural network exhibits good approximation of functional relationships, it was applied to determine optimal milling parameters in changeable tool wear conditions (VB3 and stabilisation of surface roughness parameter Ra. Our solution enables constant control over surface roughness parameters and productivity of milling process after each assessment of tool condition. The recommended parameters, i.e. those which applied in milling ensure desired surface roughness and maximal productivity, are selected from all the parameters generated by the model. The developed software may constitute an expert system supporting a milling machine operator. In addition, the application may be installed on a mobile device (smartphone, connected to a tool wear diagnostics instrument and the machine tool controller in order to supply updated optimal parameters of milling. The presented solution facilitates tool life optimisation and decreasing tool change costs, particularly during prolonged operation.

  13. Noise aspects at aerodynamic blade optimisation projects

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G. [Netherlands Energy Research Foundation, Petten (Netherlands)

    1997-12-31

    This paper shows an example of an aerodynamic blade optimisation, using the program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. The aerodynamic optimised geometry from PVOPT is the `real` optimum (up to the latest decimal). The most important conclusion from this study is, that it is worthwhile to investigate the behaviour of the objective function (in the present case the energy yield) around the optimum: If the optimum is flat, there is a possibility to apply modifications to the optimum configuration with only a limited loss in energy yield. It is obvious that the modified configurations emits a different (and possibly lower) noise level. In the BLADOPT program (the successor of PVOPT) it will be possible to quantify the noise level and hence to assess the reduced noise emission more thoroughly. At present the most promising approaches for noise reduction are believed to be a reduction of the rotor speed (if at all possible), and a reduction of the tip angle by means of low lift profiles, or decreased twist at the outboard stations. These modifications were possible without a significant loss in energy yield. (LN)

  14. Thinking big

    Science.gov (United States)

    Collins, Harry

    2008-02-01

    Physicists are often quick to discount social research based on qualitative techniques such as ethnography and "deep case studies" - where a researcher draws conclusions about a community based on immersion in the field - thinking that only quantitative research backed up by statistical analysis is sound. The balance is not so clear, however.

  15. Preintervention lesion remodelling affects operative mechanisms of balloon optimised directional coronary atherectomy procedures: a volumetric study with three dimensional intravascular ultrasound

    Science.gov (United States)

    von Birgelen, C; Mintz, G; de Vrey, E A; Serruys, P; Kimura, T; Nobuyoshi, M; Popma, J; Leon, M; Erbel, R; de Feyter, P J

    2000-01-01

    AIMS—To classify atherosclerotic coronary lesions on the basis of adequate or inadequate compensatory vascular enlargement, and to examine changes in lumen, plaque, and vessel volumes during balloon optimised directional coronary atherectomy procedures in relation to the state of adaptive remodelling before the intervention.
DESIGN—29 lesion segments in 29 patients were examined with intravascular ultrasound before and after successful balloon optimised directional coronary atherectomy procedures, and a validated volumetric intravascular ultrasound analysis was performed off-line to assess the atherosclerotic lesion remodelling and changes in plaque and vessel volumes that occurred during the intervention. Based on the intravascular ultrasound data, lesions were classified according to whether there was inadequate (group I) or adequate (group II) compensatory enlargement.
RESULTS—There was no significant difference in patient and lesion characteristics between groups I and II (n = 10 and 19), including lesion length and details of the intervention. Quantitative coronary angiographic data were similar for both groups. However, plaque and vessel volumes were significantly smaller in group I than in II. In group I, 9 (4)% (mean (SD)) of the plaque volume was ablated, while in group II 16 (11)% was ablated (p = 0.01). This difference was reflected in a lower lumen volume gain in group I than in group II (46 (18) mm3 v 80 (49) mm3 (p atherectomy procedures. Plaque ablation was found to be particularly low in lesions with inadequate compensatory vascular enlargement.


Keywords: intravascular ultrasound; ultrasonics; remodelling; coronary artery disease; atherectomy PMID:10648496

  16. Numerical optimisation of friction stir welding: review of future challenges

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2011-01-01

    During the last decade, the combination of increasingly more advanced numerical simulation software with high computational power has resulted in models for friction stir welding (FSW), which have improved the understanding of the determining physical phenomena behind the process substantially....... This has made optimisation of certain process parameters possible and has in turn led to better performing friction stir welded products, thus contributing to a general increase in the popularity of the process and its applications. However, most of these optimisation studies do not go well beyond manual...

  17. MANAGEMENT OPTIMISATION OF MASS CUSTOMISATION MANUFACTURING USING COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    Louwrens Butler

    2018-05-01

    Full Text Available Computational intelligence paradigms can be used for advanced manufacturing system optimisation. A static simulation model of an advanced manufacturing system was developed in order to simulate a manufacturing system. The purpose of this advanced manufacturing system was to mass-produce a customisable product range at a competitive cost. The aim of this study was to determine whether this new algorithm could produce a better performance than traditional optimisation methods. The algorithm produced a lower cost plan than that for a simulated annealing algorithm, and had a lower impact on the workforce.

  18. The impact of population aging on medical expenses: A big data study based on the life table.

    Science.gov (United States)

    Wang, Changying; Li, Fen; Wang, Linan; Zhou, Wentao; Zhu, Bifan; Zhang, Xiaoxi; Ding, Lingling; He, Zhimin; Song, Peipei; Jin, Chunlin

    2018-01-09

    This study shed light on the amount and structure of utilization and medical expenses on Shanghai permanent residents based on big data, simulated lifetime medical expenses through combining of expenses data and life table model, and explored the dynamic pattern of aging on medical expenditures. 5 years were taken as the class interval, the study collected and did the descriptive analysis on the medical services utilization and medical expenses information for all ages of Shanghai permanent residents in 2015, simulated lifetime medical expenses by using current life table and cross-section expenditure data. The results showed that in 2015, outpatient and emergency visits per capita in the elderly group (aged 60 and over) was 4.1 and 4.5 times higher than the childhood group (aged 1-14), and the youth and adult group (aged 15-59); hospitalization per capita in the elderly group was 3.0 and 3.5 times higher than the childhood group, and the youth and adult group. People survived in the 60-64 years group, their expected whole medical expenses (105,447 purchasing power parity Dollar) in the rest of their lives accounted for 75.6% of their lifetime. A similar study in Michigan, US showed that the expenses of the population aged 65 and over accounted for 1/2 of lifetime medical expenses, which is much lower than Shanghai. The medical expenses of the advanced elderly group (aged 80 and over) accounted for 38.8% of their lifetime expenses, including 38.2% in outpatient and emergency, and 39.5% in hospitalization, which was slightly higher than outpatient and emergency. There is room to economize in medical expenditures of the elderly people in Shanghai, especially controlling hospitalization expenses is the key to saving medical expenses of elderly people aged over 80 and over.

  19. The 'Big Karl' magnetic spectrometer - studies of the 103Ru transition nucleus with (d,p) and (p,d) reactions

    International Nuclear Information System (INIS)

    Huerlimann, W.

    1981-04-01

    The paper describes the structure and characteristics of the spectrometer and its application in a study of the 102 Ru(d,p) 103 Ru and 104 Ru(p,d) 103 Ru reactions. The study is structured as follows: To begin with the theoretical fundamentals, ion-optical characteristics and layout of BIG KARL are described. Field measurements and analyses carried out on the magnets of the spectrometer are described as well as the functioning of the 'Ht correction coils' used here for the first time to prevent faulty imaging. Chapter IV then describes methods employed so far to optimize resolution for large aperture angles of the spectrometer. Finally, chapter V investigates the 103 Ru transition nucleons on the basis of the 102 Ru(d,p) 103 RU and 104 Ru(p,d) 103 Ru transfer reactions measured in BIG KARL. (orig./HSI) [de

  20. "All Flying Insects with Big, Beautiful Wings are Butterflies!" A Study in Challenging This Misconception

    Science.gov (United States)

    Tsoi, Kwok-Ho

    2013-01-01

    This study investigated the level of understanding among student teachers in differentiating lepidopterans. It adopted a constructive approach to promoting conceptual change in students on the issue of animal classification by generating cognitive conflict. Most of the students used inaccurate morphological traits for identification, such as wing…

  1. Big Business as a Policy Innovator in State School Reform: A Minnesota Case Study.

    Science.gov (United States)

    Mazzoni, Tim L.; Clugston, Richard M., Jr.

    1987-01-01

    The Minnesota Business Partnership (MBP) was studied as a policy innovator in state school reform (for kindergarten through grade 12) in relation to agenda setting, alternative formulation, and authoritative enactment. Focus is on the MBP's policy-making involvement during the 1985 state legislative session. Overall, the MBP's influence was…

  2. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  3. Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine

    Science.gov (United States)

    Erdogan, Gamze; Yavuz, Mahmut

    2017-12-01

    The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.

  4. Geological, geochemical, and geophysical studies by the U.S. Geological Survey in Big Bend National Park, Texas

    Science.gov (United States)

    Page, W.R.; Turner, K.J.; Bohannon, R.G.; Berry, M.E.; Williams, V.S.; Miggins, D.P.; Ren, M.; Anthony, E.Y.; Morgan, L.A.; Shanks, P.W.C.; Gray, J. E.; Theodorakos, P.M.; Krabbenhoft, D. P.; Manning, A.H.; Gemery-Hill, P. A.; Hellgren, E.C.; Stricker, C.A.; Onorato, D.P.; Finn, C.A.; Anderson, E.; Gray, J. E.; Page, W.R.

    2008-01-01

    Big Bend National Park (BBNP), Tex., covers 801,163 acres (3,242 km2) and was established in 1944 through a transfer of land from the State of Texas to the United States. The park is located along a 118-mile (190-km) stretch of the Rio Grande at the United States-Mexico border. The park is in the Chihuahuan Desert, an ecosystem with high mountain ranges and basin environments containing a wide variety of native plants and animals, including more than 1,200 species of plants, more than 450 species of birds, 56 species of reptiles, and 75 species of mammals. In addition, the geology of BBNP, which varies widely from high mountains to broad open lowland basins, also enhances the beauty of the park. For example, the park contains the Chisos Mountains, which are dominantly composed of thick outcrops of Tertiary extrusive and intrusive igneous rocks that reach an altitude of 7,832 ft (2,387 m) and are considered the southernmost mountain range in the United States. Geologic features in BBNP provide opportunities to study the formation of mineral deposits and their environmental effects; the origin and formation of sedimentary and igneous rocks; Paleozoic, Mesozoic, and Cenozoic fossils; and surface and ground water resources. Mineral deposits in and around BBNP contain commodities such as mercury (Hg), uranium (U), and fluorine (F), but of these, the only significant mining has been for Hg. Because of the biological and geological diversity of BBNP, more than 350,000 tourists visit the park each year. The U.S. Geological Survey (USGS) has been investigating a number of broad and diverse geologic, geochemical, and geophysical topics in BBNP to provide fundamental information needed by the National Park Service (NPS) to address resource management goals in this park. Scientists from the USGS Mineral Resources and National Cooperative Geologic Mapping Programs have been working cooperatively with the NPS and several universities on several research studies within BBNP

  5. The Person-Event Data Environment: leveraging big data for studies of psychological strengths in soldiers

    Science.gov (United States)

    2013-12-13

    sequent analyses have also conditioned growth in psychological assets on various deployment indices and demographic factors (e.g., gender , age). In...for different subgroups (e.g., gender , age, education, and marital status). The Penn team is currently studying the impact of com- bat deployments on...information to bear on issues which have widespread implications for the DoD. ACKNOWLEDGMENTS Order of authorship was determined by a coin flip. Loryana

  6. Big hearts, small hands: a focus group study exploring parental food portion behaviours

    OpenAIRE

    Curtis, Kristina; Atkins, Louise; Brown, Katherine

    2017-01-01

    Abstract Background The development of healthy food portion sizes among families is deemed critical to childhood weight management; yet little is known about the interacting factors influencing parents’ portion control behaviours. This study aimed to use two synergistic theoretical models of behaviour: the COM-B model (Capability, Opportunity, Motivation – Behaviour) and Theoretical Domains Framework (TDF) to identify a broad spectrum of theoretically derived influences on parents’ portion co...

  7. Comparitive study of ambient air quality status for big cities of Punjab (Pakistan)

    International Nuclear Information System (INIS)

    Shahid, M.A.K.; Mahmood, A.

    2010-01-01

    This study was undertaken to investigate the quality of air in Lahore and Faisalabad at selected sites. Total eight sampling stations were selected and all the sampling locations fall in different environmental backdrops such as residential, commercial, industrial and rural (control) areas. To study the quality of air, Suspended Particulate Matter (SPM), Nitrogen dioxide (NO/sub 2/) and Sulphur dioxide (SO/sub 2/) were selected In the present study, it was found that the SPM NO/sub 2/ and SO/sub 2/ levels in all the sampling locations are within the permissible limits. However, the raising levels indicated at Residential cum Industrial area (shopping complex along with banks) followed by pure industrial area. The source of these pollutants is primarily transport sector and secondly industries. The ambient air quality reported to be low except 2Kl reported as medium. Sociological survey was conducted to determine the health hazards and the diseases related to air pollution. The results were alarming and found to be compatible with Punjab Public Health and Engineering Department (PPHE). There fore it is suggested that air quality management demands. (author)

  8. The General Factor of Personality: A meta-analysis of Big Five intercorrelations and a criterion-related validity study

    NARCIS (Netherlands)

    van der Linden, D.; te Nijenhuis, J.; Bakker, A.B.

    2010-01-01

    Recently, it has been proposed that a General Factor of Personality (GFP) occupies the top of the hierarchical personality structure. We present a meta-analysis (K = 212, total N = 144,117) on the intercorrelations among the Big Five personality factors (Openness, Conscientiousness, Extraversion,

  9. Rethinking climate change adaptation and place through a situated pathways framework: A case study from the Big Hole Valley, USA

    Science.gov (United States)

    Daniel J. Murphy; Laurie Yung; Carina Wyborn; Daniel R. Williams

    2017-01-01

    This paper critically examines the temporal and spatial dynamics of adaptation in climate change science and explores how dynamic notions of 'place' elucidate novel ways of understanding community vulnerability and adaptation. Using data gathered from a narrative scenario-building process carried out among communities of the Big Hole Valley in Montana, the...

  10. Beyond Big

    DEFF Research Database (Denmark)

    Smith, Shelley

    2003-01-01

    , Airport. An empirical examination of airport space as a relevant case for the study of how enormous scale and flux challenge traditional spatial and perceptual understandings of architecture is undertaken through an alternative historical mapping which traces the airport through 3 metaphorical....... The summation of these preliminary chapters uncovers a situation in which the descriptive vocabulary used to characterise the spatial and perceptual aspects of contemporary space is based on both negation and excess. Terms such as 'underspatialisation', 'non-place', 'anti-form' and even the 'concept...... developmental phases; 'field' – 'port' – 'city', and via the 'Airport Hop' - a round-trip tour of 5+ international airports in Europe and North America. The physical large-scale of airports is addressed cartographically while the perceptual large-scale of airports is examined with film recordings, interviews...

  11. The Person-Event Data Environment (PDE: Leveraging Big Data for Studies of Psychological Strengths in Soldiers

    Directory of Open Access Journals (Sweden)

    Loryana L. Vie

    2013-12-01

    Full Text Available The Department of Defense (DoD strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment and security systems that influence Army-wide policies and procedures.

  12. #europehappinessmap: A Framework for Multi-Lingual Sentiment Analysis via Social Media Big Data (A Twitter Case Study

    Directory of Open Access Journals (Sweden)

    Mustafa Coşkun

    2018-04-01

    Full Text Available The growth and popularity of social media platforms have generated a new social interaction environment thus a new collaboration and communication network among individuals. These platforms own tremendous amount of data about users’ behaviors and sentiments since people create, share or exchange their information, ideas, pictures or video using them. One of these popular platforms is Twitter, which via its voluntary information sharing structure, provides researchers data potential of benefit for their studies. Based on Twitter data, in this study a multilingual sentiment detection framework is proposed to compute European Gross National Happiness (GNH. This framework consists of a novel data collection, filtering and sampling method, and a newly constructed multilingual sentiment detection algorithm for social media big data, and tested with nine European countries (United Kingdom, Germany, Sweden, Turkey, Portugal, The Netherlands, Italy, France and Spain and their national languages over a six year period. The reliability of the data is checked with peak/troughs comparison for special days from Wikipedia news lists. The validity is checked with a group of correlation analyses with OECD Life Satisfaction survey reports’, Euro-Dollar and other currency exchanges, and national stock market time series data. After validity and reliability confirmations, the European GNH map is drawn for six years. The main problem addressed is to propose a novel multilingual social media sentiment analysis framework for calculating GNH for countries and change the way of OECD type organizations’ survey and interview methodology. Also, it is believed that this framework can serve more detailed results (e.g., daily or hourly sentiments of society in different languages.

  13. How big is a food portion? A pilot study in Australian families.

    Science.gov (United States)

    Collins, Clare E; Bucher, Tamara; Taylor, Aimee; Pezdirc, Kristine; Lucas, Hannah; Watson, Jane; Rollo, Megan; Duncanson, Kerith; Hutchesson, Melinda J; Burrows, Tracy

    2015-08-01

    It is not known whether individuals can accurately estimate the portion size of foods usually consumed relative to standard serving sizes in national food selection guides. The aim of the present cross-sectional pilot study was to quantify what adults and children deem a typical portion for a variety of foods and compare these with the serving sizes specified in the Australian Guide to Healthy Eating (AGHE). Adults and children were independently asked to serve out their typical portion of 10 common foods (rice, pasta, breakfast cereal, chocolate, confectionary, ice cream, meat, vegetables, soft drink and milk). They were also asked to serve what they perceived a small, medium and large portion of each food to be. Each portion was weighed and recorded by an assessor and compared with the standard AGHE serving sizes. Twenty-one individuals (nine mothers, one father, 11 children) participated in the study. There was a large degree of variability in portion sizes measured out by both parents and children, with means exceeding the standard AGHE serving size for all items, except for soft drink and milk, where mean portion sizes were less than the AGHE serving size. The greatest mean overestimations were for pasta (155%; mean 116 g; range 94-139 g) and chocolate (151%; mean 38 g; range 25-50 g), each of which represented approximately 1.5 standard AGHE servings. The findings of the present study indicate that there is variability between parents' and children's estimation of typical portion sizes compared with national recommendations. SO WHAT? Dietary interventions to improve individuals' dietary patterns should target education regarding portion size.

  14. Analysis of Big Data Maturity Stage in Hospitality Industry

    OpenAIRE

    Shabani, Neda; Munir, Arslan; Bose, Avishek

    2017-01-01

    Big data analytics has an extremely significant impact on many areas in all businesses and industries including hospitality. This study aims to guide information technology (IT) professionals in hospitality on their big data expedition. In particular, the purpose of this study is to identify the maturity stage of the big data in hospitality industry in an objective way so that hotels be able to understand their progress, and realize what it will take to get to the next stage of big data matur...

  15. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  16. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  17. Study on inventory control model based on the B2C mode in big data environment

    Directory of Open Access Journals (Sweden)

    Zhiping Zhang

    2017-03-01

    Full Text Available The current inventory problem has become the key issue in the enterprise survival and development. In this paper, we take “Taobao” as an example to conduct a detailed study of the inventory of the high conversion rate based on data mining. First, by using a funnel model to predict the conversion of the commodities on the critical path, we capture the factors influencing the consumer decision-making on each key point, and propose corresponding solutions of improving the conversion rate; Second, we use BP neural network algorithm to predict the goods traffic, and then obtain the corresponding weights by the relation analysis and the output of the goods traffic by the input of large data sample goods; Third, we can predict the inventory in accordance with the commodity conversion rate and flow prediction, and amend the predicted results to get accurate and real-time inventory forecast, avoiding the economic loss due to the inaccurate inventory.

  18. An optimal generic model for multi-parameters and big data optimizing: a laboratory experimental study

    Science.gov (United States)

    Utama, D. N.; Ani, N.; Iqbal, M. M.

    2018-03-01

    Optimization is a process for finding parameter (parameters) that is (are) able to deliver an optimal value for an objective function. Seeking an optimal generic model for optimizing is a computer science study that has been being practically conducted by numerous researchers. Generic model is a model that can be technically operated to solve any varieties of optimization problem. By using an object-oriented method, the generic model for optimizing was constructed. Moreover, two types of optimization method, simulated-annealing and hill-climbing, were functioned in constructing the model and compared to find the most optimal one then. The result said that both methods gave the same result for a value of objective function and the hill-climbing based model consumed the shortest running time.

  19. The Application and Future of Big Database Studies in Cardiology: A Single-Center Experience.

    Science.gov (United States)

    Lee, Kuang-Tso; Hour, Ai-Ling; Shia, Ben-Chang; Chu, Pao-Hsien

    2017-11-01

    As medical research techniques and quality have improved, it is apparent that cardiovascular problems could be better resolved by more strict experiment design. In fact, substantial time and resources should be expended to fulfill the requirements of high quality studies. Many worthy ideas and hypotheses were unable to be verified or proven due to ethical or economic limitations. In recent years, new and various applications and uses of databases have received increasing attention. Important information regarding certain issues such as rare cardiovascular diseases, women's heart health, post-marketing analysis of different medications, or a combination of clinical and regional cardiac features could be obtained by the use of rigorous statistical methods. However, there are limitations that exist among all databases. One of the key essentials to creating and correctly addressing this research is through reliable processes of analyzing and interpreting these cardiologic databases.

  20. Impact Response Study on Covering Cap of Aircraft Big-Size Integral Fuel Tank

    Science.gov (United States)

    Wang, Fusheng; Jia, Senqing; Wang, Yi; Yue, Zhufeng

    2016-10-01

    In order to assess various design concepts and choose a kind of covering cap design scheme which can meet the requirements of airworthiness standard and ensure the safety of fuel tank. Using finite element software ANSYS/LS- DYNA, the impact process of covering cap of aircraft fuel tank by projectile were simulated, in which dynamical characteristics of simple single covering cap and gland double-layer covering cap impacted by titanium alloy projectile and rubber projectile were studied, as well as factor effects on simple single covering cap and gland double-layer covering cap under impact region, impact angle and impact energy were also studied. Though the comparison of critical damage velocity and element deleted number of the covering caps, it shows that the external covering cap has a good protection effect on internal covering cap. The regions close to boundary are vulnerable to appear impact damage with titanium alloy projectile while the regions close to center is vulnerable to occur damage with rubber projectile. Equivalent strain in covering cap is very little when impact angle is less than 15°. Element deleted number in covering cap reaches the maximum when impact angle is between 60°and 65°by titanium alloy projectile. While the bigger the impact angle and the more serious damage of the covering cap will be when rubber projectile impact composite covering cap. The energy needed for occurring damage on external covering cap and internal covering cap is less than and higher than that when single covering cap occur damage, respectively. The energy needed for complete breakdown of double-layer covering cap is much higher than that of single covering cap.

  1. A study of operational cycle of terminal distributed power supply based on Big-data

    Science.gov (United States)

    Nie, Erbao; Liu, Zhoubin; He, Jinhong; Li, Chao

    2018-01-01

    In China, the distributed power supply industry enjoys a rapid development trend. For the users’ side of the distributed power mode of operation, there are various types. This paper, take rural as an example, mainly studies the all round life cycle operation mode of rural distributed solar power plant, including the feasibility study plan and investment suggestion of the initial construction of the rural power station, and the operation and maintenance in the middle period. China’s vast rural areas, areas per capita is large, average households have independent housing and courtyards, available building area is no problem. Compared with the urban areas, the return rate of investment is low, the investment options is rare, the collective is strong, the risk tolerance is weak and so on. Aiming at the characteristics of the rural areas in the above rural areas, three kinds of investment schemes of rural distributed photovoltaic power plants are put forward, and their concrete implementation plans are analyzed in detail. Especially the second option, for the farmers to consider the risk of investment, given their principal security, which greatly reduces the farmers into the power plant loss of funds risk. At the same time, according to the respective risk of farmers, given the corresponding investment advice. Rural income is generally low, the expected benefits of distributed photovoltaic power plant can significantly improve the income of farmers, improve the quality of life of farmers, coupled with the strong rural collective farmers, rural distributed photovoltaic power plants will mushroom, which On China’s photovoltaic construction and even the supply of clean energy is of great significance, so as to truly benefit the national energy strategy and rural construction.

  2. Big hearts, small hands: a focus group study exploring parental food portion behaviours

    Directory of Open Access Journals (Sweden)

    Kristina Curtis

    2017-09-01

    Full Text Available Abstract Background The development of healthy food portion sizes among families is deemed critical to childhood weight management; yet little is known about the interacting factors influencing parents’ portion control behaviours. This study aimed to use two synergistic theoretical models of behaviour: the COM-B model (Capability, Opportunity, Motivation – Behaviour and Theoretical Domains Framework (TDF to identify a broad spectrum of theoretically derived influences on parents’ portion control behaviours including examination of affective and habitual influences often excluded from prevailing theories of behaviour change. Methods Six focus groups exploring family weight management comprised of one with caseworkers (n = 4, four with parents of overweight children (n = 14 and one with parents of healthy weight children (n = 8. A thematic analysis was performed across the dataset where the TDF/COM-B were used as coding frameworks. Results To achieve the target behaviour, the behavioural analysis revealed the need for eliciting change in all three COM-B domains and nine associated TDF domains. Findings suggest parents’ internal processes such as their emotional responses, habits and beliefs, along with social influences from partners and grandparents, and environmental influences relating to items such as household objects, interact to influence portion size behaviours within the home environment. Conclusion This is the first study underpinned by COM-B/TDF frameworks applied to childhood weight management and provides new targets for intervention development and the opportunity for future research to explore the mediating and moderating effects of these variables on one another.

  3. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  4. Lost in a random forest: Using Big Data to study rare events

    Directory of Open Access Journals (Sweden)

    Christopher A Bail

    2015-12-01

    Full Text Available Sudden, broad-scale shifts in public opinion about social problems are relatively rare. Until recently, social scientists were forced to conduct post-hoc case studies of such unusual events that ignore the broader universe of possible shifts in public opinion that do not materialize. The vast amount of data that has recently become available via social media sites such as Facebook and Twitter—as well as the mass-digitization of qualitative archives provide an unprecedented opportunity for scholars to avoid such selection on the dependent variable. Yet the sheer scale of these new data creates a new set of methodological challenges. Conventional linear models, for example, minimize the influence of rare events as “outliers”—especially within analyses of large samples. While more advanced regression models exist to analyze outliers, they suffer from an even more daunting challenge: equifinality, or the likelihood that rare events may occur via different causal pathways. I discuss a variety of possible solutions to these problems—including recent advances in fuzzy set theory and machine learning—but ultimately advocate an ecumenical approach that combines multiple techniques in iterative fashion.

  5. Interpretation of optimisation in the context of a disposal facility for long-lived radioactive waste

    International Nuclear Information System (INIS)

    1999-01-01

    Full text: Guidance on the Requirements for Authorisation (the GRA) issued by the Environment Agency for England and Wales requires that all disposals of radioactive waste are undertaken in a manner consistent with four principles for the protection of the public. Among these is a principle of Optimisation, that: 'The radiological detriment to members of the public that may result from the disposal of radioactive waste shall be as low as reasonably achievable, economic and social factors being taken into account'. The principle of optimisation is widely accepted and has been discussed in both UK national policy and guidance and in documents from international organisations. The practical interpretation of optimisation in the context of post-closure safety of radioactive waste repositories is, however, still open to question. In particular, the strategies and procedures that a developer might employ to implement optimisation in the siting and development of a repository, and demonstrate optimisation in a safety case, are not defined. In preparation for its role of regulatory review, the Agency has undertaken a pilot study to explore the possible interpretations of optimisation stemming from the GRA, and to identify possible strategies and procedures that a developer might follow. A review has been undertaken of UK regulatory guidance and related documents, and also international guidance, referring to optimisation in relation to radioactive waste disposal facilities. In addition, diverse examples of the application of optimisation have been identified in the international and UK performance assessment literature. A one-day meeting was organised bringing together Agency staff and technical experts with different experiences and perspectives on the subject of optimisation in the context of disposal facilities for radioactive waste. This meeting identified and discussed key issues and possible approaches to optimisation, and specifically: (1) The meaning of

  6. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    Science.gov (United States)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy

  7. From 'Big 4' to 'Big 5': a review and epidemiological study on the relationship between psychiatric disorders and World Health Organization preventable diseases.

    Science.gov (United States)

    Chartier, Gabrielle; Cawthorpe, David

    2016-09-01

    This study outlines the rationale and provides evidence in support of including psychiatric disorders in the World Health Organization's classification of preventable diseases. The methods used represent a novel approach to describe clinical pathways, highlighting the importance of considering the full range of comorbid disorders within an integrated population-based data repository. Review of literature focused on comorbidity in relation to the four preventable diseases identified by the World Health Organization. This revealed that only 29 publications over the last 5 years focus on populations and tend only to consider one or two comorbid disorders simultaneously in regard to any main preventable disease class. This article draws attention to the importance of physical and psychiatric comorbidity and illustrates the complexity related to describing clinical pathways in terms of understanding the etiological and prognostic clinical profile for patients. Developing a consistent and standardized approach to describe these features of disease has the potential to dramatically shift the format of both clinical practice and medical education when taking into account the complex relationships between and among diseases, such as psychiatric and physical disease, that, hitherto, have been largely unrelated in research.

  8. The optimisation of wedge filters in radiotherapy of the prostate

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony J.; Webb, Steve

    1995-01-01

    A treatment plan optimisation algorithm has been applied to 12 patients with early prostate cancer in order to determine the optimum beam-weights and wedge angles for a standard conformal three-field treatment technique. The optimisation algorithm was based on fast-simulated-annealing using a cost function designed to achieve a uniform dose in the planning-target-volume (PTV) and to minimise the integral doses to the organs-at-risk. The algorithm has been applied to standard conformal three-field plans created by an experienced human planner, and run in three PLAN MODES: (1) where the wedge angles were fixed by the human planner and only the beam-weights were optimised; (2) where both the wedge angles and beam-weights were optimised; and (3) where both the wedge angles and beam-weights were optimised and a non-uniform dose was prescribed to the PTV. In the latter PLAN MODE, a uniform 100% dose was prescribed to all of the PTV except for that region that overlaps with the rectum where a lower (e.g., 90%) dose was prescribed. The resulting optimised plans have been compared with those of the human planner who found beam-weights by conventional forward planning techniques. Plans were compared on the basis of dose statistics, normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results of the comparison showed that all three PLAN MODES produced plans with slightly higher TCP for the same rectal NTCP, than the human planner. The best results were observed for PLAN MODE 3, where an average increase in TCP of 0.73% (± 0.20, 95% confidence interval) was predicted by the biological models. This increase arises from a beneficial dose gradient which is produced across the tumour. Although the TCP gain is small it comes with no increase in treatment complexity, and could translate into increased cures given the large numbers of patients being referred. A study of the beam-weights and wedge angles chosen by the optimisation algorithm revealed

  9. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  10. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  11. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  12. Rupture Propagation through the Big Bend of the San Andreas Fault: A Dynamic Modeling Case Study of the Great Earthquake of 1857

    Science.gov (United States)

    Lozos, J.

    2017-12-01

    The great San Andreas Fault (SAF) earthquake of 9 January 1857, estimated at M7.9, was one of California's largest historic earthquakes. Its 360 km rupture trace follows the Carrizo and Mojave segments of the SAF, including the 30° compressional Big Bend in the fault. If 1857 were a characteristic rupture, the hazard implications for southern California would be dire, especially given the inferred 150 year recurrence interval for this section of the fault. However, recent paleoseismic studies in this region suggest that 1857-type events occur less frequently than single-segment Carrizo or Mojave ruptures, and that the hinge of the Big Bend is a barrier to through-going rupture. Here, I use 3D dynamic rupture modeling to attempt to reproduce the rupture length and surface slip distribution of the 1857 earthquake, to determine which physical conditions allow rupture to negotiate the Big Bend of the SAF. These models incorporate the nonplanar geometry of the SAF, an observation-based heterogeneous regional velocity structure (SCEC CVM), and a regional stress field from seismicity literature. Under regional stress conditions, I am unable to produce model events that both match the observed surface slip on the Carrizo and Mojave segments of the SAF and include rupture through the hinge of the Big Bend. I suggest that accumulated stresses at the bend hinge from multiple smaller Carrizo or Mojave ruptures may be required to allow rupture through the bend — a concept consistent with paleoseismic observations. This study may contribute to understanding the cyclicity of hazard associated with the southern-central SAF.

  13. Statistical Optimisation of Fermentation Conditions for Citric Acid ...

    African Journals Online (AJOL)

    This study investigated the optimisation of fermentation conditions during citric acid production via solid state fermentation (SSF) of pineapple peels using Aspergillus niger. A three-variable, three-level Box-Behnken design (BBD) comprising 17 experimental runs was used to develop a statistical model for the fermentation ...

  14. Optimising the Blended Learning Environment: The Arab Open University Experience

    Science.gov (United States)

    Hamdi, Tahrir; Abu Qudais, Mohammed

    2018-01-01

    This paper will offer some insights into possible ways to optimise the blended learning environment based on experience with this modality of teaching at Arab Open University/Jordan branch and also by reflecting upon the results of several meta-analytical studies, which have shown blended learning environments to be more effective than their face…

  15. Day-ahead economic optimisation of energy storage

    NARCIS (Netherlands)

    Lampropoulos, I.; Garoufalis, P.; Bosch, van den P.P.J.; Groot, de R.J.W.; Kling, W.L.

    2014-01-01

    This article addresses the day-ahead economic optimisation of energy storage systems within the setting of electricity spot markets. The case study is about a lithium-ion battery system integrated in a low voltage distribution grid with residential customers and photovoltaic generation in the

  16. Smart optimisation and sensitivity analysis in water distribution systems

    CSIR Research Space (South Africa)

    Page, Philip R

    2015-12-01

    Full Text Available optimisation of a water distribution system by keeping the average pressure unchanged as water demands change, by changing the speed of the pumps. Another application area considered, using the same mathematical notions, is the study of the sensitivity...

  17. Optimisation of wort production from rice malt using enzymes and ...

    African Journals Online (AJOL)

    Commercially, rice malt has never been successfully used in brewing because of its low free α-amino nitrogen (FAN) content. This study was designed to optimise rice malt replacement for barley malt in wort production and to improve FAN by adding α-amylase and protease. The response surface methodology (RSM) ...

  18. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  19. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  20. Using Big Book to Teach Things in My House

    OpenAIRE

    Effrien, Intan; Lailatus, Sa’diyah; Nuruliftitah Maja, Neneng

    2017-01-01

    The purpose of this study to determine students' interest in learning using the big book media. Big book is a big book from the general book. The big book contains simple words and images that match the content of sentences and spelling. From here researchers can know the interest and development of students' knowledge. As well as train researchers to remain crative in developing learning media for students.

  1. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  2. Evolutionary programming for neutron instrument optimisation

    International Nuclear Information System (INIS)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelievre-Berna, Eddy

    2006-01-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations

  3. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  4. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  5. Public transport optimisation emphasising passengers’ travel behaviour

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo

    to the case where the two problems are solved sequentially without taking into account interdependencies. Figure 1 - Planning public transport The PhD study develops a metaheuristic algorithm to adapt the line plan configuration in order better to match passengers’ travel demand in terms of transfers as well......Passengers in public transport complaining about their travel experiences are not uncommon. This might seem counterintuitive since several operators worldwide are presenting better key performance indicators year by year. The present PhD study focuses on developing optimisation algorithms...... to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in the context of public transport operations. In public transport, the demand is represented...

  6. Dynamic optimisation of an industrial web process

    Directory of Open Access Journals (Sweden)

    M Soufian

    2008-09-01

    Full Text Available An industrial web process has been studied and it is shown that theunderlying physics of such processes governs by the Navier-Stokes partialdifferential equations with moving boundary conditions, which in turn have tobe determined by the solution of the thermodynamics equations. Thedevelopment of a two-dimensional continuous-discrete model structurebased on this study is presented. Other models are constructed based onthis model for better identification and optimisation purposes. Theparameters of the proposed models are then estimated using real dataobtained from the identification experiments with the process plant. Varioussimulation tests for validation are accompanied with the design, developmentand real-time industrial implementation of an optimal controller for dynamicoptimisation of this web process. It is shown that in comparison with thetraditional controller, the new controller resulted in a better performance, animprovement in film quality and saving in raw materials. This demonstrates theefficiency and validation of the developed models.

  7. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  8. Dose optimisation in computed radiography

    International Nuclear Information System (INIS)

    Schreiner-Karoussou, A.

    2005-01-01

    After the installation of computed radiography (CR) systems in three hospitals in Luxembourg a patient dose survey was carried out for three radiographic examinations, thorax, pelvis and lumbar spine. It was found that the patient doses had changed in comparison with the patient doses measured for conventional radiography in the same three hospitals. A close collaboration between the manufacturers of the X-ray installations, the CR imaging systems and the medical physicists led to the discovery that the speed class with which each radiographic examination was to be performed, had been ignored, during installation of the digital imaging systems. A number of procedures were carried out in order to calibrate and program the X-ray installations in conjunction with the CR systems. Following this optimisation procedure, a new patient dose survey was carried out for the three radiographic examinations. It was found that patient doses for the three hospitals were reduced. (authors)

  9. Optimising costs in WLCG operations

    CERN Document Server

    Pradillo, Mar; Flix, Josep; Forti, Alessandra; Sciabà, Andrea

    2015-01-01

    The Worldwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse the 50 Petabytes of data annually generated by the LHC. The WLCG operations are coordinated by a distributed team of managers and experts and performed by people at all participating sites and from all the experiments. Several improvements in the WLCG infrastructure have been implemented during the first long LHC shutdown to prepare for the increasing needs of the experiments during Run2 and beyond. However, constraints in funding will affect not only the computing resources but also the available effort for operations. This paper presents the results of a detailed investigation on the allocation of the effort in the different areas of WLCG operations, identifies the most important sources of inefficiency and proposes viable strategies for optimising the operational cost, taking into account the current trends in the evolution of the computing infrastruc...

  10. A study of pricing and trading model of Blockchain & Big data-based Energy-Internet electricity

    Science.gov (United States)

    Fan, Tao; He, Qingsu; Nie, Erbao; Chen, Shaozhen

    2018-01-01

    The development of Energy-Internet is currently suffering from a series of issues, such as the conflicts among high capital requirement, low-cost, high efficiency, the spreading gap between capital demand and supply, as well as the lagged trading & valuation mechanism, any of which would hinder Energy-Internet's evolution. However, with the development of Blockchain and big-data technology, it is possible to work out solutions for these issues. Based on current situation of Energy-Internet and its requirements for future progress, this paper demonstrates the validity of employing blockchain technology to solve the problems encountered by Energy-Internet during its development. It proposes applying the blockchain and big-data technologies to pricing and trading energy products through Energy-Internet and to accomplish cyber-based energy or power's transformation from physic products to financial assets.

  11. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  12. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  13. Optimisation of searches for Supersymmetry with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Zvolsky, Milan

    2012-01-15

    The ATLAS experiment is one of the four large experiments at the Large Hadron Collider which is specifically designed to search for the Higgs boson and physics beyond the Standard Model. The aim of this thesis is the optimisation of searches for Supersymmetry in decays with two leptons and missing transverse energy in the final state. Two different optimisation studies have been performed for two important analysis aspects: The final signal region selection and the choice of the trigger selection. In the first part of the analysis, a cut-based optimisation of signal regions is performed, maximising the signal for a minimal background contamination. By this, the signal yield can in parts be more than doubled. The second approach is to introduce di-lepton triggers which allow to lower the lepton transverse momentum threshold, thus enhancing the number of selected signal events significantly. The signal region optimisation was considered for the choice of the final event selection in the ATLAS di-lepton analyses. The trigger study contributed to the incorporation of di-lepton triggers to the ATLAS trigger menu. (orig.)

  14. Big Data: A Toll for all Strategic Decisions : A Study of Three Large Food and Beverage Processing Organizations

    OpenAIRE

    Arsenovic, Jasenko

    2015-01-01

    I will look at what impact big data have had on the managerial strategic decisions in the food and beverage industry. This in order to understand the complexity and theory of organizational strategic management, an effort to define the contemporary strategic theory into a holistic conceptual model is done through a literature review on organizational strategy. This literature explicitly proposes four distinctly different types of strategies that management need to consider in the organization...

  15. Work management to optimise occupational radiological protection

    International Nuclear Information System (INIS)

    Ahier, B.

    2009-01-01

    Although work management is no longer a new concept, continued efforts are still needed to ensure that good performance, outcomes and trends are maintained in the face of current and future challenges. The ISOE programme thus created an Expert Group on Work Management in 2007 to develop an updated report reflecting the current state of knowledge, technology and experience in the occupational radiological protection of workers at nuclear power plants. Published in 2009, the new ISOE report on Work Management to Optimise Occupational Radiological Protection in the Nuclear Power Industry provides up-to-date practical guidance on the application of work management principles. Work management measures aim at optimising occupational radiological protection in the context of the economic viability of the installation. Important factors in this respect are measures and techniques influencing i) dose and dose rate, including source- term reduction; ii) exposure, including amount of time spent in controlled areas for operations; and iii) efficiency in short- and long-term planning, worker involvement, coordination and training. Equally important due to their broad, cross-cutting nature are the motivational and organisational arrangements adopted. The responsibility for these aspects may reside in various parts of an installation's organisational structure, and thus, a multi-disciplinary approach must be recognised, accounted for and well-integrated in any work. Based on the operational experience within the ISOE programme, the following key areas of work management have been identified: - regulatory aspects; - ALARA management policy; - worker involvement and performance; - work planning and scheduling; - work preparation; - work implementation; - work assessment and feedback; - ensuring continuous improvement. The details of each of these areas are elaborated and illustrated in the report through examples and case studies arising from ISOE experience. They are intended to

  16. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  17. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  18. Expect systems and optimisation in process control

    Energy Technology Data Exchange (ETDEWEB)

    Mamdani, A.; Efstathiou, J. (eds.)

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately.

  19. Expert systems and optimisation in process control

    International Nuclear Information System (INIS)

    Mamdani, A.; Efstathiou, J.

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately. (author)

  20. Optimisation of parameters of DCD for PHWRs

    International Nuclear Information System (INIS)

    Velmurugan, S.; Sathyaseelan, V.S.; Narasimhan, S.V.; Mathur, P.K.

    1991-01-01

    Decontamination formulation based on EDTA, Oxalic acid, Citric acid was evaluated for its efficacy in removing oxide layers of PHWR. An ion exchange system which was specifically suitable for fission product dominated contamination in PHWRs was optimised for the reagent regeneration stage of the decontamination process. An analysis of the nature of the complexed metal species formed in the dissolution process and Electrochemical measurements were employed as a tool to follow the course of oxide removal during the dissolution process. An attempt was made to understand the redeposition behaviour of various isotopes during the decontamination process. SEM and ESCA studies of metal coupons before and after the dissolution process were used to analyse the deposits in the above context. The pick up of DCD reagents on the ion exchangers and material compatibility tests on Carbon steel, Monel-400 and Zircaloy-2 with the decontaminant under the conditions of decontamination experiment are reported. (author)

  1. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  2. Optimisation of Inulinase Production by Kluyveromyces bulgaricus

    Directory of Open Access Journals (Sweden)

    Darija Vranešić

    2002-01-01

    Full Text Available The present work is based on observation of the effects of pH and temperature of fermentation on the production of microbial enzyme inulinase by Kluyveromyces marxianus var. bulgaricus. Inulinase hydrolyzes inulin, a polysaccharide which can be isolated from plants such as Jerusalem artichoke, chicory or dahlia, and transformed into pure fructose or fructooligosaccharides. Fructooligosaccharides have great potential in food industry because they can be used as calorie-reduced compounds and noncariogenic sweeteners as well as soluble fibre and prebiotic compounds. Fructose formation from inulin is a single step enzymatic reaction and yields are up to 95 % the fructose. On the contrary, conventional fructose production from starch needs at least three enzymatic steps, yielding only 45 % of fructose. The process of inulinase production was optimised by using experimental design method. pH value of the cultivation medium showed to be the most significant variable and it should be maintained at optimum value of 3.6. The effect of temperature was slightly lower and optimal values were between 30 and 33 °C. At a low pH value of the cultivation medium, the microorganism was not able to producem enough enzyme and enzyme activities were low. Similar effect was caused by high temperature. The highest values of enzyme activities were achieved at optimal fermentation conditions and the values were: 100.16–124.36 IU/mL (with sucrose as substrate for determination of enzyme activity or 8.6–11.6 IU/mL (with inulin as substrate, respectively. The method of factorial design and response surface analysis makes it possible to study several factors simultaneously, to quantify the individual effect of each factor and to investigate their possible interactions. As a comparison to this method, optimisation of a physiological enzyme activity model depending on pH and temperature was also studied.

  3. 2015 OLC Lidar DEM: Big Wood, ID

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Quantum Spatial has collected Light Detection and Ranging (LiDAR) data for the Oregon LiDAR Consortium (OLC) Big Wood 2015 study area. This study area is located in...

  4. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  5. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  6. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  7. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  8. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  9. Towards identifying the business value of big data in a digital business ecosystem : a case study from the financial services industry

    NARCIS (Netherlands)

    de Vries, A.; Chituc, C.M.; Pommee, F.; Abramowicz, W.; Alt, R.; Franczyk, B.

    2016-01-01

    In today’s increasingly digital business ecosystem, big data offers numerous opportunities. Although research on big data receives a lot of attention, research on the business value of big data is scarce. The research project presented in this article aims at advancing the research in this area,

  10. How can general paediatric training be optimised in highly specialised tertiary settings? Twelve tips from an interview-based study of trainees.

    Science.gov (United States)

    Al-Yassin, Amina; Long, Andrew; Sharma, Sanjiv; May, Joanne

    2017-01-01

    Both general and subspecialty paediatric trainees undertake attachments in highly specialised tertiary hospitals. Trainee feedback suggests that mismatches in expectations between trainees and supervisors and a perceived lack of educational opportunities may lead to trainee dissatisfaction in such settings. With the 'Shape of Training' review (reshaping postgraduate training in the UK to focus on more general themes), this issue is likely to become more apparent. We wished to explore the factors that contribute to a positive educational environment and training experience and identify how this may be improved in highly specialised settings. General paediatric trainees working at all levels in subspecialty teams at a tertiary hospital were recruited (n=12). Semistructured interviews were undertaken to explore the strengths and weaknesses of training in such a setting and how this could be optimised. Appreciative inquiry methodology was used to identify areas of perceived best practice and consider how these could be promoted and disseminated. Twelve best practice themes were identified: (1) managing expectations by acknowledging the challenges; (2) educational contracting to identify learning needs and opportunities; (3) creative educational supervision; (4) centralised teaching events; (5) signposting learning opportunities; (6) curriculum-mapped pan-hospital teaching programmes; (7) local faculty groups with trainee representation; (8) interprofessional learning; (9) pastoral support systems; (10) crossover weeks to increase clinical exposure; (11) adequate clinical supervision; and (12) rota design to include teaching and clinic time. Tertiary settings have strengths, as well as challenges, for general paediatric training. Twelve trainee-generated tips have been identified to capitalise on the educational potential within these settings. Trainee feedback is essential to diagnose and improve educational environments and appreciative inquiry is a useful tool for

  11. Environmental optimisation of waste combustion

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Robert [AaF Energikonsult, Stockholm (Sweden); Berge, Niclas; Stroemberg, Birgitta [TPS Termiska Processer AB, Nykoeping (Sweden)

    2000-12-01

    The regulations concerning waste combustion evolve through R and D and a strive to get better and common regulations for the European countries. This study discusses if these rules of today concerning oxygen concentration, minimum temperature and residence time in the furnace and the use of stand-by burners are needed, are possible to monitor, are the optimum from an environmental point of view or could be improved. No evidence from well controlled laboratory experiments validate that 850 deg C in 6 % oxygen content in general is the best lower limit. A lower excess air level increase the temperature, which has a significant effect on the destruction of hydrocarbons, favourably increases the residence time, increases the thermal efficiency and the efficiency of the precipitators. Low oxygen content is also necessary to achieve low NO{sub x}-emissions. The conclusion is that the demands on the accuracy of the measurement devices and methods are too high, if they are to be used inside the furnace to control the combustion process. The big problem is however to find representative locations to measure temperature, oxygen content and residence time in the furnace. Another major problem is that the monitoring of the operation conditions today do not secure a good combustion. It can lead to a false security. The reason is that it is very hard to find boilers without stratifications. These stratifications (stream lines) has each a different history of residence time, mixing time, oxygen and combustible gas levels and temperature, when they reach the convection area. The combustion result is the sum of all these different histories. The hydrocarbons emission is in general not produced at a steady level. Small clouds of unburnt hydrocarbons travels along the stream lines showing up as peaks on a THC measurement device. High amplitude peaks has a tendency to contain higher ratio of heavy hydrocarbons than lower peaks. The good correlation between some easily detected

  12. Engineering Study for a Full Scale Demonstration of Steam Reforming Black Liquor Gasification at Georgia-Pacific's Mill in Big Island, Virginia; FINAL

    International Nuclear Information System (INIS)

    Robert De Carrera; Mike Ohl

    2002-01-01

    Georgia-Pacific Corporation performed an engineering study to determine the feasibility of installing a full-scale demonstration project of steam reforming black liquor chemical recovery at Georgia-Pacific's mill in Big Island, Virginia. The technology considered was the Pulse Enhanced Steam Reforming technology that was developed and patented by Manufacturing and Technology Conversion, International (MTCI) and is currently licensed to StoneChem, Inc., for use in North America. Pilot studies of steam reforming have been carried out on a 25-ton per day reformer at Inland Container's Ontario, California mill and on a 50-ton per day unit at Weyerhaeuser's New Bern, North Carolina mill

  13. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  14. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  15. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  16. The mediating role of self-efficacy in the relationship between Big five personality and depressive symptoms among Chinese unemployed population: a cross-sectional study.

    Science.gov (United States)

    Wang, Yang; Yao, Lutian; Liu, Li; Yang, Xiaoshi; Wu, Hui; Wang, Jiana; Wang, Lie

    2014-03-03

    Besides the rapid growth of economy, unemployment becomes a severe socio-economic problem in China. The huge population base in China makes the unemployed population a tremendously huge number. However, health status of unemployed population was ignored and few studies were conducted to describe the depressive symptoms of unemployed individuals in China. This study aims to examine the relationship between Big five personality and depressive symptoms and the mediating role of self-efficacy in this relationship. This cross-sectional study was performed during the period of July to September 2011. Questionnaires consisting of the Center for Epidemiologic Studies Depression Scale (CES-D), the Big Five Inventory (BFI) and the General Self-efficacy Scale (GSE), as well as demographic factors, were used to collect information of unemployed population. A total of 1,832 individuals (effective response rate: 73.28%) became our subjects. Hierarchical linear regression analyses were performed to explore the mediating role of self-efficacy. The prevalence of depressive symptoms was 67.7% among Chinese unemployed individuals. After adjusting for demographic characteristics, extraversion, agreeableness and conscientiousness were all negatively associated with depressive symptoms whereas neuroticism was positively associated with depressive symptoms. The proportion of mediating effect of self-efficacy in the relationship between extraversion/agreeableness/conscientiousness/neuroticism and depressive symptoms was 25.42%, 10.91%, 32.21% and 36.44%, respectively. Self-efficacy is a mediator in the relationship between extraversion/agreeableness/conscientiousness/neuroticism and depressive symptoms. Self-efficacy partially mediated the relationship between Big five personality and depressive symptoms among Chinese unemployed individuals. Interventions that focus on both individuals' personality and self-efficacy may be most successful to reduce depressive symptoms of unemployed

  17. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  18. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  19. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  20. The Mediating Role of Resilience in the Relationship between Big Five Personality and Anxiety among Chinese Medical Students: A Cross-Sectional Study

    Science.gov (United States)

    Shi, Meng; Liu, Li; Wang, Zi Yue; Wang, Lie

    2015-01-01

    Backgrounds The psychological distress of medical students is a major concern of public health worldwide. However, few studies have been conducted to evaluate anxiety symptoms of medical students in China. The purpose of this study was to investigate the anxiety symptoms among Chinese medical students, to examine the relationships between big five personality traits and anxiety symptoms among medical students, and to explore the mediating role of resilience in these relationships. Methods This multicenter cross-sectional study was conducted in June 2014. Self-reported questionnaires consisting of the Zung Self-Rating Anxiety Scale (SAS), Big Five Inventory (BFI), Wagnild and Young Resilience Scale (RS-14) and demographic section were distributed to the subjects. A stratified random cluster sampling method was used to select 2925 medical students (effective response rate: 83.57%) at four medical colleges and universities in Liaoning province, China. Asymptotic and resampling strategies were used to explore the mediating role of resilience. Results The prevalence of anxiety symptoms was 47.3% (SAS index score≥50) among Chinese medical students. After adjusting for the demographic factors, the traits of agreeableness, conscientiousness and openness were all negatively associated with anxiety whereas neuroticism was positively associated with it. Resilience functioned as a mediator in the relationships between agreeableness/conscientiousness/openness and anxiety symptoms. Conclusions Among Chinese medical students, the prevalence of anxiety symptoms was high and resilience mediated the relationships between big five personality traits and anxiety symptoms. Identifying at-risk individuals and undertaking appropriate intervention strategies that focus on both personality traits and resilience might be more effective to prevent and reduce anxiety symptoms. PMID:25794003

  1. The mediating role of resilience in the relationship between big five personality and anxiety among Chinese medical students: a cross-sectional study.

    Directory of Open Access Journals (Sweden)

    Meng Shi

    Full Text Available The psychological distress of medical students is a major concern of public health worldwide. However, few studies have been conducted to evaluate anxiety symptoms of medical students in China. The purpose of this study was to investigate the anxiety symptoms among Chinese medical students, to examine the relationships between big five personality traits and anxiety symptoms among medical students, and to explore the mediating role of resilience in these relationships.This multicenter cross-sectional study was conducted in June 2014. Self-reported questionnaires consisting of the Zung Self-Rating Anxiety Scale (SAS, Big Five Inventory (BFI, Wagnild and Young Resilience Scale (RS-14 and demographic section were distributed to the subjects. A stratified random cluster sampling method was used to select 2925 medical students (effective response rate: 83.57% at four medical colleges and universities in Liaoning province, China. Asymptotic and resampling strategies were used to explore the mediating role of resilience.The prevalence of anxiety symptoms was 47.3% (SAS index score≥50 among Chinese medical students. After adjusting for the demographic factors, the traits of agreeableness, conscientiousness and openness were all negatively associated with anxiety whereas neuroticism was positively associated with it. Resilience functioned as a mediator in the relationships between agreeableness/conscientiousness/openness and anxiety symptoms.Among Chinese medical students, the prevalence of anxiety symptoms was high and resilience mediated the relationships between big five personality traits and anxiety symptoms. Identifying at-risk individuals and undertaking appropriate intervention strategies that focus on both personality traits and resilience might be more effective to prevent and reduce anxiety symptoms.

  2. The mediating role of resilience in the relationship between big five personality and anxiety among Chinese medical students: a cross-sectional study.

    Science.gov (United States)

    Shi, Meng; Liu, Li; Wang, Zi Yue; Wang, Lie

    2015-01-01

    The psychological distress of medical students is a major concern of public health worldwide. However, few studies have been conducted to evaluate anxiety symptoms of medical students in China. The purpose of this study was to investigate the anxiety symptoms among Chinese medical students, to examine the relationships between big five personality traits and anxiety symptoms among medical students, and to explore the mediating role of resilience in these relationships. This multicenter cross-sectional study was conducted in June 2014. Self-reported questionnaires consisting of the Zung Self-Rating Anxiety Scale (SAS), Big Five Inventory (BFI), Wagnild and Young Resilience Scale (RS-14) and demographic section were distributed to the subjects. A stratified random cluster sampling method was used to select 2925 medical students (effective response rate: 83.57%) at four medical colleges and universities in Liaoning province, China. Asymptotic and resampling strategies were used to explore the mediating role of resilience. The prevalence of anxiety symptoms was 47.3% (SAS index score≥50) among Chinese medical students. After adjusting for the demographic factors, the traits of agreeableness, conscientiousness and openness were all negatively associated with anxiety whereas neuroticism was positively associated with it. Resilience functioned as a mediator in the relationships between agreeableness/conscientiousness/openness and anxiety symptoms. Among Chinese medical students, the prevalence of anxiety symptoms was high and resilience mediated the relationships between big five personality traits and anxiety symptoms. Identifying at-risk individuals and undertaking appropriate intervention strategies that focus on both personality traits and resilience might be more effective to prevent and reduce anxiety symptoms.

  3. Combining simulation and multi-objective optimisation for equipment quantity optimisation in container terminals

    OpenAIRE

    Lin, Zhougeng

    2013-01-01

    This thesis proposes a combination framework to integrate simulation and multi-objective optimisation (MOO) for container terminal equipment optimisation. It addresses how the strengths of simulation and multi-objective optimisation can be integrated to find high quality solutions for multiple objectives with low computational cost. Three structures for the combination framework are proposed respectively: pre-MOO structure, integrated MOO structure and post-MOO structure. The applications of ...

  4. Smart Information Management in Health Big Data.

    Science.gov (United States)

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  5. Layout Optimisation of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Pau Mercadé Ruiz

    2017-08-01

    Full Text Available This paper proposes an optimisation strategy for the layout design of wave energy converter (WEC arrays. Optimal layouts are sought so as to maximise the absorbed power given a minimum q-factor, the minimum distance between WECs, and an area of deployment. To guarantee an efficient optimisation, a four-parameter layout description is proposed. Three different optimisation algorithms are further compared in terms of performance and computational cost. These are the covariance matrix adaptation evolution strategy (CMA, a genetic algorithm (GA and the glowworm swarm optimisation (GSO algorithm. The results show slightly higher performances for the latter two algorithms; however, the first turns out to be significantly less computationally demanding.

  6. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    This thesis deals with topology optimisation for coupled convection problems. The aim is to extend and apply topology optimisation to steady-state conjugate heat transfer problems, where the heat conduction equation governs the heat transfer in a solid and is coupled to thermal transport...... in a surrounding uid, governed by a convection-diffusion equation, where the convective velocity field is found from solving the isothermal incompressible steady-state Navier-Stokes equations. Topology optimisation is also applied to steady-state natural convection problems. The modelling is done using stabilised...... finite elements, the formulation and implementation of which was done partly during a special course as prepatory work for this thesis. The formulation is extended with a Brinkman friction term in order to facilitate the topology optimisation of fluid flow and convective cooling problems. The derived...

  7. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  8. Credit price optimisation within retail banking

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... cost based pricing, where the price of a product or service is based on the .... function obtained from fitting a logistic regression model .... Note that the proposed optimisation approach below will allow us to also incorporate.

  9. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  10. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  11. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  12. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  13. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  14. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  15. User perspectives in public transport timetable optimisation

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    The present paper deals with timetable optimisation from the perspective of minimising the waiting time experienced by passengers when transferring either to or from a bus. Due to its inherent complexity, this bi-level minimisation problem is extremely difficult to solve mathematically, since tim...... on the large-scale public transport network in Denmark. The timetable optimisation approach yielded a yearly reduction in weighted waiting time equivalent to approximately 45 million Danish kroner (9 million USD)....

  16. Methodological principles for optimising functional MRI experiments

    International Nuclear Information System (INIS)

    Wuestenberg, T.; Giesel, F.L.; Strasburger, H.

    2005-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most common methods for localising neuronal activity in the brain. Even though the sensitivity of fMRI is comparatively low, the optimisation of certain experimental parameters allows obtaining reliable results. In this article, approaches for optimising the experimental design, imaging parameters and analytic strategies will be discussed. Clinical neuroscientists and interested physicians will receive practical rules of thumb for improving the efficiency of brain imaging experiments. (orig.) [de

  17. Optimisation: how to develop stake holder involvement

    International Nuclear Information System (INIS)

    Weiss, W.

    2003-01-01

    The Precautionary Principle is an internationally recognised approach for dealing with risk situations characterised by uncertainties and potential irreversible damages. Since the late fifties, ICRP has adopted this prudent attitude because of the lack of scientific evidence concerning the existence of a threshold at low doses for stochastic effects. The 'linear, no-threshold' model and the 'optimisation of protection' principle have been developed as a pragmatic response for the management of the risk. The progress in epidemiology and radiobiology over the last decades have affirmed the initial assumption and the optimisation remains the appropriate response for the application of the precautionary principle in the context of radiological protection. The basic objective of optimisation is, for any source within the system of radiological protection, to maintain the level of exposure as low as reasonably achievable, taking into account social and economical factors. Methods tools and procedures have been developed over the last two decades to put into practice the optimisation principle with a central role given to the cost-benefit analysis as a means to determine the optimised level of protection. However, with the advancement in the implementation of the principle more emphasis was progressively given to good practice, as well as on the importance of controlling individual levels of exposure through the optimisation process. In the context of the revision of its present recommendations, the Commission is reenforcing the emphasis on protection of the individual with the adoption of an equity-based system that recognizes individual rights and a basic level of health protection. Another advancement is the role that is now recognised to 'stakeholders involvement' in the optimisation process as a mean to improve the quality of the decision aiding process for identifying and selecting protection actions considered as being accepted by all those involved. The paper

  18. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  19. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  20. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  1. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  2. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  3. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  4. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  5. Measuring of the airway dimensions with spiral CT images: an experimental study in Japanese white big-ear rabbits

    International Nuclear Information System (INIS)

    Han Xinwei; Lu Huibin; Ma Ji; Wu Gang; Wang Nan; Si Jiangtao

    2009-01-01

    Objective: To measure the length, angle and their correlation of the main anatomical dimensions of the trachea and bronchus in experimental Japanese white big-ear rabbits with the help of spiral CT 3D images, in order to lay the foundation of treating the airway disorders with stenting in animal experiment. Methods: Multi-slice CT scanning of cervico-thoracic region was performed in 30 healthy adult Japanese white big-ear rabbits, the longitudinal, transversal dimensions of the trachea, the glottis-carina length, the inner diameter and length of bronchi, and the angle formed by bronchial long axis and sagittal section were measured. Results: No significant difference was found in the inner diameters of various parts of the trachea and upper apical bronchi. The angle formed by bronchial long axis and sagittal section were smaller than that of the left ones. And the inner diameters of the right main bronchus was bigger than the left ones. Conclusion: The complex branching structure of the rabbit airway tree can be well displayed on spiral CT 3D images. Through measuring and statistical analysis of the results the authors have got a regressive equation for estimating the value of the inner diameter, length, angle, etc. concerning the airway tree, which is very helpful for providing the useful anatomical parameters in rabbit experiment. (authors)

  6. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  7. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  8. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  9. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. Optimising neutron polarizers--measuring the flipping ratio and related quantities

    CERN Document Server

    Goossens, D J

    2002-01-01

    The continuing development of gaseous spin polarized sup 3 He transmission filters for use as neutron polarizers makes the choice of optimum thickness for these filters an important consideration. The 'quality factors' derived for the optimisation of transmission filters for particular measurements are general to all neutron polarizers. In this work optimisation conditions for neutron polarizers are derived and discussed for the family of studies related to measuring the flipping ratio from samples. The application of the optimisation conditions to sup 3 He transmission filters and other types of neutron polarizers is discussed. Absolute comparisons are made between the effectiveness of different types of polarizers for this sort of work.

  11. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  12. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  13. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  14. Thermal performance monitoring and optimisation

    International Nuclear Information System (INIS)

    Sunde, Svein; Berg; Oeyvind

    1998-01-01

    Monitoring of the thermal efficiency of nuclear power plants is expected to become increasingly important as energy-market liberalisation exposes plants to increasing availability requirements and fiercer competition. The general goal in thermal performance monitoring is straightforward: to maximise the ratio of profit to cost under the constraints of safe operation. One may perceive this goal to be pursued in two ways, one oriented towards fault detection and cost-optimal predictive maintenance, and another determined at optimising target values of parameters in response to any component degradation detected, changes in ambient conditions, or the like. Annual savings associated with effective thermal-performance monitoring are expected to be in the order of $ 100 000 for power plants of representative size. A literature review shows that a number of computer systems for thermal-performance monitoring exists, either as prototypes or commercially available. The characteristics and needs of power plants may vary widely, however, and decisions concerning the exact scope, content and configuration of a thermal-performance monitor may well follow a heuristic approach. Furthermore, re-use of existing software modules may be desirable. Therefore, we suggest here the design of a flexible workbench for easy assembly of an experimental thermal-performance monitor at the Halden Project. The suggested design draws heavily on our extended experience in implementing control-room systems featured by assets like high levels of customisation, flexibility in configuration and modularity in structure, and on a number of relevant adjoining activities. The design includes a multi-computer communication system and a graphical user's interface, and aims at a system adaptable to any combination of in-house or end user's modules, as well as commercially available software. (author)

  15. Dark energy, wormholes, and the big rip

    International Nuclear Information System (INIS)

    Faraoni, V.; Israel, W.

    2005-01-01

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal

  16. Study of the 2H(p,γ)3He reaction in the Big Bang Nucleosynthesis energy range at LUNA

    Science.gov (United States)

    Mossa, Viviana

    2018-01-01

    Deuterium is the first nucleus produced in the Universe, whose accumulation marks the beginning of the so called Big Bang Nucleosynthesis (BBN). Its primordial abundance is very sensitive to some cosmological parameters like the baryon density and the number of the neutrino families. Presently the main obstacle to an accurate theoretical deuterium abundance evaluation is due to the poor knowledge of the 2H(p,γ)3He cross section at BBN energies. The aim of the present work is to describe the experimental approach proposed by the LUNA collaboration, whose goal is to measure, with unprecedented precision, the total and the differential cross section of the reaction in the 30 < Ec.m. [keV] < 300 energy range.

  17. I big data e gli strumenti di visualizzazione analitica: interazioni e studi induttivi per le P.A.

    Directory of Open Access Journals (Sweden)

    Giuseppe Roccasalva

    2012-12-01

    Full Text Available Il saggio presenta alcuni risultati di una collaborazione tra Politecnico di Torino e il CSI Piemonte (Società di servizi Informatizzati partecipata dalla Regione Piemonte. Sono stati selezionati e studiati diversi strumenti di visualizzazione dei dati scientifici (Gapminder, ManyEyes, Open eXplorer e Fineo al fine di individuare quello più utile per una lettura induttiva di grandi quantità di dati informativi (big data. Lo sfruttamento intelligente dei dati digitali può portare a uno sviluppo conoscitivo ma anche a un profitto, le cui soglie di sfruttamento possono essere misurate in un sistema economico. Nell’irreversibile fenomeno di crescita dei dati digitali, la disciplina del “Data Visualization” diventa cruciale per accedere e comprendere informazioni complesse. Few, guru della comunicazione visiva, scrive che “scopriamo il mondo attraverso gli occhi”; le forme di comunicazione e interpretazione tradizionali dei dati hanno puntato sulla dimensione visuale per migliorare la comprensione e hanno permesso sia agli analisti sia agli utenti la sperimentazione di nuove interazioni (“story-telling”. Come urbanisti e cittadini, ci affidiamo alla vista che gestisce molti dei sensori (70% legati alla percezione, alle mappe cognitive, agli errori e ai nuovi pensieri. L’ipotesi di fondo di questo articolo vuole generare delle riflessioni sui Big Data come strategia importante per le imprese pubbliche e private che intendono imparare a cambiare dalle informazioni digitali di cui oggi disponiamo. Attraverso l’uso di uno strumento analitico di visualizzazione dei dati informativi, si descrive un recente caso di studio in un contesto territoriale come quello dei nuovi consorzi amministrativi (Unione dei Comuni NordEst Torino. In questo esperimento torna a essere attuale la necessità di pianificare le scelte in modo sistematico anche cercando di utilizzare in modo nuovo e semplice i sistemi informativi territoriali già disponibili.

  18. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  19. Propeptide big-endothelin, N-terminal-pro brain natriuretic peptide and mortality. The Ludwigshafen risk and cardiovascular health (LURIC) study.

    Science.gov (United States)

    Gergei, Ingrid; Krämer, Bernhard K; Scharnagl, Hubert; Stojakovic, Tatjana; März, Winfried; Mondorf, Ulrich

    The endothelin system (Big-ET-1) is a key regulator in cardiovascular (CV) disease and congestive heart failure (CHF). We have examined the incremental value of Big-ET-1 in predicting total and CV mortality next to the well-established CV risk marker N-Terminal Pro-B-Type Natriuretic Peptide (NT-proBNP). Big-ET-1 and NT-proBNP were determined in 2829 participants referred for coronary angiography (follow-up 9.9 years). Big-ET-1 is an independent predictor of total, CV mortality and death due to CHF. The conjunct use of Big-ET-1 and NT-proBNP improves the risk stratification of patients with intermediate to high risk of CV death and CHF. Big-ET-1improves risk stratification in patients referred for coronary angiography.

  20. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  1. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    find the suitable method, that is, the mode of estimation and uncertainty-measure determination that optimizes a selected measure for prescribed values close to the initial estimates. Also here, intelligent exploration methods (gradient, Brent, etc.) are useful. The third task is to apply the optimal estimation method to the climate dataset. This conference paper illustrates by means of three examples that optimal estimation has the potential to shape future big climate data analysis. First, we consider various hypothesis tests to study whether climate extremes are increasing in their occurrence. Second, we compare Pearson's and Spearman's correlation measures. Third, we introduce a novel estimator of the tail index, which helps to better quantify climate-change related risks.

  2. A Big Five approach to self-regulation: personality traits and health trajectories in the Hawaii longitudinal study of personality and health.

    Science.gov (United States)

    Hampson, Sarah E; Edmonds, Grant W; Barckley, Maureen; Goldberg, Lewis R; Dubanoski, Joan P; Hillier, Teresa A

    2016-01-01

    Self-regulatory processes influencing health outcomes may have their origins in childhood personality traits. The Big Five approach to personality was used here to investigate the associations between childhood traits, trait-related regulatory processes and changes in health across middle age. Participants (N = 1176) were members of the Hawaii longitudinal study of personality and health. Teacher assessments of the participants' traits when they were in elementary school were related to trajectories of self-rated health measured on 6 occasions over 14 years in middle age. Five trajectories of self-rated health were identified by latent class growth analysis: Stable Excellent, Stable Very Good, Good, Decreasing and Poor. Childhood Conscientiousness was the only childhood trait to predict membership in the Decreasing class vs. the combined healthy classes (Stable Excellent, Stable Very Good and Good), even after controlling for adult Conscientiousness and the other adult Big Five traits. The Decreasing class had poorer objectively assessed clinical health measured on one occasion in middle age, was less well-educated, and had a history of more lifespan health-damaging behaviors compared to the combined healthy classes. These findings suggest that higher levels of childhood Conscientiousness (i.e. greater self-discipline and goal-directedness) may prevent subsequent health decline decades later through self-regulatory processes involving the acquisition of lifelong healthful behavior patterns and higher educational attainment.

  3. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  4. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  5. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  6. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  7. Optimising the Target and Capture Sections of the Neutrino Factory

    CERN Document Server

    Hansen, Ole Martin; Stapnes, Steinar

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accelerator and thus the neutrino beam intensity. The optimisation studies were performed with the use of Monte Carlo simulation tools. The production of secondary particles, by interactions between the incoming proton beam and the mercury target, was optimised by varying the proton beam impact position and impact angles on the target. The proton beam and target interaction region was studied and showed to be off the central axis of the capture section in the baseline configuration. The off-centred interaction region resulted in ...

  8. Biomass supply chain optimisation for Organosolv-based biorefineries.

    Science.gov (United States)

    Giarola, Sara; Patel, Mayank; Shah, Nilay

    2014-05-01

    This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. On the impact of optimisation models in maintenance decision making: the state of the art

    International Nuclear Information System (INIS)

    Dekker, Rommert; Scarf, Philip A.

    1998-01-01

    In this paper we discuss the state of the art in applications of maintenance optimisation models. After giving a short introduction to the area, we consider several ways in which models may be used to optimise maintenance, such as case studies, operational and strategic decision support systems, and give examples of each of them. Next we discuss several areas where the models have been applied successfully. These include civil structure and aeroplane maintenance. From a comparative point of view, we discuss future prospects

  10. Design of passive coolers for light-emitting diode lamps using topology optimisation

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Sigmund, Ole; Meyer, Knud Erik

    2018-01-01

    Topology optimised designs for passive cooling of light-emitting diode (LED) lamps are investigated through extensive numerical parameter studies. The designs are optimised for either horizontal or vertical orientations and are compared to a lattice-fin design as well as a simple parameter......, while maintaining low sensitivity to orientation. Furthermore, they exhibit several defining features and provide insight and general guidelines for the design of passive coolers for LED lamps....

  11. Integration of Monte-Carlo ray tracing with a stochastic optimisation method: application to the design of solar receiver geometry.

    Science.gov (United States)

    Asselineau, Charles-Alexis; Zapata, Jose; Pye, John

    2015-06-01

    A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.

  12. Renewables portfolio standard and regional energy structure optimisation in China

    International Nuclear Information System (INIS)

    Fan, J.; Sun, W.; Ren, D.-M.

    2005-01-01

    Eastern Coastal areas of China have been developing rapidly since the implementation of reforms and the opening of China's economic markets in 1978. As in most areas of the world, this rapid economic growth has been accompanied by large increases in energy consumption. China's coal-dominated energy structure has resulted in serious ecological and environmental problems. Exploiting renewable energy resources and introducing Renewables Portfolio Standard (RPS) are some of the most important approaches towards optimising and sustaining the energy structure of China. This paper discusses international experiences in the implementation of RPS policies and prospects for using these policies to encourage renewable energy development in China, establishes a concise definition of renewable resources, differentiating between the broad definition (which includes hydro over 25 MW in size) from the narrow definition (which limits the eligibility of hydro to below 25 MW in size), and quantitatively analyses the potential renewable energy target. The research shows that: (1) Under the narrow hydro definition the renewable energy target would be 5.1% and under the broad hydro definition it would be 18.4%. (2) Western China has contributed 90.2% of the total renewable electricity generation in the country (if big and medium hydropowers are not included). Including big and medium hydropower, the figure is 63.8%. (3) Eastern electricity companies can achieve their quota by buying Tradable Renewable Energy Certificates (TRCs or Green Certificates) and by exploiting renewable energy resources in Western China. The successful implementation of the RPS policy will achieve the goal of sharing the benefits and responsibilities of energy production between the different regions of China

  13. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  14. A Big Social Media Data Study of the 2017 German Federal Election Based on Social Set Analysis of Political Party Facebook Pages with SoSeVi

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi; Mukkamala, Raghava Rao

    2017-01-01

    We present a big social media data study that comprises of 1 million individuals who interact with Facebook pages of the seven major political parties CDU, CSU, SPD, FDP, Greens, Die Linke and AfD during the 2017 German federal election. Our study uses the Social Set Analysis (SSA) approach, which...... is based on the sociology of associations, mathematics of set theory, and advanced visual analytics of event studies. We illustrate the capabilities of SSA through the most recent version of our Social Set Analysis (SoSeVi) tool, which enables us to deep dive into Facebook activity concerning the election....... We explore a significant gender-based difference between female and male interactions with political party Facebook pages. Furthermore, we perform a multi-faceted analysis of social media interactions using gender detection, user segmentation and retention analysis, and visualize our findings...

  15. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  16. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  17. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  19. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  20. Multicriteria Optimisation in Logistics Forwarder Activities

    Directory of Open Access Journals (Sweden)

    Tanja Poletan Jugović

    2007-05-01

    Full Text Available Logistics forwarder, as organizer and planner of coordinationand integration of all the transport and logistics chains elements,uses adequate ways and methods in the process of planningand decision-making. One of these methods, analysed inthis paper, which could be used in optimisation of transportand logistics processes and activities of logistics forwarder, isthe multicriteria optimisation method. Using that method, inthis paper is suggested model of multicriteria optimisation of logisticsforwarder activities. The suggested model of optimisationis justified in keeping with method principles of multicriteriaoptimization, which is included in operation researchmethods and it represents the process of multicriteria optimizationof variants. Among many different processes of multicriteriaoptimization, PROMETHEE (Preference Ranking OrganizationMethod for Enrichment Evaluations and Promcalc& Gaia V. 3.2., computer program of multicriteria programming,which is based on the mentioned process, were used.

  1. Noise aspects at aerodynamic blade optimisation projects

    International Nuclear Information System (INIS)

    Schepers, J.G.

    1997-06-01

    The Netherlands Energy Research Foundation (ECN) has often been involved in industrial projects, in which blade geometries are created automatic by means of numerical optimisation. Usually, these projects aim at the determination of the aerodynamic optimal wind turbine blade, i.e. the goal is to design a blade which is optimal with regard to energy yield. In other cases, blades have been designed which are optimal with regard to cost of generated energy. However, it is obvious that the wind turbine blade designs which result from these optimisations, are not necessarily optimal with regard to noise emission. In this paper an example is shown of an aerodynamic blade optimisation, using the ECN-program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. 11 figs., 8 refs

  2. Techno-economic optimisation of energy systems; Contribution a l'optimisation technico-economique de systemes energetiques

    Energy Technology Data Exchange (ETDEWEB)

    Mansilla Pellen, Ch

    2006-07-15

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  3. Algorithm for optimisation of paediatric chest radiography

    International Nuclear Information System (INIS)

    Kostova-Lefterova, D.

    2016-01-01

    The purpose of this work is to assess the current practice and patient doses in paediatric chest radiography in a large university hospital. The X-ray unit is used in the paediatric department for respiratory diseases. Another purpose was to recommend and apply optimized protocols to reduce patient dose while maintaining diagnostic image quality for the x-ray images. The practice of two different radiographers was studied. The results were compared with the existing practice in paediatric chest radiography and the opportunities for optimization were identified in order to reduce patient doses. A methodology was developed for optimization of the x-ray examinations by grouping children in age groups or according to other appropriate indication and creating an algorithm for proper selection of the exposure parameters for each group. The algorithm for the optimisation of paediatric chest radiography reduced patient doses (PKA, organ dose, effective dose) between 1.5 and 6 times for the different age groups, the average glandular dose up to 10 times and the dose for the lung between 2 and 5 times. The resulting X-ray images were of good diagnostic quality. The subjectivity in the choice of exposure parameters was reduced and standardization has been achieved in the work of the radiographers. The role of the radiologist, the medical physicist and radiographer in the process of optimization was shown. It was proven the effect of teamwork in reducing patient doses at keeping adequate image quality. Key words: Chest Radiography. Paediatric Radiography. Optimization. Radiation Exposure. Radiation Protection

  4. Optimising Boltzmann codes for the PLANCK era

    International Nuclear Information System (INIS)

    Hamann, Jan; Lesgourgues, Julien; Balbi, Amedeo; Quercellini, Claudia

    2009-01-01

    High precision measurements of the Cosmic Microwave Background (CMB) anisotropies, as can be expected from the PLANCK satellite, will require high-accuracy theoretical predictions as well. One possible source of theoretical uncertainty is the numerical error in the output of the Boltzmann codes used to calculate angular power spectra. In this work, we carry out an extensive study of the numerical accuracy of the public Boltzmann code CAMB, and identify a set of parameters which determine the error of its output. We show that at the current default settings, the cosmological parameters extracted from data of future experiments like Planck can be biased by several tenths of a standard deviation for the six parameters of the standard ΛCDM model, and potentially more seriously for extended models. We perform an optimisation procedure that leads the code to achieve sufficient precision while at the same time keeping the computation time within reasonable limits. Our conclusion is that the contribution of numerical errors to the theoretical uncertainty of model predictions is well under control—the main challenges for more accurate calculations of CMB spectra will be of an astrophysical nature instead

  5. Statistical optimisation techniques in fatigue signal editing problem

    International Nuclear Information System (INIS)

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-01-01

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection

  6. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  7. Topology Optimisation of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Thike Aye Min

    2016-01-01

    Full Text Available Wireless sensor networks are widely used in a variety of fields including industrial environments. In case of a clustered network the location of cluster head affects the reliability of the network operation. Finding of the optimum location of the cluster head, therefore, is critical for the design of a network. This paper discusses the optimisation approach, based on the brute force algorithm, in the context of topology optimisation of a cluster structure centralised wireless sensor network. Two examples are given to verify the approach that demonstrate the implementation of the brute force algorithm to find an optimum location of the cluster head.

  8. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  9. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  10. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  11. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  12. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  13. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  14. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  15. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  16. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  17. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  18. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  19. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  20. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  1. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  2. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  3. Extending Particle Swarm Optimisers with Self-Organized Criticality

    DEFF Research Database (Denmark)

    Løvbjerg, Morten; Krink, Thiemo

    2002-01-01

    Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.......Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions....

  4. Optimisation of surgical care for rectal cancer

    NARCIS (Netherlands)

    Borstlap, W.A.A.

    2017-01-01

    Optimisation of surgical care means weighing the risk of treatment related morbidity against the patients’ potential benefits of a surgical intervention. The first part of this thesis focusses on the anaemic patient undergoing colorectal surgery. Hypothesizing that a more profound haemoglobin

  5. On optimal development and becoming an optimiser

    NARCIS (Netherlands)

    de Ruyter, D.J.

    2012-01-01

    The article aims to provide a justification for the claim that optimal development and becoming an optimiser are educational ideals that parents should pursue in raising their children. Optimal development is conceptualised as enabling children to grow into flourishing persons, that is persons who

  6. OPTIMISATION OF COMPRESSIVE STRENGTH OF PERIWINKLE ...

    African Journals Online (AJOL)

    In this paper, a regression model is developed to predict and optimise the compressive strength of periwinkle shell aggregate concrete using Scheffe's regression theory. The results obtained from the derived regression model agreed favourably with the experimental data. The model was tested for adequacy using a student ...

  7. An efficient optimisation method in groundwater resource ...

    African Journals Online (AJOL)

    DRINIE

    2003-10-04

    Oct 4, 2003 ... theories developed in the field of stochastic subsurface hydrology. In reality, many ... Recently, some researchers have applied the multi-stage ... Then a robust solution of the optimisation problem given by Eqs. (1) to (3) is as ...

  8. Water distribution systems design optimisation using metaheuristics ...

    African Journals Online (AJOL)

    The topic of multi-objective water distribution systems (WDS) design optimisation using metaheuristics is investigated, comparing numerous modern metaheuristics, including several multi-objective evolutionary algorithms, an estimation of distribution algorithm and a recent hyperheuristic named AMALGAM (an evolutionary ...

  9. Optimisation of efficiency of axial fans

    NARCIS (Netherlands)

    Kruyt, Nicolaas P.; Pennings, P.C.; Faasen, R.

    2014-01-01

    A three-stage research project has been executed to develop ducted axial-fans with increased efficiency. In the first stage a design method has been developed in which various conflicting design criteria can be incorporated. Based on this design method, an optimised design has been determined

  10. Thermodynamic optimisation of a heat exchanger

    NARCIS (Netherlands)

    Cornelissen, Rene; Hirs, Gerard

    1999-01-01

    The objective of this paper is to show that for the optimal design of an energy system, where there is a trade-off between exergy saving during operation and exergy use during construction of the energy system, exergy analysis and life cycle analysis should be combined. An exergy optimisation of a

  11. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane Loft

    2013-01-01

    . The definition of an optimal performance was carried out by through a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow, given the probability of a future rain event. The methodology is successfully applied...

  12. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi

    2012-09-04

    In this paper, shape optimisation of flapping wings in forward flight is considered. This analysis is performed by combining a local gradient-based optimizer with the unsteady vortex lattice method (UVLM). Although the UVLM applies only to incompressible, inviscid flows where the separation lines are known a priori, Persson et al. [1] showed through a detailed comparison between UVLM and higher-fidelity computational fluid dynamics methods for flapping flight that the UVLM schemes produce accurate results for attached flow cases and even remain trend-relevant in the presence of flow separation. As such, they recommended the use of an aerodynamic model based on UVLM to perform preliminary design studies of flapping wing vehicles Unlike standard computational fluid dynamics schemes, this method requires meshing of the wing surface only and not of the whole flow domain [2]. From the design or optimisation perspective taken in our work, it is fairly common (and sometimes entirely necessary, as a result of the excessive computational cost of the highest fidelity tools such as Navier-Stokes solvers) to rely upon such a moderate level of modelling fidelity to traverse the design space in an economical manner. The objective of the work, described in this paper, is to identify a set of optimised shapes that maximise the propulsive efficiency, defined as the ratio of the propulsive power over the aerodynamic power, under lift, thrust, and area constraints. The shape of the wings is modelled using B-splines, a technology used in the computer-aided design (CAD) field for decades. This basis can be used to smoothly discretize wing shapes with few degrees of freedom, referred to as control points. The locations of the control points constitute the design variables. The results suggest that changing the shape yields significant improvement in the performance of the flapping wings. The optimisation pushes the design to "bird-like" shapes with substantial increase in the time

  13. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  14. Optimising Shovel-Truck Fuel Consumption using Stochastic ...

    African Journals Online (AJOL)

    Optimising the fuel consumption and truck waiting time can result in significant fuel savings. The paper demonstrates that stochastic simulation is an effective tool for optimising the utilisation of fossil-based fuels in mining and related industries. Keywords: Stochastic, Simulation Modelling, Mining, Optimisation, Shovel-Truck ...

  15. Design of optimised backstepping controller for the synchronisation ...

    Indian Academy of Sciences (India)

    Ehsan Fouladi

    2017-12-18

    Dec 18, 2017 ... for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller. Keywords. Colpitts oscillator; backstepping controller; chaos synchronisation; shark smell algorithm; particle .... The velocity model is based on the gradient of the objective function, tilting ...

  16. Efficient topology optimisation of multiscale and multiphysics problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    The aim of this Thesis is to present efficient methods for optimising high-resolution problems of a multiscale and multiphysics nature. The Thesis consists of two parts: one treating topology optimisation of microstructural details and the other treating topology optimisation of conjugate heat...

  17. Research on the Impact of Big Data on Logistics

    Directory of Open Access Journals (Sweden)

    Wang Yaxing

    2017-01-01

    Full Text Available In the context of big data development, a large amount of data will appear at logistics enterprises, especially in the aspect of logistics, such as transportation, warehousing, distribution and so on. Based on the analysis of the characteristics of big data, this paper studies the impact of big data on the logistics and its action mechanism, and gives reasonable suggestions. Through building logistics data center by using the big data technology, some hidden value information behind the data will be digged out, in which the logistics enterprises can benefit from it.

  18. Optimisation of integrated energy and materials systems

    International Nuclear Information System (INIS)

    Gielen, D.J.; Okken, P.A.

    1994-06-01

    To define cost-effective long term CO2 reduction strategies an integrated energy and materials system model for the Netherlands for the period 2000-2040 is developed. The model is based upon the energy system model MARKAL, which configures an optimal mix of technologies to satisfy the specified energy and product/materials service demands. This study concentrates on CO 2 emission reduction in the materials system. For this purpose, the energy system model is enlarged with a materials system model including all steps 'from cradle to grave'. The materials system model includes 29 materials, 20 product groups and 30 waste materials. The system is divided into seven types of technologies; 250 technologies are modeled. The results show that the integrated optimisation of the energy system and the materials system can significantly reduce the emission reduction costs, especially at higher reduction percentages. The reduction is achieved through shifts in materials production and waste handling and through materials substitution in products. Shifts in materials production and waste management seem cost-effective, while the cost-effectiveness of shifts in product composition is sensitive due to the cost structure of products. For the building sector, transportation applications and packaging, CO 2 policies show a significant impact on prices, and shifts in product composition could occur. For other products, the reduction through materials substitution seems less promising. The impact on materials consumption seems most significant for cement (reduced), timber and aluminium (both increased). For steel and plastics, the net effect is balanced, but shifts between applications do occur. The MARKAL-approach is feasible to study integrated energy and materials systems. The progress compared to other environmental system analysis instruments is much more insight in the interaction of technologies on a national scale and in time

  19. Design of the New Life(style) study : a randomised controlled trial to optimise maternal weight development during pregnancy. [ISRCTN85313483

    NARCIS (Netherlands)

    Althuizen, Ellen; van Poppel, Mireille Nm; Seidell, Jacob C; van der Wijden, Carla; van Mechelen, Willem

    2006-01-01

    BACKGROUND: Preventing excessive weight gain during pregnancy is potentially important in the prevention of overweight and obesity among women of childbearing age. However, few intervention studies aiming at weight management during pregnancy have been performed and most of these interventions were

  20. Extended prediction rule to optimise early detection of heart failure in older persons with non-acute shortness of breath : A cross-sectional study

    NARCIS (Netherlands)

    Van Riet, Evelien E S; Hoes, Arno W.; Limburg, Alexander; Landman, Marcel A J; Kemperman, Hans; Rutten, Frans H.

    2016-01-01

    Objectives: There is a need for a practical tool to aid general practitioners in early detection of heart failure in the elderly with shortness of breath. In this study, such a screening rule was developed based on an existing rule for detecting heart failure in older persons with a diagnosis of