WorldWideScience

Sample records for optimisation study big

  1. Profile control studies for JET optimised shear regime

    Energy Technology Data Exchange (ETDEWEB)

    Litaudon, X.; Becoulet, A.; Eriksson, L.G.; Fuchs, V.; Huysmans, G.; How, J.; Moreau, D.; Rochard, F.; Tresset, G.; Zwingmann, W. [Association Euratom-CEA, CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint-Paul-lez-Durance (France); Bayetti, P.; Joffrin, E.; Maget, P.; Mayorat, M.L.; Mazon, D.; Sarazin, Y. [JET Abingdon, Oxfordshire (United Kingdom); Voitsekhovitch, I. [Universite de Provence, LPIIM, Aix-Marseille 1, 13 (France)

    2000-03-01

    This report summarises the profile control studies, i.e. preparation and analysis of JET Optimised Shear plasmas, carried out during the year 1999 within the framework of the Task-Agreement (RF/CEA/02) between JET and the Association Euratom-CEA/Cadarache. We report on our participation in the preparation of the JET Optimised Shear experiments together with their comprehensive analyses and the modelling. Emphasis is put on the various aspects of pressure profile control (core and edge pressure) together with detailed studies of current profile control by non-inductive means, in the prospects of achieving steady, high performance, Optimised Shear plasmas. (authors)

  2. Review Study of Mining Big Data

    Directory of Open Access Journals (Sweden)

    Mohammad Misagh Javaherian

    2016-06-01

    Full Text Available Big data is time period for collecting extensive and complex data set which including both structured and nonstructured information. Data can come from everywhere. sensors for collecting environment data are presented in online networking targets, computer images and recording and so on , this information is known as big data. The valuable data can be extracted from this big data using data mining. Data mining is a method to find attractive samples and also logical models of information in wide scale. This article shown types of big data and future problems in extensive information as a chart. Study of issues in data-centered model in addition to big data will be analyzed.

  3. The Cell Factory Aspergillus Enters the Big Data Era: Opportunities and Challenges for Optimising Product Formation.

    Science.gov (United States)

    Meyer, Vera; Fiedler, Markus; Nitsche, Benjamin; King, Rudibert

    2015-01-01

    Living with limits. Getting more from less. Producing commodities and high-value products from renewable resources including waste. What is the driving force and quintessence of bioeconomy outlines the lifestyle and product portfolio of Aspergillus, a saprophytic genus, to which some of the top-performing microbial cell factories belong: Aspergillus niger, Aspergillus oryzae and Aspergillus terreus. What makes them so interesting for exploitation in biotechnology and how can they help us to address key challenges of the twenty-first century? How can these strains become trimmed for better growth on second-generation feedstocks and how can we enlarge their product portfolio by genetic and metabolic engineering to get more from less? On the other hand, what makes it so challenging to deduce biological meaning from the wealth of Aspergillus -omics data? And which hurdles hinder us to model and engineer industrial strains for higher productivity and better rheological performance under industrial cultivation conditions? In this review, we will address these issues by highlighting most recent findings from the Aspergillus research with a focus on fungal growth, physiology, morphology and product formation. Indeed, the last years brought us many surprising insights into model and industrial strains. They clearly told us that similar is not the same: there are different ways to make a hypha, there are more protein secretion routes than anticipated and there are different molecular and physical mechanisms which control polar growth and the development of hyphal networks. We will discuss new conceptual frameworks derived from these insights and the future scientific advances necessary to create value from Aspergillus Big Data.

  4. A case study in scanner optimisation.

    Science.gov (United States)

    Dudley, N J; Gibson, N M

    2014-02-01

    Ultrasound scanner preset programmes are factory set or tailored to user requirements. Scanners may, therefore, have different settings for the same application, even on similar equipment in a single department. The aims of this study were: (1) to attempt to match the performance of two scanners, where one was preferred and (2) to assess differences between six scanners used for breast ultrasound within our organisation. The Nottingham Ultrasound Quality Assurance software was used to compare imaging performance. Images of a Gammex RMI 404GS test object were collected from six scanners, using default presets, factory presets and settings matched to a preferred scanner. Resolution, low contrast performance and high contrast performance were measured. The performance of two scanners was successfully matched, where one had been preferred. Default presets varied across the six scanners, three different presets being used. The most used preset differed in settings across the scanners, most notably in the use of different frequency modes. The factory preset was more consistent across the scanners, the main variation being in dynamic range (55-70 dB). Image comparisons showed significant differences, which were reduced or eliminated by adjustment of settings to match a reference scanner. It is possible to match scanner performance using the Nottingham Ultrasound Quality Assurance software as a verification tool. Ultrasound users should be aware that scanners may not behave in a similar fashion, even with apparently equivalent presets. It should be possible to harmonise presets by consensus amongst users.

  5. STUDY OF FACTORS AFFECTING CUSTOMER BEHAVIOUR USING BIG DATA TECHNOLOGY

    OpenAIRE

    Prabin Sahoo; Dr. Nilay Yajnik

    2014-01-01

    Big data technology is getting momentum recently. There are several articles, books, blogs and discussion points to various facets of big data technology. The study in this paper focuses on big data as concept, and insights into 3 Vs such as Volume, Velocity and Variety and demonstrates their significance with respect to factors that can be processed using big data for studying customer behaviour for online users.

  6. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    Science.gov (United States)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  7. A comparative study of marriage in honey bees optimisation (MBO ...

    African Journals Online (AJOL)

    2012-02-15

    Feb 15, 2012 ... complicate water management decision-making. ... evolutionary algorithms, such as the genetic algorithm (GA), ant colony optimisation for continuous ... biological properties. ... and proposed a new algorithm,called the 'artificial bee colony' ... as a set of transitions in a state–space (the environment), where.

  8. Pre-segmented 2-Step IMRT with subsequent direct machine parameter optimisation – a planning study

    Directory of Open Access Journals (Sweden)

    Flentje Michael

    2008-11-01

    Full Text Available Abstract Background Modern intensity modulated radiotherapy (IMRT mostly uses iterative optimisation methods. The integration of machine parameters into the optimisation process of step and shoot leaf positions has been shown to be successful. For IMRT segmentation algorithms based on the analysis of the geometrical structure of the planning target volumes (PTV and the organs at risk (OAR, the potential of such procedures has not yet been fully explored. In this work, 2-Step IMRT was combined with subsequent direct machine parameter optimisation (DMPO-Raysearch Laboratories, Sweden to investigate this potential. Methods In a planning study DMPO on a commercial planning system was compared with manual primary 2-Step IMRT segment generation followed by DMPO optimisation. 15 clinical cases and the ESTRO Quasimodo phantom were employed. Both the same number of optimisation steps and the same set of objective values were used. The plans were compared with a clinical DMPO reference plan and a traditional IMRT plan based on fluence optimisation and consequent segmentation. The composite objective value (the weighted sum of quadratic deviations of the objective values and the related points in the dose volume histogram was used as a measure for the plan quality. Additionally, a more extended set of parameters was used for the breast cases to compare the plans. Results The plans with segments pre-defined with 2-Step IMRT were slightly superior to DMPO alone in the majority of cases. The composite objective value tended to be even lower for a smaller number of segments. The total number of monitor units was slightly higher than for the DMPO-plans. Traditional IMRT fluence optimisation with subsequent segmentation could not compete. Conclusion 2-Step IMRT segmentation is suitable as starting point for further DMPO optimisation and, in general, results in less complex plans which are equal or superior to plans generated by DMPO alone.

  9. Optimising the neutron environment of Radiation Portal Monitors: a computational optimisation study

    CERN Document Server

    Gilbert, Mark R; Packer, Lee W

    2015-01-01

    Efficient and reliable detection of radiological or nuclear threats is a crucial part of national and international efforts to prevent terrorist activities. Radiation Portal Monitors (RPMs), which are deployed worldwide, are intended to interdict smuggled fissile material by detecting emissions of neutrons and gamma rays. However, considering the range and variety of threat sources, vehicular and shielding scenarios, and that only a small signature is present, it is important that the design of the RPMs allows these signatures to be accurately differentiated from the environmental background. Using Monte-Carlo neutron-transport simulations of a model helium-3 detector system we have conducted a parameter study to identify the optimum combination of detector shielding and collimation that maximises the sensitivity of RPMs. These structures, which could be simply and cost-effectively added to existing RPMs, can improve the detector response by more than a factor of two relative to an unmodified, bare design. Fu...

  10. A Critical Axiology for Big Data Studies

    Directory of Open Access Journals (Sweden)

    Saif Shahin

    2016-01-01

    Full Text Available Los datos masivos ( Big Data han tenido un gran impacto en el periodis - mo y los estudios de comunicación, a la vez que han generado un gran número de preocupaciones sociales que van desde la vigilancia masiva hasta la legitimación de prejuicios, como el racismo. En este artículo, se desarrolla una agenda para la investigación crítica de Big Data y se discu - te cuál debería ser el propósito de dicha investigación, de qué obstáculos protegerse y la posibilidad de adaptar los métodos de Big Data para lle - var a cabo la investigación empírica desde un punto de vista crítico. Di - cho programa de investigación no solo permitirá que la erudición crítica desafíe significativamente a Big Data como una herramienta hegemónica, sino que también permitirá que los académicos usen los recursos de Big Data para abordar una serie de problemas sociales de formas previamente imposibles. El artículo llama a la innovación metodológica para combinar las técnicas emergentes de Big Data y los métodos críticos y cualitativos de investigación, como la etnografía y el análisis del discurso, de tal ma - nera que se puedan complementar.

  11. How to optimise drug study design: pharmacokinetics and pharmacodynamics studies introduced to paediatricians.

    OpenAIRE

    Vermeulen, E.; van den Anker, J N; Della Pasqua, O; Hoppu, K.; Lee, J.H.; Global Research in Paediatrics

    2016-01-01

    OBJECTIVES: In children, there is often lack of sufficient information concerning the pharmacokinetics (PK) and pharmacodynamics (PD) of a study drug to support dose selection and effective evaluation of efficacy in a randomised clinical trial (RCT). Therefore, one should consider the relevance of relatively small PKPD studies, which can provide the appropriate data to optimise the design of an RCT. METHODS: Based on the experience of experts collaborating in the EU-funded Global Research in ...

  12. ELM Meets Urban Big Data Analysis: Case Studies

    Science.gov (United States)

    Chen, Huajun; Chen, Jiaoyan

    2016-01-01

    In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203

  13. Study on the material character steel big-end bolt

    Institute of Scientific and Technical Information of China (English)

    SHI Jian-jun; MA Nian-jie; ZHAN Ping; ZHANG Dian-li

    2008-01-01

    Based on the manufacturing method now used in our country,caused low bearing capacity in the bolt-end,which is a potential danger in the bolt supporting tunnel and a waste of money,and presented the new type strong steel big-end bolt can solve this problem.Analyzed the active state of the end of bolt by ANSYS,we can know that it is very disadvantage when bolt bore eccentric load.Contrasted with the different that common bolt and big-end bolt when they bore the same loading.The common bolt is bigger than the big-end bolt in stress value.Study on the processing technic of the new type of the strong steel big-end bolt,the new metal big-end bolt was produced by heat processing over big-end bolt and upset.From the microscopic examination on bolt metal,it is concluded that heat processing on the bolt-end refines the crystal grain of the metal material,which not only increase its extension but improve its property.Moreover the strength ability of the bolt material can be exerted completely.

  14. Study on the material character steel big-end bolt

    Institute of Scientific and Technical Information of China (English)

    SHI Jian-jun; MA Nian-jie; ZHAN Ping; ZHANG Dian-li

    2008-01-01

    Based on the manufacturing method now used in our country, caused low bear-ing capacity in the bolt-end, which is a potential danger in the bolt supporting tunnel and a waste of money, and presented the new type strong steel big-end bolt can solve this problem. Analyzed the active state of the end of bolt by ANSYS, we can know that it is very disadvantage when bolt bore eccentric load. Contrasted with the different that com-mon bolt and big-end bolt when they bore the same loading. The common bolt is bigger than the big-end bolt in stress value. Study on the processing technic of the new type of the strong steel big-end bolt, the new metal big-end bolt was produced by heat processing over big-end bolt and upset. From the microscopic examination on bolt metal, it is con-cluded that heat processing on the bolt-end refines the crystal grain of the metal material, which not only increase its extension but improve its property. Moreover the strength ability of the bolt material can be exerted completely.

  15. Single-cell Transcriptome Study as Big Data

    Institute of Scientific and Technical Information of China (English)

    Pingjian Yu; Wei Lin

    2016-01-01

    The rapid growth of single-cell RNA-seq studies (scRNA-seq) demands efficient data storage, processing, and analysis. Big-data technology provides a framework that facilitates the comprehensive discovery of biological signals from inter-institutional scRNA-seq datasets. The strategies to solve the stochastic and heterogeneous single-cell transcriptome signal are discussed in this article. After extensively reviewing the available big-data applications of next-generation sequencing (NGS)-based studies, we propose a workflow that accounts for the unique characteris-tics of scRNA-seq data and primary objectives of single-cell studies.

  16. Single-cell Transcriptome Study as Big Data

    Science.gov (United States)

    Yu, Pingjian; Lin, Wei

    2016-01-01

    The rapid growth of single-cell RNA-seq studies (scRNA-seq) demands efficient data storage, processing, and analysis. Big-data technology provides a framework that facilitates the comprehensive discovery of biological signals from inter-institutional scRNA-seq datasets. The strategies to solve the stochastic and heterogeneous single-cell transcriptome signal are discussed in this article. After extensively reviewing the available big-data applications of next-generation sequencing (NGS)-based studies, we propose a workflow that accounts for the unique characteristics of scRNA-seq data and primary objectives of single-cell studies. PMID:26876720

  17. BIG DATA IN SUPPLY CHAIN MANAGEMENT: AN EXPLORATORY STUDY

    Directory of Open Access Journals (Sweden)

    Gheorghe MILITARU

    2015-12-01

    Full Text Available The objective of this paper is to set a framework for examining the conditions under which the big data can create long-term profitability through developing dynamic operations and digital supply networks in supply chain. We investigate the extent to which big data analytics has the power to change the competitive landscape of industries that could offer operational, strategic and competitive advantages. This paper is based upon a qualitative study of the convergence of predictive analytics and big data in the field of supply chain management. Our findings indicate a need for manufacturers to introduce analytics tools, real-time data, and more flexible production techniques to improve their productivity in line with the new business model. By gathering and analysing vast volumes of data, analytics tools help companies to resource allocation and capital spends more effectively based on risk assessment. Finally, implications and directions for future research are discussed.

  18. Epidemiological study of venous thromboembolism in a big Danish cohort

    DEFF Research Database (Denmark)

    Severinsen, Marianne Tang; Kristensen, Søren Risom; Overvad, Kim

    Introduction: Epidemiological data on venous thromboembolism (VT), i.e. pulmonary emboli (PE) and deep venous thrombosis (DVT) are sparse. We have examined VT-diagnoses registered in a big Danish Cohort study.  Methods: All first-time VT diagnoses in The Danish National Patient Register were...

  19. A study on Scintillating Fiber tracker optimisation for the LHCb upgrade

    CERN Document Server

    Piucci, Alessio; Esen, Sevda; Nikodem, Thomas

    2017-01-01

    New tracking stations made from scintillating fibres are designed as a part of the LHCb Upgrade to provide the tracking capabilities in the data-taking environment foreseen after Run II. Due to the higher track multiplicity and higher total integrated radiation dose, larger inefficiencies are found in the simulation compared to the current tracker. This note presents the optimisation studies and proposals which would improve the reconstruction efficiency in the three stations of the scintillating fiber tracker.

  20. Studies towards optimisation of the analog hadronic calorimeter for future linear collider detectors

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Huong Lan [Deutsches Elektronen-Synchrotron (DESY), Notkestrasse 85, 22607 Hamburg (Germany); Collaboration: CALICE-D-Collaboration

    2016-07-01

    The Analog Hadronic Calorimeter (AHCAL) is a highly granular calorimeter developed in the CALICE collaboration for future linear collider detectors. Its design concept is based on 3 x 3 cm{sup 2} scintillator tiles readout by Silicon Photomultipliers (SiPM). With this design the ambitious required jet energy resolution of 3-4 % can be achieved using the Pandora Particle Flow Algorithm (PandoraPFA). Recent discussions concerning the overall size and cost of the ILD detector has triggered new studies to optimise AHCAL cell size. A smaller number of cells can reduce the detector cost but the corresponding larger cell size can lead to a degradation of the jet energy resolution. The AHCAL optimisation study therefore has to achieve the best balance between physics performance and cost. Recent studies using the latest version of PandoraPFA with improved pattern recognition have shown significant improvement of jet energy resolution. Moreover, a better energy reconstruction of single particles, in which software compensation plays an important role, can lead to further improvements. This talk will discuss the software compensation technique and its impact on the final cell size optimisation.

  1. Radiation dose to children in diagnostic radiology. Measurements and methods for clinical optimisation studies

    Energy Technology Data Exchange (ETDEWEB)

    Almen, A.J.

    1995-09-01

    A method for estimating mean absorbed dose to different organs and tissues was developed for paediatric patients undergoing X-ray investigations. The absorbed dose distribution in water was measured for the specific X-ray beam used. Clinical images were studied to determine X-ray beam positions and field sizes. Size and position of organs in the patient were estimated using ORNL phantoms and complementary clinical information. Conversion factors between the mean absorbed dose to various organs and entrance surface dose for five different body sizes were calculated. Direct measurements on patients estimating entrance surface dose and energy imparted for common X-ray investigations were performed. The examination technique for a number of paediatric X-ray investigations used in 19 Swedish hospitals was studied. For a simulated pelvis investigation of a 1-year old child the entrance surface dose was measured and image quality was estimated using a contrast-detail phantom. Mean absorbed doses to organs and tissues in urography, lung, pelvis, thoracic spine, lumbar spine and scoliosis investigations was calculated. Calculations of effective dose were supplemented with risk calculations for special organs e g the female breast. The work shows that the examination technique in paediatric radiology is not yet optimised, and that the non-optimised procedures contribute to a considerable variation in radiation dose. In order to optimise paediatric radiology there is a need for more standardised methods in patient dosimetry. It is especially important to relate measured quantities to the size of the patient, using e g the patient weight and length. 91 refs, 17 figs, 8 tabs.

  2. Clinical studies of optimised single crystal and polycrystalline diamonds for radiotherapy dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Descamps, C. [CEA-LIST (Recherche Technologique)/DETECS/SSTM/LCD, CEA/Saclay, Gif-sur-Yvette (France)], E-mail: cdescamps23@yahoo.fr; Tromson, D.; Tranchant, N. [CEA-LIST (Recherche Technologique)/DETECS/SSTM/LCD, CEA/Saclay, Gif-sur-Yvette (France); Isambert, A.; Bridier, A. [Institut Gustave Roussy, Villejuif (France); De Angelis, C.; Onori, S. [Dipartimento di Tecnologie e Salute, Istituto Superiore di Sanita, Roma (Italy); Bucciolini, M. [Dipartimento di Fisiopatologia dell' Universita, Firenze (Italy); Bergonzo, P. [CEA-LIST (Recherche Technologique)/DETECS/SSTM/LCD, CEA/Saclay, Gif-sur-Yvette (France)

    2008-02-15

    Natural diamond based ionisation chambers commercialised by PTW are used in several hospitals, and their dosimetric properties have been reported in many papers. Nevertheless their high costs and long delivery times are strong drawbacks. Advancements in the growth of synthetic diamonds offer new possibilities. This paper presents the dosimetric analysis in terms of stability and repeatability of the signal, background signal, detector response dynamics, linearity of the signal with the absorbed dose and dose rate dependence of synthetic optimised polycrystalline and single crystal diamonds. Both were elaborated at the CEA-LIST using the chemical vapour deposition (CVD) growth technique. The first dosimetric evaluation of single crystal diamond detector, reported here, shows a repeatability better than 0.1%, a good sensitivity around 70 nC/Gy compared to 3 nC/Gy for optimised polycrystalline diamond, very fast response with rise time around 1 s. Moreover, the signal linearity vs absorbed dose and energy dependence are very satisfactory. This preliminary dosimetric study with medical linear accelerators proves that diamond, and more precisely synthetic single crystal diamond, appears as a good alternative to air ionisation chambers for quality beam control and could be a good candidate for intensity modulated radiation therapy (IMRT) beams dosimetry.

  3. Using modified fruit fly optimisation algorithm to perform the function test and case studies

    Science.gov (United States)

    Pan, Wen-Tsao

    2013-06-01

    Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.

  4. Big Data in HEP: A comprehensive use case study

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Oliver [Fermilab; Cremonesi, Matteo [Fermilab; Elmer, Peter [Princeton U.; Jayatilaka, Bo [Fermilab; Kowalkowski, Jim [Fermilab; Pivarski, Jim [Princeton U.; Sehrish, Saba [Fermilab; Mantilla Surez, Cristina [Johns Hopkins U.; Svyatkovskiy, Alexey [Princeton U.; Tran, Nhan [Fermilab

    2017-01-31

    Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the rst to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (ltering and transforming experiment-specic data formats), these new technologies use dierent approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the ocial experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed. 1.

  5. Big Data, the perfect instrument to study today's consumer behavior

    Directory of Open Access Journals (Sweden)

    Cristina STOICESCU

    2016-01-01

    Full Text Available Consumer behavior study is a new, interdisciplinary and emerging science, developed in the 1960s. Its main sources of information come from economics, psychology, sociology, anthropology and artificial intelligence. If a century ago, most people were living in small towns, with limited possibilities to leave their community, and few ways to satisfy their needs, now, due to the accelerated evolution of technology and the radical change of life style, consumers begin to have increasingly diverse needs. At the same time the instruments used to study their behavior have evolved, and today databases are included in consumer behavior research. Throughout time many models were developed, first in order to analyze, and later in order to predict the consumer behavior. As a result, the concept of Big Data developed, and by applying it now, companies are trying to understand and predict the behavior of their consumers.

  6. Big Earth Data Initiative: Metadata Improvement: Case Studies

    Science.gov (United States)

    Kozimor, John; Habermann, Ted; Farley, John

    2016-01-01

    Big Earth Data Initiative (BEDI) The Big Earth Data Initiative (BEDI) invests in standardizing and optimizing the collection, management and delivery of U.S. Government's civil Earth observation data to improve discovery, access use, and understanding of Earth observations by the broader user community. Complete and consistent standard metadata helps address all three goals.

  7. A Study of Energy Optimisation of Urban Water Distribution Systems Using Potential Elements

    Directory of Open Access Journals (Sweden)

    Ioan Sarbu

    2016-12-01

    Full Text Available Energy use in water supply systems represents a significant portion of the global energy consumption. The electricity consumption due to the water pumping represents the highest proportion of the energy costs in these systems. This paper presents several comparative studies of energy efficiency in water distribution systems considering distinct configurations of the networks and also considers implementation of the variable-speed pumps. The main objective of this study is the energy optimisation of urban systems using optimal network configurations that reduce energy consumption and improve energy efficiency. The paper describes in detail four strategies for improving energy efficiency of water pumping: control systems to vary pump speed drive according to water demand, pumped storage tanks, intermediary pumping stations integrated in the network, and elevated storage tanks floating on the system. The improving energy efficiency of water pumping is briefly reviewed providing a representative real case study. In addition, a different approach for the hydraulic analysis of the networks and the determination of the optimal location of a pumped storage tank is provided. Finally, this study compares the results of the application of four water supply strategies to a real case in Romania. The results indicate high potential operating costs savings.

  8. BIG DATA IN SUPPLY CHAIN MANAGEMENT: AN EXPLORATORY STUDY

    National Research Council Canada - National Science Library

    Gheorghe MILITARU; Massimo POLLIFRONI; Alexandra IOANID

    2015-01-01

    ... networks in supply chain. We investigate the extent to which big data analytics has the power to change the competitive landscape of industries that could offer operational, strategic and competitive advantages...

  9. Energetic study of combustion instabilities and genetic optimisation of chemical kinetics; Etude energetique des instabilites thermo-acoustiques et optimisation genetique des cinetiques reduites

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Ch.E.

    2005-12-15

    Gas turbine burners are now widely operated in lean premixed combustion mode. This technology has been introduced in order to limit pollutants emissions (especially the NO{sub x}), and thus comply with environment norms. Nevertheless, the use of lean premixed combustion decreases the stability margin of the flames. The flames are then more prone to be disturbed by flow disturbances. Combustion instabilities are then a major problem of concern for modern gas turbine conception. Some active control systems have been used to ensure stability of gas turbines retro-fitted to lean premixed combustion. The current generation of gas turbines aims to get rid of these control devices getting stability by a proper design. To do so, precise and adapted numerical tools are needed even it is impossible at the moment to guarantee the absolute stability of a combustion chamber at the design stage. Simulation tools for unsteady combustion are now able to compute the whole combustion chamber. Its intrinsic precision, allows the Large Eddy Simulation (LES) to take into account numerous phenomena involved in combustion instabilities. Chemical modelling is an important element for the precision of reactive LES. This study includes the description of an optimisation tools for the reduced chemical kinetics. The capacity of the LES to capture combustion instabilities in gas turbine chamber is also demonstrated. The acoustic energy analysis points out that the boundary impedances of the combustion systems are of prime importance for their stability. (author)

  10. Optimising qualitative longitudinal analysis: Insights from a study of traumatic brain injury recovery and adaptation.

    Science.gov (United States)

    Fadyl, Joanna K; Channon, Alexis; Theadom, Alice; McPherson, Kathryn M

    2017-04-01

    Knowledge about aspects that influence recovery and adaptation in the postacute phase of disabling health events is key to understanding how best to provide appropriate rehabilitation and health services. Qualitative longitudinal research makes it possible to look for patterns, key time points and critical moments that could be vital for interventions and supports. However, strategies that support robust data management and analysis for longitudinal qualitative research in health-care are not well documented in the literature. This article reviews three challenges encountered in a large longitudinal qualitative descriptive study about experiences of recovery and adaptation after traumatic brain injury in New Zealand, and the strategies and technologies used to address them. These were (i) tracking coding and analysis decisions during an extended analysis period; (ii) navigating interpretations over time and in response to new data; and (iii) exploiting data volume and complexity. Concept mapping during coding review, a considered combination of information technologies, employing both cross-sectional and narrative analysis, and an expectation that subanalyses would be required for key topics helped us manage the study in a way that facilitated useful and novel insights. These strategies could be applied in other qualitative longitudinal studies in healthcare inquiry to optimise data analysis and stimulate important insights. © 2016 John Wiley & Sons Ltd.

  11. Optimising the neutron environment of Radiation Portal Monitors: A computational study

    Science.gov (United States)

    Gilbert, Mark R.; Ghani, Zamir; McMillan, John E.; Packer, Lee W.

    2015-09-01

    Efficient and reliable detection of radiological or nuclear threats is a crucial part of national and international efforts to prevent terrorist activities. Radiation Portal Monitors (RPMs), which are deployed worldwide, are intended to interdict smuggled fissile material by detecting emissions of neutrons and gamma rays. However, considering the range and variety of threat sources, vehicular and shielding scenarios, and that only a small signature is present, it is important that the design of the RPMs allows these signatures to be accurately differentiated from the environmental background. Using Monte-Carlo neutron-transport simulations of a model 3He detector system we have conducted a parameter study to identify the optimum combination of detector shielding, moderation, and collimation that maximises the sensitivity of neutron-sensitive RPMs. These structures, which could be simply and cost-effectively added to existing RPMs, can improve the detector response by more than a factor of two relative to an unmodified, bare design. Furthermore, optimisation of the air gap surrounding the helium tubes also improves detector efficiency.

  12. Study of Rotor Spun Basofil/Cotton Blended Yarn Quality Characteristics during Optimisation of Processing Parameters

    Institute of Scientific and Technical Information of China (English)

    Mwasiagi J.I.; WANG Xin-hou; Tuigong D.R.; Wang J.

    2005-01-01

    Yarn quality characteristics are affected by processing parameters. A 36 tex rotor spun yarn of 50/50 Basofil/cotton (B/C) blended yarn was spun, and the spinning process optimised for rotor speed, opening roller speed and twist factor. Selected yarn characteristics were studied during the optimization process. During the optimizations process yarn elongation and hairiness reduced with increase in rotor speed. Tenacity increased with increase of rotor speed. The increase in TF caused tenacity and CV of count to increase up to a peak and then started to decrease with further increase of TF. While TF caused an increase in yarn hairiness, elongation decreased to a minimum level and then started to increase with further increase of TF. CV of count and hairiness increased with increase in opening roller speed, but tenacity and elongation decreased with increase in opening roller speed. The optimization process yielded the optimum levels for rotor speed, opening roller speed and twist factor (TF) as 45,000 rpm, 6,500 rpm and 450respectively. As per uster Standards the optimum yarn showed good results for CV of count, CV of tenacity and thin places/km.

  13. Optimisation of integrated biodiesel production. Part I. A study of the biodiesel purity and yield.

    Science.gov (United States)

    Vicente, Gemma; Martínez, Mercedes; Aracil, José

    2007-07-01

    This study consists of the development and optimisation of the potassium hydroxide-catalysed synthesis of fatty acid methyl esters (biodiesel) from sunflower oil. A factorial design of experiments and a central composite design have been used. The variables chosen were temperature, initial catalyst concentration by weight of sunflower oil and the methanol:vegetable oil molar ratio, while the responses were biodiesel purity and yield. The initial catalyst concentration is the most important factor, having a positive influence on biodiesel purity, but a negative one on biodiesel yield. Temperature has a significant positive effect on biodiesel purity and a significant negative influence on biodiesel yield. The methanol:vegetable oil molar ratio is only significant for the biodiesel purity, having a positive influence. Second-order models were obtained to predict biodiesel purity and yield as a function of these variables. The best conditions are 25 degrees C, a 1.3%wt for the catalyst concentration and a 6:1 methanol:sunflower oil molar ratio.

  14. Optimising the neutron environment of Radiation Portal Monitors: A computational study

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Mark R., E-mail: mark.gilbert@ccfe.ac.uk [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Ghani, Zamir [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); McMillan, John E. [Department of Physics and Astronomy, University of Sheffield, Hicks building, Hounsfield Road, Sheffield S3 7RH (United Kingdom); Packer, Lee W. [United Kingdom Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-21

    Efficient and reliable detection of radiological or nuclear threats is a crucial part of national and international efforts to prevent terrorist activities. Radiation Portal Monitors (RPMs), which are deployed worldwide, are intended to interdict smuggled fissile material by detecting emissions of neutrons and gamma rays. However, considering the range and variety of threat sources, vehicular and shielding scenarios, and that only a small signature is present, it is important that the design of the RPMs allows these signatures to be accurately differentiated from the environmental background. Using Monte-Carlo neutron-transport simulations of a model {sup 3}He detector system we have conducted a parameter study to identify the optimum combination of detector shielding, moderation, and collimation that maximises the sensitivity of neutron-sensitive RPMs. These structures, which could be simply and cost-effectively added to existing RPMs, can improve the detector response by more than a factor of two relative to an unmodified, bare design. Furthermore, optimisation of the air gap surrounding the helium tubes also improves detector efficiency.

  15. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  16. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay between big...

  17. Reservoir optimisation using El Niño information. Case study of Daule Peripa (Ecuador)

    Science.gov (United States)

    Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan

    2010-05-01

    The optimisation of water resources systems requires the ability to produce runoff scenarios that are consistent with available climatic information. We approach stochastic runoff modelling with a Markov-modulated autoregressive model with exogenous input, which belongs to the class of Markov-switching models. The model assumes runoff parameterisation to be conditioned on a hidden climatic state following a Markov chain, whose state transition probabilities depend on climatic information. This approach allows stochastic modeling of non-stationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We calibrate the model on the inflows of the Daule Peripa reservoir located in western Ecuador, where the occurrence of El Niño leads to anomalously heavy rainfall caused by positive sea surface temperature anomalies along the coast. El Niño - Southern Oscillation (ENSO) information is used to condition the runoff parameterisation. Inflow predictions are realistic, especially at the occurrence of El Niño events. The Daule Peripa reservoir serves a hydropower plant and a downstream water supply facility. Using historical ENSO records, synthetic monthly inflow scenarios are generated for the period 1950-2007. These scenarios are used as input to perform stochastic optimisation of the reservoir rule curves with a multi-objective Genetic Algorithm (MOGA). The optimised rule curves are assumed to be the reservoir base policy. ENSO standard indices are currently forecasted at monthly time scale with nine-month lead time. These forecasts are used to perform stochastic optimisation of reservoir releases at each monthly time step according to the following procedure: (i) nine-month inflow forecast scenarios are generated using ENSO forecasts; (ii) a MOGA is set up to optimise the upcoming nine monthly releases; (iii) the optimisation is carried out by simulating the releases on the

  18. Knee Kinematics Estimation Using Multi-Body Optimisation Embedding a Knee Joint Stiffness Matrix: A Feasibility Study

    OpenAIRE

    Richard, V.; Lamberto, G.; Lu, T.W.; Cappozzo, A.; Dumas, R

    2016-01-01

    The use of multi-body optimisation (MBO) to estimate joint kinematics from stereophotogrammetric data while compensating for soft tissue artefact is still open to debate. Presently used joint models embedded in MBO, such as mechanical linkages, constitute a considerable simplification of joint function, preventing a detailed understanding of it. The present study proposes a knee joint model where femur and tibia are represented as rigid bodies connected through an elastic element the behaviou...

  19. Optimised motion tracking for positron emission tomography studies of brain function in awake rats.

    Directory of Open Access Journals (Sweden)

    Andre Z Kyme

    Full Text Available Positron emission tomography (PET is a non-invasive molecular imaging technique using positron-emitting radioisotopes to study functional processes within the body. High resolution PET scanners designed for imaging rodents and non-human primates are now commonplace in preclinical research. Brain imaging in this context, with motion compensation, can potentially enhance the usefulness of PET by avoiding confounds due to anaesthetic drugs and enabling freely moving animals to be imaged during normal and evoked behaviours. Due to the frequent and rapid motion exhibited by alert, awake animals, optimal motion correction requires frequently sampled pose information and precise synchronisation of these data with events in the PET coincidence data stream. Motion measurements should also be as accurate as possible to avoid degrading the excellent spatial resolution provided by state-of-the-art scanners. Here we describe and validate methods for optimised motion tracking suited to the correction of motion in awake rats. A hardware based synchronisation approach is used to achieve temporal alignment of tracker and scanner data to within 10 ms. We explored the impact of motion tracker synchronisation error, pose sampling rate, rate of motion, and marker size on motion correction accuracy. With accurate synchronisation (20 Hz, and a small head marker suitable for awake animal studies, excellent motion correction results were obtained in phantom studies with a variety of continuous motion patterns, including realistic rat motion (<5% bias in mean concentration. Feasibility of the approach was also demonstrated in an awake rat study. We conclude that motion tracking parameters needed for effective motion correction in preclinical brain imaging of awake rats are achievable in the laboratory setting. This could broaden the scope of animal experiments currently possible with PET.

  20. Structure and weights optimisation of a modified Elman network emotion classifier using hybrid computational intelligence algorithms: a comparative study

    Science.gov (United States)

    Sheikhan, Mansour; Abbasnezhad Arabi, Mahdi; Gharavian, Davood

    2015-10-01

    Artificial neural networks are efficient models in pattern recognition applications, but their performance is dependent on employing suitable structure and connection weights. This study used a hybrid method for obtaining the optimal weight set and architecture of a recurrent neural emotion classifier based on gravitational search algorithm (GSA) and its binary version (BGSA), respectively. By considering the features of speech signal that were related to prosody, voice quality, and spectrum, a rich feature set was constructed. To select more efficient features, a fast feature selection method was employed. The performance of the proposed hybrid GSA-BGSA method was compared with similar hybrid methods based on particle swarm optimisation (PSO) algorithm and its binary version, PSO and discrete firefly algorithm, and hybrid of error back-propagation and genetic algorithm that were used for optimisation. Experimental tests on Berlin emotional database demonstrated the superior performance of the proposed method using a lighter network structure.

  1. TELECOM BIG DATA FOR URBAN TRANSPORT ANALYSIS – A CASE STUDY OF SPLIT-DALMATIA COUNTY IN CROATIA

    Directory of Open Access Journals (Sweden)

    M. Baučić

    2017-09-01

    Full Text Available Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the “IPA Adriatic CBC//N.0086/INTERMODAL” project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.

  2. Telecom Big Data for Urban Transport Analysis - a Case Study of Split-Dalmatia County in Croatia

    Science.gov (United States)

    Baučić, M.; Jajac, N.; Bućan, M.

    2017-09-01

    Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the "IPA Adriatic CBC//N.0086/INTERMODAL" project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.

  3. An empirical study on website usability elements and how they affect search engine optimisation

    Directory of Open Access Journals (Sweden)

    Eugene B. Visser

    2011-03-01

    Full Text Available The primary objective of this research project was to identify and investigate the website usability attributes which are in contradiction with search engine optimisation elements. The secondary objective was to determine if these usability attributes affect conversion. Although the literature review identifies the contradictions, experts disagree about their existence.An experiment was conducted, whereby the conversion and/or traffic ratio results of an existing control website were compared to a usability-designed version of the control website,namely the experimental website. All optimisation elements were ignored, thus implementing only usability. The results clearly show that inclusion of the usability attributes positively affect conversion,indicating that usability is a prerequisite for effective website design. Search engine optimisation is also a prerequisite for the very reason that if a website does not rank on the first page of the search engine result page for a given keyword, then that website might as well not exist. According to this empirical work, usability is in contradiction to search engine optimisation best practices. Therefore the two need to be weighed up in terms of importance towards search engines and visitors.

  4. A study into ant colony optimisation, evolutionary computation and constraint programming on binary constraint satisfaction problems.

    NARCIS (Netherlands)

    J.I. van Hemert; C. Solnon

    2004-01-01

    textabstractWe compare two heuristic approaches, evolutionary computation and ant colony optimisation, and a complete tree-search approach, constraint programming, for solving binary constraint satisfaction problems. We experimentally show that, if evolutionary computation is far from being able to

  5. Recurrent personality dimensions in inclusive lexical studies: indications for a big six structure.

    Science.gov (United States)

    Saucier, Gerard

    2009-10-01

    Previous evidence for both the Big Five and the alternative six-factor model has been drawn from lexical studies with relatively narrow selections of attributes. This study examined factors from previous lexical studies using a wider selection of attributes in 7 languages (Chinese, English, Filipino, Greek, Hebrew, Spanish, and Turkish) and found 6 recurrent factors, each with common conceptual content across most of the studies. The previous narrow-selection-based six-factor model outperformed the Big Five in capturing the content of the 6 recurrent wideband factors. Adjective markers of the 6 recurrent wideband factors showed substantial incremental prediction of important criterion variables over and above the Big Five. Correspondence between wideband 6 and narrowband 6 factors indicate they are variants of a "Big Six" model that is more general across variable-selection procedures and may be more general across languages and populations.

  6. Multidimensional Big Spatial Data Modeling Through A Case Study: Lte Rf Subsystem Power Consumption Modeling

    DEFF Research Database (Denmark)

    Antón Castro, Francesc/François; Musiige, Deogratius; Mioc, Darka

    2016-01-01

    This paper presents a case study for comparing different multidimensional mathematical modeling methodologies used in multidimensional spatial big data modeling and proposing a new technique. An analysis of multidimensional modeling approaches (neural networks, polynomial interpolation and homotopy...

  7. Modeling and processing for next-generation big-data technologies with applications and case studies

    CERN Document Server

    Barolli, Leonard; Barolli, Admir; Papajorgji, Petraq

    2015-01-01

    This book covers the latest advances in Big Data technologies and provides the readers with a comprehensive review of the state-of-the-art in Big Data processing, analysis, analytics, and other related topics. It presents new models, algorithms, software solutions and methodologies, covering the full data cycle, from data gathering to their visualization and interaction, and includes a set of case studies and best practices. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data are also identified and presented throughout the book, which is intended for researchers, scholars, advanced students, software developers and practitioners working at the forefront in their field.

  8. Engineering Optimisation by Cuckoo Search

    CERN Document Server

    Yang, Xin-She

    2010-01-01

    A new metaheuristic optimisation algorithm, called Cuckoo Search (CS), was developed recently by Yang and Deb (2009). This paper presents a more extensive comparison study using some standard test functions and newly designed stochastic test functions. We then apply the CS algorithm to solve engineering design optimisation problems, including the design of springs and welded beam structures. The optimal solutions obtained by CS are far better than the best solutions obtained by an efficient particle swarm optimiser. We will discuss the unique search features used in CS and the implications for further research.

  9. Neutron Self-decay Characteristic Study on Big Sample by NAA

    Institute of Scientific and Technical Information of China (English)

    SUN; Hong-chao; YUAN; Guo-jun; XIAO; Cai-jin; ZHANG; Zi-zhu; YANG; Wei; JIN; Xiang-chun; ZHANG; Gui-ying; WANG; Ping-sheng; NI; Bang-fa

    2012-01-01

    <正>The advantages of the neutron activation analysis are non-destructive, multielement and high accuracy, but there are many difficulties on the study for the big sample so far. IAEA organized a CRP cooperative item to find the solutions for big sample analysis. In this study, we analyzed the horizontal beam neutron self-decay characteristics of in-hospital neutron irradiator (IHNI) when it irradiates high purity zinc (Zn) and zirconium (Zr) plates.

  10. Optimisation of process parameters in friction stir welding based on residual stress analysis: a feasibility study

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2010-01-01

    The present paper considers the optimisation of process parameters in friction stir welding (FSW). More specifically, the choices of rotational speed and traverse welding speed have been investigated using genetic algorithms. The welding process is simulated in a transient, two-dimensional sequen......The present paper considers the optimisation of process parameters in friction stir welding (FSW). More specifically, the choices of rotational speed and traverse welding speed have been investigated using genetic algorithms. The welding process is simulated in a transient, two......, and this is presented as a Pareto optimal front. Moreover, a higher welding speed for a fixed rotational speed results, in general, in slightly higher stress levels in the tension zone, whereas a higher rotational speed for a fixed welding speed yields somewhat lower peak residual stress, however, a wider tension zone...

  11. A STUDY ON OPTIMISATION OF RESOURCES FOR MULTIPLE PROJECTS BY USING PRIMAVERA

    OpenAIRE

    B. S. K. REDDY; SK. NAGARAJU; MD. SALMAN

    2015-01-01

    Resources play vital role in construction projects. The performance of construction industry depends chiefly on how best the resources are managed. Optimisation play pivotal role in resource management, but task is highly haphazard and chaotic under the influence of complexities and vastness. Management always looks for optimum utility of resources available with them. Hence, the project management has got important place especially in resource allocation and smooth functioning with alloca...

  12. A STUDY ON OPTIMISATION OF RESOURCES FOR MULTIPLE PROJECTS BY USING PRIMAVERA

    Directory of Open Access Journals (Sweden)

    B. S. K. REDDY

    2015-02-01

    Full Text Available Resources play vital role in construction projects. The performance of construction industry depends chiefly on how best the resources are managed. Optimisation play pivotal role in resource management, but task is highly haphazard and chaotic under the influence of complexities and vastness. Management always looks for optimum utility of resources available with them. Hence, the project management has got important place especially in resource allocation and smooth functioning with allocated budget. To achieve these goals and to exercise enhance optimisation certain tools are used for resource allocation optimally. Present work illustrates resource optimisation exercises on two ongoing projects in Dubai, UAE. Resource demands of project A & B are individually levelled and observed cumulative requirement is 17475. In other option demands of projects A & B are aggregated and then together levelled, the necessary resource observed is 16490. Comparison of individually levelled and then combined option with aggregated and then levelled clearly indicates reduction in demand of resources by 5.65% in later option, which could be best considered for economy.

  13. Big Bang 6Li nucleosynthesis studied deep underground (LUNA collaboration)

    Science.gov (United States)

    Trezzi, D.; Anders, M.; Aliotta, M.; Bellini, A.; Bemmerer, D.; Boeltzig, A.; Broggini, C.; Bruno, C. G.; Caciolli, A.; Cavanna, F.; Corvisiero, P.; Costantini, H.; Davinson, T.; Depalo, R.; Elekes, Z.; Erhard, M.; Ferraro, F.; Formicola, A.; Fülop, Zs.; Gervino, G.; Guglielmetti, A.; Gustavino, C.; Gyürky, Gy.; Junker, M.; Lemut, A.; Marta, M.; Mazzocchi, C.; Menegazzo, R.; Mossa, V.; Pantaleo, F.; Prati, P.; Rossi Alvarez, C.; Scott, D. A.; Somorjai, E.; Straniero, O.; Szücs, T.; Takacs, M.

    2017-03-01

    The correct prediction of the abundances of the light nuclides produced during the epoch of Big Bang Nucleosynthesis (BBN) is one of the main topics of modern cosmology. For many of the nuclear reactions that are relevant for this epoch, direct experimental cross section data are available, ushering the so-called "age of precision". The present work addresses an exception to this current status: the 2H(α,γ)6Li reaction that controls 6Li production in the Big Bang. Recent controversial observations of 6Li in metal-poor stars have heightened the interest in understanding primordial 6Li production. If confirmed, these observations would lead to a second cosmological lithium problem, in addition to the well-known 7Li problem. In the present work, the direct experimental cross section data on 2H(α,γ)6Li in the BBN energy range are reported. The measurement has been performed deep underground at the LUNA (Laboratory for Underground Nuclear Astrophysics) 400 kV accelerator in the Laboratori Nazionali del Gran Sasso, Italy. The cross section has been directly measured at the energies of interest for Big Bang Nucleosynthesis for the first time, at Ecm = 80, 93, 120, and 133 keV. Based on the new data, the 2H(α,γ)6Li thermonuclear reaction rate has been derived. Our rate is even lower than previously reported, thus increasing the discrepancy between predicted Big Bang 6Li abundance and the amount of primordial 6Li inferred from observations.

  14. Physicists create a "perfect" way to study the Big Bang

    CERN Multimedia

    2005-01-01

    Physicists have created the state of matter thought to have filled the Universe just a few microseconds after the big bang and found it to be different from what they were expecting: instead of a gas, it is more like a liquid. Understanding why it is a liquid that should take physicists a step closer to explaining the earliest moments of our Universe (1 ½ page)

  15. Experimental study of an optimised Pyramid wave-front sensor for Extremely Large Telescopes

    Science.gov (United States)

    Bond, Charlotte Z.; El Hadi, Kacem; Sauvage, Jean-François; Correia, Carlos; Fauvarque, Olivier; Rabaud, Didier; Lamb, Masen; Neichel, Benoit; Fusco, Thierry

    2016-07-01

    Over the last few years the Laboratoire d'Astrophysique de Marseille (LAM) has been heavily involved in R&D for adaptive optics systems dedicated to future large telescopes, particularly in preparation for the European Extremely Large Telescope (E-ELT). Within this framework an investigation into a Pyramid wave-front sensor is underway. The Pyramid sensor is at the cutting edge of high order, high precision wave-front sensing for ground based telescopes. Investigations have demonstrated the ability to achieve a greater sensitivity than the standard Shack-Hartmann wave-front sensor whilst the implementation of a Pyramid sensor on the Large Binocular Telescope (LBT) has provided compelling operational results.1, 2 The Pyramid now forms part of the baseline for several next generation Extremely Large Telescopes (ELTs). As such its behaviour under realistic operating conditions must be further understood in order to optimise performance. At LAM a detailed investigation into the performance of the Pyramid aims to fully characterise the behaviour of this wave-front sensor in terms of linearity, sensitivity and operation. We have implemented a Pyramid sensor using a high speed OCAM2 camera (with close to 0 readout noise and a frame rate of 1.5kHz) in order to study the performance of the Pyramid within a full closed loop adaptive optics system. This investigation involves tests on all fronts, from theoretical models and numerical simulations to experimental tests under controlled laboratory conditions, with an aim to fully understand the Pyramid sensor in both modulated and non-modulated configurations. We include results demonstrating the linearity of the Pyramid signals, compare measured interaction matrices with those derived in simulation and evaluate the performance in closed loop operation. The final goal is to provide an on sky comparison between the Pyramid and a Shack-Hartmann wave-front sensor, at Observatoire de la Côte d'Azur (ONERA-ODISSEE bench). Here we

  16. Big-data-based edge biomarkers: study on dynamical drug sensitivity and resistance in individuals.

    Science.gov (United States)

    Zeng, Tao; Zhang, Wanwei; Yu, Xiangtian; Liu, Xiaoping; Li, Meiyi; Chen, Luonan

    2016-07-01

    Big-data-based edge biomarker is a new concept to characterize disease features based on biomedical big data in a dynamical and network manner, which also provides alternative strategies to indicate disease status in single samples. This article gives a comprehensive review on big-data-based edge biomarkers for complex diseases in an individual patient, which are defined as biomarkers based on network information and high-dimensional data. Specifically, we firstly introduce the sources and structures of biomedical big data accessible in public for edge biomarker and disease study. We show that biomedical big data are typically 'small-sample size in high-dimension space', i.e. small samples but with high dimensions on features (e.g. omics data) for each individual, in contrast to traditional big data in many other fields characterized as 'large-sample size in low-dimension space', i.e. big samples but with low dimensions on features. Then, we demonstrate the concept, model and algorithm for edge biomarkers and further big-data-based edge biomarkers. Dissimilar to conventional biomarkers, edge biomarkers, e.g. module biomarkers in module network rewiring-analysis, are able to predict the disease state by learning differential associations between molecules rather than differential expressions of molecules during disease progression or treatment in individual patients. In particular, in contrast to using the information of the common molecules or edges (i.e.molecule-pairs) across a population in traditional biomarkers including network and edge biomarkers, big-data-based edge biomarkers are specific for each individual and thus can accurately evaluate the disease state by considering the individual heterogeneity. Therefore, the measurement of big data in a high-dimensional space is required not only in the learning process but also in the diagnosing or predicting process of the tested individual. Finally, we provide a case study on analyzing the temporal expression

  17. Big Data: Big Confusion? Big Challenges?

    Science.gov (United States)

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data : Big Confusion? Big Challenges? Mary Maureen... Data : Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...Acquisition Research Symposium • ~!& UNC CHARlD1TE 90% of the data in the world today was created in the last two years Big Data growth from

  18. Big Fish in a Big Pond: a study of academic self concept in first year medical students

    Directory of Open Access Journals (Sweden)

    Seaton Marjorie

    2011-07-01

    Full Text Available Abstract Background Big-fish-little-pond effect (BFLPE research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Methods Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. Results The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Conclusions Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation.

  19. Optimising survey effort to monitor environmental variables: A case study using New Zealand kiwifruit orchards.

    Science.gov (United States)

    MacLeod, Catriona J; Green, Peter; Tompkins, Daniel M; Benge, Jayson; Moller, Henrik

    2016-12-01

    Environmental monitoring is increasingly used to assess spatial and temporal trends in agricultural sustainability, and test the effectiveness of farm management policies. However, detecting changes in environmental variables is often technically and logistically challenging. To demonstrate how survey effort for environmental monitoring can be optimised, we applied the new statistical power analysis R package simr to pilot survey data. Specifically, we identified the amount of survey effort required to have an 80% chance of detecting specified trends (-1 to -4% pa) in 13 environmental variables on New Zealand kiwifruit orchards within an 11-year period. The variables assessed were related to soil status, agricultural pests (birds), or ecosystem composition (birds). Analyses were conducted on average values (for each orchard and year combination) to provide a consistent scale for comparison among variables. Survey frequency varied from annual (11 surveys) to every 5 years (3 surveys). Survey size was set at either 30, 60, 150 or 300 orchards. In broad terms, we show the power to detect a specified range of trends over an 11-year period in this sector is much higher for 'soil status' than for 'agricultural pest' or 'ecosystem composition'. Changes in one subset of native bird species (nectar-feeders) requiring a particularly high level of relative survey effort to detect with confidence. Monitoring soil status can thus be smaller and less frequent than those which also want to detect changes in agricultural pests or ecosystem composition (with the latter requiring the most effort) but will depend on the magnitude of changes that is meaningful to detect. This assessment thus allows kiwifruit industry in New Zealand to optimise survey design to the desired information, and provides a template for other industries to do likewise. Power analyses are now more accessible through the provision of the simr package, so deploying and integrating them into design and decision

  20. Big Data technology in traffic: A case study of automatic counters

    Directory of Open Access Journals (Sweden)

    Janković Slađana R.

    2016-01-01

    Full Text Available Modern information and communication technologies together with intelligent devices provide a continuous inflow of large amounts of data that are used by traffic and transport systems. Collecting traffic data does not represent a challenge nowadays, but the issues remains in relation to storing and processing increasing amounts of data. In this paper we have investigated the possibilities of using Big Data technology to store and process data in the transport domain. The term Big Data refers to a large volume of information resource, its velocity and variety, far beyond the capabilities of commonly used software for storing, processing and data management. In our case study, Apache™ Hadoop® Big Data was used for processing data collected from 10 automatic traffic counters set up in Novi Sad and its surroundings. Indicators of traffic load which were calculated using the Big Data platforms were presented using tables and graphs in Microsoft Office Excel tool. The visualization and geolocation of the obtained indicators were performed using the Microsoft Business Intelligence (BI tools such as: Excel Power View and Excel Power Map. This case study showed that Big Data technologies combined with the BI tools can be used as a reliable support in monitoring of the traffic management systems.

  1. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  2. Big Data access and infrastructure for modern biology: case studies in data repository utility.

    Science.gov (United States)

    Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R

    2017-01-01

    Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data. © 2016 New York Academy of Sciences.

  3. Device Data Ingestion for Industrial Big Data Platforms with a Case Study.

    Science.gov (United States)

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-02-26

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data.

  4. Device Data Ingestion for Industrial Big Data Platforms with a Case Study

    Science.gov (United States)

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  5. Big Data Science Education: A Case Study of a Project-Focused Introductory Course

    Science.gov (United States)

    Saltz, Jeffrey; Heckman, Robert

    2015-01-01

    This paper reports on a case study of a project-focused introduction to big data science course. The pedagogy of the course leveraged boundary theory, where students were positioned to be at the boundary between a client's desire to understand their data and the academic class. The results of the case study demonstrate that using live clients…

  6. On Subtitle Translation of Sitcoms-A Case Study of The Big Bang Theory

    Institute of Scientific and Technical Information of China (English)

    杨雯婷

    2013-01-01

    As we all know that exquisite subtitle translation of foreign film and television series is the fatal elements for them to spread among Chinese audiences. This article is based on Eugene·Nida’s“the Functional Equivalence”principle with three char⁃acteristics of sitcoms’subtitle to study the type, form and features of the Big Bang Theory, which lead to the conclusion of sitcom subtitle’s characteristics. It helps us to analyze its subtitle from six aspects. As the result, the author of the paper makes the conclu⁃sion of translation tactic about Big Bang Theory, which could help the subtitle translation of similar sitcoms.

  7. Design of farm winery façades for the optimisation of indoor natural lighting: a case study

    Directory of Open Access Journals (Sweden)

    Daniele Torreggiani

    2013-06-01

    Full Text Available This paper deals with the theme of daylighting performances of rural buildings, within a broader research context aimed at establishing design criteria for farm wineries. The objective is to benchmark the performances of different window systems in order to define design guidelines directed towards the optimisation of natural lighting to improve visual comfort and reduce energy consumption. A winegrowing and producing farm with standard features in the Emilia- Romagna region, Northern Italy, is considered as a case study. Particular attention was given to the part of the building designated for tasting activities. The study considered several opening solutions in the building envelope, and showed the effectiveness of those involving south façade glazing with appropriate screening systems. Further analyses will aim to investigate the performance of windows distributed on several fronts, including heat balance assessment.

  8. Optimising parallel R correlation matrix calculations on gene expression data using MapReduce.

    Science.gov (United States)

    Wang, Shicai; Pandis, Ioannis; Johnson, David; Emam, Ibrahim; Guitton, Florian; Oehmichen, Axel; Guo, Yike

    2014-11-05

    High-throughput molecular profiling data has been used to improve clinical decision making by stratifying subjects based on their molecular profiles. Unsupervised clustering algorithms can be used for stratification purposes. However, the current speed of the clustering algorithms cannot meet the requirement of large-scale molecular data due to poor performance of the correlation matrix calculation. With high-throughput sequencing technologies promising to produce even larger datasets per subject, we expect the performance of the state-of-the-art statistical algorithms to be further impacted unless efforts towards optimisation are carried out. MapReduce is a widely used high performance parallel framework that can solve the problem. In this paper, we evaluate the current parallel modes for correlation calculation methods and introduce an efficient data distribution and parallel calculation algorithm based on MapReduce to optimise the correlation calculation. We studied the performance of our algorithm using two gene expression benchmarks. In the micro-benchmark, our implementation using MapReduce, based on the R package RHIPE, demonstrates a 3.26-5.83 fold increase compared to the default Snowfall and 1.56-1.64 fold increase compared to the basic RHIPE in the Euclidean, Pearson and Spearman correlations. Though vanilla R and the optimised Snowfall outperforms our optimised RHIPE in the micro-benchmark, they do not scale well with the macro-benchmark. In the macro-benchmark the optimised RHIPE performs 2.03-16.56 times faster than vanilla R. Benefiting from the 3.30-5.13 times faster data preparation, the optimised RHIPE performs 1.22-1.71 times faster than the optimised Snowfall. Both the optimised RHIPE and the optimised Snowfall successfully performs the Kendall correlation with TCGA dataset within 7 hours. Both of them conduct more than 30 times faster than the estimated vanilla R. The performance evaluation found that the new MapReduce algorithm and its

  9. Multi-objective optimisation with stochastic discrete-event simulation in retail banking: a case study

    Directory of Open Access Journals (Sweden)

    E Scholtz

    2012-12-01

    Full Text Available The cash management of an autoteller machine (ATM is a multi-objective optimisation problem which aims to maximise the service level provided to customers at minimum cost. This paper focus on improved cash management in a section of the South African retail banking industry, for which a decision support system (DSS was developed. This DSS integrates four Operations Research (OR methods: the vehicle routing problem (VRP, the continuous review policy for inventory management, the knapsack problem and stochastic, discrete-event simulation. The DSS was applied to an ATM network in the Eastern Cape, South Africa, to investigate 90 different scenarios. Results show that the application of a formal vehicle routing method consistently yields higher service levels at lower cost when compared to two other routing approaches, in conjunction with selected ATM reorder levels and a knapsack-based notes dispensing algorithm. It is concluded that the use of vehicle routing methods is especially beneficial when the bank has substantial control over transportation cost.

  10. The relationship between the big five personality factors and burnout : A study among volunteer counselors

    NARCIS (Netherlands)

    Bakker, AB; Van der Zee, KI; Lewig, KA; Dollard, MF

    2006-01-01

    In the present study of 80 volunteer counselors who cared for terminally ill patients, the authors examined the relationship between burnout as measured by the Maslach Burnout Inventory (C. Maslach, S. E. Jackson, & M. P. Leiter, 1996) and the 5 basic (Big Five) personality factors (A. A. J. Hendrik

  11. The relationship between the big five personality factors and burnout : A study among volunteer counselors

    NARCIS (Netherlands)

    Bakker, A.B.; Van der Zee, K.I.; Lewig, K.A.; Dollard, M.F.

    2006-01-01

    In the present study of 80 volunteer counselors who cared for terminally ill patients, the authors examined the relationship between burnout as measured by the Maslach Burnout Inventory (C. Maslach, S. E. Jackson, & M. P. Leiter, 1996) and the 5 basic (Big Five) personality factors (A. A. J. Hendrik

  12. Dose optimisation in single plane interstitial brachytherapy

    DEFF Research Database (Denmark)

    Tanderup, Kari; Hellebust, Taran Paulsen; Honoré, Henriette Benedicte;

    2006-01-01

    BACKGROUND AND PURPOSE: Brachytherapy dose distributions can be optimised       by modulation of source dwell times. In this study dose optimisation in       single planar interstitial implants was evaluated in order to quantify the       potential benefit in patients. MATERIAL AND METHODS: In 14...

  13. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  14. A structural study for the optimisation of functional motifs encoded in protein sequences

    Directory of Open Access Journals (Sweden)

    Helmer-Citterich Manuela

    2004-04-01

    Full Text Available Abstract Background A large number of PROSITE patterns select false positives and/or miss known true positives. It is possible that – at least in some cases – the weak specificity and/or sensitivity of a pattern is due to the fact that one, or maybe more, functional and/or structural key residues are not represented in the pattern. Multiple sequence alignments are commonly used to build functional sequence patterns. If residues structurally conserved in proteins sharing a function cannot be aligned in a multiple sequence alignment, they are likely to be missed in a standard pattern construction procedure. Results Here we present a new procedure aimed at improving the sensitivity and/ or specificity of poorly-performing patterns. The procedure can be summarised as follows: 1. residues structurally conserved in different proteins, that are true positives for a pattern, are identified by means of a computational technique and by visual inspection. 2. the sequence positions of the structurally conserved residues falling outside the pattern are used to build extended sequence patterns. 3. the extended patterns are optimised on the SWISS-PROT database for their sensitivity and specificity. The method was applied to eight PROSITE patterns. Whenever structurally conserved residues are found in the surface region close to the pattern (seven out of eight cases, the addition of information inferred from structural analysis is shown to improve pattern selectivity and in some cases selectivity and sensitivity as well. In some of the cases considered the procedure allowed the identification of functionally interesting residues, whose biological role is also discussed. Conclusion Our method can be applied to any type of functional motif or pattern (not only PROSITE ones which is not able to select all and only the true positive hits and for which at least two true positive structures are available. The computational technique for the identification of

  15. Eysenck's BIG THREE and Communication Traits: Three Correlational Studies.

    Science.gov (United States)

    McCroskey, James C.; Heisel, Alan D.; Richmond, Virginia P.

    2001-01-01

    Examines the relationship between H. Eysenck's personality dimensions (extraversion, neuroticsm, and psychoticism) and communication variables, in three separate studies encompassing more than a dozen communication variables. Finds consistent patterns across the three studies. (SR)

  16. Big data, big governance

    NARCIS (Netherlands)

    Reep, Frans van der

    2016-01-01

    “Natuurlijk is het leuk dat mijn koelkast zelf melk bestelt op basis van data gerelateerde patronen. Deep learning op basis van big data kent grote beloften,” zegt Frans van der Reep van Inholland. Geen wonder dat dit op de Hannover Messe tijdens de Wissenstag van ScienceGuide een hoofdthema zal zij

  17. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  18. Computer Based Optimisation Rutines

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    In this paper the need for optimisation methods for the laser cutting process has been identified as three different situations. Demands on the optimisation methods for these situations are presented, and one method for each situation is suggested. The adaptation and implementation of the methods...

  19. Optimal optimisation in chemometrics

    NARCIS (Netherlands)

    Hageman, Joseph Albert

    2004-01-01

    The use of global optimisation methods is not straightforward, especially for the more difficult optimisation problems. Solutions have to be found for items such as the evaluation function, representation, step function and meta-parameters, before any useful results can be obtained. This thesis aims

  20. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  1. Allocation of solid waste collection bins and route optimisation using geographical information system: A case study of Dhanbad City, India.

    Science.gov (United States)

    Khan, D; Samadder, S R

    2016-07-01

    Collection of municipal solid waste is one of the most important elements of municipal waste management and requires maximum fund allocated for waste management. The cost of collection and transportation can be reduced in comparison with the present scenario if the solid waste collection bins are located at suitable places so that the collection routes become minimum. This study presents a suitable solid waste collection bin allocation method at appropriate places with uniform distance and easily accessible location so that the collection vehicle routes become minimum for the city Dhanbad, India. The network analyst tool set available in ArcGIS was used to find the optimised route for solid waste collection considering all the required parameters for solid waste collection efficiently. These parameters include the positions of solid waste collection bins, the road network, the population density, waste collection schedules, truck capacities and their characteristics. The present study also demonstrates the significant cost reductions that can be obtained compared with the current practices in the study area. The vehicle routing problem solver tool of ArcGIS was used to identify the cost-effective scenario for waste collection, to estimate its running costs and to simulate its application considering both travel time and travel distance simultaneously.

  2. The big five as tendencies in situations : A replication study

    NARCIS (Netherlands)

    Hendriks, AAJ

    1996-01-01

    Van Heck, Perugini, Caprara and Froger (1994) report the average generalizability coefficient reflecting the consistent ordering of persons across different situations and different trait markers (items) to be in the order of 0.70. We performed a replication study in which we improved on their selec

  3. Optimisation and performance studies of the ATLAS $b$-tagging algorithms for the 2017-18 LHC run

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    The optimisation and performance of the ATLAS $b$-tagging algorithms for the 2017-18 data taking at the LHC are described. This note presents the use of additional taggers to further enhance the discrimination between $b$-, $c$- and light-flavour jets, and on new studies for more performant training of the algorithms and for assessing the universality of the training campaign in typical physics processes where flavour tagging plays a crucial role. Particular attention is paid to the inclusion of novel taggers, namely a Soft Muon Tagger, based on the reconstruction of muons from the semileptonic decay of $b$/$c$-hadrons, and a Recurrent Neural Network Impact-Parameter tagger that exploits correlations between tracks within the jet. New variants of the high-level discriminant, based on boosted decision trees and modern deep learning techniques, are also presented. The overlap between the jets tagged by the various $b$-tagging algorithms is studied, and the dependence of the tagging performance on the physics pr...

  4. The Big Five of Personality and structural imaging revisited: a VBM - DARTEL study.

    Science.gov (United States)

    Liu, Wei-Yin; Weber, Bernd; Reuter, Martin; Markett, Sebastian; Chu, Woei-Chyn; Montag, Christian

    2013-05-01

    The present study focuses on the neurostructural foundations of the human personality. In a large sample of 227 healthy human individuals (168 women and 59 men), we used MRI to examine the relationship between personality traits and both regional gray and white matter volume, while controlling for age and sex. Personality was assessed using the German version of the NEO Five-Factor Inventory that measures individual differences in the 'Big Five of Personality': extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience. In contrast to most previous studies on neural correlates of the Big Five, we used improved processing strategies: white and gray matter were independently assessed by segmentation steps before data analysis. In addition, customized sex-specific diffeomorphic anatomical registration using exponentiated lie algebra templates were used. Our results did not show significant correlations between any dimension of the Big Five and regional gray matter volume. However, among others, higher conscientiousness scores correlated significantly with reductions in regional white matter volume in different brain areas, including the right insula, putamen, caudate, and left fusiformis. These correlations were driven by the female subsample. The present study suggests that many results from the literature on the neurostructural basis of personality should be reviewed carefully, considering the results when the sample size is larger, imaging methods are rigorously applied, and sex-related and age-related effects are controlled.

  5. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis.

    Science.gov (United States)

    Huang, Liang; Ni, Xuan; Ditto, William L; Spano, Mark; Carney, Paul R; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  6. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    Science.gov (United States)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  7. submitter Performance studies of CMS workflows using Big Data technologies

    CERN Document Server

    Ambroz, Luca; Grandi, Claudio

    At the Large Hadron Collider (LHC), more than 30 petabytes of data are produced from particle collisions every year of data taking. The data processing requires large volumes of simulated events through Monte Carlo techniques. Furthermore, physics analysis implies daily access to derived data formats by hundreds of users. The Worldwide LHC Computing Grid (WLCG) - an international collaboration involving personnel and computing centers worldwide - is successfully coping with these challenges, enabling the LHC physics program. With the continuation of LHC data taking and the approval of ambitious projects such as the High-Luminosity LHC, such challenges will reach the edge of current computing capacity and performance. One of the keys to success in the next decades - also under severe financial resource constraints - is to optimize the efficiency in exploiting the computing resources. This thesis focuses on performance studies of CMS workflows, namely centrallyscheduled production activities and unpredictable d...

  8. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  9. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Directory of Open Access Journals (Sweden)

    Ho Ting Wong

    2016-10-01

    Full Text Available The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  10. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-01-01

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing. PMID:27763525

  11. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness.

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-10-17

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term "Big Data", which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  12. A study on two-warehouse partially backlogged deteriorating inventory models under inflation via particle swarm optimisation

    Science.gov (United States)

    Bhunia, A. K.; Shaikh, A. A.; Gupta, R. K.

    2015-04-01

    This paper deals with a deterministic inventory model for linear trend in demand under inflationary conditions with different rates of deterioration in two separate warehouses (owned and rented warehouses). The replenishment rate is infinite. The stock is transferred from the rented warehouse to owned warehouse in continuous release pattern and the associated transportation cost is taken into account. At owned warehouse, shortages, if any, are allowed and partially backlogged with a rate dependent on the duration of waiting time up to the arrival of the next lot. The corresponding problems have been formulated as nonlinear constrained optimisation problems for two different policies (inventory follows shortage (IFS) and shortage follows inventory (SFI)). Finally, the model has been illustrated with a numerical example and to study the effects of changes of different system parameters on initial stock level, maximum shortage level and cycle length with the minimum cost of the system, sensitivity analyses have been carried out by changing one parameter at a time and keeping the others at their original values.

  13. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  14. Progress Report: Integrated Ecological Studies at Lisbon Bottom Unit, Big Muddy Fish and Wildlife Refuge, Fiscal Year 1999

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The U.S. Geological Survey has been carrying out integrated ecological studies at the Lisbon Bottom Unit of the Big Muddy Fish and Wildlife Refuge since 1996. This...

  15. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  16. Isogeometric design optimisation

    NARCIS (Netherlands)

    Nagy, A.P.

    2011-01-01

    Design optimisation is of paramount importance in most engineering, e.g. aeronautical, automotive, or naval, disciplines. Its interdisciplinary character is manifested in the synthesis of geometric modelling, numerical analysis, mathematical programming, and computer sciences. The evolution of the f

  17. Knee Kinematics Estimation Using Multi-Body Optimisation Embedding a Knee Joint Stiffness Matrix: A Feasibility Study.

    Directory of Open Access Journals (Sweden)

    Vincent Richard

    Full Text Available The use of multi-body optimisation (MBO to estimate joint kinematics from stereophotogrammetric data while compensating for soft tissue artefact is still open to debate. Presently used joint models embedded in MBO, such as mechanical linkages, constitute a considerable simplification of joint function, preventing a detailed understanding of it. The present study proposes a knee joint model where femur and tibia are represented as rigid bodies connected through an elastic element the behaviour of which is described by a single stiffness matrix. The deformation energy, computed from the stiffness matrix and joint angles and displacements, is minimised within the MBO. Implemented as a "soft" constraint using a penalty-based method, this elastic joint description challenges the strictness of "hard" constraints. In this study, estimates of knee kinematics obtained using MBO embedding four different knee joint models (i.e., no constraints, spherical joint, parallel mechanism, and elastic joint were compared against reference kinematics measured using bi-planar fluoroscopy on two healthy subjects ascending stairs. Bland-Altman analysis and sensitivity analysis investigating the influence of variations in the stiffness matrix terms on the estimated kinematics substantiate the conclusions. The difference between the reference knee joint angles and displacements and the corresponding estimates obtained using MBO embedding the stiffness matrix showed an average bias and standard deviation for kinematics of 0.9±3.2° and 1.6±2.3 mm. These values were lower than when no joint constraints (1.1±3.8°, 2.4±4.1 mm or a parallel mechanism (7.7±3.6°, 1.6±1.7 mm were used and were comparable to the values obtained with a spherical joint (1.0±3.2°, 1.3±1.9 mm. The study demonstrated the feasibility of substituting an elastic joint for more classic joint constraints in MBO.

  18. Knee Kinematics Estimation Using Multi-Body Optimisation Embedding a Knee Joint Stiffness Matrix: A Feasibility Study.

    Science.gov (United States)

    Richard, Vincent; Lamberto, Giuliano; Lu, Tung-Wu; Cappozzo, Aurelio; Dumas, Raphaël

    2016-01-01

    The use of multi-body optimisation (MBO) to estimate joint kinematics from stereophotogrammetric data while compensating for soft tissue artefact is still open to debate. Presently used joint models embedded in MBO, such as mechanical linkages, constitute a considerable simplification of joint function, preventing a detailed understanding of it. The present study proposes a knee joint model where femur and tibia are represented as rigid bodies connected through an elastic element the behaviour of which is described by a single stiffness matrix. The deformation energy, computed from the stiffness matrix and joint angles and displacements, is minimised within the MBO. Implemented as a "soft" constraint using a penalty-based method, this elastic joint description challenges the strictness of "hard" constraints. In this study, estimates of knee kinematics obtained using MBO embedding four different knee joint models (i.e., no constraints, spherical joint, parallel mechanism, and elastic joint) were compared against reference kinematics measured using bi-planar fluoroscopy on two healthy subjects ascending stairs. Bland-Altman analysis and sensitivity analysis investigating the influence of variations in the stiffness matrix terms on the estimated kinematics substantiate the conclusions. The difference between the reference knee joint angles and displacements and the corresponding estimates obtained using MBO embedding the stiffness matrix showed an average bias and standard deviation for kinematics of 0.9±3.2° and 1.6±2.3 mm. These values were lower than when no joint constraints (1.1±3.8°, 2.4±4.1 mm) or a parallel mechanism (7.7±3.6°, 1.6±1.7 mm) were used and were comparable to the values obtained with a spherical joint (1.0±3.2°, 1.3±1.9 mm). The study demonstrated the feasibility of substituting an elastic joint for more classic joint constraints in MBO.

  19. Fluorescence optimisation and lifetime studies of fingerprints treated with magnetic powders.

    Science.gov (United States)

    Seah, L K; Dinish, U S; Phang, W F; Chao, Z X; Murukeshan, V M

    2005-09-10

    Fluorescence study plays a significant role in fingerprint detection when conventional chemical enhancement methods fail. The basic properties of fluorescence emission such as colour, intensity and lifetime could be well exploited in the detection of latent fingerprints under steady state and in dynamic methods. This paper describes a systematic study of fluorescence emission intensity from fingerprint samples treated with different magnetic powders. Understanding of suitable excitation wavelength required for getting maximum fluorescence emission intensity could be beneficial when selecting the appropriate fluorescent powders for the fingerprint detection. Lifetime study of fingerprints treated with various magnetic powders was also carried out. The importance of lifetime study is well explained through the time-resolved (TR) imaging of fingerprints with nanosecond resolution. Results from the TR imaging study revealed an improvement in the fingerprint image contrast. This is significant when the print is deposited on fluorescing background and its emission wavelength is close to that of treated fingerprint.

  20. Comparative case study on website traffic generated by search engine optimisation and a pay-per-click campaign, versus marketing expenditure

    Directory of Open Access Journals (Sweden)

    Wouter T. Kritzinger

    2015-02-01

    Full Text Available Background: No empirical work was found on how marketing expenses compare when used solely for either the one or the other of the two main types of search engine marketing. Objectives: This research set out to determine how the results of the implementation of a pay-per-click campaign compared to those of a search engine optimisation campaign, given the same website and environment. At the same time, the expenses incurred on both these marketing methods were recorded and compared. Method: The active website of an existing, successful e-commerce concern was used as platform. The company had been using pay-per-click only for a period, whilst traffic was monitored. This system was decommissioned on a particular date and time, and an alternative search engine optimisation system was started at the same time. Again, both traffic and expenses were monitored.Results: The results indicate that the pay-per-click system did produce favourable results, but on the condition that a monthly fee has to be set aside to guarantee consistent traffic. The implementation of search engine optimisation required a relatively large investment at the outset, but it was once-off. After a drop in traffic owing to crawler visitation delays, the website traffic bypassed the average figure achieved during the pay-per-click period after a little over three months, whilst the expenditure crossed over after just six months. Conclusion: Whilst considering the specific parameters of this study, an investment in search engine optimisation rather than a pay-per-click campaign appears to produce better results at a lower cost, after a given period of time.[PDF to follow

  1. Improving recruitment to a study of telehealth management for long-term conditions in primary care: two embedded, randomised controlled trials of optimised patient information materials.

    Science.gov (United States)

    Man, Mei-See; Rick, Jo; Bower, Peter

    2015-07-19

    Patient understanding of study information is fundamental to gaining informed consent to take part in a randomised controlled trial. In order to meet the requirements of research ethics committees, patient information materials can be long and need to communicate complex messages. There is concern that standard approaches to providing patient information may deter potential participants from taking part in trials. The Systematic Techniques for Assisting Recruitment to Trials (MRC-START) research programme aims to test interventions to improve trial recruitment. The aim of this study was to investigate the effect on recruitment of optimised patient information materials (with improved readability and ease of comprehension) compared with standard materials. The study was embedded within two primary care trials involving patients with long-term conditions. The Healthlines Study involves two linked trials evaluating a telehealth intervention in patients with depression (Healthlines Depression) or raised cardiovascular disease risk (Healthlines CVD). We conducted two trials of a recruitment intervention, embedded within the Healthlines host trials. Patients identified as potentially eligible in each of the Healthlines trials were randomised to receive either the original patient information materials or optimised versions of these materials. Primary outcomes were the proportion of participants randomised (Healthlines Depression) and the proportion expressing interest in taking part (Healthlines CVD). In Healthlines Depression (n = 1364), 6.3% of patients receiving the optimised patient information materials were randomised into the study compared to 4.0% in those receiving standard materials (OR = 1.63, 95% CI = 1.00 to 2.67). In Healthlines CVD (n = 671) 24.0% of those receiving optimised patient information materials responded positively to the invitation to participate, compared to 21.9% in those receiving standard materials (OR = 1.12, 95% CI = 0.78 to 1

  2. Spatial issues when optimising waste treatment and energy systems – A Danish Case Study

    DEFF Research Database (Denmark)

    Pizarro Alonso, Amalia Rosa; Münster, Marie; Petrovic, Stefan;

    2014-01-01

    This study addresses the challenge of including geographical information related to waste resources, energy demands and production plants, and transport options in the optimization of waste management. It analyses how waste may serve as an energy source through thermal conversion and anaerobic...... digestion. The relation to the energy sector is taken into account. The geographically specific potentials and utilization possibilities of waste are taken into account. Thus, the relative location of the resources (in this study waste and manure for codegestion) is accounted for. Also the location...... of the resources relative to their utilization (in this study mainly the location of district heating networks) is considered. The temporal dimension is important for the energy sector which displays distinct variations over the year, week and day, and this is reflected by a subdivision of the extension...

  3. Centralising and optimising decentralised stroke care systems: a simulation study on short-term costs and effects

    Directory of Open Access Journals (Sweden)

    Maarten M. H. Lahr

    2017-01-01

    Full Text Available Abstract Background Centralisation of thrombolysis may offer substantial benefits. The aim of this study was to assess short term costs and effects of centralisation of thrombolysis and optimised care in a decentralised system. Methods Using simulation modelling, three scenarios to improve decentralised settings in the North of Netherlands were compared from the perspective of the policy maker and compared to current decentralised care: (1 improving stroke care at nine separate hospitals, (2 centralising and improving thrombolysis treatment to four, and (3 two hospitals. Outcomes were annual mean and incremental costs per patient up to the treatment with thrombolysis, incremental cost-effectiveness ratio (iCER per 1% increase in thrombolysis rate, and the proportion treated with thrombolysis. Results Compared to current decentralised care, improving stroke care at individual community hospitals led to mean annual costs per patient of $US 1,834 (95% CI, 1,823–1,843 whereas centralising to four and two hospitals led to $US 1,462 (95% CI, 1,451–1,473 and $US 1,317 (95% CI, 1,306–1,328, respectively (P < 0.001. The iCER of improving community hospitals was $US 113 (95% CI, 91–150 and $US 71 (95% CI, 59–94, $US 56 (95% CI, 44–74 when centralising to four and two hospitals, respectively. Thrombolysis rates decreased from 22.4 to 21.8% and 21.2% (P = 0.120 and P = 0.001 in case of increasing centralisation. Conclusions Centralising thrombolysis substantially lowers mean annual costs per patient compared to raising stroke care at community hospitals simultaneously. Small, but negative effects on thrombolysis rates may be expected.

  4. From monomer to monolayer: a global optimisation study of (ZnO)n nanoclusters on the Ag surface.

    Science.gov (United States)

    Demiroglu, Ilker; Woodley, Scott M; Sokol, Alexey A; Bromley, Stefan T

    2014-12-21

    We employ global optimisation to investigate how oxide nanoclusters of increasing size can best adapt their structure to lower the system energy when interacting with a realistic extended metal support. Specifically, we focus on the (ZnO)@Ag(111) system where experiment has shown that the infinite Ag(111)-supported ZnO monolayer limit corresponds to an epitaxially 7 : 8 matched graphene-like (Zn(3)O(3))-based hexagonal sheet. Using a two-stage search method based on classical interatomic potentials and then on more accurate density functional theory, we report global minina candidate structures for Ag-supported (ZnO)n cluster with sizes ranging from n = 1-24. Comparison with the respective global minina structure of free space (ZnO)n clusters reveals that the surface interaction plays a decisive role in determining the lowest energy Ag-supported (ZnO)n cluster structures. Whereas free space (ZnO)n clusters tend to adopt cage-like bubble structures as they grow larger, Ag-supported (ZnO)n clusters of increasing size become progressively more like planar cuts from the infinite graphene-like ZnO single monolayer. This energetic favourability for planar hexagonal Ag-supported clusters over their 3D counterparts can be partly rationalised by the ZnO-Ag(111) epitaxial matching and the increased number of close interactions with the Ag surface. Detailed analysis shows that this tendency can also be attributed to the capacity of 2D clusters to distort to improve their interaction with the Ag surface relative to more rigid 3D bubble cluster isomers. For the larger sized clusters we find that the adsorption energies and most stable structural types appear to be rather converged confirming that our study makes a bridge between the Ag-supported ZnO monomer and the infinite Ag-supported ZnO monolayer.

  5. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  6. Optimisation study of the synthesis of vanadium oxide nanostructures using pulsed laser deposition

    CSIR Research Space (South Africa)

    Masina, BN

    2014-02-01

    Full Text Available Fast imaging plasma plume study have been carried out on vanadium-oxygen plasma generated using 248 nm, 25 ns pulses from an excimer KrF laser under oxygen atmosphere. The plume expansion dynamics of an ablated VO(sub2) target was investigated using...

  7. 国内图书馆的大数据研究%Studies on big data in domestic libraries

    Institute of Scientific and Technical Information of China (English)

    王秀艳

    2015-01-01

    以中国知网( CNKI)作为数据来源,采用g指数和聚类分析法对图书馆大数据的研究文献的主题进行分析,总结出国内图书馆的大数据研究热点,主要包括大数据推进图书馆创新服务模式、学科化服务参与图书馆文献资源建设、图书馆推出个性化服务模式、数字图书馆大数据的存储、图书馆对大数据的处理、挖掘和分析以及大数据处理技术和工具,并指出研究的不足。%Papers on big data studies in domestic libraries were analyzed with g index and cluster analysis method with CNKI as the data source,which showed the following hotspots in studies on big data,namely innovative service, subject service and literature resource development , big data storage , big data process , big data mining and analy-sis, big data processing technology and tools.The problems waiting to be solved were pointed out.

  8. Design evaluation and optimisation in crossover pharmacokinetic studies analysed by nonlinear mixed effects models.

    Science.gov (United States)

    Nguyen, Thu Thuy; Bazzoli, Caroline; Mentré, France

    2012-05-20

    Bioequivalence or interaction trials are commonly studied in crossover design and can be analysed by nonlinear mixed effects models as an alternative to noncompartmental approach. We propose an extension of the population Fisher information matrix in nonlinear mixed effects models to design crossover pharmacokinetic trials, using a linearisation of the model around the random effect expectation, including within-subject variability and discrete covariates fixed or changing between periods. We use the expected standard errors of treatment effect to compute the power for the Wald test of comparison or equivalence and the number of subjects needed for a given power. We perform various simulations mimicking crossover two-period trials to show the relevance of these developments. We then apply these developments to design a crossover pharmacokinetic study of amoxicillin in piglets and implement them in the new version 3.2 of the r function PFIM.

  9. Design evaluation and optimisation in crossover pharmacokinetic studies analysed by nonlinear mixed effects models

    OpenAIRE

    Nguyen, Thu Thuy; Bazzoli, Caroline; Mentré, France

    2012-01-01

    International audience; Bioequivalence or interaction trials are commonly studied in crossover design and can be analysed by nonlinear mixed effects models as an alternative to noncompartmental approach. We propose an extension of the population Fisher information matrix in nonlinear mixed effects models to design crossover pharmacokinetic trials, using a linearisation of the model around the random effect expectation, including within-subject variability and discrete covariates fixed or chan...

  10. Optimising intraperitoneal gentamicin dosing in peritoneal dialysis patients with peritonitis (GIPD study

    Directory of Open Access Journals (Sweden)

    Lipman Jeffrey

    2009-12-01

    Full Text Available Abstract Background Antibiotics are preferentially delivered via the peritoneal route to treat peritonitis, a major complication of peritoneal dialysis (PD, so that maximal concentrations are delivered at the site of infection. However, drugs administered intraperitoneally can be absorbed into the systemic circulation. Drugs excreted by the kidneys accumulate in PD patients, increasing the risk of toxicity. The aim of this study is to examine a model of gentamicin pharmacokinetics and to develop an intraperitoneal drug dosing regime that maximises bacterial killing and minimises toxicity. Methods/Design This is an observational pharmacokinetic study of consecutive PD patients presenting to the Royal Brisbane and Women's Hospital with PD peritonitis and who meet the inclusion criteria. Participants will be allocated to either group 1, if anuric as defined by urine output less than 100 ml/day, or group 2: if non-anuric, as defined by urine output more than 100 ml/day. Recruitment will be limited to 15 participants in each group. Gentamicin dosing will be based on the present Royal Brisbane & Women's Hospital guidelines, which reflect the current International Society for Peritoneal Dialysis Peritonitis Treatment Recommendations. The primary endpoint is to describe the pharmacokinetics of gentamicin administered intraperitoneally in PD patients with peritonitis based on serial blood and dialysate drug levels. Discussion The study will develop improved dosing recommendations for intraperitoneally administered gentamicin in PD patients with peritonitis. This will guide clinicians and pharmacists in selecting the most appropriate dosing regime of intraperitoneal gentamicin to treat peritonitis. Trial Registration ACTRN12609000446268

  11. Analysing bone regeneration using topological optimisation

    Directory of Open Access Journals (Sweden)

    Diego Alexander Garzón Alvarado

    2010-04-01

    Full Text Available The present article's object is to present the mathematical foundations of topological optimisation aimed at carrying out a study of bone regeneration. Bone structure can be economically adopted to different mechanical demands responding to topological optimisation models (having "minimum" mass and "high" resistance. Such analysis is essential for formulating physical therapy in patients needing partial or total strengthening of a particular bone's tissue structure. A mathematical model is formulated, as are the methods for resolving it.

  12. The application of process integration to the optimisation of cruise ship energy systems: a case study

    DEFF Research Database (Denmark)

    Baldi, Francesco; Nguyen, Tuong-Van; Ahlgren, Fredrik

    2016-01-01

    and its integration with available sources of waste heat on board.In this study, the principles of process integration are applied to the energy system of a cruise ship operating in the Baltic Sea. The heat sources (waste heat from the main and auxiliary engines in form of exhaust gas, cylinder cooling......” scenario and one in the “design” scenario, with a reduction of 13-33%, 15-27% and 46-56% of the external heat demand, respectively. Given the high amount of heat being available after the process integration, we also analysed the potential for the installation of a steam turbine for the recovery...

  13. Integrated Geologic, Geochemical, and Geophysical Studies of Big Bend National Park, Texas

    Science.gov (United States)

    Gray, John E.; Finn, Carol A.; Morgan, Lisa A.; Page, William R.; Shanks, Wayne C.

    2007-01-01

    Introduction Big Bend National Park (BBNP), Texas, covers 801,163 acres (3,242 km2) and was established in 1944 through a transfer of land from the State of Texas to the United States. The park is located along a 118-mi (190-km) stretch of the Rio Grande at the United States border with Mexico. The U.S. Geological Survey (USGS) began a 5-year project in 2003 with the objective of studying a number of broad and diverse geologic, geochemical, and geophysical topics in BBNP. This fact sheet describes results of some of the research by USGS scientists working in BBNP.

  14. Studies of Big Data metadata segmentation between relational and non-relational databases

    CERN Document Server

    Golosova, M V; Klimentov, A A; Ryabinkin, E A; Dimitrov, G; Potekhin, M

    2015-01-01

    In recent years the concepts of Big Data became well established in IT. Systems managing large data volumes produce metadata that describe data and workflows. These metadata are used to obtain information about current system state and for statistical and trend analysis of the processes these systems drive. Over the time the amount of the stored metadata can grow dramatically. In this article we present our studies to demonstrate how metadata storage scalability and performance can be improved by using hybrid RDBMS/NoSQL architecture.

  15. Studies of Big Data metadata segmentation between relational and non-relational databases

    Science.gov (United States)

    Golosova, M. V.; Grigorieva, M. A.; Klimentov, A. A.; Ryabinkin, E. A.; Dimitrov, G.; Potekhin, M.

    2015-12-01

    In recent years the concepts of Big Data became well established in IT. Systems managing large data volumes produce metadata that describe data and workflows. These metadata are used to obtain information about current system state and for statistical and trend analysis of the processes these systems drive. Over the time the amount of the stored metadata can grow dramatically. In this article we present our studies to demonstrate how metadata storage scalability and performance can be improved by using hybrid RDBMS/NoSQL architecture.

  16. Optimisation of metabolic criteria in the prognostic assessment in patients with lymphoma. A multicentre study.

    Science.gov (United States)

    Del Puig Cózar-Santiago, M; García-Garzón, J R; Moragas-Freixa, M; Soler-Peter, M; Bassa Massanas, P; Sánchez-Delgado, M; Sanchez-Jurado, R; Aguilar-Barrios, J E; Sanz-Llorens, R; Ferrer-Rebolleda, J

    To compare sensitivity, specificity and predictive value of Deauville score (DS) vs. ΔSUVmax in interim-treatment PET (iPET) and end-treatment PET (ePET), in patients with diffuse large B cell lymphoma (DLBCL), Hodgkin lymphoma (HL), and follicular lymphoma (FL). Retrospective longitudinal multicentre study including 138 patients (46 DLBCL, 46 HL, 46 FL), on whom 3 (18)F-FDG PET/CT were performed: baseline, iPET, and ePET. Visual (DS) and semi-quantitative (ΔSUVmax) parameters were determined for iPET and ePET. Predictive value was determined in relation to disease-free interval. Statistical analysis. iPET for DLBCL, HL, and FL: 1) sensitivity of DS: 76.92/83.33/61.53%; specificity: 78.78/85/81.81%; 2) sensitivity of ΔSUVmax: 53.84/83.33/61.53%; specificity: 87.87/87.50/78.78%. ePET for DLBCL, HL and FL: 1) sensitivity of DS: 61.53/83.33/69.23%; specificity: 90.90/85/87.87%; 2) sensitivity of ΔSUVmax: 69.23/83.33/69.23%; specificity: 90.90/87.50/84.84%. Predictive assessment. iPET study: in DLBCL, DS resulted in 10.3% recurrence of negative iPET, and 17.1% in ΔSUVmax at disease-free interval; in HL, both parameters showed a 2.8% recurrence of negative iPET; in FL, DS resulted in 15.6% recurrence of negative iPET, and 16.1% in ΔSUVmax, with no statistical significance. ePET study: in DLBCL, DS resulted in 14.3% recurrence of negative ePET, and 11.8% in ΔSUVmax at disease-free interval; in HL and FL, both methods showed 2.8 and 12.5% recurrence in negative ePET, respectively. DS and ΔSUVmax did not show significant differences in DLBCL, HL and FL. Their predictive value also did not show significant differences in HL and FL. In DLBCL, DS was higher in iPET, and ΔSUVmax in ePET. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  17. Study of laser megajoule calorimeter's thermal behaviour for energy measurement uncertainty optimisation.

    Science.gov (United States)

    Crespy, Charles; Villate, Denis; Lobios, Olivier

    2013-01-01

    For laser megajoule (LMJ) facility, an accurate procedure for laser pulse energy measurement is a crucial requirement. In this study, the influence of measurement procedure on LMJ calorimeter uncertainty is experimentally and numerically investigated. To this end, a 3D thermal model is developed and two experimental techniques are implemented. The metrological characteristics of both techniques are presented. As a first step, the model is validated by comparing numerical and experimental results. Then, the influence of a large number of parameters considered as likely uncertainty sources on calorimeter response is investigated: wavelength, pulse duration, ambient temperature, laser beam diameter.... The post processing technique procedure is also examined. The paper provides some of the parameters required to allow a robust and efficient calibration procedure to be produced.

  18. Optimising Controlled Human Malaria Infection Studies Using Cryopreserved P. falciparum Parasites Administered by Needle and Syringe.

    Directory of Open Access Journals (Sweden)

    Susanne H Sheehy

    Full Text Available Controlled human malaria infection (CHMI studies have become a routine tool to evaluate efficacy of candidate anti-malarial drugs and vaccines. To date, CHMI trials have mostly been conducted using the bite of infected mosquitoes, restricting the number of trial sites that can perform CHMI studies. Aseptic, cryopreserved P. falciparum sporozoites (PfSPZ Challenge provide a potentially more accurate, reproducible and practical alternative, allowing a known number of sporozoites to be administered simply by injection.We sought to assess the infectivity of PfSPZ Challenge administered in different dosing regimens to malaria-naive healthy adults (n = 18. Six participants received 2,500 sporozoites intradermally (ID, six received 2,500 sporozoites intramuscularly (IM and six received 25,000 sporozoites IM.Five out of six participants receiving 2,500 sporozoites ID, 3/6 participants receiving 2,500 sporozoites IM and 6/6 participants receiving 25,000 sporozoites IM were successfully infected. The median time to diagnosis was 13.2, 17.8 and 12.7 days for 2,500 sporozoites ID, 2,500 sporozoites IM and 25,000 sporozoites IM respectively (Kaplan Meier method; p = 0.024 log rank test.2,500 sporozoites ID and 25,000 sporozoites IM have similar infectivities. Given the dose response in infectivity seen with IM administration, further work should evaluate increasing doses of PfSPZ Challenge IM to identify a dosing regimen that reliably infects 100% of participants.ClinicalTrials.gov NCT01465048.

  19. Topology optimisation for natural convection problems

    CERN Document Server

    Alexandersen, Joe; Andreasen, Casper Schousboe; Sigmund, Ole

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach for designing heat sink geometries cooled by natural convection and micropumps powered by natural convection.

  20. An optimisation method for complex product design

    Science.gov (United States)

    Li, Ni; Yi, Wenqing; Bi, Zhuming; Kong, Haipeng; Gong, Guanghong

    2013-11-01

    Designing a complex product such as an aircraft usually requires both qualitative and quantitative data and reasoning. To assist the design process, a critical issue is how to represent qualitative data and utilise it in the optimisation. In this study, a new method is proposed for the optimal design of complex products: to make the full use of available data, information and knowledge, qualitative reasoning is integrated into the optimisation process. The transformation and fusion of qualitative and qualitative data are achieved via the fuzzy sets theory and a cloud model. To shorten the design process, parallel computing is implemented to solve the formulated optimisation problems. A parallel adaptive hybrid algorithm (PAHA) has been proposed. The performance of the new algorithm has been verified by a comparison with the results from PAHA and two other existing algorithms. Further, PAHA has been applied to determine the shape parameters of an aircraft model for aerodynamic optimisation purpose.

  1. A Comparative Study of Control Strategies for Performance Optimisation of Brushless Doubly- Fed Reluctance Machines

    Directory of Open Access Journals (Sweden)

    Milutin G. Jovanović

    2006-12-01

    Full Text Available The brushless doubly-fed machine (BDFM allows the use of a partially rated inverter and represents an attractive cost-effective candidate for variable speed applications with limited speed ranges. In its induction machine form (BDFIM, the BDFM has significant rotor losses and poor efficiency due to the cage rotor design which makes the machine dynamic models heavily parameter dependent and the resulting controller configuration complicated and difficult to implement. A reluctance version of the BDFM, the brushless doubly-fed reluctance machine (BDFRM, ideally has no rotor losses, and therefore offers the prospect for higher efficiency and simpler control compared to the BDFIM. A detailed study of this interesting and emerging machine is very important to gain a thorough understanding of its unusual operation, control aspects and compromises between optimal performance and the size of the inverter and the machine. This paper will attempt to address these issues specifically concentrating on developing conditions for various control properties of the machine such as maximum power factor, maximum torque per inverter ampere and minimum copper losses, as well as analysing the associated trade-offs.

  2. An optimised direct lysis method for gene expression studies on low cell numbers.

    Science.gov (United States)

    Le, Anh Viet-Phuong; Huang, Dexing; Blick, Tony; Thompson, Erik W; Dobrovic, Alexander

    2015-08-05

    There is increasing interest in gene expression analysis of either single cells or limited numbers of cells. One such application is the analysis of harvested circulating tumour cells (CTCs), which are often present in very low numbers. A highly efficient protocol for RNA extraction, which involves a minimal number of steps to avoid RNA loss, is essential for low input cell numbers. We compared several lysis solutions that enable reverse transcription (RT) to be performed directly on the cell lysate, offering a simple rapid approach to minimise RNA loss for RT. The lysis solutions were assessed by reverse transcription quantitative polymerase chain reaction (RT-qPCR) in low cell numbers isolated from four breast cancer cell lines. We found that a lysis solution containing both the non-ionic detergent (IGEPAL CA-630, chemically equivalent to Nonidet P-40 or NP-40) and bovine serum albumin (BSA) gave the best RT-qPCR yield. This direct lysis to reverse transcription protocol outperformed a column-based extraction method using a commercial kit. This study demonstrates a simple, reliable, time- and cost-effective method that can be widely used in any situation where RNA needs to be prepared from low to very low cell numbers.

  3. Study and optimisation of SIMS performed with He{sup +} and Ne{sup +} bombardment

    Energy Technology Data Exchange (ETDEWEB)

    Pillatsch, L.; Vanhove, N.; Dowsett, D. [Department “Science and Analysis of Materials” (SAM), Centre de Recherche Public – Gabriel Lippmann, 41 rue du Brill, L-4422 Belvaux (Luxembourg); Sijbrandij, S.; Notte, J. [Carl Zeiss Microscopy LLC, One Corporation Way, Peabody, MA 01960 (United States); Wirtz, T., E-mail: wirtz@lippmann.lu [Department “Science and Analysis of Materials” (SAM), Centre de Recherche Public – Gabriel Lippmann, 41 rue du Brill, L-4422 Belvaux (Luxembourg)

    2013-10-01

    The combination of the high-brightness He{sup +}/Ne{sup +} atomic level ion source with the detection capabilities of secondary ion mass spectrometry (SIMS) opens up the prospect of obtaining chemical information with high lateral resolution and high sensitivity on the Zeiss ORION helium ion microscope (HIM). A feasibility study with He{sup +} and Ne{sup +} ion bombardment is presented in order to determine the performance of SIMS analyses using the HIM. Therefore, the sputtering yields, useful yields and detection limits obtained for metallic (Al, Ni and W) as well as semiconductor samples (Si, Ge, GaAs and InP) were investigated. All the experiments were performed on a Cameca IMS4f SIMS instrument which was equipped with a caesium evaporator and oxygen flooding system. For most of the elements, useful yields in the range of 10{sup −4} to 3 × 10{sup −2} were measured with either O{sub 2} or Cs flooding. SIMS experiments performed directly on the ORION with a prototype secondary ion extraction and detection system lead to results that are consistent with those obtained on the IMS4f. Taking into account the obtained useful yields and the analytical conditions, such as the ion current and typical dwell time on the ORION HIM, detection limits in the at% range and better can be obtained during SIMS imaging at 10 nm lateral resolution with Ne{sup +} bombardment and down to the ppm level when a lateral resolution of 100 nm is chosen. Performing SIMS on the HIM with a good detection limit while maintaining an excellent lateral resolution (<50 nm) is therefore very promising.

  4. Architecture technology for Big Data

    National Research Council Canada - National Science Library

    Juan José Camargo Vega; Jonathan Felipe Camargo Ortega; Luis Joyanes Aguilar

    2015-01-01

    The term Big Data with each passing day, it becomes more important, which is why in this research is studied, analyzed and disclosed in a comprehensive manner the different architectures of Big Data...

  5. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  6. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  7. HEPTopTagger optimisation studies in the context of a t anti t fully-hadronic resonance search

    Energy Technology Data Exchange (ETDEWEB)

    Sosa, David; Anders, Christoph; Kasieczka, Gregor; Schoening, Andre; Schaetzel, Sebastian [Physikalischens Institut, Heidelberg (Germany)

    2013-07-01

    The HEPTopTagger algorithm identifies boosted, hadronically decaying top quarks.It has been already validated using 2011 data taken with the ATLAS detector. The performance of the HEPTopTagger can be optimised by tuning internal parameters of the algorithm to improve the signal efficiency and the background rejection. Using the HEPTopTagger, a fully-hadronic resonance search has been conducted with the ATLAS detector with 2011 data. In order to improve the mass reach of the search the full 2012 data set can be used. The HepTopTagger is tested and re-optimized as the running conditions have changed. This optimisation of the HEPTopTagger on the context of a a fully-hadronic resonance search is presented.

  8. Focal psychodynamic therapy, cognitive behaviour therapy, and optimised treatment as usual in outpatients with anorexia nervosa (ANTOP study): randomised controlled trial.

    Science.gov (United States)

    Zipfel, Stephan; Wild, Beate; Groß, Gaby; Friederich, Hans-Christoph; Teufel, Martin; Schellberg, Dieter; Giel, Katrin E; de Zwaan, Martina; Dinkel, Andreas; Herpertz, Stephan; Burgmer, Markus; Löwe, Bernd; Tagay, Sefik; von Wietersheim, Jörn; Zeeck, Almut; Schade-Brittinger, Carmen; Schauenburg, Henning; Herzog, Wolfgang

    2014-01-11

    Psychotherapy is the treatment of choice for patients with anorexia nervosa, although evidence of efficacy is weak. The Anorexia Nervosa Treatment of OutPatients (ANTOP) study aimed to assess the efficacy and safety of two manual-based outpatient treatments for anorexia nervosa--focal psychodynamic therapy and enhanced cognitive behaviour therapy--versus optimised treatment as usual. The ANTOP study is a multicentre, randomised controlled efficacy trial in adults with anorexia nervosa. We recruited patients from ten university hospitals in Germany. Participants were randomly allocated to 10 months of treatment with either focal psychodynamic therapy, enhanced cognitive behaviour therapy, or optimised treatment as usual (including outpatient psychotherapy and structured care from a family doctor). The primary outcome was weight gain, measured as increased body-mass index (BMI) at the end of treatment. A key secondary outcome was rate of recovery (based on a combination of weight gain and eating disorder-specific psychopathology). Analysis was by intention to treat. This trial is registered at http://isrctn.org, number ISRCTN72809357. Of 727 adults screened for inclusion, 242 underwent randomisation: 80 to focal psychodynamic therapy, 80 to enhanced cognitive behaviour therapy, and 82 to optimised treatment as usual. At the end of treatment, 54 patients (22%) were lost to follow-up, and at 12-month follow-up a total of 73 (30%) had dropped out. At the end of treatment, BMI had increased in all study groups (focal psychodynamic therapy 0·73 kg/m(2), enhanced cognitive behaviour therapy 0·93 kg/m(2), optimised treatment as usual 0·69 kg/m(2)); no differences were noted between groups (mean difference between focal psychodynamic therapy and enhanced cognitive behaviour therapy -0·45, 95% CI -0·96 to 0·07; focal psychodynamic therapy vs optimised treatment as usual -0·14, -0·68 to 0·39; enhanced cognitive behaviour therapy vs optimised treatment as usual -0·30

  9. Results of new petrologic and remote sensing studies in the Big Bend region

    Science.gov (United States)

    Benker, Stevan Christian

    Mesa. Based on the amount of surface relief depicted, inconsistency with subsequent normal faulting, and distance from magmatic features capable of surface doming or inflation, we believe the paleo-topographic highs modeled legitimately reflect the post-Laramide surface. We interpret the paleo-surface in this area as reflecting a post-Laramide surface that has experienced significant erosion. We attribute the paleo-topographic highs as Laramide topography that was more resistant. The model also implies a southern paleo-drainage direction for the area and suggests the present day topographic low through which the Rio Grande flows may have formed very soon after the Laramide Orogeny. Based on the newly calculated horizontal and vertical position accuracies for the Big Bend region and results of modeled Google Earth-NED data in easternmost Big Bend Ranch State Park, it seems Google Earth can be effectively utilized for remote sensing and geologic studies, however we urge caution as developers remain reluctant to disclose detailed program information to the public.

  10. Optimisation of load control

    Energy Technology Data Exchange (ETDEWEB)

    Koponen, P. [VTT Energy, Espoo (Finland)

    1998-08-01

    Electricity cannot be stored in large quantities. That is why the electricity supply and consumption are always almost equal in large power supply systems. If this balance were disturbed beyond stability, the system or a part of it would collapse until a new stable equilibrium is reached. The balance between supply and consumption is mainly maintained by controlling the power production, but also the electricity consumption or, in other words, the load is controlled. Controlling the load of the power supply system is important, if easily controllable power production capacity is limited. Temporary shortage of capacity causes high peaks in the energy price in the electricity market. Load control either reduces the electricity consumption during peak consumption and peak price or moves electricity consumption to some other time. The project Optimisation of Load Control is a part of the EDISON research program for distribution automation. The following areas were studied: Optimization of space heating and ventilation, when electricity price is time variable, load control model in power purchase optimization, optimization of direct load control sequences, interaction between load control optimization and power purchase optimization, literature on load control, optimization methods and field tests and response models of direct load control and the effects of the electricity market deregulation on load control. An overview of the main results is given in this chapter

  11. Simulation versus Optimisation

    DEFF Research Database (Denmark)

    Lund, Henrik; Arler, Finn; Østergaard, Poul Alberg

    2017-01-01

    investment optimisation or optimal solutions approach. On the other hand the analytical simulation or alternatives assessment approach. Awareness of the dissimilar theoretical assumption behind the models clarifies differences between the models, explains dissimilarities in results, and provides...... a theoretical and methodological foundation for understanding and interpreting results from the two archetypes. Keywords: energy system analysis; investment optimisation models; simulations models; modelling theory;renewable energy......In recent years, several tools and models have been developed and used for the design and analysis of future national energy systems. Many of these models focus on the integration of various renewable energy resources and the transformation of existing fossil-based energy systems into future...

  12. Optimising AspectJ

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    AspectJ, an aspect-oriented extension of Java, is becoming increasingly popular. However, not much work has been directed at optimising compilers for AspectJ. Optimising AOP languages provides many new and interesting challenges for compiler writers, and this paper identifies and addresses three...... all of the techniques in this paper in abc, our AspectBench Compiler for AspectJ, and we demonstrate significant speedups with empirical results. Some of our techniques have already been integrated into the production AspectJ compiler, ajc 1.2.1....

  13. A study on effect of big five personality traits on emotional intelligence

    Directory of Open Access Journals (Sweden)

    Hamed Dehghanan

    2014-06-01

    Full Text Available This paper presents a study to investigate the effects of big five personal traits on emotional intelligence on some Iranian firms located in city of Tehran, Iran. The proposed study uses two questionnaires, one, which is originally developed by McCare and Costa (1992 [McCrae, R. R., & Costa, P. T., Jr. (1992. Discriminant validity of NEO-PI-R facet scales. Educational and Psychological Measurement, 52, 229-237.] for measuring personality traits and the other, which is used for measuring emotional intelligence . The first questionnaire consists of five personal categories including extraversion, agreeableness, conscientiousness, emotional stability versus neuroticism, and openness. Using structural equation modeling and stepwise regression model, the study has detected a positive and meaningful relationship between four components namely, extraversion, agreeableness, conscientiousness as well as openness and emotional intelligence. In addition, the study detects a negative and meaningful relationship between neuroticism and emotional intelligence.

  14. A Study of the Perceptions Held by Information Technology Professionals in Relation to the Maturity, Value, and Practical Deployment of Big Data Solutions

    Directory of Open Access Journals (Sweden)

    Damon Andrick Runion

    2016-07-01

    Full Text Available This research study investigated relationships between an information technology (IT professional's self-assigned understanding of big data and their assessment of the maturity, value, hype, and future trends of big data. The study also examined if there was any relationship between an IT professional's understanding of big data and the position they occupy professionally. The study consisted of a twenty question survey. Research findings indicate that IT professionals are still becoming familiar with big data and related technologies. The results supported rejecting two of the five hypotheses. The study produced evidence that there is a relationship between an IT professional's level of big data understanding and their expectation that there will be an increase in technological developments related to big data in the near future.

  15. A study and analysis of recommendation systems for location-based social network (LBSN with big data

    Directory of Open Access Journals (Sweden)

    Murale Narayanan

    2016-03-01

    Full Text Available Recommender systems play an important role in our day-to-day life. A recommender system automatically suggests an item to a user that he/she might be interested in. Small-scale datasets are used to provide recommendations based on location, but in real time, the volume of data is large. We have selected Foursquare dataset to study the need for big data in recommendation systems for location-based social network (LBSN. A few quality parameters like parallel processing and multimodal interface have been selected to study the need for big data in recommender systems. This paper provides a study and analysis of quality parameters of recommendation systems for LBSN with big data.

  16. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    Science.gov (United States)

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  17. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  18. Optimising Magnetostatic Assemblies

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Smith, Anders

    the optimal remanence distribution with respect to a linear objective functional. Additionally, it is shown here that the same formalism can be applied to the optimisation of the geometry of magnetic systems. Specifically, the border separating the permanent magnet from regions occupied by air or soft...

  19. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  20. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  1. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  2. Experimental Study on the Compressive Strength of Big Mobility Concrete with Nondestructive Testing Method

    Directory of Open Access Journals (Sweden)

    Huai-Shuai Shang

    2012-01-01

    Full Text Available An experimental study of C20, C25, C30, C40, and C50 big mobility concrete cubes that came from laboratory and construction site was completed. Nondestructive testing (NDT was carried out using impact rebound hammer (IRH techniques to establish a correlation between the compressive strengths and the rebound number. The local curve for measuring strength of the regression method is set up and its superiority is proved. The rebound method presented is simple, quick, and reliable and covers wide ranges of concrete strengths. The rebound method can be easily applied to concrete specimens as well as existing concrete structures. The final results were compared with previous ones from the literature and also with actual results obtained from samples extracted from existing structures.

  3. The relationship between the Big Five personality factors and burnout: a study among volunteer counselors.

    Science.gov (United States)

    Bakker, Arnold B; Van der Zee, Karen I; Lewig, Kerry A; Dollard, Maureen F

    2006-02-01

    In the present study of 80 volunteer counselors who cared for terminally ill patients, the authors examined the relationship between burnout as measured by the Maslach Burnout Inventory (C. Maslach, S. E. Jackson, & M. P. Leiter, 1996) and the 5 basic (Big Five) personality factors (A. A. J. Hendriks, 1997): extraversion, agreeableness, conscientiousness, emotional stability, and intellect/autonomy. The results of 3 separate stepwise multiple regression analyses showed that (a) emotional exhaustion is uniquely predicted by emotional stability; (b) depersonalization is predicted by emotional stability, extraversion, and intellect/autonomy; and (c) personal accomplishment is predicted by extraversion and emotional stability. In addition, some of the basic personality factors moderated the relationship between relative number of negative experiences and burnout, suggesting that personality may help to protect against known risks of developing burnout in volunteer human service work.

  4. A longitudinal study of the relationships between the Big Five personality traits and body size perception.

    Science.gov (United States)

    Hartmann, Christina; Siegrist, Michael

    2015-06-01

    The present study investigated the longitudinal development of body size perception in relation to different personality traits. A sample of Swiss adults (N=2905, 47% men), randomly selected from the telephone book, completed a questionnaire on two consecutive years (2012, 2013). Body size perception was assessed with the Contour Drawing Rating Scale and personality traits were assessed with a short version of the Big Five Inventory. Longitudinal analysis of change indicated that men and women scoring higher on conscientiousness perceived themselves as thinner one year later. In contrast, women scoring higher on neuroticism perceived their body size as larger one year later. No significant effect was observed for men scoring higher on neuroticism. These results were independent of weight changes, body mass index, age, and education. Our findings suggest that personality traits contribute to body size perception among adults.

  5. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  6. Optimisation of the LHCb detector

    CERN Document Server

    Hierck, R H

    2003-01-01

    This thesis describes a comparison of the LHCb classic and LHCb light concept from a tracking perspective. The comparison includes the detector occupancies, the various pattern recognition algorithms and the reconstruction performance. The final optimised LHCb setup is used to study the physics performance of LHCb for the Bs->DsK and Bs->DsPi decay channels. This includes both the event selection and a study of the sensitivity for the Bs oscillation frequency, delta m_s, the Bs lifetime difference, DGamma_s, and the CP parameter gamma-2delta gamma.

  7. Implementing large-scale programmes to optimise the health workforce in low- and middle-income settings: a multicountry case study synthesis.

    Science.gov (United States)

    Gopinathan, Unni; Lewin, Simon; Glenton, Claire

    2014-12-01

    To identify factors affecting the implementation of large-scale programmes to optimise the health workforce in low- and middle-income countries. We conducted a multicountry case study synthesis. Eligible programmes were identified through consultation with experts and using Internet searches. Programmes were selected purposively to match the inclusion criteria. Programme documents were gathered via Google Scholar and PubMed and from key informants. The SURE Framework - a comprehensive list of factors that may influence the implementation of health system interventions - was used to organise the data. Thematic analysis was used to identify the key issues that emerged from the case studies. Programmes from Brazil, Ethiopia, India, Iran, Malawi, Venezuela and Zimbabwe were selected. Key system-level factors affecting the implementation of the programmes were related to health worker training and continuing education, management and programme support structures, the organisation and delivery of services, community participation, and the sociopolitical environment. Existing weaknesses in health systems may undermine the implementation of large-scale programmes to optimise the health workforce. Changes in the roles and responsibilities of cadres may also, in turn, impact the health system throughout. © 2014 John Wiley & Sons Ltd.

  8. Comparison of Oncentra® Brachy IPSA and graphical optimisation techniques: a case study of HDR brachytherapy head and neck and prostate plans

    Energy Technology Data Exchange (ETDEWEB)

    Jameson, Michael G, E-mail: michael.jameson@sswahs.nsw.gov.au [Liverpool and Macarthur Cancer Therapy Centres, Liverpool, New South Wales (Australia); Centre for Medical Radiation Physics, University of Wollongong, Wollongong, New South Wales (Australia); Ingham Institute of Applied Medical Research, Liverpool, New South Wales (Australia); Ohanessian, Lucy [Liverpool and Macarthur Cancer Therapy Centres, Liverpool, New South Wales (Australia); Batumalai, Vikneswary [Liverpool and Macarthur Cancer Therapy Centres, Liverpool, New South Wales (Australia); Ingham Institute of Applied Medical Research, Liverpool, New South Wales (Australia); South Western Sydney Clinical School, School of Medicine, University of New South Wales (Australia); Patel, Virendra [Liverpool and Macarthur Cancer Therapy Centres, Liverpool, New South Wales (Australia); Holloway, Lois C [Liverpool and Macarthur Cancer Therapy Centres, Liverpool, New South Wales (Australia); Centre for Medical Radiation Physics, University of Wollongong, Wollongong, New South Wales (Australia); Ingham Institute of Applied Medical Research, Liverpool, New South Wales (Australia); South Western Sydney Clinical School, School of Medicine, University of New South Wales (Australia); Institute of Medical Physics, School of Physics, University of Sydney, Sydney, New South Wales (Australia)

    2015-06-15

    There are a number of different dwell positions and time optimisation options available in the Oncentra® Brachy (Elekta Brachytherapy Solutions, Veenendaal, The Netherlands) brachytherapy treatment planning system. The purpose of this case study was to compare graphical (GRO) and inverse planning by simulated annealing (IPSA) optimisation techniques for interstitial head and neck (HN) and prostate plans considering dosimetry, modelled radiobiology outcome and planning time. Four retrospective brachytherapy patients were chosen for this study, two recurrent HN and two prostatic boosts. Manual GRO and IPSA plans were generated for each patient. Plans were compared using dose–volume histograms (DVH) and dose coverage metrics including; conformity index (CI), homogeneity index (HI) and conformity number (CN). Logit and relative seriality models were used to calculate tumour control probability (TCP) and normal tissue complication probability (NTCP). Approximate planning time was also recorded. There was no significant difference between GRO and IPSA in terms of dose metrics with mean CI of 1.30 and 1.57 (P > 0.05) respectively. IPSA achieved an average HN TCP of 0.32 versus 0.12 for GRO while for prostate there was no significant difference. Mean GRO planning times were greater than 75 min while average IPSA planning times were less than 10 min. Planning times for IPSA were greatly reduced compared to GRO and plans were dosimetrically similar. For this reason, IPSA makes for a useful planning tool in HN and prostate brachytherapy.

  9. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  10. Optimisation of Investment Resources at Small Enterprises

    Directory of Open Access Journals (Sweden)

    Shvets Iryna B.

    2014-03-01

    Full Text Available The goal of the article lies in the study of the process of optimisation of the structure of investment resources, development of criteria and stages of optimisation of volumes of investment resources for small enterprises by types of economic activity. The article characterises the process of transformation of investment resources into assets and liabilities of the balances of small enterprises and conducts calculation of the structure of sources of formation of investment resources in Ukraine at small enterprises by types of economic activity in 2011. On the basis of the conducted analysis of the structure of investment resources of small enterprises the article forms main groups of criteria of optimisation in the context of individual small enterprises by types of economic activity. The article offers an algorithm and step-by-step scheme of optimisation of investment resources at small enterprises in the form of a multi-stage process of management of investment resources in the context of increase of their mobility and rate of transformation of existing resources into investments. The prospect of further studies in this direction is development of a structural and logic scheme of optimisation of volumes of investment resources at small enterprises.

  11. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  12. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

    Directory of Open Access Journals (Sweden)

    Nigsch Florian

    2008-10-01

    Full Text Available Abstract Background We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC, that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024–1029. We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581–590 of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6°C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, ε of 0.21 and an RMSE of 45.1°C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3°C, R2 of 0.47 for the same data and has similar performance to a Random Forest model (RMSE of 44.5°C, R2 of 0.55. However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  13. Optimisation by hierarchical search

    Science.gov (United States)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  14. Men Get 70% of Money Available for Athletic Scholarships at Colleges That Play Big-Time Sports, New Study Finds.

    Science.gov (United States)

    Lederman, Douglas

    1992-01-01

    A study on sex equity by the National Collegiate Athletic Association found men's college athletic teams receive 70 percent of athletic scholarship money, 77 percent of operating money, and 83 percent of recruiting money spent by colleges playing big-time sports, despite virtually equal enrollment of men and women. Interpretations of the data…

  15. The Interplay of "Big Five" Personality Factors and Metaphorical Schemas: A Pilot Study with 20 Lung Transplant Recipients

    Science.gov (United States)

    Goetzmann, Lutz; Moser, Karin S.; Vetsch, Esther; Grieder, Erhard; Klaghofer, Richard; Naef, Rahel; Russi, Erich W.; Boehler, Annette; Buddeberg, Claus

    2007-01-01

    The aim of the present study was to investigate the interplay between personality factors and metaphorical schemas. The "Big Five" personality factors of 20 patients after lung transplantation were examined with the NEO-FFI. Patients were questioned about their social network, and self- and body-image. The interviews were assessed with metaphor…

  16. Optimisation study of {alpha}-cyclotron production of At-211/Po-211g for high-LET metabolic radiotherapy purposes

    Energy Technology Data Exchange (ETDEWEB)

    Groppi, F. [Universita degli Studi di Milano and INFN-Milano, LASA, Radiochemistry Laboratory, via F.lli Cervi 201, I-20090 Segrate, Milan (Italy)]. E-mail: flavia.groppi@mi.infn.it; Bonardi, M.L. [Universita degli Studi di Milano and INFN-Milano, LASA, Radiochemistry Laboratory, via F.lli Cervi 201, I-20090 Segrate, Milan (Italy); Birattari, C. [Universita degli Studi di Milano and INFN-Milano, LASA, Radiochemistry Laboratory, via F.lli Cervi 201, I-20090 Segrate, Milan (Italy); Menapace, E. [ENEA, Division for Advanced Physics Technologies, via Don Fiammelli 2, I-40128 Bologna (Italy); Abbas, K. [Institute for Health and Consumer Protection, IHCP, JRC-Ispra, via E. Fermi, I-21020 Varese (Italy); Holzwarth, U. [Institute for Health and Consumer Protection, IHCP, JRC-Ispra, via E. Fermi, I-21020 Varese (Italy); Alfarano, A. [Universita degli Studi di Milano and INFN-Milano, LASA, Radiochemistry Laboratory, via F.lli Cervi 201, I-20090 Segrate, Milan (Italy); Institute for Health and Consumer Protection, IHCP, JRC-Ispra, via E. Fermi, I-21020 Varese (Italy); Morzenti, S. [Universita degli Studi di Milano and INFN-Milano, LASA, Radiochemistry Laboratory, via F.lli Cervi 201, I-20090 Segrate, Milan (Italy); Zona, C. [Universita degli Studi di Milano and INFN-Milano, LASA, Radiochemistry Laboratory, via F.lli Cervi 201, I-20090 Segrate, Milan (Italy); Alfassi, Z.B. [Department of Nuclear Engineering, Ben Gurion University of Negev, Beer-Sheva, Il-84105 (Israel)

    2005-12-01

    The production of no-carrier-added (NCA) {alpha}-emitter {sup 211}At/{sup 211g}Po radionuclides for high-LET targeted radiotherapy and immunoradiotherapy, through the {sup 209}Bi({alpha},2n) reaction, together with the required wet radiochemistry and radioanalytical quality controls carried out at LASA is described, through dedicated irradiation experiments at the MC-40 cyclotron of JRC-Ispra. The amount of both the {gamma}-emitter {sup 210}At and its long half-lived {alpha}-emitting daughter {sup 210}Po is optimised and minimised by appropriate choice of energy and energy loss of {alpha} particle beam. The measured excitation functions for production of the main radioisotopic impurity {sup 210}At{yields}{sup 210}Po are compared with theoretical predictions from model calculations performed at ENEA.

  17. Big Data

    OpenAIRE

    2013-01-01

    Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google,...

  18. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...... a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...... knowledge producing environments with more visible boundaries and higher thresholds for collaboration....

  19. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  20. 1989 Waterfowl Nesting Study and Nesting Summary 1984-1989 : Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The 1989 nesting season marked the sixth consecutive and final year that Big Stone personnel have conducted waterfowl nesting research. In addition, an experimental...

  1. Optimisation of GnRH antagonist use in ART

    NARCIS (Netherlands)

    Hamdine, O.

    2014-01-01

    This thesis focuses on the optimisation of controlled ovarian stimulation for IVF using exogenous FSH and GnRH antagonist co-treatment, by studying the timing of the initiation of GnRH antagonist co-medication and the role of ovarian reserve markers in optimising ovarian response and reproductive ou

  2. Multi-wavelength studies of the statistical properties of active galaxies using Big Data

    Science.gov (United States)

    Mickaelian, A. M.; Abrahamyan, H. V.; Gyulzadyan, M. V.; Mikayelyan, G. A.; Paronyan, G. M.

    2017-06-01

    Statistical studies of active galaxies (both AGN and Starburst) using large multi-wavelength data are presented, including new studies of Markarian galaxies, large sample of IR galaxies, variable radio sources, and large homogeneous sample of X-ray selected AGN. Markarian survey (the First Byurakan Survey) was digitized and the DFBS database was created, as the biggest spectroscopic database by the number of objects involved ( ~ 20 million). This database provides both 2D images and 1D spectra. We have carried out a number of projects aimed at revealing and multi-wavelength studies of active galaxies among optical, X-ray, IR and radio sources. Thousands of X-ray sources were identified from ROSAT, including many AGN (52% among all identified sources). IRAS PSC/FSC sources were studied having accurate positions from WISE and a large extragalactic sample was created for further search for AGNs. The fraction of active galaxies among IR-selected galaxies was estimated as 24%. Variable radio sources at 1.4 GHz were revealed by cross-correlation of NVSS and FIRST catalogues using the method introduced by us for optical variability. Radio-X-ray sources were revealed from NVSS and ROSAT for detection of new active galaxies. Big Data in astronomy is described that provide new possibilities for statistical research of active galaxies and other objects.

  3. Sharing big biomedical data.

    Science.gov (United States)

    Toga, Arthur W; Dinov, Ivo D

    The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent datasets; however there are significant technical, social, regulatory, and institutional barriers that need to be overcome to ensure the power of Big Data overcomes these detrimental factors. Pragmatic policies that demand extensive sharing of data, promotion of data fusion, provenance, interoperability and balance security and protection of personal information are critical for the long term impact of translational Big Data analytics.

  4. Family Connections versus optimised treatment-as-usual for family members of individuals with borderline personality disorder: non-randomised controlled study.

    LENUS (Irish Health Repository)

    Flynn, Daniel

    2017-01-01

    Borderline personality disorder (BPD) is challenging for family members who are often required to fulfil multiple roles such as those of advocate, caregiver, coach and guardian. To date, two uncontrolled studies by the treatment developers suggest that Family Connections (FC) is an effective programme to support, educate and teach skills to family members of individuals with BPD. However, such studies have been limited by lack of comparison to other treatment approaches. This study aimed to compare the effectiveness of FC with an optimised treatment-as-usual (OTAU) programme for family members of individuals with BPD. A secondary aim was to introduce a long term follow-up to investigate if positive gains from the intervention would be maintained following programme completion.

  5. 微生物组学的大数据研究%Studies in Microbiome Big Data

    Institute of Scientific and Technical Information of China (English)

    蒋兴鹏; 胡小华

    2015-01-01

    微生物组学大数据在生态环境、人类健康和疾病研究方面都起到了重要作用。通过数学、统计等数据挖掘方法,从高维复杂数据中提取有用信息,是微生物组学大数据建模和分析的关键问题。本文分析了微生物组学大数据的特点,对当前数据分析和计算研究中存在的热点和难点进行了探讨分析,并综述了当前微生物组学大数据模式挖掘、网络重建与分析的研究概况。%Microbiome big data has brought great influence on environmental ecology,human health and diseases.By using mathematical and statistical methods for data mining,the extraction of useful information from high dimensional and complicate microbiome data set is the key question in modeling and analyzing microbiome big data.This paper summarizes the characteristics of microbiome big data and analyzes current hot topics and hard questions in data analysis and computational studies.We particularly reviewed the current status of pattern mining in microbiome big data and network reconstruction and analysis.

  6. Simulating stem growth using topological optimisation

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Narváez

    2010-04-01

    Full Text Available Engineers are currently resorting to observations of nature for making new designs. Studying the functioning of bodies of plants and animals has required them to be modelled and simulated; however, some models born from engineering problems could be used for such purposes. This article shows how topological optimisation (a mathematical model for optimising designing structural elements can be used for modeling and simulating the way a stem grows in terms of carrying out its funtion of providing support for the leaves and a plant's other upper organs.

  7. A case study to optimise and validate the brine shrimp Artemia franciscana immobilisation assay with silver nanoparticles: The role of harmonisation.

    Science.gov (United States)

    Kos, Monika; Kahru, Anne; Drobne, Damjana; Singh, Shashi; Kalčíková, Gabriela; Kühnel, Dana; Rohit, Rekulapelly; Gotvajn, Andreja Žgajnar; Jemec, Anita

    2016-06-01

    Brine shrimp Artemia sp. has been recognised as an important ecotoxicity and nanotoxicity test model organism for salt-rich aquatic environments, but currently there is still no harmonised testing protocol which would ensure the comparable results for hazard identification. In this paper we aimed to design the harmonised protocol for nanomaterial toxicity testing using Artemia franciscana and present a case study to validate the protocol with silver nanoparticles (AgNPs). We (i) revised the existing nanotoxicity test protocols with Artemia sp. (ii) optimised certain methodological steps based on the experiments with AgNPs and potassium dichromate (K2Cr2O7) as a soluble reference chemical and (iii) tested the optimised protocol in an international inter-laboratory exercise conducted within the EU FP7 NanoValid project. The intra- and inter-laboratory reproducibility of the proposed protocol with a soluble reference chemical K2Cr2O7 was good, which confirms the suitability of this assay for conventional chemicals. However, the variability of AgNPs toxicity results was very high showing again that nanomaterials are inherently challenging for toxicity studies, especially those which toxic effect is linked to shed metal ions. Among the identified sources for this variability were: the hatching conditions, the type of test plate incubation and the illumination regime. The latter induced variations assumingly due to the changes in bioavailable silver species concentrations. Up to our knowledge this is the first inter-laboratory comparison of the Artemia sp. toxicity study involving nanomaterials. Although the inter-laboratory exercise revealed poor repeatability of AgNPs toxicity results, this study provides valuable information regarding the importance of harmonisation of all steps in the test procedure. Also, the presented AgNPs toxicity case study may serve as a platform for further validation steps with other types of NMs.

  8. Principles of Experimental Design for Big Data Analysis.

    Science.gov (United States)

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2017-08-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis.

  9. Optimising resource management in neurorehabilitation.

    Science.gov (United States)

    Wood, Richard M; Griffiths, Jeff D; Williams, Janet E; Brouwers, Jakko

    2014-01-01

    To date, little research has been published regarding the effective and efficient management of resources (beds and staff) in neurorehabilitation, despite being an expensive service in limited supply. To demonstrate how mathematical modelling can be used to optimise service delivery, by way of a case study at a major 21 bed neurorehabilitation unit in the UK. An automated computer program for assigning weekly treatment sessions is developed. Queue modelling is used to construct a mathematical model of the hospital in terms of referral submissions to a waiting list, admission and treatment, and ultimately discharge. This is used to analyse the impact of hypothetical strategic decisions on a variety of performance measures and costs. The project culminates in a hybridised model of these two approaches, since a relationship is found between the number of therapy hours received each week (scheduling output) and length of stay (queuing model input). The introduction of the treatment scheduling program has substantially improved timetable quality (meaning a better and fairer service to patients) and has reduced employee time expended in its creation by approximately six hours each week (freeing up time for clinical work). The queuing model has been used to assess the effect of potential strategies, such as increasing the number of beds or employing more therapists. The use of mathematical modelling has not only optimised resources in the short term, but has allowed the optimality of longer term strategic decisions to be assessed.

  10. Internet of things and Big Data as potential solutions to the problems in waste electrical and electronic equipment management: An exploratory study.

    Science.gov (United States)

    Gu, Fu; Ma, Buqing; Guo, Jianfeng; Summers, Peter A; Hall, Philip

    2017-10-01

    Management of Waste Electrical and Electronic Equipment (WEEE) is a vital part in solid waste management, there are still some difficult issues require attentionss. This paper investigates the potential of applying Internet of Things (IoT) and Big Data as the solutions to the WEEE management problems. The massive data generated during the production, consumption and disposal of Electrical and Electronic Equipment (EEE) fits the characteristics of Big Data. Through using the state-of-the-art communication technologies, the IoT derives the WEEE "Big Data" from the life cycle of EEE, and the Big Data technologies process the WEEE "Big Data" for supporting decision making in WEEE management. The framework of implementing the IoT and the Big Data technologies is proposed, with its multiple layers are illustrated. Case studies with the potential application scenarios of the framework are presented and discussed. As an unprecedented exploration, the combined application of the IoT and the Big Data technologies in WEEE management brings a series of opportunities as well as new challenges. This study provides insights and visions for stakeholders in solving the WEEE management problems under the context of IoT and Big Data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The EpiDerm test protocol for the upcoming ECVAM validation study on in vitro skin irritation tests--an assessment of the performance of the optimised test.

    Science.gov (United States)

    Kandárová, Helena; Liebsch, Manfred; Gerner, Ingrid; Schmidt, Elisabeth; Genschow, Elke; Traue, Dieter; Spielmann, Horst

    2005-08-01

    During the past decade, several validation studies have been conducted on in vitro methods for discriminating between skin irritating and non-irritating chemicals. The reconstructed human skin models, EpiDerm and EPISKIN, provided the most promising results. Based on experience of the similar performance of the two skin models, it was suggested that a common test protocol and prediction model should be developed for the prediction of skin irritation potential with the two models. When the EPISKIN protocol was applied with the EpiDerm model, an acceptable specificity (80%) was achieved, whereas the sensitivity (60%) was low. In 2003, the EPISKIN protocol was further refined by extending the post-incubation period following exposure to test chemicals. This extension and additional technical improvements to the EpiDerm protocol were evaluated with 19 chemicals from the prevalidation study. With the new test design, high sensitivity (80%) and specificity (78%) were obtained. The statistical probability for correct classifications was high, so the test was considered to be ready for formal validation. However, since test optimisation had been conducted with the same test chemicals as were used in the ECVAM prevalidation study, it was decided that the optimisation of the protocol had to be verified with a new set of chemicals. Thus, in the current study, 26 additional chemicals (10 rabbit irritants and 16 non-irritants), which had previously been selected and tested by LOREAL with EPISKIN, were evaluated in three independent experiments with EpiDerm. With this unbalanced testing set, a specificity of 94%, and a sensitivity of 60% were obtained, while the positive and negative predictivity and accuracy remained almost unchanged (around 80%) in comparison to the in vivo rabbit data. Overall, 45 chemicals (20 irritants and 25 non-irritants) were tested according to the final protocol. The resulting high positive (82%) and negative predictive values (79%) confirmed the

  12. Optimisation of Microstrip Antenna

    Directory of Open Access Journals (Sweden)

    H. El Hamchary

    1996-04-01

    Full Text Available When choosing the most appropriate microstrip antenna configuration for particular applications, the kind of excitation of the radiating element is an essential factor that requires careful considerations. For controlling the distribution of energy of the linear or planar array of elements and for coupling energy to the individual elements, a wide variety of feed mechanisms are available. In this paper, the coaxial antenna feeding is assumed and the best (optimised feeding is found. Then, antenna characteristics such as radiation pattern, return loss, input impedance, and VSWR are obtained.

  13. A CASE STUDY ON MAXIMISING THE PROFITABILITY OF A FORM FILL AND SEAL MACHINE BY OPTIMISING INTERRUPTION INTERVALS

    Directory of Open Access Journals (Sweden)

    P.J. Vlok

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The practice of delivering consumer liquids in sachets, as opposed to alternative disposable packaging, has gained significant ground in the market in recent years because of environmental considerations, the cost benefit of sachets, and the relatively simple machinery required to produce sachets. In this paper, data obtained from a form, fill and seal (FFS sachet producing machine is analysed for financial feasibility. A statistical model is fitted to the data to optimise production interruptions, and the model’s relevance and value is confirmed on a second data set obtained from the same machine.

    AFRIKAANSE OPSOMMING: Die gebruik om verbruikersvloeistowwe in sakkies eerder as alternatiewe weggooibare verpakkingsmateriaal af te lewer het beduidende vooruitgang gemaak in die mark in die onlangse verlede as gevolg van omgewingsvriendelikheid, die koste-voordeel van sakkies, en die relatief eenvoudige toerusting wat benodig word vir die produksie van sakkies. In hierdie artikel word data wat verkry is van ’n Vorm, Vul en Seël (VVS sakkie vervaardigingsmasjien geanaliseer vir ekonomiese lewensvatbaarheid. ’n Statistiese model word gepas oor die data om die produksie-onderbrekings te optimeer, en die model se toepaslikheid en waarde word bevestig met ’n tweede data-stel verkry van dieselfde masjien.

  14. Big Data Analytics for Smart Manufacturing: Case Studies in Semiconductor Manufacturing

    Directory of Open Access Journals (Sweden)

    James Moyne

    2017-07-01

    Full Text Available Smart manufacturing (SM is a term generally applied to the improvement in manufacturing operations through integration of systems, linking of physical and cyber capabilities, and taking advantage of information including leveraging the big data evolution. SM adoption has been occurring unevenly across industries, thus there is an opportunity to look to other industries to determine solution and roadmap paths for industries such as biochemistry or biology. The big data evolution affords an opportunity for managing significantly larger amounts of information and acting on it with analytics for improved diagnostics and prognostics. The analytics approaches can be defined in terms of dimensions to understand their requirements and capabilities, and to determine technology gaps. The semiconductor manufacturing industry has been taking advantage of the big data and analytics evolution by improving existing capabilities such as fault detection, and supporting new capabilities such as predictive maintenance. For most of these capabilities: (1 data quality is the most important big data factor in delivering high quality solutions; and (2 incorporating subject matter expertise in analytics is often required for realizing effective on-line manufacturing solutions. In the future, an improved big data environment incorporating smart manufacturing concepts such as digital twin will further enable analytics; however, it is anticipated that the need for incorporating subject matter expertise in solution design will remain.

  15. Big five personality factors and cigarette smoking: a 10-year study among US adults.

    Science.gov (United States)

    Zvolensky, Michael J; Taha, Farah; Bono, Amanda; Goodwin, Renee D

    2015-04-01

    The present study examined the relation between the big five personality traits and any lifetime cigarette use, progression to daily smoking, and smoking persistence among adults in the United States (US) over a ten-year period. Data were drawn from the Midlife Development in the US (MIDUS) I and II (N = 2101). Logistic regression was used to examine the relationship between continuously measured personality factors and any lifetime cigarette use, smoking progression, and smoking persistence at baseline (1995-1996) and at follow-up (2004-2006). The results revealed that higher levels of openness to experience and neuroticism were each significantly associated with increased risk of any lifetime cigarette use. Neuroticism also was associated with increased risk of progression from ever smoking to daily smoking and persistent daily smoking over a ten-year period. In contrast, conscientiousness was associated with decreased risk of lifetime cigarette use, progression to daily smoking, and smoking persistence. Most, but not all, associations between smoking and personality persisted after adjusting for demographic characteristics, depression, anxiety disorders, and substance use problems. The findings suggest that openness to experience and neuroticism may be involved in any lifetime cigarette use and smoking progression, and that conscientiousness appears to protect against smoking progression and persistence. These data add to a growing literature suggesting that certain personality factors--most consistently neuroticism--are important to assess and perhaps target during intervention programs for smoking behavior.

  16. A Global Optimisation Toolbox for Massively Parallel Engineering Optimisation

    CERN Document Server

    Biscani, Francesco; Yam, Chit Hong

    2010-01-01

    A software platform for global optimisation, called PaGMO, has been developed within the Advanced Concepts Team (ACT) at the European Space Agency, and was recently released as an open-source project. PaGMO is built to tackle high-dimensional global optimisation problems, and it has been successfully used to find solutions to real-life engineering problems among which the preliminary design of interplanetary spacecraft trajectories - both chemical (including multiple flybys and deep-space maneuvers) and low-thrust (limited, at the moment, to single phase trajectories), the inverse design of nano-structured radiators and the design of non-reactive controllers for planetary rovers. Featuring an arsenal of global and local optimisation algorithms (including genetic algorithms, differential evolution, simulated annealing, particle swarm optimisation, compass search, improved harmony search, and various interfaces to libraries for local optimisation such as SNOPT, IPOPT, GSL and NLopt), PaGMO is at its core a C++ ...

  17. A study on specialist or special disease clinics based on big data.

    Science.gov (United States)

    Fang, Zhuyuan; Fan, Xiaowei; Chen, Gong

    2014-09-01

    Correlation analysis and processing of massive medical information can be implemented through big data technology to find the relevance of different factors in the life cycle of a disease and to provide the basis for scientific research and clinical practice. This paper explores the concept of constructing a big medical data platform and introduces the clinical model construction. Medical data can be collected and consolidated by distributed computing technology. Through analysis technology, such as artificial neural network and grey model, a medical model can be built. Big data analysis, such as Hadoop, can be used to construct early prediction and intervention models as well as clinical decision-making model for specialist and special disease clinics. It establishes a new model for common clinical research for specialist and special disease clinics.

  18. Particle Swarm Optimisation with Spatial Particle Extension

    DEFF Research Database (Denmark)

    Krink, Thiemo; Vesterstrøm, Jakob Svaneborg; Riget, Jacques

    2002-01-01

    In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed...

  19. Big Data Quality Case Study Preliminary Findings, U.S. Army MEDCOM MODS

    Science.gov (United States)

    2013-09-01

    Key Benefits B-5 B.3.2 Broadening Access to Hadoop B-6 Appendix C Project Description C-1 C.1 Background C-1 C.2 System Goals and Objectives C-1 C.3...relational technology in concert with their newer intermediate Aster products and the standard Hadoop map-reduce Big Data technologies. While the single...The document and records data as well as the text data will be processed through a separate Hadoop -based Big Data Analytics environment using the

  20. The hydrodynamics of the Big Horn Basin: a study of the role of faults

    Science.gov (United States)

    Bredehoeft, J.D.; Belitz, K.; Sharp-Hansen, S.

    1992-01-01

    A three-dimensional mathematical model simulates groundwater flow in the Big Horn basin, Wyoming. The hydraulic head at depth over much of the Big Horn basin is near the land surface elevation, a condition usually defined as hydrostatic. This condition indicates a high, regional-scale, vertical conductivity for the sediments in the basin. Our hypothesis to explain the high conductivity is that the faults act as vertical conduits for fluid flow. These same faults can act as either horizontal barriers to flow or nonbarriers, depending upon whether the fault zones are more permeable or less permeable than the adjoining aquifers. -from Authors

  1. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  2. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  3. From experimental zoology to big data: Observation and integration in the study of animal development.

    Science.gov (United States)

    Bolker, Jessica; Brauckmann, Sabine

    2015-06-01

    The founding of the Journal of Experimental Zoology in 1904 was inspired by a widespread turn toward experimental biology in the 19th century. The founding editors sought to promote experimental, laboratory-based approaches, particularly in developmental biology. This agenda raised key practical and epistemological questions about how and where to study development: Does the environment matter? How do we know that a cell or embryo isolated to facilitate observation reveals normal developmental processes? How can we integrate descriptive and experimental data? R.G. Harrison, the journal's first editor, grappled with these questions in justifying his use of cell culture to study neural patterning. Others confronted them in different contexts: for example, F.B. Sumner insisted on the primacy of fieldwork in his studies on adaptation, but also performed breeding experiments using wild-collected animals. The work of Harrison, Sumner, and other early contributors exemplified both the power of new techniques, and the meticulous explanation of practice and epistemology that was marshaled to promote experimental approaches. A century later, experimentation is widely viewed as the standard way to study development; yet at the same time, cutting-edge "big data" projects are essentially descriptive, closer to natural history than to the approaches championed by Harrison et al. Thus, the original questions about how and where we can best learn about development are still with us. Examining their history can inform current efforts to incorporate data from experiment and description, lab and field, and a broad range of organisms and disciplines, into an integrated understanding of animal development. © 2015 Wiley Periodicals, Inc.

  4. Understanding the implementation and adoption of an information technology intervention to support medicine optimisation in primary care: qualitative study using strong structuration theory.

    Science.gov (United States)

    Jeffries, Mark; Phipps, Denham; Howard, Rachel L; Avery, Anthony; Rodgers, Sarah; Ashcroft, Darren

    2017-05-10

    Using strong structuration theory, we aimed to understand the adoption and implementation of an electronic clinical audit and feedback tool to support medicine optimisation for patients in primary care. This is a qualitative study informed by strong structuration theory. The analysis was thematic, using a template approach. An a priori set of thematic codes, based on strong structuration theory, was developed from the literature and applied to the transcripts. The coding template was then modified through successive readings of the data. Clinical commissioning group in the south of England. Four focus groups and five semi-structured interviews were conducted with 18 participants purposively sampled from a range of stakeholder groups (general practitioners, pharmacists, patients and commissioners). Using the system could lead to improved medication safety, but use was determined by broad institutional contexts; by the perceptions, dispositions and skills of users; and by the structures embedded within the technology. These included perceptions of the system as new and requiring technical competence and skill; the adoption of the system for information gathering; and interactions and relationships that involved individual, shared or collective use. The dynamics between these external, internal and technological structures affected the adoption and implementation of the system. Successful implementation of information technology interventions for medicine optimisation will depend on a combination of the infrastructure within primary care, social structures embedded in the technology and the conventions, norms and dispositions of those utilising it. Future interventions, using electronic audit and feedback tools to improve medication safety, should consider the complexity of the social and organisational contexts and how internal and external structures can affect the use of the technology in order to support effective implementation. © Article author(s) (or their

  5. Simple Combinatorial Optimisation Cost Games

    NARCIS (Netherlands)

    van Velzen, S.

    2005-01-01

    In this paper we introduce the class of simple combinatorial optimisation cost games, which are games associated to {0, 1}-matrices.A coalitional value of a combinatorial optimisation game is determined by solving an integer program associated with this matrix and the characteristic vector of the

  6. Deciding between carbon trading and carbon capture and sequestration: an optimisation-based case study for methanol synthesis from syngas.

    Science.gov (United States)

    Üçtuğ, Fehmi Görkem; Ağralı, Semra; Arıkan, Yıldız; Avcıoğlu, Eray

    2014-01-01

    The economic and technical feasibility of carbon capture and sequestration (CCS) systems are gaining importance as CO2 emission reduction is becoming a more pressing issue for parties from production sectors. Public and private entities have to comply with national schemes imposing tighter limits on their emission allowances. Often these parties face two options as whether to invest in CCS or buy carbon credits for the excess emissions above their limits. CCS is an expensive system to invest in and to operate. Therefore, its feasibility depends on the carbon credit prices prevailing in the markets now and in the future. In this paper we consider the problem of installing a CCS unit in order to ensure that the amount of CO2 emissions is within its allowable limits. We formulate this problem as a non-linear optimisation problem where the objective is to maximise the net returns from pursuing an optimal mix of the two options described above. General Algebraic Modelling Systems (GAMS) software was used to solve the model. The results were found to be sensitive to carbon credit prices and the discount rate, which determines the choices with respect to the future and the present. The model was applied to a methanol synthesis plant as an example. However, the formulation can easily be extended to any production process if the CO2 emissions level per unit of physical production is known. The results showed that for CCS to be feasible, carbon credit prices must be above 15 Euros per ton. This value, naturally, depends on the plant-specific data, and the costs we have employed for CCS. The actual prices (≈5 Euros/ton CO2) at present are far from encouraging the investors into CCS technology.

  7. Carbon dioxide sequestration using NaHSO4 and NaOH: A dissolution and carbonation optimisation study.

    Science.gov (United States)

    Sanna, Aimaro; Steel, Luc; Maroto-Valer, M Mercedes

    2017-03-15

    The use of NaHSO4 to leach out Mg fromlizardite-rich serpentinite (in form of MgSO4) and the carbonation of CO2 (captured in form of Na2CO3 using NaOH) to form MgCO3 and Na2SO4 was investigated. Unlike ammonium sulphate, sodium sulphate can be separated via precipitation during the recycling step avoiding energy intensive evaporation process required in NH4-based processes. To determine the effectiveness of the NaHSO4/NaOH process when applied to lizardite, the optimisation of the dissolution and carbonation steps were performed using a UK lizardite-rich serpentine. Temperature, solid/liquid ratio, particle size, concentration and molar ratio were evaluated. An optimal dissolution efficiency of 69.6% was achieved over 3 h at 100 °C using 1.4 M sodium bisulphate and 50 g/l serpentine with particle size 75-150 μm. An optimal carbonation efficiency of 95.4% was achieved over 30 min at 90 °C and 1:1 magnesium:sodium carbonate molar ratio using non-synthesised solution. The CO2 sequestration capacity was 223.6 g carbon dioxide/kg serpentine (66.4% in terms of Mg bonded to hydromagnesite), which is comparable with those obtained using ammonium based processes. Therefore, lizardite-rich serpentinites represent a valuable resource for the NaHSO4/NaOH based pH swing mineralisation process.

  8. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  9. Using Multiple Big Datasets and Machine Learning to Produce a New Global Particulate Dataset: A Technology Challenge Case Study

    Science.gov (United States)

    Lary, D. J.

    2013-12-01

    A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.

  10. The Person-Event Data Environment: leveraging big data for studies of psychological strengths in soldiers.

    Science.gov (United States)

    Vie, Loryana L; Griffith, Kevin N; Scheier, Lawrence M; Lester, Paul B; Seligman, Martin E P

    2013-01-01

    The Department of Defense (DoD) strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE) was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment, and security systems that influence Army-wide policies and procedures.

  11. Trait Emotional Intelligence and the Big Five: A Study on Italian Children and Preadolescents

    Science.gov (United States)

    Russo, Paolo Maria; Mancini, Giacomo; Trombini, Elena; Baldaro, Bruno; Mavroveli, Stella; Petrides, K. V.

    2012-01-01

    Trait emotional intelligence (EI) is a constellation of emotion-related self-perceptions located at the lower levels of personality hierarchies. This article examines the validity of the Trait Emotional Intelligence Questionnaire-Child Form and investigates its relationships with Big Five factors and cognitive ability. A total of 690 children (317…

  12. Study on LBS for Characterization and Analysis of Big Data Benchmarks

    Directory of Open Access Journals (Sweden)

    Aftab Ahmed Chandio

    2014-10-01

    Full Text Available In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a on-demand accessed and (b large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes to thousands of PT (Peta Bytes (i.e. Big Data. To increase the development and the assessment of the applications such as LBS (Location Based Services, a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction

  13. A Study of the Subtitle Translation in“The Big Bang Theory”from Newmark’s Communicative Translation Theory

    Institute of Scientific and Technical Information of China (English)

    甄宽; 彭念凡; 甄顺

    2015-01-01

    The subtitle translation is very different from other forms of translation.We translators should meet the particular needs of the subtitle.This study is going to analyze the subtitle translation in“The Big Bang Theory” from Newmark’s Communicative Translation Theory in three main perspectives:the information transmission,the aesthetics effect and the emotional transmission.In the information transmission the study will put emphasis on the limited circumstance.In the aesthetics effect the study will explore the expression of the sense of beauty.In the emotional transmission this study will study the use of rhetoric to express different emotions.

  14. A Study of the Subtitle Translation in “The Big Bang Theory” from Newmark’s Communicative Translation Theory

    Institute of Scientific and Technical Information of China (English)

    甄宽; 彭念凡; 甄顺

    2015-01-01

    The subtitle translation is very different from other forms of translation.We translators should meet the particular needs of the subtitle.This study is going to analyze the subtitle translation in "The Big Bang Theory" from Newmark’s Communicative Translation Theory in three main perspectives:the information transmission,the aesthetics effect and the emotional transmission.In the information transmission the study will put emphasis on the limited circumstance.In the aesthetics effect the study will explore the expression of the sense of beauty.In the emotional transmission this study will study the use of rhetoric to express different emotions.

  15. [Analysis of a blog for gastrointestinal disease in the view point of the big data: a single institutional study].

    Science.gov (United States)

    Choi, Jungran; Park, Hyojin; Lee, Choong-Hyun

    2014-06-01

    With the enormous increase in the amount of data, the concept of big data has emerged and this allows us to gain new insights and appreciate its value. However, analysis related to gastrointestinal diseases in the viewpoint of the big data has not been performed yet in Korea. This study analyzed the data of the blog's visitors as a set of big data to investigate questions they did not mention in the clinical situation. We analyzed the blog of a professor whose subspecialty is gastroenterology at Gangnam Severance Hospital. We assessed the changes in the number of visitors, access path of visitors, and the queries from January 2011 to December 2013. A total of 50,084 visitors gained accessed to the blog. An average of 1,535.3 people visited the blog per month and 49.5 people per day. The number of visitors and the cumulative number of registered posts showed a positive correlation. The most utilized access path of visitors to the website was blog.iseverance.com (42.2%), followed by Google (32.8%) and Daum (6.6%). The most searched term by the visitors in the blog was intestinal metaplasia (16.6%), followed by dizziness (8.3%) and gastric submucosal tumor (7.0%). Personal blog can function as a communication route for patients with digestive diseases. The most frequently searched word necessitating explanation and education was 'intestinal metaplasia'. Identifying and analyzing even unstructured data as a set of big data is expected to provide meaningful information.

  16. Optimising Ankle Foot Orthoses for children with Cerebral Palsy walking with excessive knee flexion to improve their mobility and participation; protocol of the AFO-CP study

    Directory of Open Access Journals (Sweden)

    Kerkum Yvette L

    2013-02-01

    Full Text Available Abstract Background Ankle-Foot-Orthoses with a ventral shell, also known as Floor Reaction Orthoses (FROs, are often used to reduce gait-related problems in children with spastic cerebral palsy (SCP, walking with excessive knee flexion. However, current evidence for the effectiveness (e.g. in terms of walking energy cost of FROs is both limited and inconclusive. Much of this ambiguity may be due to a mismatch between the FRO ankle stiffness and the patient’s gait deviations. The primary aim of this study is to evaluate the effect of FROs optimised for ankle stiffness on the walking energy cost in children with SCP, compared to walking with shoes alone. In addition, effects on various secondary outcome measures will be evaluated in order to identify possible working mechanisms and potential predictors of FRO treatment success. Method/Design A pre-post experimental study design will include 32 children with SCP, walking with excessive knee flexion in midstance, recruited from our university hospital and affiliated rehabilitation centres. All participants will receive a newly designed FRO, allowing ankle stiffness to be varied into three configurations by means of a hinge. Gait biomechanics will be assessed for each FRO configuration. The FRO that results in the greatest reduction in knee flexion during the single stance phase will be selected as the subject’s optimal FRO. Subsequently, the effects of wearing this optimal FRO will be evaluated after 12–20 weeks. The primary study parameter will be walking energy cost, with the most important secondary outcomes being intensity of participation, daily activity, walking speed and gait biomechanics. Discussion The AFO-CP trial will be the first experimental study to evaluate the effect of individually optimised FROs on mobility and participation. The evaluation will include outcome measures at all levels of the International Classification of Functioning, Disability and Health, providing a unique

  17. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    OpenAIRE

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationsh...

  18. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  19. Topology optimised wavelength dependent splitters

    DEFF Research Database (Denmark)

    Hede, K. K.; Burgos Leon, J.; Frandsen, Lars Hagedorn;

    A photonic crystal wavelength dependent splitter has been constructed by utilising topology optimisation1. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1). The topology optimised wavelength dependent splitter demonstrates promising 3D FDTD simulation results....... This complex photonic crystal structure is very sensitive against small fabrication variations from the expected topology optimised design. A wavelength dependent splitter is an important basic building block for high-performance nanophotonic circuits. 1J. S. Jensen and O. Sigmund, App. Phys. Lett. 84, 2022...

  20. Data, BigData and smart cities. Considerations and case study on environmental monitoring

    Directory of Open Access Journals (Sweden)

    Giacomo Chiesa

    2014-10-01

    Full Text Available The growing interest in technologies and strategies for constructing smart cities and smart buildings promotes the spread of ICT solutions which often use large amounts of data. Nowadays, urban monitoring are often interrelated with the innovations introduced by BigData and the neologism “datization”, passing from the collection of a limited number of datapoints to the accumulation of as much data as possible, regardless of their future uses. The paper focuses on the production phase of data from the monitoring of environmental variables by using several measurement stations spread on the territory. The aim is to identify operational problems and possible solutions for a bottom-up construction of BigData datasets.

  1. A methodological approach to the design of optimising control strategies for sewer systems

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft; Mikkelsen, Peter Steen; Sin, Gürkan

    2016-01-01

    This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordin......This study focuses on designing an optimisation based control for sewer system in a methodological way and linking itto a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters...... control; a rule based expert system. On the other hand, compared with a regulatory control technique designed earlier in Mollerup et al. (2015), the optimisation showed similar performance with respect to minimising overflow volume. Hence for operation of small sewer systems, regulatory control strategies...... can offer promising potential and should be considered along more advanced strategies when identifying novel solutions....

  2. Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte; Olsen, Flemmming Ove

    1996-01-01

    The problem in optimising the laser cutting process is outlined. Basic optimisation criteria and principles for adapting an optimisation method, the simplex method, are presented. The results of implementing a response function in the optimisation are discussed with respect to the quality as well...

  3. Application of optimisation techniques in groundwater quantity and quality management

    Indian Academy of Sciences (India)

    Amlan Das; Bithin Datta

    2001-08-01

    This paper presents the state-of-the-art on application of optimisation techniques in groundwater quality and quantity management. In order to solve optimisation-based groundwater management models, researchers have used various mathematical programming techniques such as linear programming (LP), nonlinear programming (NLP), mixed-integer programming (MIP), optimal control theory-based mathematical programming, differential dynamic programming (DDP), stochastic programming (SP), combinatorial optimisation (CO), and multiple objective programming for multipurpose management. Studies reported in the literature on the application of these methods are reviewed in this paper.

  4. Unstructured medical image query using big data - An epilepsy case study.

    Science.gov (United States)

    Istephan, Sarmad; Siadat, Mohammad-Reza

    2016-02-01

    Big data technologies are critical to the medical field which requires new frameworks to leverage them. Such frameworks would benefit medical experts to test hypotheses by querying huge volumes of unstructured medical data to provide better patient care. The objective of this work is to implement and examine the feasibility of having such a framework to provide efficient querying of unstructured data in unlimited ways. The feasibility study was conducted specifically in the epilepsy field. The proposed framework evaluates a query in two phases. In phase 1, structured data is used to filter the clinical data warehouse. In phase 2, feature extraction modules are executed on the unstructured data in a distributed manner via Hadoop to complete the query. Three modules have been created, volume comparer, surface to volume conversion and average intensity. The framework allows for user-defined modules to be imported to provide unlimited ways to process the unstructured data hence potentially extending the application of this framework beyond epilepsy field. Two types of criteria were used to validate the feasibility of the proposed framework - the ability/accuracy of fulfilling an advanced medical query and the efficiency that Hadoop provides. For the first criterion, the framework executed an advanced medical query that spanned both structured and unstructured data with accurate results. For the second criterion, different architectures were explored to evaluate the performance of various Hadoop configurations and were compared to a traditional Single Server Architecture (SSA). The surface to volume conversion module performed up to 40 times faster than the SSA (using a 20 node Hadoop cluster) and the average intensity module performed up to 85 times faster than the SSA (using a 40 node Hadoop cluster). Furthermore, the 40 node Hadoop cluster executed the average intensity module on 10,000 models in 3h which was not even practical for the SSA. The current study is

  5. Transforming fragments into candidates: small becomes big in medicinal chemistry.

    Science.gov (United States)

    de Kloe, Gerdien E; Bailey, David; Leurs, Rob; de Esch, Iwan J P

    2009-07-01

    Fragment-based drug discovery (FBDD) represents a logical and efficient approach to lead discovery and optimisation. It can draw on structural, biophysical and biochemical data, incorporating a wide range of inputs, from precise mode-of-binding information on specific fragments to wider ranging pharmacophoric screening surveys using traditional HTS approaches. It is truly an enabling technology for the imaginative medicinal chemist. In this review, we analyse a representative set of 23 published FBDD studies that describe how low molecular weight fragments are being identified and efficiently transformed into higher molecular weight drug candidates. FBDD is now becoming warmly endorsed by industry as well as academia and the focus on small interacting molecules is making a big scientific impact.

  6. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  7. Optimisation of patient protection and image quality in diagnostic ...

    African Journals Online (AJOL)

    Optimisation of patient protection and image quality in diagnostic radiology. ... The study leads to the introduction of the concept of plan- do-check-act on QC results ... (QA) programme and continues to collect data for establishment of DRL's.

  8. Isogeometric Analysis and Shape Optimisation

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Gersborg, Allan Roulund

    obtained and also some of the problems we have encountered. One of these problems is that the geometry of the shape is given by the boundary alone. And, it is the parametrisation of the boundary which is changed by the optimisation procedure. But isogeometric analysis requires a parametrisation......One of the attractive features of isogeometric analysis is the exact representation of the geometry. The geometry is furthermore given by a relative low number of control points and this makes isogeometric analysis an ideal basis for shape optimisation. I will describe some of the results we have...... of the whole domain. So in every optimisation cycle we need to extend a parametrisation of the boundary of a domain to the whole domain. It has to be fast in order not to slow the optimisation down but it also has to be robust and give a parametrisation of high quality. These are conflicting requirements so we...

  9. Turbulence optimisation in stellarator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Proll, Josefine H.E. [Max-Planck/Princeton Center for Plasma Physics (Germany); Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstr. 1, 17491 Greifswald (Germany); Faber, Benjamin J. [HSX Plasma Laboratory, University of Wisconsin-Madison, Madison, WI 53706 (United States); Helander, Per; Xanthopoulos, Pavlos [Max-Planck/Princeton Center for Plasma Physics (Germany); Lazerson, Samuel A.; Mynick, Harry E. [Plasma Physics Laboratory, Princeton University, P.O. Box 451 Princeton, New Jersey 08543-0451 (United States)

    2015-05-01

    Stellarators, the twisted siblings of the axisymmetric fusion experiments called tokamaks, have historically suffered from confining the heat of the plasma insufficiently compared with tokamaks and were therefore considered to be less promising candidates for a fusion reactor. This has changed, however, with the advent of stellarators in which the laminar transport is reduced to levels below that of tokamaks by shaping the magnetic field accordingly. As in tokamaks, the turbulent transport remains as the now dominant transport channel. Recent analytical theory suggests that the large configuration space of stellarators allows for an additional optimisation of the magnetic field to also reduce the turbulent transport. In this talk, the idea behind the turbulence optimisation is explained. We also present how an optimised equilibrium is obtained and how it might differ from the equilibrium field of an already existing device, and we compare experimental turbulence measurements in different configurations of the HSX stellarator in order to test the optimisation procedure.

  10. Study rationale and design of OPTIMISE, a randomised controlled trial on the effect of benchmarking on quality of care in type 2 diabetes mellitus

    Directory of Open Access Journals (Sweden)

    Hermans Michel P

    2011-09-01

    Full Text Available Abstract Background To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM. Methods Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP], checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Results Recruitment was completed in December 2008 with 3994 evaluable patients. Conclusions This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. Trial registration NCT00681850

  11. Machine Learning for Big Data: A Study to Understand Limits at Scale

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Del-Castillo-Negrete, Carlos Emilio [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-21

    This report aims to empirically understand the limits of machine learning when applied to Big Data. We observe that recent innovations in being able to collect, access, organize, integrate, and query massive amounts of data from a wide variety of data sources have brought statistical data mining and machine learning under more scrutiny, evaluation and application for gleaning insights from the data than ever before. Much is expected from algorithms without understanding their limitations at scale while dealing with massive datasets. In that context, we pose and address the following questions How does a machine learning algorithm perform on measures such as accuracy and execution time with increasing sample size and feature dimensionality? Does training with more samples guarantee better accuracy? How many features to compute for a given problem? Do more features guarantee better accuracy? Do efforts to derive and calculate more features and train on larger samples worth the effort? As problems become more complex and traditional binary classification algorithms are replaced with multi-task, multi-class categorization algorithms do parallel learners perform better? What happens to the accuracy of the learning algorithm when trained to categorize multiple classes within the same feature space? Towards finding answers to these questions, we describe the design of an empirical study and present the results. We conclude with the following observations (i) accuracy of the learning algorithm increases with increasing sample size but saturates at a point, beyond which more samples do not contribute to better accuracy/learning, (ii) the richness of the feature space dictates performance - both accuracy and training time, (iii) increased dimensionality often reflected in better performance (higher accuracy in spite of longer training times) but the improvements are not commensurate the efforts for feature computation and training and (iv) accuracy of the learning algorithms

  12. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  13. 基于概念置换的大数据定义研究%Study on the Definitions of Big Data Based on the Concept Replacement

    Institute of Scientific and Technical Information of China (English)

    李天柱; 王圣慧; 马佳

    2015-01-01

    In the current researches about big data,the crisis of definition disorder emerges.The poor big data definitions, neither because of vague nor because of inconsistency,would affect the further researches of big data in related areas. Based on the literature review,this study then brings forward a strict,clarified and broadly -accepted definition of big data by the concept replacement method in analytic philosophy.The fair concept of big data can make the follow -up researches easier and more workable.In the end,this study discusses the application prospect of the definition of big data by case studies in enterprise technological innovation.%目前大数据研究领域正面临概念危机,定义的模糊不清和互换性较差将影响该领域研究的深入进行。在文献回顾的基础上,运用分析哲学中的概念置换方法提出一个严格、清晰且普适性较好的大数据定义,基于这个公允定义,一系列后续问题的研究变得简易可行了。进一步以企业技术创新管理为例研究了该定义的应用前景。

  14. Public transport optimisation emphasising passengers’ travel behaviour

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo

    to enhance the operations of public transport while explicitly emphasising passengers’ travel behaviour and preferences. Similar to economic theory, interactions between supply and demand are omnipresent in the context of public transport operations. In public transport, the demand is represented...... the published performance measures and what passengers actually experience, a large academic contribution of the current PhD study is the explicit consideration of passengers’ travel behaviour in optimisation studies and in the performance assessment. Besides the explicit passenger focus in transit planning...... at as the motivator for delay-robust railway timetables. Interestingly, passenger oriented optimisation studies considering robustness in railway planning typically limit their emphasis on passengers to the consideration of transfer maintenance. Clearly, passengers’ travel behaviour is more complex and multifaceted...

  15. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  16. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  17. Amplitude-oriented exercise in Parkinson's disease: a randomized study comparing LSVT-BIG and a short training protocol.

    Science.gov (United States)

    Ebersbach, Georg; Grust, Ute; Ebersbach, Almut; Wegner, Brigitte; Gandor, Florin; Kühn, Andrea A

    2015-02-01

    LSVT-BIG is an exercise for patients with Parkinson's disease (PD) comprising of 16 1-h sessions within 4 weeks. LSVT-BIG was compared with a 2-week short protocol (AOT-SP) consisting of 10 sessions with identical exercises in 42 patients with PD. UPDRS-III-score was reduced by -6.6 in LSVT-BIG and -5.7 in AOT-SP at follow-up after 16 weeks (p BIG and AOT-SP but high-intensity LSVT-BIG was more effective to obtain patient-perceived benefit.

  18. Risk based methods for optimised operation of power stations - a pilot study; Riskbaserade metoder foer optimerad drift av kraftvaermeverk - en foerstudie

    Energy Technology Data Exchange (ETDEWEB)

    Gunnars, Jens; Gustavsson, Fredrik [Det Norske Veritas AB, Stockholm (Sweden)

    2002-03-01

    Methods for risk based planning and management of maintenance and operation of mechanical components in power stations have been studied. Risk based methods may be utilised for analysis of the risk level with reference to both safety and economy of the plant. The methods can be an important tool for planning and optimisation of the annual investment in different types of maintenance actions, with the purpose of improving long term profitability. The risk based planning can include: selection of components, inspection intervals, coverage, planning of time for replacement/repair of components, and selection of operation conditions. The first part of the report is a general survey and description of risk based methods for analyse of mechanical components. Some problems specific to power stations are discussed. Application of quantitative RBI is illustrated for the water system in steam boiler number 5 at Aabyverket. The possibilities to decrease inspection costs or increase availability also for power stations is obvious, and is expected to result in competitive advantages. The use and understanding of quantitative reliability methods are a necessary and essential part of any RBI assessment.

  19. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, I.; Mollerup, Ane Loft

    2013-01-01

    Self-optimising control is a useful concept to select optimal controlled variables from a set of candidate measurements in a systematic manner. In this study, use self-optimizing control tools and apply them to the specific features of sewer systems, e.g. the continuously transient dynamics......, the availability of a large number of measurements, the stochastic and unforeseeable character of the disturbances (rainfall). Using a subcatchment area in the Copenhagen sewer system as a case study we demonstrate, step by step, the formulation of the self-optimising control problem. The final result...... is an improved control structure aimed at optimizing the losses for a given control objective, here the minimization of the combined sewer overflows despite rainfall variations....

  20. Big Man

    Institute of Scientific and Technical Information of China (English)

    郑秀文

    2012-01-01

    <正>梁炳"Edmond"说他演唱会后会跟太太去旅行。无论飞机降落在地球的哪角,有伴在旁就是幸福。他的concert名字是big man,初时我看错是big mac演唱会:心想干吗是大汉堡演唱会?嘻!后来才知看错。但其实细想,在成长路上,谁不曾是活得像个傻傻的面包,一团面粉暴露在这大千世界,时间和各式人生经历就是酵母,多少年月日,你我都会发酵成长。友情也是激发彼此成长的酵母,看到对方早已经从男仔成了男人,我都原来一早已不再能够以"女仔"称呼自己。在我眼中,他的改变是大的,爱玩外向的个性收窄了,现在的我们,

  1. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  2. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  3. Are Big Food's corporate social responsibility strategies valuable to communities? A qualitative study with parents and children.

    Science.gov (United States)

    Richards, Zoe; Phillipson, Lyn

    2017-08-29

    Recent studies have identified parents and children as two target groups whom Big Food hopes to positively influence through its corporate social responsibility (CSR) strategies. The current preliminary study aimed to gain an in-depth understanding of parents and children's awareness and interpretation of Big Food's CSR strategies to understand how CSR shapes their beliefs about companies. Community-based qualitative semi-structured interviews. New South Wales, Australia. Parents (n 15) and children aged 8-12 years (n 15). Parents and children showed unprompted recognition of CSR activities when shown McDonald's and Coca-Cola brand logos, indicating a strong level of association between the brands and activities that target the settings of children. When discussing CSR strategies some parents and most children saw value in the activities, viewing them as acts of merit or worth. For some parents and children, the companies' CSR activities were seen as a reflection of the company's moral attributes, which resonated with their own values of charity and health. For others, CSR strategies were in conflict with companies' core business. Finally, some also viewed the activities as harmful, representing a deceit of the public and a smokescreen for the companies' ultimately unethical behaviour. A large proportion of participants valued the CSR activities, signalling that denormalising CSR to sever the strong ties between the community and Big Food will be a difficult process for the public health community. Efforts to gain public acceptance for action on CSR may need greater levels of persuasion to gain public support of a comprehensive and restrictive approach.

  4. Development of a United States-Mexico Emissions Inventory for the Big Bend Regional Aerosol and Visibility Observational (BRAVO) Study.

    Science.gov (United States)

    Kuhns, Hampden; Knipping, Eladio M; Vukovich, Jeffrey M

    2005-05-01

    The Big Bend Regional Aerosol and Visibility Observational (BRAVO) Study was commissioned to investigate the sources of haze at Big Bend National Park in southwest Texas. The modeling domain of the BRAVO Study includes most of the continental United States and Mexico. The BRAVO emissions inventory was constructed from the 1999 National Emission Inventory for the United States, modified to include finer-resolution data for Texas and 13 U.S. states in close proximity. The first regional-scale Mexican emissions inventory designed for air-quality modeling applications was developed for 10 northern Mexican states, the Tula Industrial Park in the state of Hidalgo, and the Popocatépetl volcano in the state of Puebla. Emissions data were compiled from numerous sources, including the U.S. Environmental Protection Agency (EPA), the Texas Natural Resources Conservation Commission (now Texas Commission on Environmental Quality), the Eastern Research Group, the Minerals Management Service, the Instituto Nacional de Ecología, and the Instituto Nacional de Estadistica Geografía y Informática. The inventory includes emissions for CO, nitrogen oxides, sulfur dioxide, volatile organic compounds (VOCs), ammonia, particulate matter (PM) < 10 microm in aerodynamic diameter, and PM < 2.5 microm in aerodynamic diameter. Wind-blown dust and biomass burning were not included in the inventory, although high concentrations of dust and organic PM attributed to biomass burning have been observed at Big Bend National Park. The SMOKE modeling system was used to generate gridded emissions fields for use with the Regional Modeling System for Aerosols and Deposition (REMSAD) and the Community Multiscale Air Quality model modified with the Model of Aerosol Dynamics, Reaction, Ionization and Dissolution (CMAQ-MADRID). The compilation of the inventory, supporting model input data, and issues encountered during the development of the inventory are documented. A comparison of the BRAVO emissions

  5. Big(ger Data as Better Data in Open Distance Learning

    Directory of Open Access Journals (Sweden)

    Paul Prinsloo

    2015-02-01

    Full Text Available In the context of the hype, promise and perils of Big Data and the currently dominant paradigm of data-driven decision-making, it is important to critically engage with the potential of Big Data for higher education. We do not question the potential of Big Data, but we do raise a number of issues, and present a number of theses to be seriously considered in realising this potential. The University of South Africa (Unisa is one of the mega ODL institutions in the world with more than 360,000 students and a range of courses and programmes. Unisa already has access to a staggering amount of student data, hosted in disparate sources, and governed by different processes. As the university moves to mainstreaming online learning, the amount of and need for analyses of data are increasing, raising important questions regarding our assumptions, understanding, data sources, systems and processes. This article presents a descriptive case study of the current state of student data at Unisa, as well as explores the impact of existing data sources and analytic approaches. From the analysis it is clear that in order for big(ger data to be better data, a number of issues need to be addressed. The article concludes by presenting a number of theses that should form the basis for the imperative to optimise the harvesting, analysis and use of student data.

  6. optPBN: An Optimisation Toolbox for Probabilistic Boolean Networks

    Science.gov (United States)

    Trairatphisan, Panuwat; Mizera, Andrzej; Pang, Jun; Tantar, Alexandru Adrian; Sauter, Thomas

    2014-01-01

    Background There exist several computational tools which allow for the optimisation and inference of biological networks using a Boolean formalism. Nevertheless, the results from such tools yield only limited quantitative insights into the complexity of biological systems because of the inherited qualitative nature of Boolean networks. Results We introduce optPBN, a Matlab-based toolbox for the optimisation of probabilistic Boolean networks (PBN) which operates under the framework of the BN/PBN toolbox. optPBN offers an easy generation of probabilistic Boolean networks from rule-based Boolean model specification and it allows for flexible measurement data integration from multiple experiments. Subsequently, optPBN generates integrated optimisation problems which can be solved by various optimisers. In term of functionalities, optPBN allows for the construction of a probabilistic Boolean network from a given set of potential constitutive Boolean networks by optimising the selection probabilities for these networks so that the resulting PBN fits experimental data. Furthermore, the optPBN pipeline can also be operated on large-scale computational platforms to solve complex optimisation problems. Apart from exemplary case studies which we correctly inferred the original network, we also successfully applied optPBN to study a large-scale Boolean model of apoptosis where it allows identifying the inverse correlation between UVB irradiation, NFκB and Caspase 3 activations, and apoptosis in primary hepatocytes quantitatively. Also, the results from optPBN help elucidating the relevancy of crosstalk interactions in the apoptotic network. Summary The optPBN toolbox provides a simple yet comprehensive pipeline for integrated optimisation problem generation in the PBN formalism that can readily be solved by various optimisers on local or grid-based computational platforms. optPBN can be further applied to various biological studies such as the inference of gene regulatory

  7. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  8. 基于改进能量优化的透视纹理合成研究%STUDY ON PERSPECTIVE TEXTURE SYNTHESIS BASED ON IMPROVED ENERGY OPTIMISATION

    Institute of Scientific and Technical Information of China (English)

    马爽; 史巍; 许刚

    2014-01-01

    针对传统能量优化方法需要逐点计算的不足,提出一种结合缝合技术的改进能量优化方法用于透视纹理图像的合成。首先,用像素块处理代替逐个像素点的运算,建立基于图像块的能量优化模型;其次,在优化过程中,引入图像缝合技术解决块处理可能存在的接缝、重叠等问题,提高合成图像质量。实验结果表明,改进的能量优化法计算效率大大提高,且可获得更好的视觉效果。%In this paper we propose an energy optimisation improvement algorithm in combination with image quilting technology for perspective texture image synthesis in light of the insufficiency of traditional pixel-based energy optimisation method.First,we replace the pixel-to-pixel calculation with patch processing of pixels,and build the image patch-based energy optimisation model.Secondly,during the optimisation process,we introduce image quilting to solve the problems may having in patch processing such as the joint seams,overlapping, and so on,for improving the quality of synthesis image.Experimental results show that the proposed energy optimisation algorithm greatly improves the computation efficiency,meanwhile,a better visual effect is achieved as well.

  9. Experimental Study of the Cloud Architecture Selection for Effective Big Data Processing

    Directory of Open Access Journals (Sweden)

    Evgeny Nikulchev

    2015-06-01

    Full Text Available Big data dictate their requirements to the hardware and software. Simple migration to the cloud data processing, while solving the problem of increasing computational capabilities, however creates some issues: the need to ensure the safety, the need to control the quality during data transmission, the need to optimize requests. Computational cloud does not simply provide scalable resources but also requires network infrastructure, unknown routes and the number of user requests. In addition, during functioning situation can occur, in which you need to change the architecture of the application — part of the data needs to be placed in a private cloud, part in a public cloud, part stays on the client.

  10. TEM turbulence optimisation in stellarators

    CERN Document Server

    Proll, J H E; Xanthopoulos, P; Lazerson, S A; Faber, B J

    2015-01-01

    With the advent of neoclassically optimised stellarators, optimising stellarators for turbulent transport is an important next step. The reduction of ion-temperature-gradient-driven turbulence has been achieved via shaping of the magnetic field, and the reduction of trapped-electron mode (TEM) turbulence is adressed in the present paper. Recent analytical and numerical findings suggest TEMs are stabilised when a large fraction of trapped particles experiences favourable bounce-averaged curvature. This is the case for example in Wendelstein 7-X [C.D. Beidler $\\textit{et al}$ Fusion Technology $\\bf{17}$, 148 (1990)] and other Helias-type stellarators. Using this knowledge, a proxy function was designed to estimate the TEM dynamics, allowing optimal configurations for TEM stability to be determined with the STELLOPT [D.A. Spong $\\textit{et al}$ Nucl. Fusion $\\bf{41}$, 711 (2001)] code without extensive turbulence simulations. A first proof-of-principle optimised equilibrium stemming from the TEM-dominated stella...

  11. Design of the New Life(style study: a randomised controlled trial to optimise maternal weight development during pregnancy. [ISRCTN85313483

    Directory of Open Access Journals (Sweden)

    Seidell Jacob C

    2006-06-01

    Full Text Available Abstract Background Preventing excessive weight gain during pregnancy is potentially important in the prevention of overweight and obesity among women of childbearing age. However, few intervention studies aiming at weight management during pregnancy have been performed and most of these interventions were not as successful as expected. In this paper the design of the New Life(style study is described as well as the content of the individually tailored intervention program, which focuses on controlling weight development during pregnancy. Methods The effectiveness of the New Life(style intervention program versus usual care by midwives is evaluated in a randomised controlled trial. Women who expect their first child and visit one of the participating midwifery practices are included. The intervention is standardised in a protocol and executed by trained counsellors with the women who are randomised in the intervention group. During 5 sessions – at 18, 22, 30 and 36 weeks of pregnancy and at 8 weeks postpartum – individual weight gain is discussed in relation to weight gain guidelines for pregnant women of the American Institute of Medicine. Counsellors coach the women to maintain or optimise a healthy lifestyle, in a period of drastic physical and mental changes. Data is collected at 15, 25, 35 weeks of pregnancy and at 6, 26, and 52 weeks after delivery. Primary outcome measures are body weight, BMI, and skinfold thickness. Secondary outcome measures include physical activity, nutrition and blood levels of factors that are associated with energy homeostasis. Discussion Results of the current RCT will improve the knowledge of determinants of weight gain during pregnancy, weight retention after childbirth and of the effectiveness of the intervention program that is described. Caregivers and researchers in the field of health promotion are offered more insight in specific elements of the New Life(style intervention program.

  12. STROBE-AMS : recommendations to optimise reporting of epidemiological studies on antimicrobial resistance and informing improvement in antimicrobial stewardship

    NARCIS (Netherlands)

    Tacconelli, Evelina; Cataldo, Maria A; Paul, M; Leibovici, L; Kluytmans, Jan; Schröder, Wiebke; Foschi, Federico; De Angelis, Giulia; De Waure, Chiara; Cadeddu, Chiara; Mutters, Nico T; Gastmeier, Petra; Cookson, Barry

    2016-01-01

    OBJECTIVES: To explore the accuracy of application of the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) tool in epidemiological studies focused on the evaluation of the role of antibiotics in selecting resistance, and to derive and test an extension of STROBE to impro

  13. Comparative study of visual acuity and aberrations after intralase femtosecond LASIK: small corneal flap versus big corneal flap

    Institute of Scientific and Technical Information of China (English)

    Ya-Li; Zhang; Lei; Liu; Chang-Xia; Cui; Ming; Hu; Zhao-Na; Li; Li-Jun; Cao; Xiu-Hua; Jing; Guo-Ying; Mu

    2014-01-01

    AIM:To study the effects of different flap sizes on visual acuity, refractive outcomes, and aberrations after femtosecond laser for laser keratomileusis (LASIK). ·METHODS: In each of the forty patients enrolled, 1 eye was randomly assigned to receive treatment with a 8.1mm diameter corneal flap, defined as the small flap, while the other eye was treated with a 8.6mm diameter corneal flap, defined as the big flap. Refractive errors, visual acuity, and higher -order aberrations were compared between the two groups at week 1, month 1 and 3 postoperatively. · RESULTS: The postoperative refractive errors and visual acuity all conformed to the intended goal. Postoperative higher -order aberrations were increased, especially in spherical aberration (Z12) and vertical coma (Z7). There were no statistically significant differences between the two groups in terms of postoperative refractive errors, visual acuity, root mean square of total HOAs (HO -RMS), trefoil 30° (Z6), vertical coma (Z7), horizontal coma (Z8), trefoil 0° (Z9), and spherical aberration (Z12) at any point during the postoperative follow-up. ·CONCLUSION: Both the small and big flaps are safe and effective procedures to correct myopia, provided the exposure stroma meets the excimer laser ablations. The personalized size corneal flap is feasible, as we can design the size of corneal flap based on the principle that the corneal flap diameter should be equal to or greater than the sum of the maximum ablation diameter and apparatus error.

  14. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    to estimate both the structural and performance effects of selection, as well as the behavioral of crossing organizational boundaries. Preliminary results suggest that the selection of collaborators still is skewed, and identify a large assortativity effect, as well as a tendency to interact with both authors......In this paper we investigate the micro-mechanisms governing the structural evolution of a scientific collaboration. Empirical evidence indicates that we have transcended into a new paradigm with a new modus operandi where scientific discovery are not lead by so called lone ?stars?, or big egos......, but instead by a group of people, from a multitude of institutions, having a diverse knowledge set and capable of operating more and more complex instrumentation. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize a stochastic actor oriented model...

  15. Methodology for optimising location of new primary health care facilities in rural communities: a case study in KwaZulu-Natal, South Africa.

    Science.gov (United States)

    Tanser, Frank

    2006-10-01

    To develop a quantitative methodology to optimally site new primary health care facilities so as to achieve the maximum population level increase in accessibility to care. The study aims to test the methodology in a rural community characterised by considerable heterogeneity in population distribution and health care access. A geographical information system was used to estimate travel time to the nearest primary health care facility for each of the 26 000 homesteads in the subdistrict. The homestead's travel time estimate was then converted into an impedance to care estimate using distance decay (in clinic use) data obtained from the subdistrict. A map of total person impedance/km(2) was then produced using a 3 km standard Gaussian filter. The resulting map was used to site a test clinic in the largest contiguous area of high person impedance. Hlabisa health subdistrict, KwaZulu-Natal, South Africa. The population level increase in accessibility that would be achieved by the construction of the test clinic would be 3.6 times the increase in accessibility achieved by the construction of the newest clinic in the subdistrict. The corresponding ratio for increasing clinic coverage (% of the population within 60 minutes of care) would be 4.7. The methodology successfully identifies a locality for a new facility that would maximise the population level increase in accessibility to care. The same principles used in this research could also be applied in other settings. The methodology is of practical value in health research and practice and provides a framework for optimising location of new primary health care facilities.

  16. Alpha-synuclein oligomers and fibrils originate in two distinct conformer pools: a small angle X-ray scattering and ensemble optimisation modelling study.

    Science.gov (United States)

    Curtain, Cyril C; Kirby, Nigel M; Mertens, Haydyn D T; Barnham, Kevin J; Knott, Robert B; Masters, Colin L; Cappai, Roberto; Rekas, Agata; Kenche, Vijaya B; Ryan, Timothy

    2015-01-01

    The 140 residue intrinsically disordered protein α-synuclein (α-syn) self-associates to form fibrils that are the major constituent of the Lewy body intracellular protein inclusions, and neurotoxic oligomers. Both of these macromolecular structures are associated with a number of neurodegenerative diseases, including Parkinson's disease and dementia with Lewy bodies. Using ensemble optimisation modelling (EOM) and small angle X-ray scattering (SAXS) on a size-exclusion column equipped beamline, we studied how the distribution of structural conformers in α-syn may be influenced by the presence of the familial early-onset mutations A30P, E45K and A53T, by substituting the four methionine residues with alanines and by reaction with copper (Cu2+) or an anti-fibril organic platinum (Pt) complex. We found that the WT had two major conformer groups, representing ensembles of compact and extended structures. The population of the extended group was increased in the more rapidly fibril-forming E45K and A53T mutants, while the compact group was enlarged in the oligomer-forming A30P mutant. Addition of Cu2+ resulted in the formation of an ensemble of compact conformers, while the anti-fibril agent and alanine substitution substantially reduced the population of extended conformers. Since our observations with the mutants suggest that fibrils may be drawn from the extended conformer ensemble, we propose that the compact and extended ensembles represent the beginning of oligomer and fibril formation pathways respectively, both of which have been reported to lead to a toxic gain of function. Manipulating these pathways and monitoring the results by EOM and SAXS may be useful in the development of anti-Parkinson's disease therapies.

  17. The OPTIMIST study: optimisation of cost effectiveness through individualised FSH stimulation dosages for IVF treatment. A randomised controlled trial

    Directory of Open Access Journals (Sweden)

    van Tilborg Theodora C

    2012-09-01

    Full Text Available Abstract Background Costs of in vitro fertilisation (IVF are high, which is partly due to the use of follicle stimulating hormone (FSH. FSH is usually administered in a standard dose. However, due to differences in ovarian reserve between women, ovarian response also differs with potential negative consequences on pregnancy rates. A Markov decision-analytic model showed that FSH dose individualisation according to ovarian reserve is likely to be cost-effective in women who are eligible for IVF. However, this has never been confirmed in a large randomised controlled trial (RCT. The aim of the present study is to assess whether an individualised FSH dose regime based on an ovarian reserve test (ORT is more cost-effective than a standard dose regime. Methods/Design Multicentre RCT in subfertile women indicated for a first IVF or intracytoplasmic sperm injection cycle, who are aged  Discussion The results of this study will be integrated into a decision model that compares cost-effectiveness of the three dose-adjustment strategies to a standard dose strategy. The study outcomes will provide scientific foundation for national and international guidelines. Trial registration NTR2657

  18. pARIS-htt: an optimised expression platform to study huntingtin reveals functional domains required for vesicular trafficking

    Directory of Open Access Journals (Sweden)

    Pardo Raúl

    2010-06-01

    Full Text Available Abstract Background Huntingtin (htt is a multi-domain protein of 350 kDa that is mutated in Huntington's disease (HD but whose function is yet to be fully understood. This absence of information is due in part to the difficulty of manipulating large DNA fragments by using conventional molecular cloning techniques. Consequently, few studies have addressed the cellular function(s of full-length htt and its dysfunction(s associated with the disease. Results We describe a flexible synthetic vector encoding full-length htt called pARIS-htt (Adaptable, RNAi Insensitive &Synthetic. It includes synthetic cDNA coding for full-length human htt modified so that: 1 it is improved for codon usage, 2 it is insensitive to four different siRNAs allowing gene replacement studies, 3 it contains unique restriction sites (URSs dispersed throughout the entire sequence without modifying the translated amino acid sequence, 4 it contains multiple cloning sites at the N and C-ter ends and 5 it is Gateway compatible. These modifications facilitate mutagenesis, tagging and cloning into diverse expression plasmids. Htt regulates dynein/dynactin-dependent trafficking of vesicles, such as brain-derived neurotrophic factor (BDNF-containing vesicles, and of organelles, including reforming and maintenance of the Golgi near the cell centre. We used tests of these trafficking functions to validate various pARIS-htt constructs. We demonstrated, after silencing of endogenous htt, that full-length htt expressed from pARIS-htt rescues Golgi apparatus reformation following reversible microtubule disruption. A mutant form of htt that contains a 100Q expansion and a htt form devoid of either HAP1 or dynein interaction domains are both unable to rescue loss of endogenous htt. These mutants have also an impaired capacity to promote BDNF vesicular trafficking in neuronal cells. Conclusion We report the validation of a synthetic gene encoding full-length htt protein that will facilitate

  19. pARIS-htt: an optimised expression platform to study huntingtin reveals functional domains required for vesicular trafficking.

    Science.gov (United States)

    Pardo, Raúl; Molina-Calavita, Maria; Poizat, Ghislaine; Keryer, Guy; Humbert, Sandrine; Saudou, Frédéric

    2010-06-01

    Huntingtin (htt) is a multi-domain protein of 350 kDa that is mutated in Huntington's disease (HD) but whose function is yet to be fully understood. This absence of information is due in part to the difficulty of manipulating large DNA fragments by using conventional molecular cloning techniques. Consequently, few studies have addressed the cellular function(s) of full-length htt and its dysfunction(s) associated with the disease. We describe a flexible synthetic vector encoding full-length htt called pARIS-htt (Adaptable, RNAi Insensitive &Synthetic). It includes synthetic cDNA coding for full-length human htt modified so that: 1) it is improved for codon usage, 2) it is insensitive to four different siRNAs allowing gene replacement studies, 3) it contains unique restriction sites (URSs) dispersed throughout the entire sequence without modifying the translated amino acid sequence, 4) it contains multiple cloning sites at the N and C-ter ends and 5) it is Gateway compatible. These modifications facilitate mutagenesis, tagging and cloning into diverse expression plasmids. Htt regulates dynein/dynactin-dependent trafficking of vesicles, such as brain-derived neurotrophic factor (BDNF)-containing vesicles, and of organelles, including reforming and maintenance of the Golgi near the cell centre. We used tests of these trafficking functions to validate various pARIS-htt constructs. We demonstrated, after silencing of endogenous htt, that full-length htt expressed from pARIS-htt rescues Golgi apparatus reformation following reversible microtubule disruption. A mutant form of htt that contains a 100Q expansion and a htt form devoid of either HAP1 or dynein interaction domains are both unable to rescue loss of endogenous htt. These mutants have also an impaired capacity to promote BDNF vesicular trafficking in neuronal cells. We report the validation of a synthetic gene encoding full-length htt protein that will facilitate analyses of its structure/function. This may help

  20. Enhancement of compact heat exchanger fins: numerical and experimental study; Optimisation des echangeurs compacts a ailettes: etude numerique et experimentale

    Energy Technology Data Exchange (ETDEWEB)

    Michel, F.

    2003-10-01

    This work concerns plate fins compact heat exchangers. These compact devices (C > 700 m2/m3) reduce bulk and weight due to large surfaces for heat transfer. These exchangers, widely used in automotive systems, cryogenics and aeronautics, are currently studied with empirical correlations. So, this limits the evolution of fins in compact heat exchangers. We propose a numerical methodology for designing and enhancing Offset Strip Fin (OSF) geometries. Numerical models and methods have been validated to correctly predict thermohydraulics in Offset Strip Fin heat exchangers. We have validated simulations with data from the literature but also with two experimental devices made for this thesis. Local and global temperature and velocity measurements have been realised in geometries near Offset Strip Fins. Hot wire and cold wire anemometry and Laser Doppler Anemometry (LDA) have been used to obtained validation data. Finally, the validated numerical simulations have been used to enhance geometries of fins and to give innovating geometries. (author)

  1. Optimising iron chelation therapy with deferasirox for non-transfusion-dependent thalassaemia patients: 1-year results from the THETIS study.

    Science.gov (United States)

    Taher, Ali T; Cappellini, M Domenica; Aydinok, Yesim; Porter, John B; Karakas, Zeynep; Viprakasit, Vip; Siritanaratkul, Noppadol; Kattamis, Antonis; Wang, Candace; Zhu, Zewen; Joaquin, Victor; Uwamahoro, Marie José; Lai, Yong-Rong

    2016-03-01

    Efficacy and safety of iron chelation therapy with deferasirox in iron-overloaded non-transfusion-dependent thalassaemia (NTDT) patients were established in the THALASSA study. THETIS, an open-label, single-arm, multicentre, Phase IV study, added to this evidence by investigating earlier dose escalation by baseline liver iron concentration (LIC) (week 4: escalation according to baseline LIC; week 24: adjustment according to LIC response, maximum 30mg/kg/day). The primary efficacy endpoint was absolute change in LIC from baseline to week 52. 134 iron-overloaded non-transfusion-dependent anaemia patients were enrolled and received deferasirox starting at 10mg/kg/day. Mean actual dose±SD over 1year was 14.70±5.48mg/kg/day. At week 52, mean LIC±SD decreased significantly from 15.13±10.72mg Fe/g dw at baseline to 8.46±6.25mg Fe/g dw (absolute change from baseline, -6.68±7.02mg Fe/g dw [95% CI: -7.91, -5.45]; P<0.0001). Most common drug-related adverse events were gastrointestinal: abdominal discomfort, diarrhoea and nausea (n=6 each). There was one death (pneumonia, not considered drug related). With significant and clinically relevant reductions in iron burden alongside a safety profile similar to that in THALASSA, these data support earlier escalation with higher deferasirox doses in iron-overloaded non-transfusion-dependent anaemia patients.

  2. Optimisation of biogas production from manure through serial digestion: lab-scale and pilot-scale studies.

    Science.gov (United States)

    Kaparaju, Prasad; Ellegaard, Lars; Angelidaki, Irini

    2009-01-01

    In the present study, the possibility of optimizing biogas production from manure by serial digestion was investigated. In the lab-scale experiments, process performance and biogas production of serial digestion, two methanogenic continuously stirred tank reactors (CSTR) connected in series, was compared to a conventional one-step CSTR process. The one-step process was operated at 55 degrees C with 15d HRT and 5l working volume (control). For serial digestion, the total working volume of 5l was distributed as 70/30%, 50/50%, 30/70% or 13/87% between the two methanogenic reactors, respectively. Results showed that serial digestion improved biogas production from manure compared to one-step process. Among the tested reactor configurations, best results were obtained when serial reactors were operated with 70/30% and 50/50% volume distribution. Serial digestion at 70/30% and 50/50% volume distribution produced 13-17.8% more biogas and methane and, contained low VFA and residual methane potential loss in the effluent compared to the one-step CSTR process. At 30/70% volume distribution, an increase in biogas production was also noticed but the process was very unstable with low methane production. At 13/87% volume distribution, no difference in biogas production was noticed and methane production was much lower than the one-step CSTR process. Pilot-scale experiments also showed that serial digestion with 77/23% volume distribution could improve biogas yields by 1.9-6.1% compared to one-step process. The study thus suggests that the biogas production from manure can be optimized through serial digestion with an optimal volume distribution of 70/30% or 50/50% as the operational fluctuations are typically high during full scale application. However, process temperature between the two methanogenic reactors should be as close as possible in order to derive the benefits of serial coupling.

  3. EVITEACH: a study exploring ways to optimise the uptake of evidence-based practice to undergraduate nurses.

    Science.gov (United States)

    Hickman, Louise D; Kelly, Helen; Phillips, Jane L

    2014-11-01

    EVITEACH aimed to increase undergraduate nursing student's engagement with evidence-based practice and enhance their knowledge utilisation and translation capabilities. Building students capabilities to apply evidence in professional practice is a fundamental university role. Undergraduate nursing students need to actively engage with knowledge utilisation and translational skill development to narrow the evidence practice gap in the clinical setting. A two phase mixed methods study was undertaken over a three year period (2008-2010, inclusive) utilizing a Plan-Do-Study-Act (PDSA) approach. Three undergraduate nursing cohorts (N = 188) enrolled in a compulsory knowledge translation and utilisation subject at one Australian university participated. Data collection comprised of subject evaluation data and reflective statements. Preliminary investigations identified priority areas related to subject: materials, resources, teaching and workload. These priority areas became the focus of action for two PDSA cycles. PDSA cycle 1 demonstrated significant improvement of the subject overall (p > 0.05), evaluation of the materials used (p > 0.001) and teaching sub-groups (p > 0.05). PDSA cycle 2 continued to sustain improvement of the subject overall (p > 0.05). Furthermore reflective statements collected during PDSA cycle 2 identified four themes: (1) What engages undergraduate nurses in the learning process; (2) The undergraduate nurses learning trajectory; (3) Undergraduate nurses' preconceptions of research and evidenced-based practice; and (4) Appreciating the importance of research and evidence-based practice to nursing. There is little robust evidence to guide the most effective way to build knowledge utilisation and translational skills. Effectively engaging undergraduate nursing students in knowledge translation and utilisation subjects could have immediate and long term benefits for nursing as a profession and patient outcomes. Developing evidence-based practice

  4. Railway vehicle performance optimisation using virtual homologation

    Science.gov (United States)

    Magalhães, H.; Madeira, J. F. A.; Ambrósio, J.; Pombo, J.

    2016-09-01

    Unlike regular automotive vehicles, which are designed to travel in different types of roads, railway vehicles travel mostly in the same route during their life cycle. To accept the operation of a railway vehicle in a particular network, a homologation process is required according to local standard regulations. In Europe, the standards EN 14363 and UIC 518, which are used for railway vehicle acceptance, require on-track tests and/or numerical simulations. An important advantage of using virtual homologation is the reduction of the high costs associated with on-track tests by studying the railway vehicle performance in different operation conditions. This work proposes a methodology for the improvement of railway vehicle design with the objective of its operation in selected railway tracks by using optimisation. The analyses required for the vehicle improvement are performed under control of the optimisation method global and local optimisation using direct search. To quantify the performance of the vehicle, a new objective function is proposed, which includes: a Dynamic Performance Index, defined as a weighted sum of the indices obtained from the virtual homologation process; the non-compensated acceleration, which is related to the operational velocity; and a penalty associated with cases where the vehicle presents an unacceptable dynamic behaviour according to the standards. Thus, the optimisation process intends not only to improve the quality of the vehicle in terms of running safety and ride quality, but also to increase the vehicle availability via the reduction of the time for a journey while ensuring its operational acceptance under the standards. The design variables include the suspension characteristics and the operational velocity of the vehicle, which are allowed to vary in an acceptable range of variation. The results of the optimisation lead to a global minimum of the objective function in which the suspensions characteristics of the vehicle are

  5. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  6. High Energy Density Plasmas (HEDP) for studies of basic nuclear science relevant to Stellar and Big Bang Nucleosynthesis

    Science.gov (United States)

    Frenje, Johan

    2014-06-01

    Thermonuclear reaction rates and nuclear processes have been explored traditionally by means of conventional accelerator experiments, which are difficult to execute at conditions relevant to stellar nucleosynthesis. Thus, nuclear reactions at stellar energies are often studied through extrapolations from higher-energy data or in low-background underground experiments. Even when measurements are possible using accelerators at relevant energies, thermonuclear reaction rates in stars are inherently different from those in accelerator experiments. The fusing nuclei are surrounded by bound electrons in accelerator experiments, whereas electrons occupy mainly continuum states in a stellar environment. Nuclear astrophysics research will therefore benefit from an enlarged toolkit for studies of nuclear reactions. In this presentation, we report on the first use of High Energy Density Plasmas for studies of nuclear reactions relevant to basic nuclear science, stellar and Big Bang nucleosynthesis. These experiments were carried out at the OMEGA laser facility at University of Rochester and the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, in which spherical capsules were irradiated with powerful lasers to compress and heat the fuel to high enough temperatures and densities for nuclear reactions to occur. Four experiments will be highlighted in this presentation. In the first experiment, the differential cross section for the elastic neutron-triton (n-T) scattering at 14.1 MeV was measured with significantly higher accuracy than achieved in accelerator experiments. In the second experiment, the T(t,2n)4He reaction, a mirror reaction to the 3He(3He,2p)4He reaction that plays an important role in the proton-proton chain that transforms hydrogen into ordinary 4He in stars like our Sun, was studied at energies in the range 15-40 keV. In the third experiment, the 3He+3He solar fusion reaction was studied directly, and in the fourth experiment, we

  7. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  8. Stacking sequence optimisation of composite panels subjected to slamming impact loads using a genetic algorithm

    OpenAIRE

    Khedmati,Mohammad Reza; Sangtabi,Mohammad Rezai; Fakoori,Mehdi

    2013-01-01

    Optimisation of stacking sequence for composite panels under slamming impact loads using a genetic algorithm method is studied in this paper. For this purpose, slamming load is assumed to have a uniform distribution with a triangular-pulse type of intensity function. In order to perform optimisation based on a genetic algorithm, a special code is written in MATLAB software environment. The optimiser is coupled with the commercial software ANSYS in order to analyse the composite panel under st...

  9. Cogeneration technologies, optimisation and implementation

    CERN Document Server

    Frangopoulos, Christos A

    2017-01-01

    Cogeneration refers to the use of a power station to deliver two or more useful forms of energy, for example, to generate electricity and heat at the same time. This book provides an integrated treatment of cogeneration, including a tour of the available technologies and their features, and how these systems can be analysed and optimised.

  10. Optimisation of solar synoptic observations

    Science.gov (United States)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  11. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  12. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  13. 大数据分析研究现状、问题与对策%Big Data Study on the Current Situation,Problems and Countermeasures

    Institute of Scientific and Technical Information of China (English)

    官思发; 孟玺; 李宗洁; 刘扬

    2015-01-01

    with the big data of development, it has generated enormous publicity at home and abroad. It is the core problem for big data analysis, which must adopt effective and efficient processing and analyzing This paper adopts literature review method to study big data, which focus on contents such as analysis as a service,big data analysis methods and big data driven science after literature review and sum-marizing the practical development of big data analysis in China and other countries,then it comes up with five key challenges in big data analysis,such as data storage,weak data usability,data modeling,resource distribution,shortage of professional big data analytics tools and provides five countermeasures correspondingly,they are deploying cloud-based storage,promoting data usability,optimizing data analysis model,dispatching analytics resources,developing big data analysis platform.%大数据的快速发展引起了国内外的广泛关注和重视,对大数据进行科学有效地分析处理是大数据领域最核心的问题,通过文献综述从分析即服务、大数据分析方法和大数据驱动科学萌芽三方面对国内外大数据分析研究现状进行总结,提出了大数据分析领域数据存储、弱可用性、数据建模、资源调度和专业分析工具匮乏等五大重要问题,并有针对性地提出部署云存储技术、提升数据可用性、优化数据分析模型、弹性调度资源和研发大数据分析平台五个对策建议。

  14. Computing seismic damage estimates for buildings within a big city. Bucharest case study.

    Science.gov (United States)

    Toma-Danila, Dragos; Armas, Iuliana

    2016-04-01

    The seismic risk analysis of big cities is a very demanding yet necessary task; the modeling of such complex systems requires first of all insightful input data at good resolution, referring to local effects, buildings and socio-economic aspects. Also, seismic risk estimation methods with good confidence levels are needed. Until recently, these requirements were not fulfilled for Bucharest, one of the most endangered capital city in Europe due to earthquakes. Based on 2011 and 2002 census data, standardized according to the framework of the Near-real time System for Estimating the Seismic Damage in Romania (SeisDaRo) through a unique approach and on relevant hazard scenarios, we estimate for the first time the building damage within the city, divided in more than 120 areas. The methodology applied relies on 48 vulnerability curves for buildings, on the Improved Displacement Coefficient Analytical Method included in the SELENA software for computing damage probabilities and on multiple seismic hazard scenarios, including the maximum possible. In order to compare results with real losses we use a scenario based on the 4 March 1977 Vrancea earthquake (7.4 moment-magnitude) that lead to 1424 deaths in Bucharest. By using overlay analysis with satellite imagery and a new methodology integrated in GIS we show how results can be enhanced, reflecting even more local characteristics. Best practices for seismic risk mapping are also expressed. Results are promising and contribute to the mitigation efforts in Bucharest.

  15. Pentaho and Jaspersoft: A Comparative Study of Business Intelligence Open Source Tools Processing Big Data to Evaluate Performances

    Directory of Open Access Journals (Sweden)

    Victor M. Parra

    2016-10-01

    . Meanwhile, Pentaho BI had a marked increment of the CPU time in the process of data over Jaspersoft evidenced by the reporting analysis outcomes with an average of 43.12% over six databases that prove the point of this study. This study is a guiding reference for many researchers and those IT professionals who support the conveniences of Big Data processing, and the implementation of BI open source tool based on their needs.

  16. Disease activity-guided dose optimisation of adalimumab and etanercept is a cost-effective strategy compared with non-tapering tight control rheumatoid arthritis care: analyses of the DRESS study.

    Science.gov (United States)

    Kievit, Wietske; van Herwaarden, Noortje; van den Hoogen, Frank Hj; van Vollenhoven, Ronald F; Bijlsma, Johannes Wj; van den Bemt, Bart Jf; van der Maas, Aatke; den Broeder, Alfons A

    2016-11-01

    A disease activity-guided dose optimisation strategy of adalimumab or etanercept (TNFi (tumour necrosis factor inhibitors)) has shown to be non-inferior in maintaining disease control in patients with rheumatoid arthritis (RA) compared with usual care. However, the cost-effectiveness of this strategy is still unknown. This is a preplanned cost-effectiveness analysis of the Dose REduction Strategy of Subcutaneous TNF inhibitors (DRESS) study, a randomised controlled, open-label, non-inferiority trial performed in two Dutch rheumatology outpatient clinics. Patients with low disease activity using TNF inhibitors were included. Total healthcare costs were measured and quality adjusted life years (QALY) were based on EQ5D utility scores. Decremental cost-effectiveness analyses were performed using bootstrap analyses; incremental net monetary benefit (iNMB) was used to express cost-effectiveness. 180 patients were included, and 121 were allocated to the dose optimisation strategy and 59 to control. The dose optimisation strategy resulted in a mean cost saving of -€12 280 (95 percentile -€10 502; -€14 104) per patient per 18 months. There is an 84% chance that the dose optimisation strategy results in a QALY loss with a mean QALY loss of -0.02 (-0.07 to 0.02). The decremental cost-effectiveness ratio (DCER) was €390 493 (€5 085 184; dominant) of savings per QALY lost. The mean iNMB was €10 467 (€6553-€14 037). Sensitivity analyses using 30% and 50% lower prices for TNFi remained cost-effective. Disease activity-guided dose optimisation of TNFi results in considerable cost savings while no relevant loss of quality of life was observed. When the minimal QALY loss is compensated with the upper limit of what society is willing to pay or accept in the Netherlands, the net savings are still high. NTR3216; Post-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to

  17. Comparative case study on website traffic generated by search engine optimisation and a pay-per-click campaign, versus marketing expenditure

    OpenAIRE

    Wouter T. Kritzinger; Melius Weideman

    2015-01-01

    Background: No empirical work was found on how marketing expenses compare when used solely for either the one or the other of the two main types of search engine marketing. Objectives: This research set out to determine how the results of the implementation of a pay-per-click campaign compared to those of a search engine optimisation campaign, given the same website and environment. At the same time, the expenses incurred on both these marketing methods were recorded and compared. Method: Th...

  18. A study on the Translation of Cultural Elements in TV Subtitles-A Case Study of The Big Bang Theory

    Institute of Scientific and Technical Information of China (English)

    胡亚庆

    2014-01-01

    With the growing popularity of one country’s TV series in foreign countries, subtitle translation has attracted consider-able attention in recent years. The Big Bang Theory (TBBT) is a widespread American TV series whose subtitles have been trans-lated into many foreign languages, and it also boasts its richness in cultural elements, the most typical ones including science ele-ments and religious cultural elements.

  19. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  20. Clarifying the role of social comparison in the big-fish-little-pond effect (BFLPE): an integrative study.

    Science.gov (United States)

    Huguet, Pascal; Dumas, Florence; Marsh, Herbert; Wheeler, Ladd; Seaton, Marjorie; Nezlek, John; Suls, Jerry; Régner, Isabelle

    2009-07-01

    It has been speculated that the big-fish-little-pond effect (BFLPE; the negative impact of highly selective academic settings on academic self-concept) is a consequence of invidious social comparisons experienced in higher ability schools. However, the direct role of such comparisons for the BFLPE has not heretofore been documented. The present study comprises the first evidence that the BFLPE (a) is eliminated after controlling for students' invidious comparisons with their class and (b) coexists with the assimilative and contrastive effects of upward social comparison choices on academic self-concept. These results increase understanding of the BFLPE and offer support for integrative approaches of social comparison (selective accessibility and interpretation comparison models) in a natural setting. They also lend support for the distinction between forced and deliberate social comparisons and the usefulness of distinguishing between absolute and relative comparison-level choice in self-assessment.

  1. Real-Time Prediction of Gamers Behavior Using Variable Order Markov and Big Data Technology: A Case of Study

    Directory of Open Access Journals (Sweden)

    Alejandro Baldominos Gómez

    2016-03-01

    Full Text Available This paper presents the results and conclusions found when predicting the behavior of gamers in commercial videogames datasets. In particular, it uses Variable-Order Markov (VOM to build a probabilistic model that is able to use the historic behavior of gamers and to infer what will be their next actions. Being able to predict with accuracy the next user’s actions can be of special interest to learn from the behavior of gamers, to make them more engaged and to reduce churn rate. In order to support a big volume and velocity of data, the system is built on top of the Hadoop ecosystem, using HBase for real-time processing; and the prediction tool is provided as a service (SaaS and accessible through a RESTful API. The prediction system is evaluated using a case of study with two commercial videogames, attaining promising results with high prediction accuracies.

  2. Big data=Big marketing?!

    Institute of Scientific and Technical Information of China (English)

    肖明超

    2012-01-01

    <正>互联网刚刚兴起的时候,有句话很流行:"在网上,没人知道你是一条狗。"但是,在20多年后的今天,这句话已经早被扔进了历史的垃圾堆,因为在技术的推动下,随着移动互联、社交网络、电子商务等的迅速发展,消费者的"行踪"变得越来越容易被把握,消费者在互联网上的眼球、行为轨迹、谈论、喜好、购物经历等等都可能被捕捉到,消费者进入一个几乎透明化生存的"大数据时代"(Age of Big Data)。数据不仅仅正在变得更加可用,人工智能(AI)技术,包括自然语言处理、模式识别和机器学习等技术的发展,正在让数据变得更加容易被计算机所理解,

  3. Optimisation of fertiliser rates in crop production against energy use indicators

    DEFF Research Database (Denmark)

    Rossner, Helis; Ritz, Christian; Astover, Alar

    2014-01-01

    Optimising mineral nitrogen (N) use in crop production is inevitable target as mineral fertilisers reflectone of the highest inputs both in terms of economy and energy. The aim of the study was to comparethe relationship between the rate of N fertiliser application and different measures of energy.......05) optimisation. Both the new combined indices gave optimum N norms in between the rate ofER an EG. Composted cow manure background did not affect mineral N optimisation significantly. Wesuggest optimisation of mineral N according to bi-dimensional parameters as they capture important fea-tures of production...

  4. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    Science.gov (United States)

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  5. Symptoms of endocrine treatment and outcome in the BIG 1-98 study.

    Science.gov (United States)

    Huober, J; Cole, B F; Rabaglio, M; Giobbie-Hurder, A; Wu, J; Ejlertsen, B; Bonnefoi, H; Forbes, J F; Neven, P; Láng, I; Smith, I; Wardley, A; Price, K N; Goldhirsch, A; Coates, A S; Colleoni, M; Gelber, R D; Thürlimann, B

    2014-01-01

    There may be a relationship between the incidence of vasomotor and arthralgia/myalgia symptoms and treatment outcomes for postmenopausal breast cancer patients with endocrine-responsive disease who received adjuvant letrozole or tamoxifen. Data on patients randomized into the monotherapy arms of the BIG 1-98 clinical trial who did not have either vasomotor or arthralgia/myalgia/carpal tunnel (AMC) symptoms reported at baseline, started protocol treatment and were alive and disease-free at the 3-month landmark (n = 4,798) and at the 12-month landmark (n = 4,682) were used for this report. Cohorts of patients with vasomotor symptoms, AMC symptoms, neither, or both were defined at both 3 and 12 months from randomization. Landmark analyses were performed for disease-free survival (DFS) and for breast cancer free interval (BCFI), using regression analysis to estimate hazard ratios (HR) and 95 % confidence intervals (CI). Median follow-up was 7.0 years. Reporting of AMC symptoms was associated with better outcome for both the 3- and 12-month landmark analyses [e.g., 12-month landmark, HR (95 % CI) for DFS = 0.65 (0.49-0.87), and for BCFI = 0.70 (0.49-0.99)]. By contrast, reporting of vasomotor symptoms was less clearly associated with DFS [12-month DFS HR (95 % CI) = 0.82 (0.70-0.96)] and BCFI (12-month DFS HR (95 % CI) = 0.97 (0.80-1.18). Interaction tests indicated no effect of treatment group on associations between symptoms and outcomes. While reporting of AMC symptoms was clearly associated with better DFS and BCFI, the association between vasomotor symptoms and outcome was less clear, especially with respect to breast cancer-related events.

  6. Optimisation of the $VH$ to $b\\bar{b} + X$ Selection

    CERN Document Server

    Wilk, Fabian

    2013-01-01

    This report presents results from two separated, yet related studies both using truth data: First a cut optimisation study and its results are presented. This study aims to provide a numerically optimised set of cuts for the current $8\\, \\mathrm{TeV}$ and the upcoming $14 \\, \\mathrm{TeV}$ analysis of the $WH \\to l\

  7. Contingency Factors and Matrix Study of Big Data Solution%大数据解决方案之权变因素及其矩阵研究

    Institute of Scientific and Technical Information of China (English)

    尹晓锋; 辛伯宇

    2015-01-01

    With the development of the internet, IOT and cloud computing technology etc., big data have gradually attract attention and application of various organizations. However, there is no unified standard for how to make or organize big data solution. To solve this problem, this paper conducted a comparative study and the classified analysis for the existing big data solutions,focusing on the contingency of big data solutions, and proposes the contingency factor matrix of big data solutions, which provides an effective strategy for technical staffs or organizations to organize and choose big data solution.%随着互联网、物联网和云计算等技术的发展,大数据逐渐引起各种组织的关注及应用.然而,如何制定或组织大数据解决方案,还尚未有统一的标准可循.为了解决该问题,本文对现有的大数据解决方案进行了比较性研究和归类分析,重点研究了大数据解决方案的权变因素,并提出了大数据解决方案权变因素矩阵,为相关组织或技术人员组织和选择大数据解决方案提供了一种可行的权衡策略.

  8. An Inverse Robust Optimisation Approach for a Class of Vehicle Routing Problems under Uncertainty

    Directory of Open Access Journals (Sweden)

    Liang Sun

    2016-01-01

    Full Text Available There is a trade-off between the total penalty paid to customers (TPC and the total transportation cost (TTC in depot for vehicle routing problems under uncertainty (VRPU. The trade-off refers to the fact that the TTC in depot inevitably increases when the TPC decreases and vice versa. With respect to this issue, the vehicle routing problem (VRP with uncertain customer demand and travel time was studied to optimise the TPC and the TTC in depot. In addition, an inverse robust optimisation approach was proposed to solve this kind of VRPU by combining the ideas of inverse optimisation and robust optimisation so as to improve both the TPC and the TTC in depot. The method aimed to improve the corresponding TTC of the robust optimisation solution under the minimum TPC through minimising the adjustment of benchmark road transportation cost. According to the characteristics of the inverse robust optimisation model, a genetic algorithm (GA and column generation algorithm are combined to solve the problem. Moreover, 39 test problems are solved by using an inverse robust optimisation approach: the results show that both the TPC and TTC obtained by using the inverse robust optimisation approach are less than those calculated using a robust optimisation approach.

  9. A Study of Ethnic Minority College Students: A Relationship among the Big Five Personality Traits, Cultural Intelligence, and Psychological Well-Being

    Science.gov (United States)

    Smith, Teresa Ann

    2012-01-01

    Institutions of Higher Education are challenged to educate an increasing, diverse ethnic minority population. This study examines (1) if the theory of the Big Five personality traits as a predictor of the cultural intelligence theoretical model remains constant with ethnic minority college students attending a southeastern United States…

  10. Big data impact on society: a research roadmap for Europe

    OpenAIRE

    Cuquet, Martí; Fensel, Anna

    2016-01-01

    With its rapid growth and increasing adoption, big data is producing a growing impact in society. Its usage is opening both opportunities such as new business models and economic gains and risks such as privacy violations and discrimination. Europe is in need of a comprehensive strategy to optimise the use of data for a societal benefit and increase the innovation and competitiveness of its productive activities. In this paper, we contribute to the definition of this strategy with a research ...

  11. Study and optimisation of the high energy detector in Cd(Zn)Te of the Simbol-X space mission for X and gamma astronomy; Etude et optimisation du plan de detection de haute energie en Cd(Zn)Te pour la mission spatiale d'observation astronomie X et gamma SIMBOL-X

    Energy Technology Data Exchange (ETDEWEB)

    Meuris, A.

    2009-09-15

    Stars in final phases of evolution are sites of highest energetic phenomena of the Universe. The understanding of their mechanisms is based on the observation of the X and gamma rays from the sources. The Simbol-X French-Italian project is a novel concept of telescope with two satellites flying in formation. This space mission combines upgraded optics from X-ray telescopes with detection Systems from gamma-ray telescopes. CEA Saclay involved in major space missions for gamma astronomy is in charge of the definition and the design of the High Energy Detector (HED) of Simbol-X to cover the spectral range from 8 to 80 keV. Two generations of micro-cameras called Caliste have been designed, fabricated and tested. They integrate cadmium telluride (CdTe) crystals and optimised front-end electronics named Idef-X. The hybridization technique enables to put them side by side as a mosaic to achieve for the first time a CdTe detection plane with fine spatial resolution (600 {mu}m) and arbitrarily large surface. By setting up test benches and leading test campaigns, I was involved in the fabrication of Caliste prototypes and I assessed temporal, spatial and spectral resolutions. At the conclusion of experiments and simulations, I propose a detector type, operating conditions and digital processing on board the spacecraft to optimise HED performance. The best detector candidate is CdTe Schottky, well suited to high resolution spectroscopy; however, it suffers from lost in stability during biasing. Beyond Simbol-X mission, I studied theoretically and experimentally this kind of detector to build an updated model that can apply to other projects of gamma spectroscopy and imaging. (author)

  12. Optimising Code Generation with haggies

    OpenAIRE

    Reiter, Thomas

    2009-01-01

    This article describes haggies, a program for the generation of optimised programs for the efficient numerical evaluation of mathematical expressions. It uses a multivariate Horner-scheme and Common Subexpression Elimination to reduce the overall number of operations. The package can serve as a back-end for virtually any general purpose computer algebra program. Built-in type inference that allows to deal with non-standard data types in strongly typed languages and a very flexible, pattern-ba...

  13. Hybrid Genetic Algorithm with PSO Effect for Combinatorial Optimisation Problems

    Directory of Open Access Journals (Sweden)

    M. H. Mehta

    2012-12-01

    Full Text Available In engineering field, many problems are hard to solve in some definite interval of time. These problems known as “combinatorial optimisation problems” are of the category NP. These problems are easy to solve in some polynomial time when input size is small but as input size grows problems become toughest to solve in some definite interval of time. Long known conventional methods are not able to solve the problems and thus proper heuristics is necessary. Evolutionary algorithms based on behaviours of different animals and species have been invented and studied for this purpose. Genetic Algorithm is considered a powerful algorithm for solving combinatorial optimisation problems. Genetic algorithms work on these problems mimicking the human genetics. It follows principle of “survival of the fittest” kind of strategy. Particle swarm optimisation is a new evolutionary approach that copies behaviour of swarm in nature. However, neither traditional genetic algorithms nor particle swarm optimisation alone has been completely successful for solving combinatorial optimisation problems. Here a hybrid algorithm is proposed in which strengths of both algorithms are merged and performance of proposed algorithm is compared with simple genetic algorithm. Results show that proposed algorithm works definitely better than the simple genetic algorithm.

  14. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  15. Microfluidic converging/diverging channels optimised for homogeneous extensional deformation

    Science.gov (United States)

    Zografos, K.; Oliveira, M. S. N.

    2016-01-01

    In this work, we optimise microfluidic converging/diverging geometries in order to produce constant strain-rates along the centreline of the flow, for performing studies under homogeneous extension. The design is examined for both two-dimensional and three-dimensional flows where the effects of aspect ratio and dimensionless contraction length are investigated. Initially, pressure driven flows of Newtonian fluids under creeping flow conditions are considered, which is a reasonable approximation in microfluidics, and the limits of the applicability of the design in terms of Reynolds numbers are investigated. The optimised geometry is then used for studying the flow of viscoelastic fluids and the practical limitations in terms of Weissenberg number are reported. Furthermore, the optimisation strategy is also applied for electro-osmotic driven flows, where the development of a plug-like velocity profile allows for a wider region of homogeneous extensional deformation in the flow field. PMID:27478523

  16. Plant-wide performance optimisation – The refrigeration system case

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Green, Torben; Razavi-Far, Roozbeh

    2012-01-01

    This paper investigates the problem of plant-wide performance optimisation seen from an industrial perspective. The refrigeration system is used as a case study, because it has a distributed control architecture and operates in steady state conditions, which is common for many industrial applicat......This paper investigates the problem of plant-wide performance optimisation seen from an industrial perspective. The refrigeration system is used as a case study, because it has a distributed control architecture and operates in steady state conditions, which is common for many industrial...

  17. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  18. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  19. Fabrication optimisation of carbon fiber electrode with Taguchi method.

    Science.gov (United States)

    Cheng, Ching-Ching; Young, Ming-Shing; Chuang, Chang-Lin; Chang, Ching-Chang

    2003-07-01

    In this study, we describe an optimised procedure for fabricating carbon fiber electrodes using Taguchi quality engineering method (TQEM). The preliminary results show a S/N ratio improvement from 22 to 30 db (decibel). The optimised parameter was tested by using a glass micropipette (0.3 mm outer/2.5 mm inner length of carbon fiber) dipped into PBS solution under 2.9 V triangle-wave electrochemical processing for 15 s, followed by coating treatment of micropipette on 2.6 V DC for 45 s in 5% Nafion solution. It is thus shown that Taguchi process optimisation can improve cost, manufacture time and quality of carbon fiber electrodes.

  20. Robust optimisation of railway crossing geometry

    Science.gov (United States)

    Wan, Chang; Markine, Valeri; Dollevoet, Rolf

    2016-05-01

    This paper presents a methodology for improving the crossing (frog) geometry through the robust optimisation approach, wherein the variability of the design parameters within a prescribed tolerance is included in the optimisation problem. Here, the crossing geometry is defined by parameterising the B-spline represented cross-sectional shape and the longitudinal height profile of the nose rail. The dynamic performance of the crossing is evaluated considering the variation of wheel profiles and track alignment. A multipoint approximation method (MAM) is applied in solving the optimisation problem of minimising the contact pressure during the wheel-rail contact and constraining the location of wheel transition at the crossing. To clarify the difference between the robust optimisation and the normal deterministic optimisation approaches, the optimisation problems are solved in both approaches. The results show that the deterministic optimum fails under slight change of the design variables; the robust optimum, however, has improved and robust performance.

  1. Optimising Antibiotic Usage to Treat Bacterial Infections

    Science.gov (United States)

    Paterson, Iona K.; Hoyle, Andy; Ochoa, Gabriela; Baker-Austin, Craig; Taylor, Nick G. H.

    2016-11-01

    The increase in antibiotic resistant bacteria poses a threat to the continued use of antibiotics to treat bacterial infections. The overuse and misuse of antibiotics has been identified as a significant driver in the emergence of resistance. Finding optimal treatment regimens is therefore critical in ensuring the prolonged effectiveness of these antibiotics. This study uses mathematical modelling to analyse the effect traditional treatment regimens have on the dynamics of a bacterial infection. Using a novel approach, a genetic algorithm, the study then identifies improved treatment regimens. Using a single antibiotic the genetic algorithm identifies regimens which minimise the amount of antibiotic used while maximising bacterial eradication. Although exact treatments are highly dependent on parameter values and initial bacterial load, a significant common trend is identified throughout the results. A treatment regimen consisting of a high initial dose followed by an extended tapering of doses is found to optimise the use of antibiotics. This consistently improves the success of eradicating infections, uses less antibiotic than traditional regimens and reduces the time to eradication. The use of genetic algorithms to optimise treatment regimens enables an extensive search of possible regimens, with previous regimens directing the search into regions of better performance.

  2. Towards an Integral Meta-Studies: Describing and Transcending Boundaries in the Development of Big Picture Science

    Directory of Open Access Journals (Sweden)

    Mark G. Edwards

    2013-06-01

    Full Text Available We are entering a period in human civilisation when we will either act globally to establish a sustainable and sustaining network of world societies or be enmired, for the foreseeable future, in a regressive cycle of ever-deepening global crises. We will need to develop global forms of big picture science that possess institutionalised capacities for carrying out meta-level research and practice. It will be global in that such research cannot be undertaken in isolation from practical global concerns and global social movements. In this paper I propose a general schema, called integral meta-studies, that describes some of the characteristics of this meta-level science. Integral here refers to the long tradition of scientific and philosophic endeavours to develop integrative models and methods. Given the disastrous outcomes of some of the totalising theories of the nineteenth century, the subsequent focus on ideas of the middle-range is entirely understandable. But middle-range theory will not resolve global problems. A more reflexive and wider conceptual vision is required. Global problems of the scale that we currently face require a response that can navigate through theoretical pluralism and not be swallowed up by it. In saying that, twenty-first-century metatheories will need to be different from the monistic, grand theories of the past. They will have to be integrative rather than totalising, pluralistic rather than monistic, based on science and not only on philosophy, methodical rather than idiosyncratic, find inspiration in theories, methods and interpretive frameworks from the edge more than from the centre and provide means for inventing new ways of understanding as much as new technologies. Integrative meta-studies describes an open system of knowledge acquisition that has a place for many forms of scientific inquiry and their respective theories, methods, techniques of analysis and interpretive frameworks. Note: The word

  3. Numerical optimisation of friction stir welding: review of future challenges

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Hattel, Jesper Henri

    2011-01-01

    During the last decade, the combination of increasingly more advanced numerical simulation software with high computational power has resulted in models for friction stir welding (FSW), which have improved the understanding of the determining physical phenomena behind the process substantially....... This has made optimisation of certain process parameters possible and has in turn led to better performing friction stir welded products, thus contributing to a general increase in the popularity of the process and its applications. However, most of these optimisation studies do not go well beyond manual...

  4. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  5. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...... using the boundary element method where absorption is incorporated. An example is given where the geometry of a room is defined by four design modes. The room geometry is optimised to get a uniform sound pressure....

  6. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  7. Multiwavelength astronomy and big data

    Science.gov (United States)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  8. Optimising Optimal Image Subtraction

    CERN Document Server

    Israel, H; Schuh, S; Israel, Holger; Hessman, Frederic V.; Schuh, Sonja

    2006-01-01

    Difference imaging is a technique for obtaining precise relative photometry of variable sources in crowded stellar fields and, as such, constitutes a crucial part of the data reduction pipeline in surveys for microlensing events or transiting extrasolar planets. The Optimal Image Subtraction (OIS) algorithm permits the accurate differencing of images by determining convolution kernels which, when applied to reference images of particularly good quality, provide excellent matches to the point-spread functions (PSF) in other images of the time series to be analysed. The convolution kernels are built as linear combinations of a set of basis functions, conventionally bivariate Gaussians modulated by polynomials. The kernel parameters must be supplied by the user and should ideally be matched to the PSF, pixel-sampling, and S/N of the data to be analysed. We have studied the outcome of the reduction as a function of the kernel parameters using our implementation of OIS within the TRIPP package. From the analysis o...

  9. Noise aspects at aerodynamic blade optimisation projects

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G. [Netherlands Energy Research Foundation, Petten (Netherlands)

    1997-12-31

    This paper shows an example of an aerodynamic blade optimisation, using the program PVOPT. PVOPT calculates the optimal wind turbine blade geometry such that the maximum energy yield is obtained. Using the aerodynamic optimal blade design as a basis, the possibilities of noise reduction are investigated. The aerodynamic optimised geometry from PVOPT is the `real` optimum (up to the latest decimal). The most important conclusion from this study is, that it is worthwhile to investigate the behaviour of the objective function (in the present case the energy yield) around the optimum: If the optimum is flat, there is a possibility to apply modifications to the optimum configuration with only a limited loss in energy yield. It is obvious that the modified configurations emits a different (and possibly lower) noise level. In the BLADOPT program (the successor of PVOPT) it will be possible to quantify the noise level and hence to assess the reduced noise emission more thoroughly. At present the most promising approaches for noise reduction are believed to be a reduction of the rotor speed (if at all possible), and a reduction of the tip angle by means of low lift profiles, or decreased twist at the outboard stations. These modifications were possible without a significant loss in energy yield. (LN)

  10. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  11. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  12. Optimisation combinatoire Theorie et algorithmes

    CERN Document Server

    Korte, Bernhard; Fonlupt, Jean

    2010-01-01

    Ce livre est la traduction fran aise de la quatri me et derni re dition de Combinatorial Optimization: Theory and Algorithms crit par deux minents sp cialistes du domaine: Bernhard Korte et Jens Vygen de l'universit de Bonn en Allemagne. Il met l accent sur les aspects th oriques de l'optimisation combinatoire ainsi que sur les algorithmes efficaces et exacts de r solution de probl mes. Il se distingue en cela des approches heuristiques plus simples et souvent d crites par ailleurs. L ouvrage contient de nombreuses d monstrations, concises et l gantes, de r sultats difficiles. Destin aux tudia

  13. A Study on Quality of the User Service Guaranteed for Library Under the Big Data Era%大数据时代图书馆用户服务保障研究

    Institute of Scientific and Technical Information of China (English)

    陈臣

    2014-01-01

    本文对大数据时代图书馆的特征与读者服务质量保障需求进行了分析,指出大数据时代图书馆用户服务保障应从四个方面进行。%Currently the big data era has arrived.Firstly, this paper studies four characteristics of library under the big data era, including big volume, variety of data types, spares value and great velocity.And then, quality of the user service guaranteed for library under the big data era was discussed.The strategy can effectively protect the quality of the user service for library under the big data era.

  14. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  15. Plant-wide performance optimisation – The refrigeration system case

    DEFF Research Database (Denmark)

    Green, Torben; Razavi-Far, Roozbeh; Izadi-Zamanabadi, Roozbeh;

    2012-01-01

    This paper investigates the problem of plant-wide performance optimisation seen from an industrial perspective. The refrigeration system is used as a case study, because it has a distributed control architecture and operates in steady state conditions, which is common for many industrial applicat...

  16. Cluster Optimisation using Cgroups at a Tier-2

    Science.gov (United States)

    Qin, G.; Roy, G.; Crooks, D.; Skipsey, S. C.; Stewart, G. P.; Britton, D.

    2016-10-01

    The Linux kernel feature Control Groups (cgroups) has been used to gather metrics on the resource usage of single and eight-core ATLAS workloads. It has been used to study the effects on performance of a reduction in the amount of physical memory. The results were used to optimise cluster performance, and consequently increase cluster throughput by up to 10%.

  17. Optimising a fall out dust monitoring sampling programme at a ...

    African Journals Online (AJOL)

    GREG

    The aim of this study at the specific cement manufacturing plant and open cast mine was ... Key words: Fall out dust monitoring, cement plant, optimising, air pollution sampling, ..... meters as this is in line with the height of a typical fall out dust.

  18. Big, bad and stupid or big, good and smart? : a three-year participant observational field study of the male bodybuilder stereotype and its consequences

    OpenAIRE

    Persson, Roland S

    2004-01-01

    This research aims at exploring the male bodybuilder stereotype by establishingwhether there indeed exists a stereotypical response pattern inbeing confronted with this type of athlete. If so, which is the content of thisstereotype? Is there also a cross-cultural fit to such a pattern? The study issocio-biologically oriented and designed mainly as a participant-observationfield study in varying Swedish settings over a period of three years. An internationalsample proper of bodybuilders (N = 2...

  19. Self-optimising control of sewer systems

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane Loft;

    The design of sewer system control is a complex task given the large size of the sewer networks, the transient dynamics of the water flows and the stochastic nature of rainfall. This contribution presents a generic methodology for the design of a self-optimising controller in sewer systems...... to design an optimising control strategy for a subcathcment area in Copenhagen....

  20. An Optimisation Approach for Room Acoustics Design

    DEFF Research Database (Denmark)

    Holm-Jørgensen, Kristian; Kirkegaard, Poul Henning; Andersen, Lars

    2005-01-01

    This paper discuss on a conceptual level the value of optimisation techniques in architectural acoustics room design from a practical point of view. It is chosen to optimise one objective room acoustics design criterium estimated from the sound field inside the room. The sound field is modeled...

  1. Haemodynamic optimisation in lower limb arterial surgery

    DEFF Research Database (Denmark)

    Bisgaard, J; Gilsaa, T; Rønholm, E;

    2012-01-01

    index was optimised by administering 250 ml aliquots of colloid intraoperatively and during the first 6 h post-operatively. Following surgery, fluid optimisation was supplemented with dobutamine, if necessary, targeting an oxygen delivery index level ≥ 600 ml/min(/) m(2) in the intervention group...

  2. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  3. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  4. "All Flying Insects with Big, Beautiful Wings are Butterflies!" A Study in Challenging This Misconception

    Science.gov (United States)

    Tsoi, Kwok-Ho

    2013-01-01

    This study investigated the level of understanding among student teachers in differentiating lepidopterans. It adopted a constructive approach to promoting conceptual change in students on the issue of animal classification by generating cognitive conflict. Most of the students used inaccurate morphological traits for identification, such as wing…

  5. Beyond Big

    DEFF Research Database (Denmark)

    Smith, Shelley

    2003-01-01

    In the 1990's the focus in the field of architecture shifted from limited and defined works of architecture and planning to areas of vast and undefined space characterised by large scale, dissolving borders and flux. This Ph.D. thesis examines a contemporary situation in which large scale has...... become an architectural parameter challenging the way architectural space has traditionally been regarded and experienced. A multidisciplinary approach employing socio-cultural, architectural, and aesthetic discourses is developed into the theoretically based chapters, Urbanism, Space and Aesthetics......, Airport. An empirical examination of airport space as a relevant case for the study of how enormous scale and flux challenge traditional spatial and perceptual understandings of architecture is undertaken through an alternative historical mapping which traces the airport through 3 metaphorical...

  6. Evolutionary programming for neutron instrument optimisation

    Science.gov (United States)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelièvre-Berna, Eddy

    2006-11-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  7. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  8. The Person-Event Data Environment (PDE: Leveraging Big Data for Studies of Psychological Strengths in Soldiers

    Directory of Open Access Journals (Sweden)

    Loryana L. Vie

    2013-12-01

    Full Text Available The Department of Defense (DoD strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment and security systems that influence Army-wide policies and procedures.

  9. Studying Plant-Rhizobium Mutualism in the Biology Classroom: Connecting the Big Ideas in Biology through Inquiry

    Science.gov (United States)

    Suwa, Tomomi; Williamson, Brad

    2014-01-01

    We present a guided-inquiry biology lesson, using the plant-rhizobium symbiosis as a model system. This system provides a rich environment for developing connections between the big ideas in biology as outlined in the College Board's new AP Biology Curriculum. Students gain experience with the practice of scientific investigation, from…

  10. Studying Plant-Rhizobium Mutualism in the Biology Classroom: Connecting the Big Ideas in Biology through Inquiry

    Science.gov (United States)

    Suwa, Tomomi; Williamson, Brad

    2014-01-01

    We present a guided-inquiry biology lesson, using the plant-rhizobium symbiosis as a model system. This system provides a rich environment for developing connections between the big ideas in biology as outlined in the College Board's new AP Biology Curriculum. Students gain experience with the practice of scientific investigation, from…

  11. Optimisation of searches for Supersymmetry with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Zvolsky, Milan

    2012-01-15

    The ATLAS experiment is one of the four large experiments at the Large Hadron Collider which is specifically designed to search for the Higgs boson and physics beyond the Standard Model. The aim of this thesis is the optimisation of searches for Supersymmetry in decays with two leptons and missing transverse energy in the final state. Two different optimisation studies have been performed for two important analysis aspects: The final signal region selection and the choice of the trigger selection. In the first part of the analysis, a cut-based optimisation of signal regions is performed, maximising the signal for a minimal background contamination. By this, the signal yield can in parts be more than doubled. The second approach is to introduce di-lepton triggers which allow to lower the lepton transverse momentum threshold, thus enhancing the number of selected signal events significantly. The signal region optimisation was considered for the choice of the final event selection in the ATLAS di-lepton analyses. The trigger study contributed to the incorporation of di-lepton triggers to the ATLAS trigger menu. (orig.)

  12. Big hearts, small hands: a focus group study exploring parental food portion behaviours.

    Science.gov (United States)

    Curtis, Kristina; Atkins, Louise; Brown, Katherine

    2017-09-18

    The development of healthy food portion sizes among families is deemed critical to childhood weight management; yet little is known about the interacting factors influencing parents' portion control behaviours. This study aimed to use two synergistic theoretical models of behaviour: the COM-B model (Capability, Opportunity, Motivation - Behaviour) and Theoretical Domains Framework (TDF) to identify a broad spectrum of theoretically derived influences on parents' portion control behaviours including examination of affective and habitual influences often excluded from prevailing theories of behaviour change. Six focus groups exploring family weight management comprised of one with caseworkers (n = 4), four with parents of overweight children (n = 14) and one with parents of healthy weight children (n = 8). A thematic analysis was performed across the dataset where the TDF/COM-B were used as coding frameworks. To achieve the target behaviour, the behavioural analysis revealed the need for eliciting change in all three COM-B domains and nine associated TDF domains. Findings suggest parents' internal processes such as their emotional responses, habits and beliefs, along with social influences from partners and grandparents, and environmental influences relating to items such as household objects, interact to influence portion size behaviours within the home environment. This is the first study underpinned by COM-B/TDF frameworks applied to childhood weight management and provides new targets for intervention development and the opportunity for future research to explore the mediating and moderating effects of these variables on one another.

  13. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  14. Study on inventory control model based on the B2C mode in big data environment

    Directory of Open Access Journals (Sweden)

    Zhiping Zhang

    2017-03-01

    Full Text Available The current inventory problem has become the key issue in the enterprise survival and development. In this paper, we take “Taobao” as an example to conduct a detailed study of the inventory of the high conversion rate based on data mining. First, by using a funnel model to predict the conversion of the commodities on the critical path, we capture the factors influencing the consumer decision-making on each key point, and propose corresponding solutions of improving the conversion rate; Second, we use BP neural network algorithm to predict the goods traffic, and then obtain the corresponding weights by the relation analysis and the output of the goods traffic by the input of large data sample goods; Third, we can predict the inventory in accordance with the commodity conversion rate and flow prediction, and amend the predicted results to get accurate and real-time inventory forecast, avoiding the economic loss due to the inaccurate inventory.

  15. 6 Top Tools for Taming Big Data%6Top Tools for Taming Big Data

    Institute of Scientific and Technical Information of China (English)

    JakoB BJ orklund

    2012-01-01

    The industry now has a buzzword,"big data," for how we're going to do something with the huge amount of information piling up."Big data" is replacing "business intelligence,"which subsumed "reporting," which put a nicer gloss on "spreadsheets," which beat out the old-fashioned "printouts."Managers who long ago studied printouts are now hiring mathematicians who claim to be big data specialists to help them solve the same old problem:What's selling and why?

  16. Impact Response Study on Covering Cap of Aircraft Big-Size Integral Fuel Tank

    Science.gov (United States)

    Wang, Fusheng; Jia, Senqing; Wang, Yi; Yue, Zhufeng

    2016-10-01

    In order to assess various design concepts and choose a kind of covering cap design scheme which can meet the requirements of airworthiness standard and ensure the safety of fuel tank. Using finite element software ANSYS/LS- DYNA, the impact process of covering cap of aircraft fuel tank by projectile were simulated, in which dynamical characteristics of simple single covering cap and gland double-layer covering cap impacted by titanium alloy projectile and rubber projectile were studied, as well as factor effects on simple single covering cap and gland double-layer covering cap under impact region, impact angle and impact energy were also studied. Though the comparison of critical damage velocity and element deleted number of the covering caps, it shows that the external covering cap has a good protection effect on internal covering cap. The regions close to boundary are vulnerable to appear impact damage with titanium alloy projectile while the regions close to center is vulnerable to occur damage with rubber projectile. Equivalent strain in covering cap is very little when impact angle is less than 15°. Element deleted number in covering cap reaches the maximum when impact angle is between 60°and 65°by titanium alloy projectile. While the bigger the impact angle and the more serious damage of the covering cap will be when rubber projectile impact composite covering cap. The energy needed for occurring damage on external covering cap and internal covering cap is less than and higher than that when single covering cap occur damage, respectively. The energy needed for complete breakdown of double-layer covering cap is much higher than that of single covering cap.

  17. Study of Coping-competence among Unmarried Pregnant Young Women in Three Big Cites in China

    Institute of Scientific and Technical Information of China (English)

    Wei WEI; Xiao-ming YU

    2009-01-01

    Objective To identify the coping-competence among unmarried pregnant young women.Methods A cross-sectional study was conducted in the setting of clinics-based.A total of 1391 unmarried young women were recruited as the sample in Youth Clinics of 3 maternal care hospitals in Beijing,Jinan,and Guangzhou respectively in China."Behavioral Attributes of Psychosocial Competence Scale-Condensed Form" was administered to identify the coping-competence of these women.All these women were aged 10-24 years old and were divided into three groups based on whether or not they had sex and pregnancy.The three groups were named as follows:pregnancy group with young women having had both sex and pregnancy,sex group with young women having had only sex but not pregnancy,and non-sex group with young women having no sex experience.Results Among the adolescents aged 10-19 years old,the coping-competence was different among the three groups(P=0.050).Compared with the pregnancy group,the non-sex group were more inclined to active coping(P=0.026).Among all the pregnant women aged 10-24 years old,the coping-competence was various by region(P<0.001):the women in Jinan were more inclined to active coping than the women in another two cites(P=0.009,P<0.001),and there was no difference between the women from Beijing and Guangzhou(P=0.324).Conclusion This is the first study of coping among unmarried pregnant young women in China.The results supported the point of view that the pregnant adolescents were more inclined to passive coping,and the coping had regional differences.

  18. Lost in a random forest: Using Big Data to study rare events

    Directory of Open Access Journals (Sweden)

    Christopher A Bail

    2015-12-01

    Full Text Available Sudden, broad-scale shifts in public opinion about social problems are relatively rare. Until recently, social scientists were forced to conduct post-hoc case studies of such unusual events that ignore the broader universe of possible shifts in public opinion that do not materialize. The vast amount of data that has recently become available via social media sites such as Facebook and Twitter—as well as the mass-digitization of qualitative archives provide an unprecedented opportunity for scholars to avoid such selection on the dependent variable. Yet the sheer scale of these new data creates a new set of methodological challenges. Conventional linear models, for example, minimize the influence of rare events as “outliers”—especially within analyses of large samples. While more advanced regression models exist to analyze outliers, they suffer from an even more daunting challenge: equifinality, or the likelihood that rare events may occur via different causal pathways. I discuss a variety of possible solutions to these problems—including recent advances in fuzzy set theory and machine learning—but ultimately advocate an ecumenical approach that combines multiple techniques in iterative fashion.

  19. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    Science.gov (United States)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy

  20. Study on the detection of red-tide outbreaks using big satellite database

    Science.gov (United States)

    Son, Young Baek; Eun, Yoon Joo; Park, Kyongseok; Lee, Sanghwan; Lee, Ryong; Kim, Sang-Hyun; Yoo, Sinjae

    2014-11-01

    Satellite remote sensing has been successfully employed to monitor and detect the increasing incidence of harmful algal blooms (HABs) under various water conditions. In this study, to establish a comprehensive monitoring system of HAB outbreaks (particularly Cochlodinium polykrikoides blooms) in the southern coast of Korea (SCK), we tested the several proposed red-tide detection methods using SeaWiFS and MODIS ocean color data. Temporal and spatial information of red tide events from 2002 to 2013 were obtained from the National Fisheries Research and Development of Korea (NFRDI), which were matched with synchronously obtained satellite-derived ocean color data. The spectral characteristics of C. polykrikoides red tides were that increased phytoplankton absorption at 443 nm and pigment backscattering 555 nm resulted in a steeper slope between 488 and 555 nm with a hinge point at 488 (or 490) nm. On the other hand, non-red tide water, typically were presented by broader radiance spectra between the blue and green bands were associated with reduced pigment absorption and backscattering. The analysis of ocean color imageries that captured C. polykrikoides red tide blooms showed discolored waters with enhanced pigment concentrations, high chlorophyll, fluorescence, absorption at 443 nm. However, most red tide detection algorithms found a large number of false positive but only a small number of true positive areas. These proposed algorithms are not useful to distinguish true red tide water from complex non-red tide water. Our proposed method substantially reduces the false signal rate (false positive) from strong absorption at short wavelengths and provide a more reliable and robust detection of C. polykrikoides blooms in the SCK from the space.

  1. Evaluation of a high throughput starch analysis optimised for wood.

    Directory of Open Access Journals (Sweden)

    Chandra Bellasio

    Full Text Available Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11 was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood of four species (coniferous and flowering plants. The optimised protocol proved to be remarkably precise and accurate (3%, suitable for a high throughput routine analysis (35 samples a day of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  2. Evaluation of a high throughput starch analysis optimised for wood.

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  3. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  4. Optimisation of Kinematics for Tracked Vehicle Hydro Gas Suspension System

    Directory of Open Access Journals (Sweden)

    S. Sridhar

    2006-11-01

    Full Text Available The modern-day armoured fighting vehicles (AFVs are basically tracked vehicles equippedwith hydro gas suspensions, in lieu of conventional mechanical suspensions like torsion barand coil spring bogie suspensions. The uniqueness of hydro gas suspension is that it offersa nonlinear spring rate, which is very much required for the cross-country moveability of atracked vehicle. The AFVs have to negotiate different cross-country terrains like sandy, rocky,riverbed, etc. and the road irregularities provide enumerable problems during dynamic loadingsto the design of hydro gas suspension system. Optimising various design parameters demandsinnovative design methodologies to achieve better ride performance. Hence, a comprehensivekinematic analysis is needed. In this study, a methodology has been derived to optimise thekinematics of the suspension by reorienting the cylinder axis and optimising the loadtransferringleverage factor so that the side thrust on the cylinder is minimised to a greaterextent. The optimisation ultimately increases the life of the high-pressure and high-temperaturepiston seals, resulting in enhanced system life for better dependability.

  5. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  6. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  7. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  8. Niobium Cavity Electropolishing Modelling and Optimisation

    CERN Document Server

    Ferreira, L M A; Forel, S; Shirra, J A

    2013-01-01

    It’s widely accepted that electropolishing (EP) is the most suitable surface finishing process to achieve high performance bulk Nb accelerating cavities. At CERN and in preparation for the processing of the 704 MHz high-beta Superconducting Proton Linac (SPL) cavities a new vertical electropolishing facility has been assembled and a study is on-going for the modelling of electropolishing on cavities with COMSOL® software. In a first phase, the electrochemical parameters were taken into account for a fixed process temperature and flow rate, and are presented in this poster as well as the results obtained on a real SPL single cell cavity. The procedure to acquire the data used as input for the simulation is presented. The modelling procedure adopted to optimise the cathode geometry, aimed at a uniform current density distribution in the cavity cell for the minimum working potential and total current is explained. Some preliminary results on fluid dynamics is also briefly described.

  9. Optimising costs in WLCG operations

    CERN Document Server

    Pradillo, Mar; Flix, Josep; Forti, Alessandra; Sciabà, Andrea

    2015-01-01

    The Worldwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse the 50 Petabytes of data annually generated by the LHC. The WLCG operations are coordinated by a distributed team of managers and experts and performed by people at all participating sites and from all the experiments. Several improvements in the WLCG infrastructure have been implemented during the first long LHC shutdown to prepare for the increasing needs of the experiments during Run2 and beyond. However, constraints in funding will affect not only the computing resources but also the available effort for operations. This paper presents the results of a detailed investigation on the allocation of the effort in the different areas of WLCG operations, identifies the most important sources of inefficiency and proposes viable strategies for optimising the operational cost, taking into account the current trends in the evolution of the computing infrastruc...

  10. Optimising Comprehensibility in Interlingual Translation

    DEFF Research Database (Denmark)

    Nisbeth Jensen, Matilde

    2015-01-01

    . It is argued that Plain Language writing is a type of intralingual translation as it involves rewriting or translating a complex monolingual text into comprehensible language. Based on Plain Language literature, a comprehensibility framework is elaborated, which is subsequently exemplified through...... the functional text type of Patient Information Leaflet. Finally, the usefulness of applying the principles of Plain Language and intralingual translation for optimising comprehensibility in interlingual translation is discussed....... information on medication and tax information. Such texts are often written by experts and received by lay people, and, in today’s globalised world, they are often translated as well. In these functional texts, the receiver is not a mere recipient of information, but s/he needs to be able to act upon it...

  11. A comparative study of physicochemical characteristics and functionalities of pinto bean protein isolate (PBPI) against the soybean protein isolate (SPI) after the extraction optimisation.

    Science.gov (United States)

    Tan, Ee-San; Ying-Yuan, Ngoh; Gan, Chee-Yuen

    2014-01-01

    Optimisation of protein extraction yield from pinto bean was investigated using response surface methodology. The maximum protein yield of 54.8 mg/g was obtained with the optimal conditions of: temperature=25 °C, time=1 h and buffer-to-sample ratio=20 ml/g. PBPI was found to obtain high amount of essential amino acids such as leucine, lysine, and phenylalanine compared to SPI. The predominant proteins of PBPI were vicilin and phytohemagglutinins whereas the predominant proteins of SPI were glycinin and conglycinins. Significantly higher emulsifying capacity was found in PBPI (84.8%) compared to SPI (61.9%). Different isoelectric points were found in both PBPI (4.0-5.5) and SPI (4.0-5.0). Also, it was found that PBPI obtained a much higher denaturation temperature of 110.2 °C compared to SPI (92.5 °C). Other properties such as structural information, gelling capacity, water- and oil-holding capacities, emulsion stability as well as digestibility were also reported.

  12. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  13. Big Boss Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper big boss interval games are introduced and various characterizations are given. The structure of the core of a big boss interval game is explicitly described and plays an important role relative to interval-type bi-monotonic allocation schemes for such games. Specifically, each element

  14. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  15. Efficient big data assimilation through sparse representation: A 3D benchmark case study in seismic history matching

    CERN Document Server

    Luo, Xiaodong; Jakobsen, Morten; Nævdal, Geir

    2016-01-01

    In a previous work \\citep{luo2016sparse2d_spej}, the authors proposed an ensemble-based 4D seismic history matching (SHM) framework, which has some relatively new ingredients, in terms of the type of seismic data in choice, the way to handle big seismic data and related data noise estimation, and the use of a recently developed iterative ensemble history matching algorithm. In seismic history matching, it is customary to use inverted seismic attributes, such as acoustic impedance, as the observed data. In doing so, extra uncertainties may arise during the inversion processes. The proposed SHM framework avoids such intermediate inversion processes by adopting amplitude versus angle (AVA) data. In addition, SHM typically involves assimilating a large amount of observed seismic attributes into reservoir models. To handle the big-data problem in SHM, the proposed framework adopts the following wavelet-based sparse representation procedure: First, a discrete wavelet transform is applied to observed seismic attribu...

  16. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  17. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  18. Big data are coming to psychiatry: a general introduction.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry.

  19. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  20. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  1. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  2. Distributed convex optimisation with event-triggered communication in networked systems

    Science.gov (United States)

    Liu, Jiayun; Chen, Weisheng

    2016-12-01

    This paper studies the distributed convex optimisation problem over directed networks. Motivated by practical considerations, we propose a novel distributed zero-gradient-sum optimisation algorithm with event-triggered communication. Therefore, communication and control updates just occur at discrete instants when some predefined condition satisfies. Thus, compared with the time-driven distributed optimisation algorithms, the proposed algorithm has the advantages of less energy consumption and less communication cost. Based on Lyapunov approaches, we show that the proposed algorithm makes the system states asymptotically converge to the solution of the problem exponentially fast and the Zeno behaviour is excluded. Finally, simulation example is given to illustrate the effectiveness of the proposed algorithm.

  3. Systematic delay-driven power optimisation and power-driven delay optimisation of combinational circuits

    OpenAIRE

    Mehrotra, Rashmi

    2013-01-01

    With the proliferation of mobile wireless communication and embedded systems, the energy efficiency becomes a major design constraint. The dissipated energy is often referred as the product of power dissipation and the input-output delay. Most of electronic design automation techniques focus on optimising only one of these parameters either power or delay. Industry standard design flows integrate systematic methods of optimising either area or timing while for power consumption optimisation o...

  4. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  5. Environmental optimisation of waste combustion

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Robert [AaF Energikonsult, Stockholm (Sweden); Berge, Niclas; Stroemberg, Birgitta [TPS Termiska Processer AB, Nykoeping (Sweden)

    2000-12-01

    The regulations concerning waste combustion evolve through R and D and a strive to get better and common regulations for the European countries. This study discusses if these rules of today concerning oxygen concentration, minimum temperature and residence time in the furnace and the use of stand-by burners are needed, are possible to monitor, are the optimum from an environmental point of view or could be improved. No evidence from well controlled laboratory experiments validate that 850 deg C in 6 % oxygen content in general is the best lower limit. A lower excess air level increase the temperature, which has a significant effect on the destruction of hydrocarbons, favourably increases the residence time, increases the thermal efficiency and the efficiency of the precipitators. Low oxygen content is also necessary to achieve low NO{sub x}-emissions. The conclusion is that the demands on the accuracy of the measurement devices and methods are too high, if they are to be used inside the furnace to control the combustion process. The big problem is however to find representative locations to measure temperature, oxygen content and residence time in the furnace. Another major problem is that the monitoring of the operation conditions today do not secure a good combustion. It can lead to a false security. The reason is that it is very hard to find boilers without stratifications. These stratifications (stream lines) has each a different history of residence time, mixing time, oxygen and combustible gas levels and temperature, when they reach the convection area. The combustion result is the sum of all these different histories. The hydrocarbons emission is in general not produced at a steady level. Small clouds of unburnt hydrocarbons travels along the stream lines showing up as peaks on a THC measurement device. High amplitude peaks has a tendency to contain higher ratio of heavy hydrocarbons than lower peaks. The good correlation between some easily detected

  6. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  7. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe

    This thesis deals with topology optimisation for coupled convection problems. The aim is to extend and apply topology optimisation to steady-state conjugate heat transfer problems, where the heat conduction equation governs the heat transfer in a solid and is coupled to thermal transport...... in a surrounding uid, governed by a convection-diffusion equation, where the convective velocity field is found from solving the isothermal incompressible steady-state Navier-Stokes equations. Topology optimisation is also applied to steady-state natural convection problems. The modelling is done using stabilised...... finite elements, the formulation and implementation of which was done partly during a special course as prepatory work for this thesis. The formulation is extended with a Brinkman friction term in order to facilitate the topology optimisation of fluid flow and convective cooling problems. The derived...

  8. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  9. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  10. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  11. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  12. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  13. User perspectives in public transport timetable optimisation

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    The present paper deals with timetable optimisation from the perspective of minimising the waiting time experienced by passengers when transferring either to or from a bus. Due to its inherent complexity, this bi-level minimisation problem is extremely difficult to solve mathematically, since...... on the large-scale public transport network in Denmark. The timetable optimisation approach yielded a yearly reduction in weighted waiting time equivalent to approximately 45 million Danish kroner (9 million USD)....

  14. Brief Cognitive Behavioural Therapy Compared to Optimised General Practitioners? Care for Depression: A Randomised Trial

    OpenAIRE

    Schene, A.H.; Baas, K.D.; Koeter, M; Lucassen, P.; Bockting, C.L.H.; Wittkampf, K. F.; van Weert, H.C.; Huyser, J.

    2014-01-01

    Background: How to treat Major Depressive Disorder (MDD) in primary care? Studies that compared (brief) Cognitive Behavioural Therapy (CBT) with care as usual by the General Practitioner (GP) found the first to be more effective. However, to make a fair comparison GP care should be optimised and protocolised according to current evidence based guidelines for depression. So far this has not been the case. We studied whether a protocolised 8 session CBT is more effective than optimised and prot...

  15. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  16. Inhomogeneous Big Bang Cosmology

    CERN Document Server

    Wagh, S M

    2002-01-01

    In this letter, we outline an inhomogeneous model of the Big Bang cosmology. For the inhomogeneous spacetime used here, the universe originates in the infinite past as the one dominated by vacuum energy and ends in the infinite future as the one consisting of "hot and relativistic" matter. The spatial distribution of matter in the considered inhomogeneous spacetime is {\\em arbitrary}. Hence, observed structures can arise in this cosmology from suitable "initial" density contrast. Different problems of the standard model of Big Bang cosmology are also resolved in the present inhomogeneous model. This inhomogeneous model of the Big Bang Cosmology predicts "hot death" for the universe.

  17. Optimising costs in WLCG operations

    Science.gov (United States)

    Alandes Pradillo, María; Dimou, Maria; Flix, Josep; Forti, Alessandra; Sciabà, Andrea

    2015-12-01

    The Worldwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse the 50 Petabytes of data annually generated by the LHC. The WLCG operations are coordinated by a distributed team of managers and experts and performed by people at all participating sites and from all the experiments. Several improvements in the WLCG infrastructure have been implemented during the first long LHC shutdown to prepare for the increasing needs of the experiments during Run2 and beyond. However, constraints in funding will affect not only the computing resources but also the available effort for operations. This paper presents the results of a detailed investigation on the allocation of the effort in the different areas of WLCG operations, identifies the most important sources of inefficiency and proposes viable strategies for optimising the operational cost, taking into account the current trends in the evolution of the computing infrastructure and the computing models of the experiments.

  18. Health Informatics Scientists' Perception About Big Data Technology.

    Science.gov (United States)

    Minou, John; Routsis, Fotios; Gallos, Parisis; Mantas, John

    2017-01-01

    The aim of this paper is to present the perceptions of the Health Informatics Scientists about the Big Data Technology in Healthcare. An empirical study was conducted among 46 scientists to assess their knowledge about the Big Data Technology and their perceptions about using this technology in healthcare. Based on the study findings, 86.7% of the scientists had knowledge of Big data Technology. Furthermore, 59.1% of the scientists believed that Big Data Technology refers to structured data. Additionally, 100% of the population believed that Big Data Technology can be implemented in Healthcare. Finally, the majority does not know any cases of use of Big Data Technology in Greece while 57,8% of the them mentioned that they knew use cases of the Big Data Technology abroad.

  19. LSVT-BIG Improves UPDRS III Scores at 4 Weeks in Parkinson's Disease Patients with Wearing Off: A Prospective, Open-Label Study.

    Science.gov (United States)

    Ueno, Tatsuya; Sasaki, Megumi; Nishijima, Haruo; Funamizu, Yukihisa; Kon, Tomoya; Haga, Rie; Arai, Akira; Suzuki, Chieko; Nunomura, Jin-Ichi; Baba, Masayuki; Tomiyama, Masahiko

    2017-01-01

    The efficacy of LSVT-BIG for advanced Parkinson's disease (PD) patients with wearing off remains to be determined. Therefore, we evaluated whether LSVT-BIG improves motor disability in eight PD patients with wearing off. Unified Parkinson's Disease Rating Scale (UPDRS) scores, daily off time, and mobility assessments were evaluated during the "on" time before and after the LSVT-BIG course. LSVT-BIG significantly improved UPDRS III scores at 4 weeks and UPDRS II scores in the "off" state at 12 weeks, with no changes in the other measures. The findings suggest that LSVT-BIG may be an effective therapy for advanced PD patients with wearing off.

  20. 大数据与高校管理研究%Study on Big Data and Higher Education Management

    Institute of Scientific and Technical Information of China (English)

    于亮; 孟宇

    2016-01-01

    “大数据”一词正在以铺天盖地之势席卷当今社会以及人们的视听。尽管人们对于其具体所指莫衷一是,但对该词的使用已经遥遥领先于对该词的理解。“大数据”一词大行其道,成为一种时尚和风潮。文章则希望倒置这种“时尚”,从理解该词出发,探讨大数据给整个社会以及高校系统带来的机遇和挑战。文章分为3个部分:第一部分,文章将着重论述从数据到大数据的过渡中的时代转折和社会变革,尤其是在互联网时代中,由用户创建内容所引领的获取数据和分析数据的新趋势;第二部分,文章将分析大数据时代社会所面临的种种机遇以及挑战;第三部分,文章将具体结合中国高校管理的现状,探讨大数据如何影响学校管理环境、管理水平和管理决策。%The term “Big Data” has been sweeping the globe and attracting unprecedented attention. Although people have disagreement regarding what the term really means, the use of the word has to a large extent preceded the understanding of it. The ubiquitous use of “big data”has become a trend and fashion. This article shows the author’s efforts in interpreting the trend, which starts from understanding the word and discusses the opportunities and challenges big data has brought up to the society and the higher education system. The discussion is unfolded in three sections: in the ifrst section, the author articulates the social reform rooted in the transition from data to big data, especially the new tendencies initiated by the prosperity of user generated content; in the second section, the author analyzes the opportunities and challenges the contemporary society is facing in the era of big data; in the last section, the author applies the analysis to the higher education management practices and explicates how big data inlfuences the ecology of university management, the level of management, as well

  1. The mediating role of resilience in the relationship between big five personality and anxiety among Chinese medical students: a cross-sectional study.

    Directory of Open Access Journals (Sweden)

    Meng Shi

    Full Text Available The psychological distress of medical students is a major concern of public health worldwide. However, few studies have been conducted to evaluate anxiety symptoms of medical students in China. The purpose of this study was to investigate the anxiety symptoms among Chinese medical students, to examine the relationships between big five personality traits and anxiety symptoms among medical students, and to explore the mediating role of resilience in these relationships.This multicenter cross-sectional study was conducted in June 2014. Self-reported questionnaires consisting of the Zung Self-Rating Anxiety Scale (SAS, Big Five Inventory (BFI, Wagnild and Young Resilience Scale (RS-14 and demographic section were distributed to the subjects. A stratified random cluster sampling method was used to select 2925 medical students (effective response rate: 83.57% at four medical colleges and universities in Liaoning province, China. Asymptotic and resampling strategies were used to explore the mediating role of resilience.The prevalence of anxiety symptoms was 47.3% (SAS index score≥50 among Chinese medical students. After adjusting for the demographic factors, the traits of agreeableness, conscientiousness and openness were all negatively associated with anxiety whereas neuroticism was positively associated with it. Resilience functioned as a mediator in the relationships between agreeableness/conscientiousness/openness and anxiety symptoms.Among Chinese medical students, the prevalence of anxiety symptoms was high and resilience mediated the relationships between big five personality traits and anxiety symptoms. Identifying at-risk individuals and undertaking appropriate intervention strategies that focus on both personality traits and resilience might be more effective to prevent and reduce anxiety symptoms.

  2. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  3. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  4. The big head and the long tail

    DEFF Research Database (Denmark)

    Helles, Rasmus

    2013-01-01

    This paper discusses how the advent of big data challenges established theories in Internet studies to redevelop existing explanatory strategies in order to incorporate the possibilities offered by this new empirical resource. The article suggests that established analytical procedures and theore......This paper discusses how the advent of big data challenges established theories in Internet studies to redevelop existing explanatory strategies in order to incorporate the possibilities offered by this new empirical resource. The article suggests that established analytical procedures...

  5. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    ’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  6. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  7. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  8. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  9. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  10. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  11. ANALYTICS OF BIG DATA

    OpenAIRE

    Asst. Prof. Shubhada Talegaon

    2014-01-01

    Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, senti...

  12. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  13. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  14. Big data need big theory too

    Science.gov (United States)

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  15. Big data need big theory too

    OpenAIRE

    Coveney, Peter V.; Dougherty, Edward R; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  16. Big data need big theory too.

    OpenAIRE

    Coveney, P. V.; Dougherty, E. R.; Highfield, R. R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  17. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  18. 血浆NT-pro BNP和Big ET-1在冠心病的代谢变化及其临床价值%Study on Clinical Value of Changes of Plasma NT-proBNP and Big ET-1 in Patients with Coronary Heart Disease

    Institute of Scientific and Technical Information of China (English)

    陈剑雄; 彭紫元; 张琼丽; 李延武; 屠洪; 李卓成

    2011-01-01

    目的:探讨冠心病(CHD)患者血浆N末端脑钠肽原(NT-proBNP)与大内皮素-1(Big ET-1)水平的变化及其与心功能的关系.方法:采用酶联免疫吸附试验双抗体夹心法对136例CHD患者及55例健康对照者进行血浆NT-proBNP和Big ET-1测定,同时以彩色多普勒超声心电动仪测定CHD患者左心室舒张末期内径(LVEDD)、左心室收缩末期内径(LVESD)、左心室射血分数(LVEF),并与血浆NT-proBNP和Big ET-1水平进行相关性分析.结果:CHD患者血浆NT-proBNP和Big ET-1水平均明显高于健康对照者(P<0.01),且不同心功能患者之间的血浆NT-proBNP和Big ET-1含量亦存在显著性差异(P<0.01);CHD患者血浆NT-proBNP水平与LVEF、LVESD、LVEDD呈现良好的相关性(r分别为-0.63、+0.57和+0.61,P均<0.01),而血浆Big ET-1水平与LVEF、LVESD、LVEDD的变化亦密切相关(r分别为-0.51、+0.46和+0.49,P均<0.01);血浆NT-proBNP和Big ET-1水平呈正相关(r=+0.47,P<0.01).结论:NT-proBNP和Big ET-1可能参与了CHD及心功能损伤的病理生理过程,血浆NT-proBNP升高与Big ET-1的变化呈正相关.%Objective To investigate the changes of plasma levels of N-terminal irobrain natriuredc peptide( NT-proBNP ) and Big endotbelin-1 ( Big ET-1 ) and their relationship with cardiac function in patients with coonary heart disease (CHD ). Methods The levels of plasma NT-ptoBNP and Big ET-1 were measured by double antibedy enzyme linked immumescrbent assay in 136 patients with CHD and 55 healthy subjects. Lift ventricular ejection fracfion( LVEF ), left ventricular end systolic diameter ( LVESD ) and left ventricular end diastolic diameter ( LVEDD ) were determined by color Doppler ultra.sonograghy in CHD patients. And the relation between plaama levels of NT-proBNP and Big ET-1. as well as the values of LVEF , LVESD and LVEDD were studied in CHD patients. ResUlts Plasma levels of NT-proBNP and Big ET-1 were markedly increased in patients with CHD as compared with

  19. Heavy carbon travertine related to methane generation: A case study of the Big Tarkhan cold spring, Kerch Peninsula, Crimea

    Science.gov (United States)

    Kokh, Svetlana N.; Shnyukov, Yevgeny F.; Sokol, Ella V.; Novikova, Sofya A.; Kozmenko, Olga A.; Semenova, Dina V.; Rybak, Elena N.

    2015-07-01

    The Big Tarkhan cold spring is located within the Bulganak zone of mud volcanism in the northern Kerch Peninsula (Crimea). The spring waters mainly have Cl-HCO3-Na-Ca chemistry and temperatures from 18 to 23 °C. Active travertine deposition at the site produces abundant calcite and minor amounts of siderite, halite, tincalconite, trona, gaylussite, northupite and amorphous iron hydroxides. Calcite contains the impurities of 0.26-2.16 wt.% MgO, up to 0.87 wt.% FeO, 0.15-0.73 wt.% SrO, 0.28-0.98 BaO, up to 0.43 wt.% MnO, and 0.09-0.60 Na2O. Active travertines are depleted in REE (ΣREE = 2.6-4.8 ppm) compared to the inactive ones. The Kerch travertine calcite shows unusual carbon and oxygen isotope compositions (+ 8.1 ÷ + 12.5‰ VPDB δ13C and + 10.1 ÷ + 12.9‰ VPDB δ18O). Their isotopic and trace element signatures (including REE patterns) suggest their relation to basinal waters and origin from the organic-rich clayey Maikop Fm., which is the principal source rock of the area. The Big Tarkhan travertine deposits are associated with thermogenic methane production.

  20. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  1. Global Fluctuation Spectra in Big Crunch/Big Bang String Vacua

    CERN Document Server

    Craps, B; Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study Big Crunch/Big Bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a Big Crunch and a Big Bang cosmology, as well as additional ``whisker'' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the Big Crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function $\\Delta$, which is momentum and time-dependent. We compute $\\Delta$ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to ``entanglement'' entropy in the Big Bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that $\\Delta\\to 1$ and, hence, the fluctuation spectrum is unaltered by the Big Crunch/Big Bang singularity. We comment on, but do not attempt to resolve, su...

  2. Modelling and Optimising TinyTP over IrDA Stacks

    Directory of Open Access Journals (Sweden)

    Boucouvalas A. C.

    2005-01-01

    Full Text Available TinyTP is the IrDA transport layer protocol for indoor infrared communications. For the first time, this paper presents a mathematical model for TinyTP over the IrDA protocol stacks taking into account the presence of bit errors. Based on this model, we carry out a comprehensive optimisation study to improve system performance at the transport layer. Four major parameters are optimised for maximum throughput including TinyTP receiver window, IrLAP window and frame size, as well as IrLAP turnaround time. Equations are derived for the optimum IrLAP window and frame sizes. Numerical results show that the system throughput is significantly improved by implementing the optimised parameters. The major contribution of this work is the modelling of TinyTP including the low-layer protocols and optimisation of the overall throughput by appropriate parameter selection.

  3. The use of a genetic algorithm in optical thin film design and optimisation

    Directory of Open Access Journals (Sweden)

    Efrem K. Ejigu

    2010-07-01

    Full Text Available We used a genetic algorithm in the design and optimisation of optical thin films and present the effects of the choice of variables, refractive index and optical thickness, in both applications of this algorithm, in this paper. The Fourier transform optical thin film design method was used to create a starting population, which was later optimised by the genetic algorithm. In the genetic algorithm design application, the effect of the choice of variable was not distinct, as it depended on the type of design specification. In the genetic algorithm optimisation application, the choice of refractive index as a variable showed a better performance than that of optical thickness. The results of this study indicate that a genetic algorithm is more effective in the design application than in the optimisation application of optical thin film synthesis.

  4. Optimising the Target and Capture Sections of the Neutrino Factory

    CERN Document Server

    Hansen, Ole Martin; Stapnes, Steinar

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accelerator and thus the neutrino beam intensity. The optimisation studies were performed with the use of Monte Carlo simulation tools. The production of secondary particles, by interactions between the incoming proton beam and the mercury target, was optimised by varying the proton beam impact position and impact angles on the target. The proton beam and target interaction region was studied and showed to be off the central axis of the capture section in the baseline configuration. The off-centred interaction region resulted in ...

  5. Optimisation of VSC-HVDC Transmission for Wind Power Plants

    DEFF Research Database (Denmark)

    Silva, Rodrigo Da

    Connection of Wind Power Plants (WPP), typically oshore, using VSCHVDC transmission is an emerging solution with many benefits compared to the traditional AC solution, especially concerning the impact on control architecture of the wind farms and the grid. The VSC-HVDC solution is likely to meet...... more stringent grid codes than a conventional AC transmission connection. The purpose of this project is to analyse how HVDC solution, considering the voltage-source converter based technology, for grid connection of large wind power plants can be designed and optimised. By optimisation, the project...... the requirements established by the operators in the multiterminal VSC-HVDC transmission system. Moreover, the possibility in minimising the overall transmission losses can be a solution for small grids and the minimisation in the dispatch error is a new solution for power deliver maximisation. The second study...

  6. Biomass supply chain optimisation for Organosolv-based biorefineries.

    Science.gov (United States)

    Giarola, Sara; Patel, Mayank; Shah, Nilay

    2014-05-01

    This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs.

  7. Big data processing with Hadoop

    OpenAIRE

    Wu, Shiqi

    2015-01-01

    Computing technology has changed the way we work, study, and live. The distributed data processing technology is one of the popular topics in the IT field. It provides a simple and centralized computing platform by reducing the cost of the hardware. The characteristics of distributed data processing technology have changed the whole industry. Hadoop, as the open source project of Apache foundation, is the most representative platform of distributed big data processing. The Hadoop distribu...

  8. Alginate microencapsulated hepatocytes optimised for transplantation in acute liver failure.

    Directory of Open Access Journals (Sweden)

    Suttiruk Jitraruch

    Full Text Available BACKGROUND AND AIM: Intraperitoneal transplantation of alginate-microencapsulated human hepatocytes is an attractive option for the management of acute liver failure (ALF providing short-term support to allow native liver regeneration. The main aim of this study was to establish an optimised protocol for production of alginate-encapsulated human hepatocytes and evaluate their suitability for clinical use. METHODS: Human hepatocyte microbeads (HMBs were prepared using sterile GMP grade materials. We determined physical stability, cell viability, and hepatocyte metabolic function of HMBs using different polymerisation times and cell densities. The immune activation of peripheral blood mononuclear cells (PBMCs after co-culture with HMBs was studied. Rats with ALF induced by galactosamine were transplanted intraperitoneally with rat hepatocyte microbeads (RMBs produced using a similar optimised protocol. Survival rate and biochemical profiles were determined. Retrieved microbeads were evaluated for morphology and functionality. RESULTS: The optimised HMBs were of uniform size (583.5±3.3 µm and mechanically stable using 15 min polymerisation time compared to 10 min and 20 min (p<0.001. 3D confocal microscopy images demonstrated that hepatocytes with similar cell viability were evenly distributed within HMBs. Cell density of 3.5×10(6 cells/ml provided the highest viability. HMBs incubated in human ascitic fluid showed better cell viability and function than controls. There was no significant activation of PBMCs co-cultured with empty or hepatocyte microbeads, compared to PBMCs alone. Intraperitoneal transplantation of RMBs was safe and significantly improved the severity of liver damage compared to control groups (empty microbeads and medium alone; p<0.01. Retrieved RMBs were intact and free of immune cell adherence and contained viable hepatocytes with preserved function. CONCLUSION: An optimised protocol to produce GMP grade alginate

  9. Optimisation of brain SPET and portability of normal databases

    Energy Technology Data Exchange (ETDEWEB)

    Barnden, Leighton R.; Behin-Ain, Setayesh; Goble, Elizabeth A. [The Queen Elizabeth Hospital, Adelaide (Australia); Hatton, Rochelle L.; Hutton, Brian F. [Westmead Hospital, Sydney (Australia)

    2004-03-01

    Use of a normal database in quantitative regional analysis of brain single-photon emission tomography (SPET) facilitates the detection of functional defects in individual or group studies by accounting for inter-subject variability. Different reconstruction methods and suboptimal attenuation and scatter correction methods can introduce additional variance that will adversely affect such analysis. Similarly, processing differences across different instruments and/or institutions may invalidate the use of external normal databases. The object of this study was to minimise additional variance by comparing reconstructions of a physical phantom with its numerical template so as to optimise processing parameters. Age- and gender-matched normal scans acquired on two different systems were compared using SPM99 after processing with both standard and optimised parameters. For three SPET systems we have optimised parameters for attenuation correction, lower window scatter subtraction, reconstructed pixel size and fanbeam focal length for both filtered back-projection (FBP) and iterative (OSEM) reconstruction. Both attenuation and scatter correction improved accuracy for all systems. For single-iteration Chang attenuation correction the optimum attenuation coefficient (mu) was 0.45-0.85 of the narrow beam value (Nmu) before, and 0.75-0.85 Nmu after, scatter subtraction. For accurately modelled OSEM attenuation correction, optimum mu was 0.6-0.9 Nmu before and 0.9-1.1 Nmu after scatter subtraction. FBP appeared to change in-plane voxel dimensions by about 2% and this was confirmed by line phantom measurements. Improvement in accuracy with scatter subtraction was most marked for the highest spatial resolution system. Optimised processing reduced but did not remove highly significant regional differences between normal databases acquired on two different SPET systems. (orig.)

  10. Optimisation Modelling of Efficiency of Enterprise Restructuring

    Directory of Open Access Journals (Sweden)

    Yefimova Hanna V.

    2014-03-01

    Full Text Available The article considers issues of optimisation of the use of resources directed at restructuring of a shipbuilding enterprise, which is the main prerequisite of its efficiency. Restructuring is considered as a process of complex and interconnected change in the structure of assets, liabilities, enterprise functions, initiated by dynamic environment, which is based on the strategic concept of its development and directed at increase of efficiency of its activity, which is expressed in the growth of cost. The task of making a decision to restructure a shipbuilding enterprise and selection of a specific restructuring project refers to optimisation tasks of prospective planning. Enterprise resources that are allocated for restructuring serve as constraints of the mathematical model. Main criteria of optimisation are maximisation of pure discounted income or minimisation of expenditures on restructuring measures. The formed optimisation model is designed for assessment of volumes of attraction of own and borrowed funds for restructuring. Imitation model ensures development of cash flows. The task solution is achieved on the basis of the complex of interrelated optimisation and imitation models and procedures on formation, selection and co-ordination of managerial decisions.

  11. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  12. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    The emergence of new tracking technologies and Big Data has caused a transformation of the transport geography field in recent years. One new datatype, which is starting to play a significant role in public transport, is smart card data. Despite the growing focus on smart card data, there is a need...... for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...

  13. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D. “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  14. Techno-economic optimisation of energy systems; Contribution a l'optimisation technico-economique de systemes energetiques

    Energy Technology Data Exchange (ETDEWEB)

    Mansilla Pellen, Ch

    2006-07-15

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  15. Advanced treatment planning using direct 4D optimisation for pencil-beam scanned particle therapy

    Science.gov (United States)

    Bernatowicz, Kinga; Zhang, Ye; Perrin, Rosalind; Weber, Damien C.; Lomax, Antony J.

    2017-08-01

    We report on development of a new four-dimensional (4D) optimisation approach for scanned proton beams, which incorporates both irregular motion patterns and the delivery dynamics of the treatment machine into the plan optimiser. Furthermore, we assess the effectiveness of this technique to reduce dose to critical structures in proximity to moving targets, while maintaining effective target dose homogeneity and coverage. The proposed approach has been tested using both a simulated phantom and a clinical liver cancer case, and allows for realistic 4D calculations and optimisation using irregular breathing patterns extracted from e.g. 4DCT-MRI (4D computed tomography-magnetic resonance imaging). 4D dose distributions resulting from our 4D optimisation can achieve almost the same quality as static plans, independent of the studied geometry/anatomy or selected motion (regular and irregular). Additionally, current implementation of the 4D optimisation approach requires less than 3 min to find the solution for a single field planned on 4DCT of a liver cancer patient. Although 4D optimisation allows for realistic calculations using irregular breathing patterns, it is very sensitive to variations from the planned motion. Based on a sensitivity analysis, target dose homogeneity comparable to static plans (D5-D95  <5%) has been found only for differences in amplitude of up to 1 mm, for changes in respiratory phase  <200 ms and for changes in the breathing period of  <20 ms in comparison to the motions used during optimisation. As such, methods to robustly deliver 4D optimised plans employing 4D intensity-modulated delivery are discussed.

  16. Using Lean principles to optimise inpatient phlebotomy services.

    Science.gov (United States)

    Le, Rachel D; Melanson, Stacy E F; Santos, Katherine S; Paredes, Jose D; Baum, Jonathan M; Goonan, Ellen M; Torrence-Hill, Joi N; Gustafson, Michael L; Tanasijevic, Milenko J

    2014-08-01

    In the USA, inpatient phlebotomy services are under constant operational pressure to optimise workflow, improve timeliness of blood draws, and decrease error in the context of increasing patient volume and complexity of work. To date, the principles of Lean continuous process improvement have been rarely applied to inpatient phlebotomy. To optimise supply replenishment and cart standardisation, communication and workload management, blood draw process standardisation, and rounding schedules and assignments using Lean principles in inpatient phlebotomy services. We conducted four Lean process improvement events and implemented a number of interventions in inpatient phlebotomy over a 9-month period. We then assessed their impact using three primary metrics: (1) percentage of phlebotomists drawing their first patient by 05:30 for 05:00 rounds, (2) percentage of phlebotomists completing 08:00 rounds by 09:30, and (3) number of errors per 1000 draws. We saw marked increases in the percentage of phlebotomists drawing their first patient by 05:30, and the percentage of phlebotomists completing rounds by 09:30 postprocess improvement. A decrease in the number of errors per 1000 draws was also observed. This study illustrates how continuous process improvement through Lean can optimise workflow, improve timeliness, and decrease error in inpatient phlebotomy. We believe this manuscript adds to the field of clinical pathology as it can be used as a guide for other laboratories with similar goals of optimising workflow, improving timeliness, and decreasing error, providing examples of interventions and metrics that can be tailored to specific laboratories with particular services and resources. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. I big data e gli strumenti di visualizzazione analitica: interazioni e studi induttivi per le P.A.

    Directory of Open Access Journals (Sweden)

    Giuseppe Roccasalva

    2012-12-01

    Full Text Available Il saggio presenta alcuni risultati di una collaborazione tra Politecnico di Torino e il CSI Piemonte (Società di servizi Informatizzati partecipata dalla Regione Piemonte. Sono stati selezionati e studiati diversi strumenti di visualizzazione dei dati scientifici (Gapminder, ManyEyes, Open eXplorer e Fineo al fine di individuare quello più utile per una lettura induttiva di grandi quantità di dati informativi (big data. Lo sfruttamento intelligente dei dati digitali può portare a uno sviluppo conoscitivo ma anche a un profitto, le cui soglie di sfruttamento possono essere misurate in un sistema economico. Nell’irreversibile fenomeno di crescita dei dati digitali, la disciplina del “Data Visualization” diventa cruciale per accedere e comprendere informazioni complesse. Few, guru della comunicazione visiva, scrive che “scopriamo il mondo attraverso gli occhi”; le forme di comunicazione e interpretazione tradizionali dei dati hanno puntato sulla dimensione visuale per migliorare la comprensione e hanno permesso sia agli analisti sia agli utenti la sperimentazione di nuove interazioni (“story-telling”. Come urbanisti e cittadini, ci affidiamo alla vista che gestisce molti dei sensori (70% legati alla percezione, alle mappe cognitive, agli errori e ai nuovi pensieri. L’ipotesi di fondo di questo articolo vuole generare delle riflessioni sui Big Data come strategia importante per le imprese pubbliche e private che intendono imparare a cambiare dalle informazioni digitali di cui oggi disponiamo. Attraverso l’uso di uno strumento analitico di visualizzazione dei dati informativi, si descrive un recente caso di studio in un contesto territoriale come quello dei nuovi consorzi amministrativi (Unione dei Comuni NordEst Torino. In questo esperimento torna a essere attuale la necessità di pianificare le scelte in modo sistematico anche cercando di utilizzare in modo nuovo e semplice i sistemi informativi territoriali già disponibili.

  18. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  19. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  20. Parameter Screening and Optimisation for ILP Using Designed Experiments

    Science.gov (United States)

    Srinivasan, Ashwin; Ramakrishnan, Ganesh

    Reports of experiments conducted with an Inductive Logic Programming system rarely describe how specific values of parameters of the system are arrived at when constructing models. Usually, no attempt is made to identify sensitive parameters, and those that are used are often given "factory-supplied" default values, or values obtained from some non-systematic exploratory analysis. The immediate consequence of this is, of course, that it is not clear if better models could have been obtained if some form of parameter selection and optimisation had been performed. Questions follow inevitably on the experiments themselves: specifically, are all algorithms being treated fairly, and is the exploratory phase sufficiently well-defined to allow the experiments to be replicated? In this paper, we investigate the use of parameter selection and optimisation techniques grouped under the study of experimental design. Screening and "response surface" methods determine, in turn, sensitive parameters and good values for these parameters. This combined use of parameter selection and response surface-driven optimisation has a long history of application in industrial engineering, and its role in ILP is investigated using two well-known benchmarks. The results suggest that computational overheads from this preliminary phase are not substantial, and that much can be gained, both on improving system performance and on enabling controlled experimentation, by adopting well-established procedures such as the ones proposed here.

  1. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  2. Direct and Indirect Gradient Control for Static Optimisation

    Institute of Scientific and Technical Information of China (English)

    Yi Cao

    2005-01-01

    Static "self-optimising" control is an important concept, which provides a link between static optimisation and control[1]. According to the concept, a dynamic control system could be configured in such a way that when a set of certain variables are maintained at their setpoints, the overall process operation is automatically optimal or near optimal at steadystate in the presence of disturbances. A novel approach using constrained gradient control to achieve "self-optimisation" has been proposed by Cao[2]. However, for most process plants, the information required to get the gradient measure may not be available in real-time. In such cases, controlled variable selection has to be carried out based on measurable candidates. In this work, the idea of direct gradient control has been extended to controlled variable selection based on gradient sensitivity analysis (indirect gradient control). New criteria, which indicate the sensitivity of the gradient function to disturbances and implementation errors, have been derived for selection. The particular case study shows that the controlled variables selected by gradient sensitivity measures are able to achieve near optimal performance.

  3. Optimisation of Lilla Edet Landslide GPS Monitoring Network

    Science.gov (United States)

    Alizadeh-Khameneh, M. A.; Eshagh, M.; Sjöberg, L. E.

    2015-06-01

    Since the year 2000, some periodic investigations have been performed in the Lilla Edet region to monitor and possibly determine the landslide of the area with GPS measurements. The responsible consultant has conducted this project by setting up some stable stations for GPS receivers in the risky areas of Lilla Edet and measured the independent baselines amongst the stations according to their observation plan. Here, we optimise the existing surveying network and determine the optimal configuration of the observation plan based on different criteria.We aim to optimise the current network to become sensitive to detect 5 mm possible displacements in each net point. The network quality criteria of precision, reliability and cost are used as object functions to perform single-, bi- and multi-objective optimisation models. It has been shown in the results that the single-objective model of reliability, which is constrained to the precision, provides much higher precision than the defined criterion by preserving almost all of the observations. However, in this study, the multi-objective model can fulfil all the mentioned quality criteria of the network by 17% less measurements than the original observation plan, meaning 17%of saving time, cost and effort in the project.

  4. A Big Five approach to self-regulation: personality traits and health trajectories in the Hawaii longitudinal study of personality and health.

    Science.gov (United States)

    Hampson, Sarah E; Edmonds, Grant W; Barckley, Maureen; Goldberg, Lewis R; Dubanoski, Joan P; Hillier, Teresa A

    2016-01-01

    Self-regulatory processes influencing health outcomes may have their origins in childhood personality traits. The Big Five approach to personality was used here to investigate the associations between childhood traits, trait-related regulatory processes and changes in health across middle age. Participants (N = 1176) were members of the Hawaii longitudinal study of personality and health. Teacher assessments of the participants' traits when they were in elementary school were related to trajectories of self-rated health measured on 6 occasions over 14 years in middle age. Five trajectories of self-rated health were identified by latent class growth analysis: Stable Excellent, Stable Very Good, Good, Decreasing and Poor. Childhood Conscientiousness was the only childhood trait to predict membership in the Decreasing class vs. the combined healthy classes (Stable Excellent, Stable Very Good and Good), even after controlling for adult Conscientiousness and the other adult Big Five traits. The Decreasing class had poorer objectively assessed clinical health measured on one occasion in middle age, was less well-educated, and had a history of more lifespan health-damaging behaviors compared to the combined healthy classes. These findings suggest that higher levels of childhood Conscientiousness (i.e. greater self-discipline and goal-directedness) may prevent subsequent health decline decades later through self-regulatory processes involving the acquisition of lifelong healthful behavior patterns and higher educational attainment.

  5. Landsat Big Data Analysis for Detecting Long-Term Water Quality Changes: a Case Study in the Han River, South Korea

    Science.gov (United States)

    Seong, J. C.; Hwang, C. S.; Gibbs, R.; Roh, K.; Mehdi, M. R.; Oh, C.; Jeong, J. J.

    2017-05-01

    Landsat imagery satisfies the characteristics of big data because of its massive data archive since 1972, continuous temporal updates, and various spatial resolutions from different sensors. As a case study of Landsat big data analysis, a total of 776 Landsat scenes were analyzed that cover a part of the Han River in South Korea. A total of eleven sample datasets was taken at the upstream, mid-stream and downstream along the Han River. This research aimed at analyzing locational variance of reflectance, analyzing seasonal difference, finding long-term changes, and modeling algal amount change. There were distinctive reflectance differences among the downstream, mid-stream and upstream areas. Red, green, blue and near-infrared reflectance values decreased significantly toward the upstream. Results also showed that reflectance values are significantly associated with the seasonal factor. In the case of long-term trends, reflectance values have slightly increased in the downstream, while decreased slightly in the mid-stream and upstream. The modeling of chlorophyll-a and Secchi disk depth imply that water clarity has decreased over time while chlorophyll-a amounts have decreased. The decreasing water clarity seems to be attributed to other reasons than chlorophyll-a.

  6. Multicriteria Optimisation in Logistics Forwarder Activities

    Directory of Open Access Journals (Sweden)

    Tanja Poletan Jugović

    2007-05-01

    Full Text Available Logistics forwarder, as organizer and planner of coordinationand integration of all the transport and logistics chains elements,uses adequate ways and methods in the process of planningand decision-making. One of these methods, analysed inthis paper, which could be used in optimisation of transportand logistics processes and activities of logistics forwarder, isthe multicriteria optimisation method. Using that method, inthis paper is suggested model of multicriteria optimisation of logisticsforwarder activities. The suggested model of optimisationis justified in keeping with method principles of multicriteriaoptimization, which is included in operation researchmethods and it represents the process of multicriteria optimizationof variants. Among many different processes of multicriteriaoptimization, PROMETHEE (Preference Ranking OrganizationMethod for Enrichment Evaluations and Promcalc& Gaia V. 3.2., computer program of multicriteria programming,which is based on the mentioned process, were used.

  7. Topology Optimisation for Coupled Convection Problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Andreasen, Casper Schousboe; Aage, Niels

    conduction governs in the solid parts of the design domain and couples to convection-dominated heat transfer to a surrounding fluid. Both loosely coupled and tightly coupled problems are considered. The loosely coupled problems are convection-diffusion problems, based on an advective velocity field from......The work focuses on applying topology optimisation to forced and natural convection problems in fluid dynamics and conjugate (fluid-structure) heat transfer. To the authors' knowledge, topology optimisation has not yet been applied to natural convection flow problems in the published literature...... and the current work is thus seen as contributing new results to the field. In the literature, most works on the topology optimisation of weakly coupled convection-diffusion problems focus on the temperature distribution of the fluid, but a selection of notable exceptions also focusing on the temperature...

  8. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  9. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  10. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at CBS......) have developed a research-based capability mapping tool, entitled DataProfit, which the public business consultants can use to upgrade their tool kit to enable data-driven growth in manufacturing organizations. Benefit: The DataProfit model/tool comprises insights of an extensive research project...

  11. Focus : big data, little questions?

    OpenAIRE

    Uprichard, Emma

    2013-01-01

    Big data. Little data. Deep data. Surface data. Noisy, unstructured data. Big. The world of data has gone from being analogue and digital, qualitative and quantitative, transactional and a by-product, to, simply, BIG. It is as if we couldn’t quite deal with its omnipotence and just ran out of adjectives. BIG. With all the data power it is supposedly meant to entail, one might have thought that a slightly better descriptive term might have been latched onto. But, no. BIG. Just BIG.

  12. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  13. The optimisation, design and verification of feed horn structures for future Cosmic Microwave Background missions

    Science.gov (United States)

    McCarthy, Darragh; Trappe, Neil; Murphy, J. Anthony; O'Sullivan, Créidhe; Gradziel, Marcin; Doherty, Stephen; Huggard, Peter G.; Polegro, Arturo; van der Vorst, Maarten

    2016-05-01

    In order to investigate the origins of the Universe, it is necessary to carry out full sky surveys of the temperature and polarisation of the Cosmic Microwave Background (CMB) radiation, the remnant of the Big Bang. Missions such as COBE and Planck have previously mapped the CMB temperature, however in order to further constrain evolutionary and inflationary models, it is necessary to measure the polarisation of the CMB with greater accuracy and sensitivity than before. Missions undertaking such observations require large arrays of feed horn antennas to feed the detector arrays. Corrugated horns provide the best performance, however owing to the large number required (circa 5000 in the case of the proposed COrE+ mission), such horns are prohibitive in terms of thermal, mechanical and cost limitations. In this paper we consider the optimisation of an alternative smooth-walled piecewise conical profiled horn, using the mode-matching technique alongside a genetic algorithm. The technique is optimised to return a suitable design using efficient modelling software and standard desktop computing power. A design is presented showing a directional beam pattern and low levels of return loss, cross-polar power and sidelobes, as required by future CMB missions. This design is manufactured and the measured results compared with simulation, showing excellent agreement and meeting the required performance criteria. The optimisation process described here is robust and can be applied to many other applications where specific performance characteristics are required, with the user simply defining the beam requirements.

  14. [Process optimisation: from theory to practical implementation].

    Science.gov (United States)

    Töpfer, Armin

    2010-01-01

    Today process optimisation is an indispensable approach to mastering the current challenges of modern health care management. The objective is to design business processes free of defects and free of waste as well as their monitoring and controlling with meaningful test statistics. Based on the identification of essential key performance indicators, key success factors and value cash generators two basic approaches to process optimisation, which are well-established and widely used in the industry, are now being implemented in the health care sector as well: Lean Management and Six Sigma.

  15. Bat Algorithm for Multi-objective Optimisation

    CERN Document Server

    Yang, Xin-She

    2012-01-01

    Engineering optimization is typically multiobjective and multidisciplinary with complex constraints, and the solution of such complex problems requires efficient optimization algorithms. Recently, Xin-She Yang proposed a bat-inspired algorithm for solving nonlinear, global optimisation problems. In this paper, we extend this algorithm to solve multiobjective optimisation problems. The proposed multiobjective bat algorithm (MOBA) is first validated against a subset of test functions, and then applied to solve multiobjective design problems such as welded beam design. Simulation results suggest that the proposed algorithm works efficiently.

  16. Topology Optimisation of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Thike Aye Min

    2016-01-01

    Full Text Available Wireless sensor networks are widely used in a variety of fields including industrial environments. In case of a clustered network the location of cluster head affects the reliability of the network operation. Finding of the optimum location of the cluster head, therefore, is critical for the design of a network. This paper discusses the optimisation approach, based on the brute force algorithm, in the context of topology optimisation of a cluster structure centralised wireless sensor network. Two examples are given to verify the approach that demonstrate the implementation of the brute force algorithm to find an optimum location of the cluster head.

  17. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  18. Big and Small

    CERN Document Server

    Ekers, R D

    2010-01-01

    Technology leads discovery in astronomy, as in all other areas of science, so growth in technology leads to the continual stream of new discoveries which makes our field so fascinating. Derek de Solla Price had analysed the discovery process in science in the 1960s and he introduced the terms 'Little Science' and 'Big Science' as part of his discussion of the role of exponential growth in science. I will show how the development of astronomical facilities has followed this same trend from 'Little Science' to 'Big Science' as a field matures. We can see this in the discoveries resulting in Nobel Prizes in astronomy. A more detailed analysis of discoveries in radio astronomy shows the same effect. I include a digression to look at how science progresses, comparing the roles of prediction, serendipity, measurement and explanation. Finally I comment on the differences between the 'Big Science' culture in Physics and in Astronomy.

  19. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  20. Optimisation of interventional cardiology procedures; Optimisation des procedures en cardiologie interventionnelle

    Energy Technology Data Exchange (ETDEWEB)

    Bar, Olivier [SELARL, Cardiologie Interventionnelle Imagerie Cardiaque - CIIC, 8, place de la Cathedrale - 37042 Tours (France)

    2011-07-15

    Radiation-guided procedures in interventional cardiology include diagnostic and/or therapeutic procedures, primarily coronary catheterization and coronary angioplasty. Application of the principles of radiation protection and the use of optimised procedures are contributing to dose reduction while maintaining the radiological image quality necessary for performance of the procedures. The mandatory training in patient radiation protection and technical training in the use of radiology devices mean that implementing continuous optimisation of procedures is possible in practice. This optimisation approach is the basis of patient radiation protection; when associated with the wearing of protective equipment it also contributes to the radiation protection of the cardiologists. (author)

  1. 大数据时代营销创新研究的价值、基础与方向%The Value,Foundation and Directions of the Studies on Marketing Innovation in the Big Data Era

    Institute of Scientific and Technical Information of China (English)

    李巍; 席小涛

    2014-01-01

    大数据研究已经成为当前理论研究热点,但其与管理学,特别是与营销管理研究的结合还非常缺乏。在分析大数据时代营销创新研究现实价值的前提下,从大数据与营销创新研究两方面系统梳理现有研究基础,进而基于技术发展与管理变革互动的视角提出大数据时代营销创新研究的三大方向,即大数据的营销应用价值研究、大数据时代营销创新的内在机理与支撑体系研究。旨在将大数据与营销管理变革有效结合,为推动大数据时代营销创新的相关研究提供基础借鉴与方向指引。%The research on big data has become the studies hotspot,but the research combining big data with management, or especially marketing is still very scarce.This paper on the basis of analyzing the value of the research on marketing inno-vation in the big data era,systematically concludes the existing foundation of the studies on big data and marketing innova-tion.Based on the interaction perspective of on technology development and management changes,the paper presents the future directions of marketing innovation in the big data era,namely the study on application value of big data in marketing, internal mechanism and support system of marketing innovation in the big data era.This paper aims to combine effectively big data with marketing management changes,and provide positively basic reference and orientation guides for promoting the related researches on marketing innovation in the big data era.

  2. Extending Particle Swarm Optimisers with Self-Organized Criticality

    DEFF Research Database (Denmark)

    Løvbjerg, Morten; Krink, Thiemo

    2002-01-01

    Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.......Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions....

  3. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  4. Shape optimisation and performance analysis of flapping wings

    KAUST Repository

    Ghommem, Mehdi

    2012-09-04

    In this paper, shape optimisation of flapping wings in forward flight is considered. This analysis is performed by combining a local gradient-based optimizer with the unsteady vortex lattice method (UVLM). Although the UVLM applies only to incompressible, inviscid flows where the separation lines are known a priori, Persson et al. [1] showed through a detailed comparison between UVLM and higher-fidelity computational fluid dynamics methods for flapping flight that the UVLM schemes produce accurate results for attached flow cases and even remain trend-relevant in the presence of flow separation. As such, they recommended the use of an aerodynamic model based on UVLM to perform preliminary design studies of flapping wing vehicles Unlike standard computational fluid dynamics schemes, this method requires meshing of the wing surface only and not of the whole flow domain [2]. From the design or optimisation perspective taken in our work, it is fairly common (and sometimes entirely necessary, as a result of the excessive computational cost of the highest fidelity tools such as Navier-Stokes solvers) to rely upon such a moderate level of modelling fidelity to traverse the design space in an economical manner. The objective of the work, described in this paper, is to identify a set of optimised shapes that maximise the propulsive efficiency, defined as the ratio of the propulsive power over the aerodynamic power, under lift, thrust, and area constraints. The shape of the wings is modelled using B-splines, a technology used in the computer-aided design (CAD) field for decades. This basis can be used to smoothly discretize wing shapes with few degrees of freedom, referred to as control points. The locations of the control points constitute the design variables. The results suggest that changing the shape yields significant improvement in the performance of the flapping wings. The optimisation pushes the design to "bird-like" shapes with substantial increase in the time

  5. Textural and stable isotope studies of the Big Mike cupriferous volcanogenic massive sulfide deposit, Pershing County, Nevada.

    Science.gov (United States)

    Rye, R.O.; Roberts, R.J.; Snyder, W.S.; Lahusen, G.L.; Motica, J.E.

    1984-01-01

    The Big Mike deposit is a massive sulphide lens entirely within a carbonaceous argillite of the Palaeozoic Havallah pelagic sequence. The massive ore contains two generations of pyrite, a fine- and a coarse-grained variety; framboidal pyrite occurs in the surrounding carbonaceous argillite. Coarse grained pyrite is largely recrystallized fine-grained pyrite and is proportionately more abundant toward the margins of the lens. Chalcopyrite and sphalerite replace fine-grained pyrite and vein-fragmented coarse-grained pyrite. Quartz fills openings in the sulphide fabric. S-isotope data are related to sulphide mineralogy and textures. Isotopically light S in the early fine-grained pyrite was probably derived from framboidal biogenic pyrite. The S-isotope values of the later coarse-grained pyrite and chalcopyrite probably reflect a combination of reduced sea-water sulphate and igneous S. Combined S- and O-isotope and textural data accord with precipitation of fine-grained pyrite from a hydrothermal plume like those at the East Pacific Rise spreading centre at lat. 21oN. The primary material was recystallized and mineralized by later fluids of distinctly different S-isotope composition. -G.J.N.

  6. Study on clear stereo image pair acquisition method for small objects with big vertical size in SLM vision system.

    Science.gov (United States)

    Wang, Yuezong; Jin, Yan; Wang, Lika; Geng, Benliang

    2016-05-01

    Microscopic vision system with stereo light microscope (SLM) has been applied to surface profile measurement. If the vertical size of a small object exceeds the range of depth, its images will contain clear and fuzzy image regions. Hence, in order to obtain clear stereo images, we propose a microscopic sequence image fusion method which is suitable for SLM vision system. First, a solution to capture and align image sequence is designed, which outputs an aligning stereo images. Second, we decompose stereo image sequence by wavelet analysis theory, and obtain a series of high and low frequency coefficients with different resolutions. Then fused stereo images are output based on the high and low frequency coefficient fusion rules proposed in this article. The results show that Δw1 (Δw2 ) and ΔZ of stereo images in a sequence have linear relationship. Hence, a procedure for image alignment is necessary before image fusion. In contrast with other image fusion methods, our method can output clear fused stereo images with better performance, which is suitable for SLM vision system, and very helpful for avoiding image fuzzy caused by big vertical size of small objects.

  7. Topology optimisation of natural convection problems

    DEFF Research Database (Denmark)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe

    2014-01-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equation...

  8. Thermodynamic optimisation of a heat exchanger

    NARCIS (Netherlands)

    Cornelissen, R.L.; Hirs, G.G.

    1999-01-01

    The objective of this paper is to show that for the optimal design of an energy system, where there is a trade-off between exergy saving during operation and exergy use during construction of the energy system, exergy analysis and life cycle analysis should be combined. An exergy optimisation of a h

  9. Optimised Design of Transparent Optical Domains

    DEFF Research Database (Denmark)

    Hanik, N.; Caspar, C.; Schmidt, F.;

    2000-01-01

    Three different design concepts for transparent, dispersion compensated, optical WDM transmission links are optimised numerically and experimentally for 10 Gbit/s data rate per channel. It is shown that robust transparent domains of 1,500 km in diameter can be realised using simple design rutes....

  10. The Promise and Prejudice of Big Data in Intelligence Community

    OpenAIRE

    Jani, Karan

    2016-01-01

    Big data holds critical importance in the current generation of information technology, with applications ranging from financial, industrial, academic to defense sectors. With the exponential rise of open source data from social media and increasing government monitoring, big data is now also linked with national security, and subsequently to the intelligence community. In this study I review the scope of big data sciences in the functioning of intelligence community. The major part of my stu...

  11. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  12. 运营商大数据安全防护技术研究%Study on Big Data Protection Technology of Telecom Operators

    Institute of Scientific and Technical Information of China (English)

    陶冶; 张云勇; 张尼

    2014-01-01

    Big data have brought not only opportunities, but also various kinds of security problems to the telecom operators. Protecting big data security becomes a key factor to ensure the development of big data business. Based on the analysis on the current situation of telecom operators’big data protection, it summarizes the big data protection solution and technology for Chinese telecom operators.%大数据给电信运营商带来巨大发展机遇的同时,也带来了各类型的安全隐患。保护大数据安全成为电信运营商大数据业务发展的关键。通过分析电信运营商大数据安全现状,总结国内运营商的大数据安全防护方案与技术。

  13. LSVT-BIG Improves UPDRS III Scores at 4 Weeks in Parkinson’s Disease Patients with Wearing Off: A Prospective, Open-Label Study

    Directory of Open Access Journals (Sweden)

    Tatsuya Ueno

    2017-01-01

    Full Text Available The efficacy of LSVT-BIG for advanced Parkinson’s disease (PD patients with wearing off remains to be determined. Therefore, we evaluated whether LSVT-BIG improves motor disability in eight PD patients with wearing off. Unified Parkinson’s Disease Rating Scale (UPDRS scores, daily off time, and mobility assessments were evaluated during the “on” time before and after the LSVT-BIG course. LSVT-BIG significantly improved UPDRS III scores at 4 weeks and UPDRS II scores in the “off” state at 12 weeks, with no changes in the other measures. The findings suggest that LSVT-BIG may be an effective therapy for advanced PD patients with wearing off.

  14. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  15. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  16. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  17. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  18. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  19. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  20. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  1. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  2. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  3. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  4. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  5. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  6. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  7. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  8. Disease activity-guided dose optimisation of adalimumab and etanercept is a cost-effective strategy compared with non-tapering tight control rheumatoid arthritis care: analyses of the DRESS study

    NARCIS (Netherlands)

    Kievit, W.; Herwaarden, N. van; Hoogen, F.H.J. van den; Vollenhoven, R.F. van; Bijlsma, J.W.; Bemt, B.J. van den; Maas, A. van der; Broeder, A.A. den

    2016-01-01

    BACKGROUND: A disease activity-guided dose optimisation strategy of adalimumab or etanercept (TNFi (tumour necrosis factor inhibitors)) has shown to be non-inferior in maintaining disease control in patients with rheumatoid arthritis (RA) compared with usual care. However, the cost-effectiveness of

  9. Organisational design elements and competencies for optimising the expertise of knowledge workers in a shared services centre

    Directory of Open Access Journals (Sweden)

    Mark Ramsey

    2011-02-01

    Full Text Available Orientation: Organisations are still structured according to the Industrial Age control model that restricts optimising the expertise of knowledge workers.Research purpose: The general aim of the research was to explore the organisation design elements and competencies that contribute to optimising the expertise of knowledge workers in a shared services centre.Motivation for the study: Current organisational design methodologies do not emphasise optimising the expertise of knowledge workers. This research addresses the challenge of how an organisation design can improve the creation and availability of the expertise of knowledge workers.Research design/approach method: The researcher followed a qualitative case study research design and collected data in six focus group sessions (N = 25.Main findings: The findings showed that the shared services centre (SSC is not designed to enable its structure, culture and codifying system to optimise the expertise of knowledge workers. In addition, the SSC does not share the knowledge generated with other knowledge workers. Furthermore, it does not use the output of the knowledge workers to improve business processes.Practical/managerial implications: The expertise of knowledge workers is the basis of competitive advantage. Therefore, managers should create an organisational design that is conducive to optimising knowledge work expertise.Contribution/value add: This research highlights the important organisational design elements and supportive organisational structures for optimising the expertise of knowledge workers. The research also proposes a framework for optimising the expertise of knowledge workers and helping an organisation to achieve sustainable competitive advantage.

  10. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  11. Comparing Audit Quality of Big 4 and Non-Big 4 Auditors

    OpenAIRE

    PING, CHAO-WEN

    2010-01-01

    In the first decade of the 21st centuries, there were many multinational corporations collapsed because of fraud. These scandals have induced concerns about audit quality even among the large international accounting firms. This study examines the quality of audit services in the Taiwanese audit market. Consistent with prior research, this paper treat audit quality as a dichotomous variable, and assume that Big 4 auditors are of higher quality than non-Big 4 auditors, as it is widely accepted...

  12. The Study of Pedantic Language in the Big Bang Theory%《生活大爆炸》中学究语研究

    Institute of Scientific and Technical Information of China (English)

    王亚辉

    2014-01-01

    Variants vary via variable communities. Social dialect reflects in distinct speech communities caused by social factors. Pedantic language, a sort of academic style used by speech communities with high education background, is a language variety stylized with advanced terminologies and complex sentences. The TV The Big Bang Theory, with its unique script setting, cate-gorizes its dialogues into pedantic language. Its plays discernibly incarnate the use of pedantic language in specific registers, and dis-play the impacts of it with other varieties of speech communities to some extent. This paper tentatively studies the influence of ac-ademic style on inappropriate registers through the corpus of season two of The Big Bang Theory.%语言变体随社会网络变化而迥异。社群言语的抽象表现就是由社会因素造成的社会方言。“学究语”是受教育程度相当高的社会团体使用的学术性语体,话语中常包含高级术语和复杂长句,是语言变体的一种。美剧《生活大爆炸》中独特地剧本背景设定将角色语言划归为学究语,在不同程度上表现出学究语作为独特的语言变体在特定语域中的作用,以及与不同社群的其他语言变体的冲突。该文对该剧第二季语料中的语言变体进行分析,试解读固定语域下非恰当语体对听话人造成的影响。

  13. Conceptualization and theorization of the Big Data

    Directory of Open Access Journals (Sweden)

    Marcos Mazzieri

    2016-06-01

    Full Text Available The term Big Data is being used widely by companies and researchers who consider your relevant functionalities or applications to create value and business innovation. However some questions arise about what is this phenomenon and, more precisely, how it occurs and under what conditions it can create value and innovation in business. In our view, the lack of depth related to the principles involved in Big Data and the very absence of a conceptual definition, made it difficult to answer these questions that have been the basis for our research. To answer these questions we did a bibliometric study and extensive literature review. The bibliometric studies were realized based in articles and citation of Web of Knowledge database. The main result of our research is the providing a conceptual definition for the term Big Data. Also, we propose which principles discovered can contribute with other researches  that intend value creation by Big Data. Finally we propose see the value creation through Big Data using the  Resource Based View as the main theory used for discuss that theme.

  14. Transcriptome marker diagnostics using big data.

    Science.gov (United States)

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  15. Perceptions and experiences of the implementation, management, use and optimisation of electronic prescribing systems in hospital settings: protocol for a systematic review of qualitative studies

    Science.gov (United States)

    Farre, Albert; Bem, Danai; Heath, Gemma; Shaw, Karen; Cummins, Carole

    2016-01-01

    Introduction There is increasing evidence that electronic prescribing (ePrescribing) or computerised provider/physician order entry (CPOE) systems can improve the quality and safety of healthcare services. However, it has also become clear that their implementation is not straightforward and may create unintended or undesired consequences once in use. In this context, qualitative approaches have been particularly useful and their interpretative synthesis could make an important and timely contribution to the field. This review will aim to identify, appraise and synthesise qualitative studies on ePrescribing/CPOE in hospital settings, with or without clinical decision support. Methods and analysis Data sources will include the following bibliographic databases: MEDLINE, MEDLINE In Process, EMBASE, PsycINFO, Social Policy and Practice via Ovid, CINAHL via EBSCO, The Cochrane Library (CDSR, DARE and CENTRAL databases), Nursing and Allied Health Sources, Applied Social Sciences Index and Abstracts via ProQuest and SCOPUS. In addition, other sources will be searched for ongoing studies (ClinicalTrials.gov) and grey literature: Healthcare Management Information Consortium, Conference Proceedings Citation Index (Web of Science) and Sociological abstracts. Studies will be independently screened for eligibility by 2 reviewers. Qualitative studies, either standalone or in the context of mixed-methods designs, reporting the perspectives of any actors involved in the implementation, management and use of ePrescribing/CPOE systems in hospital-based care settings will be included. Data extraction will be conducted by 2 reviewers using a piloted form. Quality appraisal will be based on criteria from the Critical Appraisal Skills Programme checklist and Standards for Reporting Qualitative Research. Studies will not be excluded based on quality assessment. A postsynthesis sensitivity analysis will be undertaken. Data analysis will follow the thematic synthesis method. Ethics and

  16. Classical and quantum Big Brake cosmology for scalar field and tachyonic models

    CERN Document Server

    Kamenshchik, A

    2013-01-01

    We study a relation between the cosmological singularities in classical and quantum theory, comparing the classical and quantum dynamics in some models possessing the Big Brake singularity - the model based on a scalar field and two models based on a tachyon-pseudo-tachyon field . It is shown that the effect of quantum avoidance is absent for the soft singularities of the Big Brake type while it is present for the Big Bang and Big Crunch singularities. Thus, there is some kind of a classical - quantum correspondence, because soft singularities are traversable in classical cosmology, while the strong Big Bang and Big Crunch singularities are not traversable.

  17. Classical and Quantum Big Brake Cosmology for Scalar Field and Tachyonic Models

    Science.gov (United States)

    Kamenshchik, Alexander; Manti, Serena

    2015-01-01

    We study a relation between the cosmological singularities in classical and quantum theory, comparing the classical and quantum dynamics in some models possessing the Big Brake singularity - the model based on a scalar field and two models based on a tachyon-pseudo-tachyon field. It is shown that the effect of quantum avoidance is absent for the soft singularities of the Big Brake type while it is present for the Big Bang and Big Crunch singularities. Thus, there is some kind of a classical - quantum correspondence, because soft singularities are traversable in classical cosmology, while the strong Big Bang and Big Crunch singularities are not traversable.

  18. Classical and quantum Big Brake cosmology for scalar field and tachyonic models

    Energy Technology Data Exchange (ETDEWEB)

    Kamenshchik, A. Yu. [Dipartimento di Fisica e Astronomia and INFN, Via Irnerio 46, 40126 Bologna (Italy) and L.D. Landau Institute for Theoretical Physics of the Russian Academy of Sciences, Kosygin str. 2, 119334 Moscow (Russian Federation); Manti, S. [Scuola Normale Superiore, Piazza dei Cavalieri 7, 56126 Pisa (Italy)

    2013-02-21

    We study a relation between the cosmological singularities in classical and quantum theory, comparing the classical and quantum dynamics in some models possessing the Big Brake singularity - the model based on a scalar field and two models based on a tachyon-pseudo-tachyon field . It is shown that the effect of quantum avoidance is absent for the soft singularities of the Big Brake type while it is present for the Big Bang and Big Crunch singularities. Thus, there is some kind of a classical - quantum correspondence, because soft singularities are traversable in classical cosmology, while the strong Big Bang and Big Crunch singularities are not traversable.

  19. A study to assess COPD Symptom-based Management and to Optimise treatment Strategy in Japan (COSMOS-J based on GOLD 2011

    Directory of Open Access Journals (Sweden)

    Betsuyaku T

    2013-10-01

    Full Text Available Tomoko Betsuyaku,1 Motokazu Kato,2 Keisaku Fujimoto,3 Gerry Hagan,4 Akihiro Kobayashi,5 Hideki Hitosugi,5 Mark James,5 Paul W Jones61Division of Pulmonary Medicine, Department of Medicine, Keio University, Tokyo, Japan; 2Department of Respiratory Disease, Kishiwada City Hospital, Osaka, Japan; 3Department of Clinical Laboratory Sciences, Shinshu University, Nagano, Japan; 4Private Practice, Marbella, Spain; 5GlaxoSmithKline KK, Tokyo, Japan; 6Division of Clinical Science, St George’s, University of London, London, UKBackground and objective: The Global initiative for chronic Obstructive Lung Disease (GOLD Committee has proposed a COPD assessment framework focused on symptoms and on exacerbation risk. This study will evaluate a symptom and exacerbation risk-based treatment strategy based on GOLD in a real-world setting in Japan. Optimal management of COPD will be determined by assessing symptoms using the COPD Assessment Test (CAT and by assessing the frequency of exacerbations.Methods: This study (ClinicalTrials.gov identifier: NCT01762800 is a 24-week, multicenter, randomized, double-blind, double-dummy, parallel-group study. It aims to recruit 400 patients with moderate-to-severe COPD. Patients will be randomized to receive treatment with either salmeterol/fluticasone propionate (SFC 50/250 µg twice daily or with tiotropium bromide 18 µg once daily. Optimal management of patients will be assessed at four-weekly intervals and, if patients remain symptomatic, as measured using the CAT, or experience an exacerbation, they have the option to step up to treatment with both drugs, ie, SFC twice daily and tiotropium once daily (TRIPLE therapy. The primary endpoint of the study will be the proportion of patients who are able to remain on the randomized therapy.Results: No data are available. This paper summarizes the methodology of the study in advance of the study starting.Conclusion: The results of this study will help physicians to understand

  20. Reliability and validity of needle biopsy evaluation of breast-abnormalities using the B-categorization – design and objectives of the Diagnosis Optimisation Study (DIOS

    Directory of Open Access Journals (Sweden)

    Schmidt-Pokrzywniak Andrea

    2007-06-01

    Full Text Available Abstract Background The planned nationwide implementation of mammography screening 2007 in Germany will increase the occurrence of mammographically detected breast abnormalities. These abnormalities are normally evaluated by minimal invasive core biopsy. To minimize false positive and false negative histological findings, quality assurance of the pathological evaluation of the biopsies is essential. Various guidelines for quality assurance in breast cancer diagnosis recommend applying the B-classification for histopathological categorization. However, to date there are only few studies that reported results about reliability and validity of B-classification. Therefore, objectives of our study are to determine the inter- and intraobserver variability (reliability study and construct and predictive validity (validity study of core biopsy evaluation of breast abnormalities. This paper describes the design and objectives of the DIOS Study. Methods/Design All consecutive asymptomatic and symptomatic women with breast imaging abnormalities who are referred to the University Hospital of Halle for core breast biopsy over a period of 24 months are eligible. According to the sample size calculation we need 800 women for the study. All patients in the study population underwent clinical and radiological examination. Core biopsy is performed by stereotactic-, ultrasound- or magnetic resonance (MR guided automated gun method or vacuum assisted method. The histopathologic agreement (intra- and interobserver of pathologists and the histopathologic validity will be evaluated. Two reference standards are implemented, a reference pathologist and in case of suspicious or malignant findings the histopathologic result of excision biopsy. Furthermore, a self administrated questionnaire which contains questions about potential risk factors of breast cancer, is sent to the participants approximately two weeks after core biopsy. This enables us to run a case

  1. 大数据经济学的概念、框架与学科定位研究%The Study on Sources,Framework and Subject Positioning of Big Data Economics

    Institute of Scientific and Technical Information of China (English)

    俞立平

    2015-01-01

    This paper defined the content and structure of big data economics firstly ,basing on the impact study of big data bring to economics ,and considered its included large data econometrics ,big data statistics ,and big data applied economics .Then analyzed the relationship between big data economics and computer science and technology ,software engineering ,management science and engineering ,statistics , library and information science and archives ,psychology ,applied economics ,and think big data economics is a new interdisciplinary .Finally ,analyzed the subject location of big data economics ,and regarded its as the following two disciplines of applied economics temporarily when the infancy of big data economics , then as a discipline below the economics categories after the discipline relatively mature .%在对大数据给经济学带来冲击研究的基础上,对大数据经济学的内容和结构进行了界定,认为其包括大数据计量经济学、大数据统计学、大数据应用经济学。分析了大数据经济学与计算机科学与技术、软件工程、管理科学与工程、统计学、图书情报与档案学、心理学、应用经济学之间的关系,认为大数据经济学是一门新兴交叉学科。对大数据经济学的学科定位进行了分析,认为在大数据经济学的萌芽期,暂时将其作为应用经济学下面的二级学科,待学科发展相对成熟后再作为经济学大类下面的一级学科。

  2. Transmit Power Optimisation in Wireless Network

    Directory of Open Access Journals (Sweden)

    Besnik Terziu

    2011-09-01

    Full Text Available Transmit power optimisation in wireless networks based on beamforming have emerged as a promising technique to enhance the spectrum efficiency of present and future wireless communication systems. The aim of this study is to minimise the access point power consumption in cellular networks while maintaining a targeted quality of service (QoS for the mobile terminals. In this study, the targeted quality of service is delivered to a mobile station by providing a desired level of Signal to Interference and Noise Ratio (SINR. Base-stations are coordinated across multiple cells in a multi-antenna beamforming system. This study focuses on a multi-cell multi-antenna downlink scenario where each mobile user is equipped with a single antenna, but where multiple mobile users may be active simultaneously in each cell and are separated via spatial multiplexing using beamforming. The design criteria is to minimize the total weighted transmitted power across the base-stations subject to SINR constraints at the mobile users. The main contribution of this study is to define an iterative algorithm that is capable of finding the joint optimal beamformers for all basestations, based on a correlation-based channel model, the full-correlation model. Among all correlated channel models, the correlated channel model used in this study is the most accurate, giving the best performance in terms of power consumption. The environment here in this study is chosen to be Non-Light of- Sight (NLOS condition, where a signal from a wireless transmitter passes several obstructions before arriving at a wireless receiver. Moreover there are many scatterers local to the mobile, and multiple reflections can occur among them before energy arrives at the mobile. The proposed algorithm is based on uplink-downlink duality using the Lagrangian duality theory. Time-Division Duplex (TDD is chosen as the platform for this study since it has been adopted to the latest technologies in Fourth

  3. Approaches and challenges to optimising primary care teams’ electronic health record usage

    Directory of Open Access Journals (Sweden)

    Nancy Pandhi

    2014-07-01

    Full Text Available Background Although the presence of an electronic health record (EHR alone does not ensure high quality, efficient care, few studies have focused on the work of those charged with optimising use of existing EHR functionality.Objective To examine the approaches used and challenges perceived by analysts supporting the optimisation of primary care teams’ EHR use at a large U.S. academic health care system.Methods A qualitative study was conducted. Optimisation analysts and their supervisor were interviewed and data were analysed for themes.Results Analysts needed to reconcile the tension created by organisational mandates focused on the standardisation of EHR processes with the primary care teams’ demand for EHR customisation. They gained an understanding of health information technology (HIT leadership’s and primary care team’s goals through attending meetings, reading meeting minutes and visiting with clinical teams. Within what was organisationally possible, EHR education could then be tailored to fit team needs. Major challenges were related to organisational attempts to standardise EHR use despite varied clinic contexts, personnel readiness and technical issues with the EHR platform. Forcing standardisation upon clinical needs that current EHR functionality could not satisfy was difficult.Conclusions Dedicated optimisation analysts can add value to health systems through playing a mediating role between HIT leadership and care teams. Our findings imply that EHR optimisation should be performed with an in-depth understanding of the workflow, cognitive and interactional activities in primary care.

  4. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record

  5. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  6. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  7. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  8. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  9. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  10. Optimisation of logistics processes of energy grass collection

    Science.gov (United States)

    Bányai, Tamás.

    2010-05-01

    The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The

  11. Review of magnesium hydride-based materials: development and optimisation

    Science.gov (United States)

    Crivello, J.-C.; Dam, B.; Denys, R. V.; Dornheim, M.; Grant, D. M.; Huot, J.; Jensen, T. R.; de Jongh, P.; Latroche, M.; Milanese, C.; Milčius, D.; Walker, G. S.; Webb, C. J.; Zlotea, C.; Yartys, V. A.

    2016-02-01

    Magnesium hydride has been studied extensively for applications as a hydrogen storage material owing to the favourable cost and high gravimetric and volumetric hydrogen densities. However, its high enthalpy of decomposition necessitates high working temperatures for hydrogen desorption while the slow rates for some processes such as hydrogen diffusion through the bulk create challenges for large-scale implementation. The present paper reviews fundamentals of the Mg-H system and looks at the recent advances in the optimisation of magnesium hydride as a hydrogen storage material through the use of catalytic additives, incorporation of defects and an understanding of the rate-limiting processes during absorption and desorption.

  12. Optimisation of biodiesel production by sunflower oil transesterification.

    Science.gov (United States)

    Antolín, G; Tinaut, F V; Briceño, Y; Castaño, V; Pérez, C; Ramírez, A I

    2002-06-01

    In this work the transformation process of sunflower oil in order to obtain biodiesel by means of transesterification was studied. Taguchi's methodology was chosen for the optimisation of the most important variables (temperature conditions, reactants proportion and methods of purification), with the purpose of obtaining a high quality biodiesel that fulfils the European pre-legislation with the maximum process yield. Finally, sunflower methyl esters were characterised to test their properties as fuels in diesel engines, such as viscosity, flash point, cold filter plugging point and acid value. Results showed that biodiesel obtained under the optimum conditions is an excellent substitute for fossil fuels.

  13. Optimisation is at the heart of the operation.

    Science.gov (United States)

    Jones, Darren

    2013-11-01

    In our other article based around operating theatres in this issue of HEJ (see pages 64-72), we examine how some of the latest technology is benefiting users, but in this article--with all areas of the NHS charged with reducing energy consumption and cutting carbon emissions--Darren Jones, MD at carbon and energy management specialist, Low Carbon Europe, takes a detailed look, with the help of a 'real-life' case study based on recent experience at London's Heart Hospital, at operating theatre optimisation and HTM 03-01 audits.

  14. Optimising the Target and Capture Sections of the Neutrino Factory

    OpenAIRE

    Hansen, Ole Martin

    2016-01-01

    The Neutrino Factory is designed to produce an intense high energy neutrino beam from stored muons. The majority of the muons are obtained from the decay of pions, produced by a proton beam impinging on a free-flowing mercury-jet target and captured by a high magnetic field. It is important to capture a large fraction of the produced pions to maximize the intensity of the neutrino beam. Various optimisation studies have been performed with the aim of maximising the muon influx to the accel...

  15. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  16. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  17. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  18. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  19. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  20. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  1. DARPA's Big Mechanism program.

    Science.gov (United States)

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  2. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  3. A cancer research UK pharmacokinetic study of BPA-mannitol in patients with high grade glioma to optimise uptake parameters for clinical trials of BNCT

    Energy Technology Data Exchange (ETDEWEB)

    Cruickshank, G.S. [University of Birmingham and University Hospital Birmingham, Birmingham (United Kingdom)], E-mail: garth.cruickshank@uhb.nhs.uk; Ngoga, D.; Detta, A.; Green, S.; James, N.D.; Wojnecki, C.; Doran, J.; Hardie, J.; Chester, M.; Graham, N.; Ghani, Z. [University of Birmingham and University Hospital Birmingham, Birmingham (United Kingdom); Halbert, G.; Elliot, M.; Ford, S. [CR-UK Formulation Unit, University of Strathclyde, Glasgow (United Kingdom); Braithwaite, R.; Sheehan, T.M.T. [Regional Laboratory for Toxicology, Sandwell and West Birmingham Hospitals Trust, Birmingham (United Kingdom); Vickerman, J.; Lockyer, N. [Surface Analysis Research Centre, University of Manchester, Manchester (United Kingdom); Steinfeldt, H.; Croswell, G. [CR-UK Drug Development Office, London (United Kingdom)] (and others)

    2009-07-15

    This paper describes results to-date from a human pharmacokinetic study which began recruitment in December 2007. Results are presented for a single patient recruited in December 2007. A second patient was recruited in July 2008 but detailed data are not available at the time of writing. The trial is an open-label, non-comparative, non-therapeutic study of BPA-mannitol in patients with high-grade glioma, who will be undergoing stereotactic brain biopsy as part of the diagnostic process before definitive treatment. The study investigates the route of infusion (intra-venous (IV) or intra-carotid artery) and in each case will assess the effect of administration of mannitol as a blood-brain barrier disrupter. All cohorts will receive a 2 h infusion of BPA-mannitol, and for some cohorts an additional mannitol bolus will be administered at the beginning of this infusion. Measurements are made by inductively coupled plasma mass spectrometry (ICP-MS) of {sup 10}B concentration in samples of blood, urine, extra-cellular fluid in normal brain (via a dialysis probe), brain tissue around tumour and tumour tissue. Additional analysis of the tumour tissue is performed using secondary ion mass spectrometry (SIMS). The first patient was part of the cohort having intra-venous infusion without mannitol bolus. No serious clinical problems were experienced and the assay results can be compared with available patient data from other BNCT centres. In particular we note that the peak {sup 10}B concentration in blood was 28.1 mg/ml for a total BPA administration of 350 mg/kg which is very consistent with the previous experience with BPA-fructose reported by the Helsinki group.

  4. A phantom-based JAFROC observer study of two CT reconstruction methods: the search for optimisation of lesion detection and effective dose

    Science.gov (United States)

    Thompson, John D.; Chakraborty, Dev P.; Szczepura, Katy; Vamvakas, Ioannis; Tootell, Andrew; Manning, David J.; Hogg, Peter

    2015-03-01

    Purpose: To investigate the dose saving potential of iterative reconstruction (IR) in a computed tomography (CT) examination of the thorax. Materials and Methods: An anthropomorphic chest phantom containing various configurations of simulated lesions (5, 8, 10 and 12mm; +100, -630 and -800 Hounsfield Units, HU) was imaged on a modern CT system over a tube current range (20, 40, 60 and 80mA). Images were reconstructed with (IR) and filtered back projection (FBP). An ATOM 701D (CIRS, Norfolk, VA) dosimetry phantom was used to measure organ dose. Effective dose was calculated. Eleven observers (15.11+/-8.75 years of experience) completed a free response study, localizing lesions in 544 single CT image slices. A modified jackknife alternative free-response receiver operating characteristic (JAFROC) analysis was completed to look for a significant effect of two factors: reconstruction method and tube current. Alpha was set at 0.05 to control the Type I error in this study. Results: For modified JAFROC analysis of reconstruction method there was no statistically significant difference in lesion detection performance between FBP and IR when figures-of-merit were averaged over tube current (F(1,10)=0.08, p = 0.789). For tube current analysis, significant differences were revealed between multiple pairs of tube current settings (F(3,10) = 16.96, pConclusion: The free-response study suggests that lesion detection can be optimized at 40mA in this phantom model, a measured effective dose of 0.97mSv. In high-contrast regions the diagnostic value of IR, compared to FBP, is less clear.

  5. Big Data Knowledge Mining

    Directory of Open Access Journals (Sweden)

    Huda Umar Banuqitah

    2016-11-01

    Full Text Available Big Data (BD era has been arrived. The ascent of big data applications where information accumulation has grown beyond the ability of the present programming instrument to catch, manage and process within tolerable short time. The volume is not only the characteristic that defines big data, but also velocity, variety, and value. Many resources contain BD that should be processed. The biomedical research literature is one among many other domains that hides a rich knowledge. MEDLINE is a huge biomedical research database which remain a significantly underutilized source of biological information. Discovering the useful knowledge from such huge corpus leading to many problems related to the type of information such as the related concepts of the domain of texts and the semantic relationship associated with them. In this paper, an agent-based system of two–level for Self-supervised relation extraction from MEDLINE using Unified Medical Language System (UMLS Knowledgebase, has been proposed . The model uses a Self-supervised Approach for Relation Extraction (RE by constructing enhanced training examples using information from UMLS with hybrid text features. The model incorporates Apache Spark and HBase BD technologies with multiple data mining and machine learning technique with the Multi Agent System (MAS. The system shows a better result in comparison with the current state of the art and naïve approach in terms of Accuracy, Precision, Recall and F-score.

  6. Supporting Treatment decision making to Optimise the Prevention of STROKE in Atrial Fibrillation: The STOP STROKE in AF study. Protocol for a cluster randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Gattellari Melina

    2012-07-01

    Full Text Available Abstract Background Suboptimal uptake of anticoagulation for stroke prevention in atrial fibrillation has persisted for over 20 years, despite high-level evidence demonstrating its effectiveness in reducing the risk of fatal and disabling stroke. Methods The STOP STROKE in AF study is a national, cluster randomised controlled trial designed to improve the uptake of anticoagulation in primary care. General practitioners from around Australia enrolling in this ‘distance education’ program are mailed written educational materials, followed by an academic detailing session delivered via telephone by a medical peer, during which participants discuss patient de-identified cases. General practitioners are then randomised to receive written specialist feedback about the patient de-identified cases either before or after completing a three-month posttest audit. Specialist feedback is designed to provide participants with support and confidence to prescribe anticoagulation. The primary outcome is the proportion of patients with atrial fibrillation receiving oral anticoagulation at the time of the posttest audit. Discussion The STOP STROKE in AF study aims to evaluate a feasible intervention via distance education to prevent avoidable stroke due to atrial fibrillation. It provides a systematic test of augmenting academic detailing with expert feedback about patient management. Trial registration Australian Clinical Trials Registry Registration Number: ACTRN12611000076976.

  7. Topology optimised planar photonic crystal building blocks

    DEFF Research Database (Denmark)

    Frandsen, Lars Hagedorn; Hede, K. K.; Borel, Peter Ingo

    A photonic crystal waveguide (PhCW) 1x4 splitter has been constructed from PhCW 60° bends1 and Y-splitters2 that have been designed individually by utilising topology optimisation3. The splitter has been fabricated in a silicon-on-insulator material (Fig. 1) and exhibits a broadband splitting...... for the TE-polarisation with an average excess loss of 1.55±0.54 dB for a 110 nm bandwidth. The 1x4 splitter demonstrates that individual topology-optimised parts can be used as building blocks to realise high-performance nanophotonic circuits. 1L. H. Frandsen et al., Opt. Express 12, 5916-5921 (2004) 2P. I...

  8. Improved Squeaky Wheel Optimisation for Driver Scheduling

    CERN Document Server

    Aickelin, Uwe; Li, Jingpeng

    2008-01-01

    This paper presents a technique called Improved Squeaky Wheel Optimisation for driver scheduling problems. It improves the original Squeaky Wheel Optimisations effectiveness and execution speed by incorporating two additional steps of Selection and Mutation which implement evolution within a single solution. In the ISWO, a cycle of Analysis-Selection-Mutation-Prioritization-Construction continues until stopping conditions are reached. The Analysis step first computes the fitness of a current solution to identify troublesome components. The Selection step then discards these troublesome components probabilistically by using the fitness measure, and the Mutation step follows to further discard a small number of components at random. After the above steps, an input solution becomes partial and thus the resulting partial solution needs to be repaired. The repair is carried out by using the Prioritization step to first produce priorities that determine an order by which the following Construction step then schedul...

  9. Buckling optimisation of sandwich cylindrical panels

    Science.gov (United States)

    Abouhamzeh, M.; Sadighi, M.

    2016-06-01

    In this paper, the buckling load optimisation is performed on sandwich cylindrical panels. A finite element program is developed in MATLAB to solve the governing differential equations of the global buckling of the structure. In order to find the optimal solution, the genetic algorithm Toolbox in MATLAB is implemented. Verifications are made for both the buckling finite element code and also the results from the genetic algorithm by comparisons to the results available in literature. Sandwich cylindrical panels are optimised for the buckling strength with isotropic or orthotropic cores with different boundary conditions. Results are presented in terms of stacking sequence of fibers in the face sheets and core to face sheet thickness ratio.

  10. Applying the Theory of Optimising Professional Life

    Directory of Open Access Journals (Sweden)

    Lesley Margaret Piko

    2014-12-01

    Full Text Available Glaser (2014 wrote that “the application of grounded theory (GT is a relatively neglected topic” (p. 1 in the literature. Applying GT to purposely intervene and improve a situation is an important adjunct to our knowledge and understanding of GT. A recent workshop of family doctors and general practitioners provides a useful example. The theory of optimising professional life explains that doctors are concerned about sustainment in their career and, to resolve this concern, they implement solutions to optimise their personal situation. Sustainment is a new, overarching concept of three needs: the need for self-care to sustain well-being, the need for work interest to sustain motivation, and the need for income to sustain lifestyle. The objective of the workshop was to empower doctors to reinvent their careers using this theory. Working individually and in small groups, participants were able to analyse a problem and to identify potential solutions.

  11. Fermionic orbital optimisation in tensor network states

    CERN Document Server

    Krumnow, C; Eisert, J

    2015-01-01

    Tensor network states and specifically matrix-product states have proven to be a powerful tool for simulating ground states of strongly correlated spin models. Recently, they have also been applied to interacting fermionic problems, specifically in the context of quantum chemistry. A new freedom arising in such non-local fermionic systems is the choice of orbitals, it being far from clear what choice of fermionic orbitals to make. In this work, we propose a way to overcome this challenge. We suggest a method intertwining the optimisation over matrix product states with suitable fermionic Gaussian mode transformations, hence bringing the advantages of both approaches together. The described algorithm generalises basis changes in the spirit of the Hartree-Fock methods to matrix-product states, and provides a black box tool for basis optimisations in tensor network methods.

  12. Reducing Medical Admissions into Hospital through Optimising Medicines (REMAIN HOME) Study: protocol for a stepped-wedge, cluster-randomised trial.

    Science.gov (United States)

    Foot, Holly; Freeman, Christopher; Hemming, Karla; Scott, Ian; Coombes, Ian D; Williams, Ian D; Connelly, Luke; Whitty, Jennifer A; Sturman, Nancy; Kirsa, Sue; Nicholson, Caroline; Russell, Grant; Kirkpatrick, Carl; Cottrell, Neil

    2017-04-13

    A model of general practitioner (GP) and pharmacist collaboration in primary care may be an effective strategy to reduce medication-related problems and provide better support to patients after discharge. The aim of this study is to investigate whether a model of structured pharmacist and GP care reduces hospital readmissions in high-risk patients. This protocol details a stepped-wedge, cluster-randomised trial that will recruit participants over 9 months with a 12-month follow-up. There will be 14 clusters each representing a different general practice medical centre. A total of 2240 participants will be recruited from hospital who attend an enrolled medical centre, take five or more long-term medicines or whose reason for admission was related to heart failure or chronic obstructive pulmonary disease.The intervention is a multifaceted service, involving a pharmacist integrated into a medical centre to assist patients after hospitalisation. Participants will meet with the practice pharmacist and their GP after discharge to review and reconcile their medicines and discuss changes made in hospital. The pharmacist will follow-up with the participant and liaise with other health professionals involved in the participant's care. The control will be usual care, which usually involves a patient self-organising a visit to their GP after hospital discharge.The primary outcome is the rate of unplanned, all-cause hospital readmissions over 12 months, which will be analysed using a mixed effects Poisson regression model with a random effect for cluster and a fixed effect to account for any temporal trend. A cost analysis will be undertaken to compare the healthcare costs associated with the intervention to those of usual care. The study has received ethical approval (HREC/16/QRBW/410). The study findings will be disseminated through peer-reviewed publications, conferences and reports to key stakeholders. ACTRN12616001627448. © Article author(s) (or their employer(s) unless

  13. Reliability centered maintenance applied to gas turbines, a deeper methodological study; Optimisation de la maintenance par la fiabilite appliquee aux turbines a combustion: approfondissement methodologique

    Energy Technology Data Exchange (ETDEWEB)

    Despujols, A.; Delbos, J.P.; Zuliani, G.

    1995-12-31

    The 9000E`combustion turbine study is unique in that it applied he RCM procedure to a power station before construction had begun on it. This has resulted in a maintenance programme being determined from the outset, rather than an existing programme being improved. The analysis begins with a search of the functions required by the unit, its operating states and its failure modes, then moves on to look at different systems to establish functional trees followed by failure trees. The lowest leaves on these graphs correspond respectively to functions carried out by important equipment nd their failure modes. Following on from this hierarchical procedure, each of the items of equipment are studies in an analysis of the failures modes, their effects and their criticality. The absence of feedback on this machine, the lack of data on the technology and the small size of the equipment (by comparison with those encountered in a nuclear unit), all contribute to limiting the level of decomposition. Tables are thus obtained which show the failure modes of an item of equipment, their origin, their effects on a system and unit level, the potential damage they cause, their severity, expert estimates of the frequency with they are likely to appear, their criticality and their evidence. The next step is a crucial one since it produces the results expected from the previous stages. It relies on task selection logic which has been honed to better take account of the obvious or hidden character of the failures. Not only must preventive operations to stop critical failure modes from occurring be suggested, but certain operations must be prescribed in order for breakdowns to be revealed. In practice, the state of the redundant or protective equipment has to be known when some of their failure modes remain hidden from the operator. The risk of a double failure with serious consequences becomes a probability when their breakdown state is not detected. (authors) 15 refs.

  14. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  15. Adaptive Java Optimisation using machine learning techniques

    OpenAIRE

    Long, Shun

    2004-01-01

    There is a continuing demand for higher performance, particularly in the area of scientific and engineering computation. In order to achieve high performance in the context of frequent hardware upgrading, software must be adaptable for portable performance. What is required is an optimising compiler that evolves and adapts itself to environmental change without sacrificing performance. Java has emerged as a dominant programming language widely used in a variety of application areas. Howeve...

  16. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  17. Optimised polarisation measurements on Bragg peaks

    Energy Technology Data Exchange (ETDEWEB)

    Lelievre-Berna, E. [Institut Laue Langevin, 6 rue Jules Horowitz, 38042 Grenoble Cedex 9 (France)]. E-mail: lelievre@ill.fr; Brown, P.J. [Institut Laue Langevin, 6 rue Jules Horowitz, 38042 Grenoble Cedex 9 (France); Tasset, F. [Institut Laue Langevin, 6 rue Jules Horowitz, 38042 Grenoble Cedex 9 (France)

    2007-07-15

    Experimentally the asymmetry A (or the flipping ratio R) is deduced from the two count rates observed for |+> and |-> neutron spin states. Since the count rates for the two spin states may be quite different and both require to be corrected for background, the optimum strategy for the measurement is important. We present here the theory for optimisation of the accuracy of measurement of A (or R) within the constraint of a fixed total measuring time.

  18. Implant Optimisation for Primary Hip Replacement in Patients over 60 Years with Osteoarthritis: A Cohort Study of Clinical Outcomes and Implant Costs Using Data from England and Wales.

    Directory of Open Access Journals (Sweden)

    Simon S Jameson

    Full Text Available Hip replacement is one of the most commonly performed surgical procedures worldwide; hundreds of implant configurations provide options for femoral head size, joint surface material and fixation method with dramatically varying costs. Robust comparative evidence to inform the choice of implant is needed. This retrospective cohort study uses linked national databases from England and Wales to determine the optimal type of replacement for patients over 60 years undergoing hip replacement for osteoarthritis.Implants included were the commonest brand from each of the four types of replacement (cemented, cementless, hybrid and resurfacing; the reference prosthesis was the cemented hip procedure. Patient reported outcome scores (PROMs, costs and risk of repeat (revision surgery were examined. Multivariable analyses included analysis of covariance to assess improvement in PROMs (Oxford hip score, OHS, and EQ5D index (9159 linked episodes and competing risks modelling of implant survival (79,775 procedures. Cost of implants and ancillary equipment were obtained from National Health Service procurement data.EQ5D score improvements (at 6 months were similar for all hip replacement types. In females, revision risk was significantly higher in cementless hip prostheses (hazard ratio, HR = 2.22, p<0.001, when compared to the reference hip. Although improvement in OHS was statistically higher (22.1 versus 20.5, p<0.001 for cementless implants, this small difference is unlikely to be clinically important. In males, revision risk was significantly higher in cementless (HR = 1.95, p = 0.003 and resurfacing implants, HR = 3.46, p<0.001, with no differences in OHS. Material costs were lowest with the reference implant (cemented, range £1103 to £1524 and highest with cementless implants (£1928 to £4285. Limitations include the design of the study, which is intrinsically vulnerable to omitted variables, a paucity of long-term implant survival data (reflecting the

  19. Testing the credibility, feasibility and acceptability of an optimised behavioural intervention (OBI) for avoidant chronic low back pain patients: protocol for a randomised feasibility study.

    Science.gov (United States)

    Pincus, Tamar; Anwar, Shamaila; McCracken, Lance; McGregor, Alison; Graham, Liz; Collinson, Michelle; Farrin, Amanda J

    2013-06-13

    Chronic back pain continues to be a costly and prevalent condition. The latest NICE guidelines issued in 2009 state that for patients with persistent back pain (of between six weeks and twelve months duration), who are highly distressed and/or disabled and for whom exercise, manual therapy and acupuncture has not been beneficial, the evidence supports a combination of around 100 hours of combined physical and psychological treatment. This is costly, and may prove unacceptable to many patients. A key recommendation of these guidelines was for further randomised controlled trials (RCTs) of psychological treatment and to target treatment to specific sub-groups of patients. Recent trials that have included psychological interventions have shown only moderate improvement at best, and results are not maintained long term. There is therefore a need to test theoretically driven interventions that focus on specific high-risk sub-groups, in which the intervention is delivered at full integrity against a credible control. A feasibility study of a pragmatic randomised controlled trial comparing psychologist-delivered Contextual Cognitive Behavioural Therapy (CCBT) against Treatment As Usual (TAU) physiotherapy delivered by physiotherapists for the treatment of chronic lower back pain in 'avoidant' patients. Ninety-two patients referred for physiotherapy will be recruited and randomised on a 1:1 basis to receive CCBT or TAU. Treatment groups will be balanced by centre and pain interference score. Primary outcomes include assessing the credibility and acceptability of the intervention, and to demonstrate proof of principle through a greater change in pain acceptance in the CCBT arm, measured by the Acceptance and Action -II and the Chronic Pain Acceptance questionnaires. In addition, the feasibility of carrying out a full trial will be explored with reference to recruitment and follow-up rates including the assessment of the burden of outcome measure completion. Secondary

  20. System optimisation for automatic wood-fired heating systems; Systemoptimierung automatischer Holzheizung - Projektphase 1

    Energy Technology Data Exchange (ETDEWEB)

    Good, J.; Nussbaumer, T. [Verenum, Zuerich (Switzerland); Jenni, A. [Ardens GmbH, Liestal (Switzerland); Buehler, R. [Umwelt und Energie, Maschwanden (Switzerland)

    2002-07-01

    This final report for the Swiss Federal Office of Energy (SFOE) presents the results of the first phase of a project that is to optimise the performance of existing automatic wood-fired heating systems in the range 330 kW to 1 MW from the ecological and economical points of view. The report presents the results of an initial phase of the project in which five selected installations were optimised in order to be able to assess the potential for optimisation in general. The study looks at the efficiency of the plant as far as heat generation and distribution are concerned. The report presents details on various factors measured such as operating hours, heat distribution, control strategies, fuel-quality requirements, integration in heating systems and safety aspects and compares power delivered with rated power. The authors consider the potential for optimisation to be high and suggest optimisation targets concerning consumer density in district heating schemes, full-load operating hours, minimum yearly operational efficiency and the control of heating power.