WorldWideScience

Sample records for large scientific problem

  1. Speedup predictions on large scientific parallel programs

    International Nuclear Information System (INIS)

    Williams, E.; Bobrowicz, F.

    1985-01-01

    How much speedup can we expect for large scientific parallel programs running on supercomputers. For insight into this problem we extend the parallel processing environment currently existing on the Cray X-MP (a shared memory multiprocessor with at most four processors) to a simulated N-processor environment, where N greater than or equal to 1. Several large scientific parallel programs from Los Alamos National Laboratory were run in this simulated environment, and speedups were predicted. A speedup of 14.4 on 16 processors was measured for one of the three most used codes at the Laboratory

  2. Problems of scientific research in developing countries

    International Nuclear Information System (INIS)

    Vose, P.B.; Cervellini, A.

    1983-01-01

    The paper gives a general consideration of the problems encountered in the scientific research by the developing countries. Possible optimizations in the long term as well as short term strategies are pointed out

  3. Scientific perspectives on greenhouse problem. Part 2

    International Nuclear Information System (INIS)

    Jastrow, R.; Nierenberg, W.; Seitz, F.

    1992-01-01

    The spectre of major climate change caused by the greenhouse effect has generated intensive research, heated scientific debate and a concerted international effort to draft agreements for the reduction of greenhouse gas emissions. This report of Scientific Perspectives on the greenhouse problem explains the technical issues in the debate in language readily understandable to the non-specialist. The inherent complexities of attempts to simulate the earth's climate are explained, particularly with regard to the effects of clouds and the circulation of the oceans, which together represent the largest factors of uncertainty in current global warming forecasts. Results of the search for the 'greenhouse signal' in existing climate records aredescribed in chapter 3 (part two). Chapter 5 (part two) develops a projection of 21st-century warming based on relatively firm evidence of the earth's actual response to known increases in greenhouse gas emissions during the last 100 years

  4. Fundamental Scientific Problems in Magnetic Recording

    Energy Technology Data Exchange (ETDEWEB)

    Schulthess, T.C.; Miller, M.K.

    2007-06-27

    Magnetic data storage technology is presently leading the high tech industry in advancing device integration--doubling the storage density every 12 months. To continue these advancements and to achieve terra bit per inch squared recording densities, new approaches to store and access data will be needed in about 3-5 years. In this project, collaboration between Oak Ridge National Laboratory (ORNL), Center for Materials for Information Technology (MINT) at University of Alabama (UA), Imago Scientific Instruments, and Seagate Technologies, was undertaken to address the fundamental scientific problems confronted by the industry in meeting the upcoming challenges. The areas that were the focus of this study were to: (1) develop atom probe tomography for atomic scale imaging of magnetic heterostructures used in magnetic data storage technology; (2) develop a first principles based tools for the study of exchange bias aimed at finding new anti-ferromagnetic materials to reduce the thickness of the pinning layer in the read head; (3) develop high moment magnetic materials and tools to study magnetic switching in nanostructures aimed at developing improved writers of high anisotropy magnetic storage media.

  5. Shutdown problems in large tokamaks

    International Nuclear Information System (INIS)

    Weldon, D.M.

    1978-01-01

    Some of the problems connected with a normal shutdown at the end of the burn phase (soft shutdown) and with a shutdown caused by disruptive instability (hard shutdown) have been considered. For a soft shutdown a cursory literature search was undertaken and methods for controlling the thermal wall loading were listed. Because shutdown computer codes are not widespread, some of the differences between start-up codes and shutdown codes were discussed along with program changes needed to change a start-up code to a shutdown code. For a hard shutdown, the major problems are large induced voltages in the ohmic-heating and equilibrium-field coils and high first wall erosion. A literature search of plasma-wall interactions was carried out. Phenomena that occur at the plasma-wall interface can be quite complicated. For example, material evaporated from the wall can form a virtual limiter or shield protecting the wall from major damage. Thermal gradients that occur during the interaction can produce currents whose associated magnetic field also helps shield the wall

  6. Key scientific problems from Cosmic Ray History

    Science.gov (United States)

    Lev, Dorman

    2016-07-01

    Recently was published the monograph "Cosmic Ray History" by Lev Dorman and Irina Dorman (Nova Publishers, New York). What learn us and what key scientific problems formulated the Cosmic Ray History? 1. As many great discoveries, the phenomenon of cosmic rays was discovered accidentally, during investigations that sought to answer another question: what are sources of air ionization? This problem became interesting for science about 230 years ago in the end of the 18th century, when physics met with a problem of leakage of electrical charge from very good isolated bodies. 2. At the beginning of the 20th century, in connection with the discovery of natural radioactivity, it became apparent that this problem is mainly solved: it was widely accepted that the main source of the air ionization were α, b, and γ - radiations from radioactive substances in the ground (γ-radiation was considered as the most important cause because α- and b-radiations are rapidly absorbed in the air). 3. The general accepted wrong opinion on the ground radioactivity as main source of air ionization, stopped German meteorologist Franz Linke to made correct conclusion on the basis of correct measurements. In fact, he made 12 balloon flights in 1900-1903 during his PhD studies at Berlin University, carrying an electroscope to a height of 5500 m. The PhD Thesis was not published, but in Thesis he concludes: "Were one to compare the presented values with those on ground, one must say that at 1000 m altitude the ionization is smaller than on the ground, between 1 and 3 km the same amount, and above it is larger with values increasing up to a factor of 4 (at 5500 m). The uncertainties in the observations only allow the conclusion that the reason for the ionization has to be found first in the Earth." Nobody later quoted Franz Linke and although he had made the right measurements, he had reached the wrong conclusions, and the discovery of CR became only later on about 10 years. 4. Victor Hess, a

  7. Problems of information support in scientific research

    Science.gov (United States)

    Shamaev, V. G.; Gorshkov, A. B.

    2015-11-01

    This paper reports on the creation of the open access Akustika portal (AKDATA.RU) designed to provide Russian-language easy-to-read and search information on acoustics and related topics. The absence of a Russian-language publication in foreign databases means that it is effectively lost for much of the scientific community. The portal has three interrelated sections: the Akustika information search system (ISS) (Acoustics), full-text archive of the Akusticheskii Zhurnal (Acoustic Journal), and 'Signal'naya informatsiya' ('Signaling information') on acoustics. The paper presents a description of the Akustika ISS, including its structure, content, interface, and information search capabilities for basic and applied research in diverse areas of science, engineering, biology, medicine, etc. The intended users of the portal are physicists, engineers, and engineering technologists interested in expanding their research activities and seeking to increase their knowledge base. Those studying current trends in the Russian-language contribution to international science may also find the portal useful.

  8. Examining the Relationship of Scientific Reasoning with Physics Problem Solving

    Science.gov (United States)

    Fabby, Carol; Koenig, Kathleen

    2015-01-01

    Recent research suggests students with more formal reasoning patterns are more proficient learners. However, little research has been done to establish a relationship between scientific reasoning and problem solving abilities by novices. In this exploratory study, we compared scientific reasoning abilities of students enrolled in a college level…

  9. INTEGRATION OF UKRAINIAN INDUSTRY SCIENTIFIC PERIODACLS INTO WORLD SCIENTIFIC INFORMATION SPACE: PROBLEMS AND SOLUTIONS

    Directory of Open Access Journals (Sweden)

    T. O. Kolesnykova

    2013-11-01

    Full Text Available Purpose. Problem of representation lack of scientists’ publications, including transport scientists, in the international scientometric databases is the urgent one for Ukrainian science. To solve the problem one should study the structure and quality of the information flow of scientific periodicals of railway universities in Ukraine and to determine the integration algorithm of scientific publications of Ukrainian scientists into the world scientific information space. Methodology. Applying the methods of scientific analysis, synthesis, analogy, comparison and prediction the author has investigated the problem of scientific knowledge distribution using formal communications. The readiness of Ukrainian railway periodicals to registration procedure in the international scientometric systems was analyzed. The level of representation of articles and authors of Ukrainian railway universities in scientometric database Scopus was studied. Findings. Monitoring of the portals of railway industry universities of Ukraine and the sites of their scientific periodicals and analysis of obtained data prove insufficient readiness of most scientific publications for submission to scientometric database. The ways providing sufficient "visibility" of industry periodicals of Ukrainian universities in the global scientific information space were proposed. Originality. The structure and quality of documentary flow of scientific periodicals in railway transport universities of Ukraine and its reflection in scientometric DB Scopus were first investigated. The basic directions of university activities to integrate the results of transport scientists research into the global scientific digital environment were outlined. It was determined the leading role of university libraries in the integration processes of scientific documentary resources of universities into the global scientific and information communicative space. Practical value. Implementation of the proposed

  10. Application of Logic Models in a Large Scientific Research Program

    Science.gov (United States)

    O'Keefe, Christine M.; Head, Richard J.

    2011-01-01

    It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…

  11. Scientific and technical problems in the production of coke

    Energy Technology Data Exchange (ETDEWEB)

    Glushchenko, I M

    1979-05-01

    This paper lists several scientific and technological problems facing the producers of coke in the future. The demand for coke on the world market is steadily increasing despite the efforts of metallurgists to find non-coke methods of ore smelting. The major problems are held to be the gap between very successful results of research and the lack of their application to industry, unsatisfactory coke oven construction, problems in quenching and imperfections in the formed coke process. (In Russian)

  12. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  13. Advanced I/O for large-scale scientific applications

    International Nuclear Information System (INIS)

    Klasky, Scott; Schwan, Karsten; Oldfield, Ron A.; Lofstead, Gerald F. II

    2010-01-01

    As scientific simulations scale to use petascale machines and beyond, the data volumes generated pose a dual problem. First, with increasing machine sizes, the careful tuning of IO routines becomes more and more important to keep the time spent in IO acceptable. It is not uncommon, for instance, to have 20% of an application's runtime spent performing IO in a 'tuned' system. Careful management of the IO routines can move that to 5% or even less in some cases. Second, the data volumes are so large, on the order of 10s to 100s of TB, that trying to discover the scientifically valid contributions requires assistance at runtime to both organize and annotate the data. Waiting for offline processing is not feasible due both to the impact on the IO system and the time required. To reduce this load and improve the ability of scientists to use the large amounts of data being produced, new techniques for data management are required. First, there is a need for techniques for efficient movement of data from the compute space to storage. These techniques should understand the underlying system infrastructure and adapt to changing system conditions. Technologies include aggregation networks, data staging nodes for a closer parity to the IO subsystem, and autonomic IO routines that can detect system bottlenecks and choose different approaches, such as splitting the output into multiple targets, staggering output processes. Such methods must be end-to-end, meaning that even with properly managed asynchronous techniques, it is still essential to properly manage the later synchronous interaction with the storage system to maintain acceptable performance. Second, for the data being generated, annotations and other metadata must be incorporated to help the scientist understand output data for the simulation run as a whole, to select data and data features without concern for what files or other storage technologies were employed. All of these features should be attained while

  14. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  15. Modern algorithms for large sparse eigenvalue problems

    International Nuclear Information System (INIS)

    Meyer, A.

    1987-01-01

    The volume is written for mathematicians interested in (numerical) linear algebra and in the solution of large sparse eigenvalue problems, as well as for specialists in engineering, who use the considered algorithms in the investigation of eigenoscillations of structures, in reactor physics, etc. Some variants of the algorithms based on the idea of a gradient-type direction of movement are presented and their convergence properties are discussed. From this, a general strategy for the direct use of preconditionings for the eigenvalue problem is derived. In this new approach the necessity of the solution of large linear systems is entirely avoided. Hence, these methods represent a new alternative to some other modern eigenvalue algorithms, as they show a slightly slower convergence on the one hand but essentially lower numerical and data processing problems on the other hand. A brief description and comparison of some well-known methods (i.e. simultaneous iteration, Lanczos algorithm) completes this volume. (author)

  16. Technologies for Large Data Management in Scientific Computing

    CERN Document Server

    Pace, A

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focusses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  17. On-demand Overlay Networks for Large Scientific Data Transfers

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Guok, Chin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jackson, Keith [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kissel, Ezra [Univ. of Delaware, Newark, DE (United States); Swany, D. Martin [Univ. of Delaware, Newark, DE (United States); Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2009-10-12

    Large scale scientific data transfers are central to scientific processes. Data from large experimental facilities have to be moved to local institutions for analysis or often data needs to be moved between local clusters and large supercomputing centers. In this paper, we propose and evaluate a network overlay architecture to enable highthroughput, on-demand, coordinated data transfers over wide-area networks. Our work leverages Phoebus and On-demand Secure Circuits and AdvanceReservation System (OSCARS) to provide high performance wide-area network connections. OSCARS enables dynamic provisioning of network paths with guaranteed bandwidth and Phoebus enables the coordination and effective utilization of the OSCARS network paths. Our evaluation shows that this approach leads to improved end-to-end data transfer throughput with minimal overheads. The achievedthroughput using our overlay was limited only by the ability of the end hosts to sink the data.

  18. Problems of future philologists’ training in modern scientific discourse

    Directory of Open Access Journals (Sweden)

    Maryna Ikonnikova

    2017-04-01

    Full Text Available Philosophical, psychological-pedagogical and sociolinguistic projections of future philologists’ professional training have been studied in the paper. It has been defined that they provide for creating optimal conditions for learning language, literature, translation, etc.; stimulating speech and mental activity of students; developing their critical thinking skills, linguistic personality, multiple intellect, the ability to model conceptual information; widening knowledge-based space taking into account individual styles and strategies for student learning. It has been indicated that within foreign scientific discourse scholars focus on the problem of training philologists of the integrated type that is possible provided the methodology is scientifically justified, based on the significant achievements of philosophy, psychology, pedagogy, linguodidactics, sociolinguistics and culturology, oriented toward European requirements to language education, positive foreign experience and national traditions.Key words: future philologists, professional training, philological education, philology, scientific discourse.

  19. Doing physics with scientific notebook a problem solving approach

    CERN Document Server

    Gallant, Joseph

    2012-01-01

    The goal of this book is to teach undergraduate students how to use Scientific Notebook (SNB) to solve physics problems. SNB software combines word processing and mathematics in standard notation with the power of symbolic computation. As its name implies, SNB can be used as a notebook in which students set up a math or science problem, write and solve equations, and analyze and discuss their results. Written by a physics teacher with over 20 years experience, this text includes topics that have educational value, fit within the typical physics curriculum, and show the benefits of using SNB.

  20. Insufficiencies in solving global environmental problems by specialized scientific expertise

    International Nuclear Information System (INIS)

    Hartwig, S.G.; Kra, R.S.

    1989-01-01

    The most paradoxical and urgent problem faces the world today. We find ourselves between the horns of a dilemma. One horn represents the accelerating demand for energy, and the other, the irreversible degradation of our natural environment. There are two directions that we can take to solve our growing global crisis. The first step is to encourage scientific specialists to think in broader terms. The second necessary approach is to make decision makers aware of the complexity of the situation as well as the dangers of tunnel vision that experts often fall into. Therefore, to find a long-term holistic solution, decision makers, be they government officials or academics, must be, themselves, solution oriented and capable of directing scientists along broadened problem-solving pathways. Up till now, scientists have been required to research environmental problems, discover causal associations and determine effects. Contemporary scientists, in the truest sense of the meaning, are no longer generalists but are specialists in their own fields with great depth and accuracy of knowledge. However, experts of high standing may have difficulty visualizing adjacent sciences, which causes them to lose sight of topics peripheral to their main field of interest. The consequence of this can be that solutions to a problem will be sought only within particular and specialized areas, but it is, unfortunately, a fact of life that environmental problems do not come neatly packaged in scientific disciplines: they happen in their entirety, with all their synergistic implications. 5 refs., 5 figs

  1. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  2. From normative towards positive epistemology: Problem of scientific fact

    Directory of Open Access Journals (Sweden)

    Brdar Milan

    2009-01-01

    Full Text Available In this article author provides a sketch of a twist from normative towards descriptive epistemology as a result of transformation of contemporary philosophy during the main occurrence known as Linguistic Turn. Best illustrations for epistemic twist are given in theories of Karl Popper, Ludwik Fleck and Thomas Kuhn. These three theories determined deconstruction of traditional philosophy of science and traditional picture of science and its practice. At the same time U isti mah, by the same twist towards descriptive and historical epistemology some core problems of epistemology and methodology are actualized in very sharper form. This concerns mainly of the problems of scientific statements and procedures of establishing of their objectivity and truthfulness. For in all three theories, of Popper, Fleck and Kuhn, empirical statements are functions/dependent of: theoretical framework, of though style or of paradigm, problem of truthfulness ad objective justification shows itself within Minhausen trilema. Task of positive epistemology is not to prescribe some procedure, but to provide reconstruction of real procedures given in practice of scientific communities. That means to show how they resolve problem of epistemic foundation of their theories and how they provide justified reasons to defend their theories as truthful and objective. .

  3. Bayesian nonlinear regression for large small problems

    KAUST Repository

    Chakraborty, Sounak; Ghosh, Malay; Mallick, Bani K.

    2012-01-01

    Statistical modeling and inference problems with sample sizes substantially smaller than the number of available covariates are challenging. This is known as large p small n problem. Furthermore, the problem is more complicated when we have multiple correlated responses. We develop multivariate nonlinear regression models in this setup for accurate prediction. In this paper, we introduce a full Bayesian support vector regression model with Vapnik's ε-insensitive loss function, based on reproducing kernel Hilbert spaces (RKHS) under the multivariate correlated response setup. This provides a full probabilistic description of support vector machine (SVM) rather than an algorithm for fitting purposes. We have also introduced a multivariate version of the relevance vector machine (RVM). Instead of the original treatment of the RVM relying on the use of type II maximum likelihood estimates of the hyper-parameters, we put a prior on the hyper-parameters and use Markov chain Monte Carlo technique for computation. We have also proposed an empirical Bayes method for our RVM and SVM. Our methods are illustrated with a prediction problem in the near-infrared (NIR) spectroscopy. A simulation study is also undertaken to check the prediction accuracy of our models. © 2012 Elsevier Inc.

  4. Bayesian nonlinear regression for large small problems

    KAUST Repository

    Chakraborty, Sounak

    2012-07-01

    Statistical modeling and inference problems with sample sizes substantially smaller than the number of available covariates are challenging. This is known as large p small n problem. Furthermore, the problem is more complicated when we have multiple correlated responses. We develop multivariate nonlinear regression models in this setup for accurate prediction. In this paper, we introduce a full Bayesian support vector regression model with Vapnik\\'s ε-insensitive loss function, based on reproducing kernel Hilbert spaces (RKHS) under the multivariate correlated response setup. This provides a full probabilistic description of support vector machine (SVM) rather than an algorithm for fitting purposes. We have also introduced a multivariate version of the relevance vector machine (RVM). Instead of the original treatment of the RVM relying on the use of type II maximum likelihood estimates of the hyper-parameters, we put a prior on the hyper-parameters and use Markov chain Monte Carlo technique for computation. We have also proposed an empirical Bayes method for our RVM and SVM. Our methods are illustrated with a prediction problem in the near-infrared (NIR) spectroscopy. A simulation study is also undertaken to check the prediction accuracy of our models. © 2012 Elsevier Inc.

  5. Ethics in scientific communication: study of a problem case.

    Science.gov (United States)

    Berger, R L

    1994-12-01

    The hypothermia experiments performed on humans during the Second World War at the German concentration camp in Dachau have been regarded as crimes against humanity, disguised as medical research. For almost 50 years, scientists maintained that the study produced valuable, even if not totally reliable, information. In recent years, the results from the Dachau hypothermia project were glamorized with life-saving potential and a heated ethical dialogue was activated about the use of life-saving but tainted scientific information. In the wake of the debate, an in-depth examination of the scientific rigour of the project was performed and revealed that neither the science nor the scientists from Dachau could be trusted and that the data were worthless. The body of medical opinion accepted the unfavourable determination but a few scientists and ethicists have continued to endorse the validity, of at least parts, of the Dachau hypothermia data. The conduct of the scientific communications about the Dachau hypothermia experiments by the scientific and ethical communities invites serious consideration of a possible ethical misadventure. It appears that for almost 50 years, the results of the study had been endorsed without careful examination of the scientific base of the experiments and that secondary citation of relevant original material may have been commonly employed. These infractions contributed to a myth that good science was practised by the Nazis at Dachau. The more recent emphasis on the life-saving potential of the Dachau data, without citation of credible supporting evidence, has also been misleading. Similarly, acceptance of a determination by an in-depth examination that the 'whole' Dachau project if flawed with simultaneous endorsement of the validity of 'parts' of the results, poses an ethical problem. It is advisable that before seeking ethical consultation about the use of unethically obtained data, scientists should examine the quality of science behind

  6. Computational approach to large quantum dynamical problems

    International Nuclear Information System (INIS)

    Friesner, R.A.; Brunet, J.P.; Wyatt, R.E.; Leforestier, C.; Binkley, S.

    1987-01-01

    The organizational structure is described for a new program that permits computations on a variety of quantum mechanical problems in chemical dynamics and spectroscopy. Particular attention is devoted to developing and using algorithms that exploit the capabilities of current vector supercomputers. A key component in this procedure is the recursive transformation of the large sparse Hamiltonian matrix into a much smaller tridiagonal matrix. An application to time-dependent laser molecule energy transfer is presented. Rate of energy deposition in the multimode molecule for systematic variations in the molecular intermode coupling parameters is emphasized

  7. Linux software for large topology optimization problems

    DEFF Research Database (Denmark)

    evolving product, which allows a parallel solution of the PDE, it lacks the important feature that the matrix-generation part of the computations is localized to each processor. This is well-known to be critical for obtaining a useful speedup on a Linux cluster and it motivates the search for a COMSOL......-like package for large topology optimization problems. One candidate for such software is developed for Linux by Sandia Nat’l Lab in the USA being the Sundance system. Sundance also uses a symbolic representation of the PDE and a scalable numerical solution is achieved by employing the underlying Trilinos...

  8. Control problems in very large accelerators

    International Nuclear Information System (INIS)

    Crowley-Milling, M.C.

    1985-06-01

    There is no fundamental difference of kind in the control requirements between a small and a large accelerator since they are built of the same types of components, which individually have similar control inputs and outputs. The main difference is one of scale; the large machine has many more components of each type, and the distances involved are much greater. Both of these factors must be taken into account in determining the optimum way of carrying out the control functions. Small machines should use standard equipment and software for control as much as possible, as special developments for small quantities cannot normally be justified if all costs are taken into account. On the other hand, the very great number of devices needed for a large machine means that, if special developments can result in simplification, they may make possible an appreciable reduction in the control equipment costs. It is the purpose of this report to look at the special control problems of large accelerators, which the author shall arbitarily define as those with a length of circumference in excess of 10 km, and point out where special developments, or the adoption of developments from outside the accelerator control field, can be of assistance in minimizing the cost of the control system. Most of the first part of this report was presented as a paper to the 1985 Particle Accelerator Conference. It has now been extended to include a discussion on the special case of the controls for the SSC

  9. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    Science.gov (United States)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  10. [Credit problems on scientific article authorships: some solutions].

    Science.gov (United States)

    Casas-Martínez, María de la Luz

    2008-01-01

    Unfortunately, since the amount of publications has become a parameter for qualifying the scientific production, abuses and even frauds can take place easily. Along with the rising of new investigation projects in which numerous scientists participate, the multi-authorship has also taken place into the scientific scene, with articles having been damaged, modified or altered purposely with erroneous information and plagiarism, against the true author's will. The unconformities among the research team members reflect on a tense and stressful work atmosphere that interferes with investigation process itself; therefore, it is necessary to give and apply more transparency to these activities. The present article will analyze the recommendations from the International Committee of Medical Journal Editors on authorships for medical articles. Here it is pointed out, when an authorship should be considered as one or not, also offering possible solutions to avoid this problem, such as, the elaboration of an author's consent letters written previously to the publication of the article, the elaboration of credits and the limitation of qualified publications regarding each activity reason.

  11. (Mis)understanding Science: The Problem with Scientific Breakthroughs.

    Science.gov (United States)

    Evans, James P

    2016-09-01

    On Saturday morning, February 28, 1953, the mystery of heredity appeared secure. Humans hadn't the faintest idea of how genetic information was transmitted-how the uncanny resemblance between mother and daughter, grandfather and grandson was conveyed across generations. Yet, by that Saturday afternoon, two individuals, James Watson and Francis Crick, had glimpsed the solution to these mysteries. The story of Watson and Crick's great triumph has been told and retold and has rightly entered the pantheon of scientific legend. But Watson and Crick's breakthrough was just that: a rupture and dramatic discontinuity in human knowledge that solved a deep mystery, the likes of which occurs, perhaps, a couple of times each century. And that's the problem. The story is just so good and so irresistible that it has misled generations of scientists about what to expect regarding a life in science. And more damaging, the resulting breakthrough mentality misleads the public, the media, and society's decision-makers about how science really works, all to the detriment of scientific progress and our society's well-being. © 2016 The Hastings Center.

  12. Web-based visualization of very large scientific astronomy imagery

    Science.gov (United States)

    Bertin, E.; Pillay, R.; Marmo, C.

    2015-04-01

    Visualizing and navigating through large astronomy images from a remote location with current astronomy display tools can be a frustrating experience in terms of speed and ergonomics, especially on mobile devices. In this paper, we present a high performance, versatile and robust client-server system for remote visualization and analysis of extremely large scientific images. Applications of this work include survey image quality control, interactive data query and exploration, citizen science, as well as public outreach. The proposed software is entirely open source and is designed to be generic and applicable to a variety of datasets. It provides access to floating point data at terabyte scales, with the ability to precisely adjust image settings in real-time. The proposed clients are light-weight, platform-independent web applications built on standard HTML5 web technologies and compatible with both touch and mouse-based devices. We put the system to the test and assess the performance of the system and show that a single server can comfortably handle more than a hundred simultaneous users accessing full precision 32 bit astronomy data.

  13. Scientific decision of the Chernobyl accident problems (results of 1997)

    International Nuclear Information System (INIS)

    Konoplya, E.F.; Rolevich, I.V.

    1998-12-01

    In the publication are summarized the basic results of the researches executed in 1997 in the framework of the 'Scientific maintenance of the decision of problems of the Chernobyl NPP accident consequences' of the State program of Republic of Belarus for minimization and overcoming of the Chernobyl NPP accident consequences on 1996-2000 on the following directions: dose monitoring of the population, estimation and forecast of both collective irradiation dozes and risks of radiation induced diseases; development and ground of the measures for increase of radiation protection of the population of Belarus during of the reducing period after the Chernobyl accident; study of influence of radiological consequences of the Chernobyl accident on health of people, development of methods and means of diagnostics, treatment and preventive maintenance of diseases for various categories of the victims; optimisation of the system of measures for preservation of health of the victim population and development of ways for increase of it effectiveness; creation of the effective both prophylactic means and food additives for treatment and rehabilitation of the persons having suffered after the Chernobyl accident; development of complex system of an estimation and decision-making on problems of radiation protection of the population living on contaminated territories; development and optimization of a complex of measures for effective land use and decrease of radioactive contamination of agricultural production in order to reduce irradiation dozes of the population; development of complex technologies and means of decontamination, treatment and burial of radioactive wastes; study of the radioisotopes behaviour dynamics in environment (air, water, ground), ecosystems and populated areas; optimization of the system of radiation ecological monitoring in the republic and scientific methodical ways of it fulfilling; study of effects of low doze irradiation and combined influences, search

  14. Genesis of scientific research of legal problems of reserves

    Directory of Open Access Journals (Sweden)

    Олександр Олександрович Пономаренко

    2017-12-01

    Full Text Available The problems of the legal status of nature reserves as objects of ecological and legal commandment are considered. One of the main directions of the modern strategy of Ukraine’s environmental policy should be the implementation of international standards in the organization and protection of nature reserves as objects of the state natural reserve fund, the improvement of legislation on the nature reserve fund in accordance with the recommendations of the Pan-European Biological and Landscape Diversity Strategy (1995 on the formation of the Pan-European Ecological Network as a single spatial system of territories of European countries with the EU or partially altered landscape. All this allowed to formulate the definition of a natural reserve as a state research institution with the status of a legal entity of national importance and performs the functions of preserving in a natural state typical or unique for the given landscape zone of natural complexes with all components of their components, the study of natural processes and phenomena, the developments in them, the development of scientific principles of environmental protection, the effective use of natural resources and environmental safety, the implementation of ecological education and education of the population in the conditions of full restriction of economic activity not connected with its functioning.

  15. Study of Scientific Problem-Solving Abilities Based on Scientific Knowledge about Atmosphere and Weather for Seventh Grade Students

    Directory of Open Access Journals (Sweden)

    Phoorin Thaengnoi

    2017-06-01

    Full Text Available The purposes of this research were: 1 to develop scientific problem-solving abilities test based on scientific knowledge about atmosphere and weather for seventh grade students and 2 to study the scientific problem-solving abilities of seventh grade students. The samples used in this study were 47 students who were studying in seventh grade in academic year 2015 of a school in Chai Nat province, Thailand. Purposive sampling was applied for identifying the samples. The research instrument of this study was the scientific problem-solving abilities test developed by the researcher. The research data was analyzed by comparing students’ scores with the criteria and considering students’ answers in each element of scientific problem-solving abilities. The results of the study were as follows: The scientific problem-solving abilities test composed of 2 parts. The first part was multiple-choice questions which was composed of 4 situations, a total of 20 questions. The Index of Item Objective Congruence of this part was varied in the range between 0.67 – 1.00. The difficulty and the discrimination level were in the range between 0.33 – 0.63 and 0.27 – 0.67, respectively. The reliability levels of this part was equal to 0.81. The second part of the test was subjective questions which composed of 2 situations, a total of 10 questions. The Index of Item Objective Congruence of this part was varied in the range between 0.67 – 1.00. The reliability level of this part was equal to 0.83. Besides, all questions in the test were covered all elements of scientific problem-solving abilities ; 1 identifying the problem 2 making the hypothesis 3 collecting data and knowledge to solve the problem 4 identifying problem-solving method and 5 predicting the characteristics of the results. The problem-solving abilities of the students revealed that 40.43% of students (n=19 were in a moderate level and 59.57% of students (n=28 were in a low level with the

  16. Mathematical Problems in Creating Large Astronomical Catalogs

    Directory of Open Access Journals (Sweden)

    Prokhorov M. E.

    2016-12-01

    Full Text Available The next stage after performing observations and their primary reduction is to transform the set of observations into a catalog. To this end, objects that are irrelevant to the catalog should be excluded from observations and gross errors should be discarded. To transform such a prepared data set into a high-precision catalog, we need to identify and correct systematic errors. Therefore, each object of the survey should be observed several, preferably many, times. The problem formally reduces to solving an overdetermined set of equations. However, in the case of catalogs this system of equations has a very specific form: it is extremely sparse, and its sparseness increases rapidly with the number of objects in the catalog. Such equation systems require special methods for storing data on disks and in RAM, and for the choice of the techniques for their solving. Another specific feature of such systems is their high “stiffiness”, which also increases with the volume of a catalog. Special stable mathematical methods should be used in order not to lose precision when solving such systems of equations. We illustrate the problem by the example of photometric star catalogs, although similar problems arise in the case of positional, radial-velocity, and parallax catalogs.

  17. Problems Encountered during the Scientific Research Process in Graduate Education: The Institute of Educational Sciences

    Science.gov (United States)

    Akyürek, Erkan; Afacan, Özlem

    2018-01-01

    This study was conducted to determine the problems faced by graduate students when conducting scientific research and to make suggestions for solving these problems. The research model was a case study. Semi-structured interviews were conducted with participants in the study with questions about the problems encountered during scientific research…

  18. Unified Access Architecture for Large-Scale Scientific Datasets

    Science.gov (United States)

    Karna, Risav

    2014-05-01

    Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output

  19. Twenty third scientific conference on research-scientific problems of constructing mine buildings and metallurgical plants

    Energy Technology Data Exchange (ETDEWEB)

    Swiadrowski, W

    1978-01-01

    Annual conference was held in Krynica from 16-23 September 1977. One hundred and nineteen papers were delivered, of these 24 papers were on mine buildings. It was noted that damage caused by underground coal mining is prevalent and characterized by a tendency to increase. In the middle of the 1970s damages paid by mines (mainly by coal mines) reached 5 billion zlotys yearly. Damages which were not compensated, and social cost of mining damages are not included in the calculation. The following problems were discussed: interaction between foundations of buildings with ground in areas affected by deformations; influnce of underground coal mining on properties of soil; improving construction of large industrial plants located on grounds characterized by surface deformations; influence of underground coal mining on deformations of walls of sedimentation tanks; complex utilization of mined deposits in the Upper Silesian basin, Rybnik basin and in the Lublin black coal basin (coal and other minerals). (In Polish)

  20. Scientific assessments: Matching the process to the problem

    Directory of Open Access Journals (Sweden)

    Robert J. Scholes

    2017-03-01

    Full Text Available Background: The science–policy interface process – known as a ‘scientific assessment’ – has risen to prominence in the past few decades. Complex assessments are appropriate for issues which are both technically complicated, multifaceted and of high societal interest. There is increasing interest from the research community that studies biological invasions to undertake such an assessment. Objectives: Providing the relevant background and context, the article describes key principles and steps for designing, planning, resourcing and executing such a process, as well as providing evidence of high-impact assessments enhancing scientific careers. Method: Experience from international and national assessments, most recently the South African scientific assessment for the Shale Gas Development in the Central Karoo, was used to develop this guiding generic template for practitioners. Analyses of researcher publication performances were undertaken to determine the benefit of being involved in assessments. Results: The key success factors for assessments mostly relate to adherence to ‘process’ and ‘governance’ aspects, for which scientists are sometimes ill-equipped. As regards publication outputs, authors involved in assessment processes demonstrated higher H-indices than their environmental scientist peers. We have suggested causal explanations for this. Conclusion: Effectively designed and managed assessments provide the platform for the ‘co-production of knowledge’ – an iterative and collaborative process involving scientists, stakeholders and policymakers. This increases scientific impact in the society–policy domain. While scientists seem concerned that effort directed towards assessments comes at the detriment of scientific credibility and productivity, we have presented data that suggest the opposite.

  1. Trends in scientific publishing: Dark clouds loom large.

    Science.gov (United States)

    Vinny, Pulikottil Wilson; Vishnu, Venugopalan Y; Lal, Vivek

    2016-04-15

    The world wide web has brought about a paradigm shift in the way medical research is published and accessed. The ease with which a new journal can be started/hosted by publishing start-ups is unprecedented. The tremendous capabilities of the world wide web and the open access revolution when combined with a highly profitable business have attracted unscrupulous fraudulent operators to the publishing industry. The intent of these fraudulent publishers is solely driven by profit with utter disregard to scientific content, peer reviews and ethics. This phenomenon has been referred to as "predatory publishing". The "international" tag of such journals often betrays their true origins. The gold open access model of publishing, where the author pays the publisher, when coupled with a non-existent peer review threatens to blur the distinction between science and pseudoscience. The average researcher needs to be made more aware of this clear and present danger to the scientific community. Prevention is better than cure. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Basic Properties and Problem Fields of Scientific-Innovation Space of the Region

    Directory of Open Access Journals (Sweden)

    Alexey Aleksandrovich Rumyantsev

    2013-06-01

    Full Text Available Increasing scale of the scientific-innovative activity in administrative-territorial units, complicating structure of the regional scientific-innovative complexes and development of inter-regional horizontal and vertical ties expand the space of the scientific and innovation activity research of which primarily involves the development of theoretical and methodological provisions. Basing on the philosophical category «space», the paper describes main properties of the scientific-innovative space of the region and the factors causing them. The author identified problem fields as the direction of possible transformation of scientific-innovative space of the region. The analysis allowed defining some features of the scientific and innovation space and problems of development. The obtained results show the feasibility of study of the scientific-innovative activity in the spatial dimension

  3. Scientific Approach to Improve Mathematical Problem Solving Skills Students of Grade V

    Science.gov (United States)

    Roheni; Herman, T.; Jupri, A.

    2017-09-01

    This study investigates the skills of elementary school students’ in problem solving through the Scientific Approach. The purpose of this study is to determine mathematical problem solving skills of students by using Scientific Approach is better than mathematical problem solving skills of students by using Direct Instruction. This study is using quasi-experimental method. Subject of this study is students in grade V in one of state elementary school in Cirebon Regency. Instrument that used in this study is mathematical problem solving skills. The result of this study showed that mathematical problem solving skills of students who learn by using Scientific Approach is more significant than using Direct Instruction. Base on result and analysis, the conclusion is that Scientific Approach can improve students’ mathematical problem solving skills.

  4. Parallel Tensor Compression for Large-Scale Scientific Data.

    Energy Technology Data Exchange (ETDEWEB)

    Kolda, Tamara G. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ballard, Grey [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Austin, Woody Nathan [Univ. of Texas, Austin, TX (United States)

    2015-10-01

    As parallel computing trends towards the exascale, scientific data produced by high-fidelity simulations are growing increasingly massive. For instance, a simulation on a three-dimensional spatial grid with 512 points per dimension that tracks 64 variables per grid point for 128 time steps yields 8 TB of data. By viewing the data as a dense five way tensor, we can compute a Tucker decomposition to find inherent low-dimensional multilinear structure, achieving compression ratios of up to 10000 on real-world data sets with negligible loss in accuracy. So that we can operate on such massive data, we present the first-ever distributed memory parallel implementation for the Tucker decomposition, whose key computations correspond to parallel linear algebra operations, albeit with nonstandard data layouts. Our approach specifies a data distribution for tensors that avoids any tensor data redistribution, either locally or in parallel. We provide accompanying analysis of the computation and communication costs of the algorithms. To demonstrate the compression and accuracy of the method, we apply our approach to real-world data sets from combustion science simulations. We also provide detailed performance results, including parallel performance in both weak and strong scaling experiments.

  5. New computational methodology for large 3D neutron transport problems

    International Nuclear Information System (INIS)

    Dahmani, M.; Roy, R.; Koclas, J.

    2004-01-01

    We present a new computational methodology, based on 3D characteristics method, dedicated to solve very large 3D problems without spatial homogenization. In order to eliminate the input/output problems occurring when solving these large problems, we set up a new computing scheme that requires more CPU resources than the usual one, based on sweeps over large tracking files. The huge capacity of storage needed in some problems and the related I/O queries needed by the characteristics solver are replaced by on-the-fly recalculation of tracks at each iteration step. Using this technique, large 3D problems are no longer I/O-bound, and distributed CPU resources can be efficiently used. (authors)

  6. A Polar Rover for Large-Scale Scientific Surveys: Design, Implementation and Field Test Results

    Directory of Open Access Journals (Sweden)

    Yuqing He

    2015-10-01

    Full Text Available Exploration of polar regions is of great importance to scientific research. Unfortunately, due to the harsh environment, most of the regions on the Antarctic continent are still unreachable for humankind. Therefore, in 2011, the Chinese National Antarctic Research Expedition (CHINARE launched a project to design a rover to conduct large-scale scientific surveys on the Antarctic. The main challenges for the rover are twofold: one is the mobility, i.e., how to make a rover that could survive the harsh environment and safely move on the uneven, icy and snowy terrain; the other is the autonomy, in that the robot should be able to move at a relatively high speed with little or no human intervention so that it can explore a large region in a limit time interval under the communication constraints. In this paper, the corresponding techniques, especially the polar rover's design and autonomous navigation algorithms, are introduced in detail. Subsequently, an experimental report of the fields tests on the Antarctic is given to show some preliminary evaluation of the rover. Finally, experiences and existing challenging problems are summarized.

  7. Intelligent Personal Supercomputer for Solving Scientific and Technical Problems

    Directory of Open Access Journals (Sweden)

    Khimich, O.M.

    2016-09-01

    Full Text Available New domestic intellіgent personal supercomputer of hybrid architecture Inparkom_pg for the mathematical modeling of processes in the defense industry, engineering, construction, etc. was developed. Intelligent software for the automatic research and tasks of computational mathematics with approximate data of different structures was designed. Applied software to provide mathematical modeling problems in construction, welding and filtration processes was implemented.

  8. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  9. Ecology and environmental protection - a scientific and political problem

    Energy Technology Data Exchange (ETDEWEB)

    Trenkler, H

    1983-09-01

    The strategy for life in the inter-relationship between way of life and the environment and also the cause of ecological crises and assumptions about biological innovations are indicated in this paper. The aim of man has always been to adapt animate and inanimate nature to his needs. With the help of environmental protection measures it has been possible to extent the ecological load-bearing capacity of the human population. The acceleration of cycles of elements important for the process of life and business which is wide ranging and required on a world wide basis have become a serious problem. Sensible measures for conserving the resources of the environment are a responsibility of the state.

  10. Structuring and assessing large and complex decision problems using MCDA

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    This paper presents an approach for the structuring and assessing of large and complex decision problems using multi-criteria decision analysis (MCDA). The MCDA problem is structured in a decision tree and assessed using the REMBRANDT technique featuring a procedure for limiting the number of pair...

  11. The Cauchy problem for the Pavlov equation with large data

    Science.gov (United States)

    Wu, Derchyi

    2017-08-01

    We prove a local solvability of the Cauchy problem for the Pavlov equation with large initial data by the inverse scattering method. The Pavlov equation arises in studies Einstein-Weyl geometries and dispersionless integrable models. Our theory yields a local solvability of Cauchy problems for a quasi-linear wave equation with a characteristic initial hypersurface.

  12. Actual problems of physics and technology. III International youth scientific school-conference. Book of abstracts

    International Nuclear Information System (INIS)

    2014-01-01

    The third International youth scientific school-conference took place 10-13 April 2014 year in Moscow on the basis National Research Nuclear University MEPhI and RAS Lebedev P.N. Physical Institute. The actual scientific problems of current fundamental and applied physics as well as nuclear and physical technologies were discussed. This book of abstracts contains many interesting items devoted problems of theoretical physics and astrophysics, nuclear physics, nanotecnology, laser physics and plasma physics [ru

  13. Professional identity of civil servants as a scientific problem

    Directory of Open Access Journals (Sweden)

    Nataliia Anatoliivna Lypovska

    2013-11-01

    Full Text Available The article examines the concept of «professional identity» and its importance for the analysis of the professionalization of the civil servants. The basic concepts such as “profession”, “professionalism” (“professional development”, “professional competence”, and their relationship are concerned. Relevance of the research is due to the fact that professional identity acts as an internal source of professional development and personal growth of any business entity, and the question of the development of professional identity is included into the total range of problems of any professional. Stages of professional identity are grounded. The paper concludes that professional identity is an integration concept, which expresses the relationship of personal characteristics that provide guidance in the world of professions and allows a person more fully realize his personal potential careers, as well as to predict the consequences of professional choice. Professional identity performs of transforming and stabilizing functions. Therefore professional identity serves like a kind of regulator for a profession.

  14. Higgs, moduli problem, baryogenesis and large volume compactifications

    International Nuclear Information System (INIS)

    Higaki, Tetsutaro; Takahashi, Fuminobu

    2012-07-01

    We consider the cosmological moduli problem in the context of high-scale supersymmetry breaking suggested by the recent discovery of the standard-model like Higgs boson. In order to solve the notorious moduli-induced gravitino problem, we focus on the LARGE volume scenario, in which the modulus decay into gravitinos can be kinematically forbidden. We then consider the Affleck-Dine mechanism with or without an enhanced coupling with the inflaton, taking account of possible Q-ball formation. We show that the baryon asymmetry of the present Universe can be generated by the Affleck-Dine mechanism in LARGE volume scenario, solving the moduli and gravitino problems.

  15. Higgs, moduli problem, baryogenesis and large volume compactifications

    Energy Technology Data Exchange (ETDEWEB)

    Higaki, Tetsutaro [RIKEN Nishina Center, Saitama (Japan). Mathematical Physics Lab.; Kamada, Kohei [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Takahashi, Fuminobu [Tohoku Univ., Sendai (Japan). Dept. of Physics

    2012-07-15

    We consider the cosmological moduli problem in the context of high-scale supersymmetry breaking suggested by the recent discovery of the standard-model like Higgs boson. In order to solve the notorious moduli-induced gravitino problem, we focus on the LARGE volume scenario, in which the modulus decay into gravitinos can be kinematically forbidden. We then consider the Affleck-Dine mechanism with or without an enhanced coupling with the inflaton, taking account of possible Q-ball formation. We show that the baryon asymmetry of the present Universe can be generated by the Affleck-Dine mechanism in LARGE volume scenario, solving the moduli and gravitino problems.

  16. Material-Point Analysis of Large-Strain Problems

    DEFF Research Database (Denmark)

    Andersen, Søren

    The aim of this thesis is to apply and improve the material-point method for modelling of geotechnical problems. One of the geotechnical phenomena that is a subject of active research is the study of landslides. A large amount of research is focused on determining when slopes become unstable. Hence......, it is possible to predict if a certain slope is stable using commercial finite element or finite difference software such as PLAXIS, ABAQUS or FLAC. However, the dynamics during a landslide are less explored. The material-point method (MPM) is a novel numerical method aimed at analysing problems involving...... materials subjected to large strains in a dynamical time–space domain. This thesis explores the material-point method with the specific aim of improving the performance for geotechnical problems. Large-strain geotechnical problems such as landslides pose a major challenge to model numerically. Employing...

  17. Solving Large Clustering Problems with Meta-Heuristic Search

    DEFF Research Database (Denmark)

    Turkensteen, Marcel; Andersen, Kim Allan; Bang-Jensen, Jørgen

    In Clustering Problems, groups of similar subjects are to be retrieved from data sets. In this paper, Clustering Problems with the frequently used Minimum Sum-of-Squares Criterion are solved using meta-heuristic search. Tabu search has proved to be a successful methodology for solving optimization...... problems, but applications to large clustering problems are rare. The simulated annealing heuristic has mainly been applied to relatively small instances. In this paper, we implement tabu search and simulated annealing approaches and compare them to the commonly used k-means approach. We find that the meta-heuristic...

  18. Improving Students’ Scientific Reasoning and Problem-Solving Skills by The 5E Learning Model

    Directory of Open Access Journals (Sweden)

    Sri Mulyani Endang Susilowati

    2017-12-01

    Full Text Available Biology learning in MA (Madrasah Aliyah Khas Kempek was still dominated by teacher with low students’ involvement. This study would analyze the effectiveness of the 5E (Engagement, Exploration, Explanation, Elaboration, Evaluation learning model in improving scientific knowledge and problems solving. It also explained the relationship between students’ scientific reasoning with their problem-solving abilities. This was a pre-experimental research with one group pre-test post-test. Sixty students of MA Khas Kempek from XI MIA 3 and XI MIA 4 involved in this study. The learning outcome of the students was collected by the test of reasoning and problem-solving. The results showed that the rises of students’ scientific reasoning ability were 69.77% for XI MIA 3 and 66.27% for XI MIA 4, in the medium category. The problem-solving skills were 63.40% for XI MIA 3, 61.67% for XI MIA 4, and classified in the moderate category. The simple regression test found a linear correlation between students’ scientific reasoning and problem-solving ability. This study affirms that reasoning ability is needed in problem-solving. It is found that application of 5E learning model was effective to improve scientific reasoning and problem-solving ability of students.

  19. Proceedings of 5. international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century'. Pt. 1

    International Nuclear Information System (INIS)

    Kundas, S.P.; Okeanov, A.E.; Shevchuk, V.E.

    2005-05-01

    The first part of proceedings of the fifth international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century', which was held in the International A. Sakharov Environmental University, contents materials on topics: socio-ecological problems, medical ecology, biological ecology. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  20. Proceedings of 7. international scientific conference 'Sakharov readings 2007: Ecological problems of XXI century'

    International Nuclear Information System (INIS)

    Kundas, S.P.; Mel'nov, S.B.; Poznyak, S.S.

    2007-05-01

    Abstracts of the seventh international scientific conference 'Sakharov readings 2007: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: socio-ecological problems, medical ecology, biomonitoring and bioindication, biological ecology. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  1. Perceived Problem Solving Skills: As a Predictor of Prospective Teachers' Scientific Epistemological Beliefs

    Science.gov (United States)

    Temel, Senar

    2016-01-01

    This study aims to determine the level of perceived problem solving skills of prospective teachers and the relations between these skills and their scientific epistemological beliefs. The study was conducted in the fall semester of 2015-2016 academic year. Prospective teachers were applied Problem Solving Inventory which was developed by Heppner…

  2. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  3. Large-Scale Assessment, Rationality, and Scientific Management: The Case of No Child Left Behind

    Science.gov (United States)

    Roach, Andrew T.; Frank, Jennifer

    2007-01-01

    This article examines the ways in which NCLB and the movement towards large-scale assessment systems are based on Weber's concept of formal rationality and tradition of scientific management. Building on these ideas, the authors use Ritzer's McDonaldization thesis to examine some of the core features of large-scale assessment and accountability…

  4. Solving Large Scale Crew Scheduling Problems in Practice

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin); L. Albino; T.A.B. Dollevoet (Twan); D. Huisman (Dennis); J. Roussado; R.L. Saldanha

    2010-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of guards. Some labor rules restrict the duties of a certain crew base

  5. How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment

    Science.gov (United States)

    Baker, Lisa M.

    While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation

  6. Structural problems of public participation in large-scale projects with environmental impact

    International Nuclear Information System (INIS)

    Bechmann, G.

    1989-01-01

    Four items are discussed showing that the problems involved through participation of the public in large-scale projects with environmental impact cannot be solved satisfactorily without suitable modification of the existing legal framework. The problematic items are: the status of the electric utilities as a quasi public enterprise; informal preliminary negotiations; the penetration of scientific argumentation into administrative decisions; the procedural concept. The paper discusses the fundamental issue of the problem-adequate design of the procedure and develops suggestions for a cooperative participation design. (orig./HSCH) [de

  7. Background problem for a large solid angle, high sensitivity detector

    International Nuclear Information System (INIS)

    Chen, M.

    1977-01-01

    With extremely good vacuum (10 -11 to 10 -13 torr) and well controlled beams, the ISR has a good reputation for clean beam conditions and low background for most types of experiments. However, for a detector covering a large solid angle, measuring processes with small cross sections (approximately 10 -38 cm 2 ) there are serious background problems which took almost a year to solve. Since ISABELLE may have similar problems, a summary is given of experience at the ISR with the hope that some of the solutions can be installed in ISABELLE at an early stage

  8. The coupled dynamical problem of thermoelasticity in case of large temperature differences

    International Nuclear Information System (INIS)

    Szekeres, A.

    1981-01-01

    In the tasks of thermoelasticity in general, also in dynamical problems it is common to suppose small temperature differences. The equations used in scientific literature refer to these. It arises the thought of what is the influence on the dynamical problems of taking into account the large temperature changes. To investigate this first we present the general equation of heat conduction in case of small temperature differences according to Nowacki and Biot. On this basis we introduce the general equation of heat conduction with large temperature changes. Some remarks show the connection between the two cases. Using the latter in the equations of thermoelasticity we write down the expressions of the problem for the thermal shock of a long bar. Finally we show the results of the numerical example and the experimental opoortunity to measure some of the constants. (orig.)

  9. Proceedings of 6. international scientific conference 'Sakharov readings 2006: Ecological problems of XXI century'. Pt. 1

    International Nuclear Information System (INIS)

    Kundas, S.P.; Okeanov, A.E.; Poznyak, S.S.

    2006-05-01

    The first part of proceedings of the sixth international scientific conference 'Sakharov readings 2006: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: socio-ecological problems, medical ecology, biomonitoring and bioindication, biological ecology. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  10. THE PROBLEM OF CORRUPTION OF BASIC SCIENTIFIC INVESTIGATIONS IN PARTICULAR: FORMAL-ETHIC AND ECONOMIC ASPECTS

    Directory of Open Access Journals (Sweden)

    V. O. Lobovikov

    2016-01-01

    Full Text Available The aim of the paper is to carry out historical-philosophical andlinguistic analysis of ethical and metaphysical doctrine of Aristotle on corruption in general; to discuss of formal-ethical view on the problem of corruption in basic scientific researches; to define the place and role of fundamental scientific researches in knowledge-based economy taken as a whole, and Boston Chart, in particular.Methods. The methods involve the historical-philosophical and logical-linguistic analysis of texts; creation and studying of the elementary discrete mathematical model of the researched moral phenomenon at the level of artificial language of two-digit algebra of the natural right and morals; use of such conceptual and figurative tool of the economic theory as Boston Chart.Results and scientific novelty. The definition of the concept «basic scientific research» is given for the first time; the concept includes time parameter and knowledge of utility (the practical importance of results of this research.Practical significance. The submitted definition (criterion gives a possibility to establish at any moment of time definite borderline between the basic and the applied scientific search (the line undergoes change in the flow of time. The effective criterion of basic scientific researches offered by the author, and also exact specifying of their place and role in lifecycle of knowledge as goods in market economy (at the conceptual level of the Boston Chart allow to designate an urgent problem of corruption of the scientific sphere in a new perspective. Along with some additional conditions, this new evidence could help to solve the problem.

  11. Scientific Paradigms and Falsification: Kuhn, Popper, and Problems in Education Research

    Science.gov (United States)

    Hyslop-Margison, Emery James

    2010-01-01

    By examining the respective contributions of Karl Popper and Thomas Kuhn to the philosophy of science, the author highlights some prevailing problems in this article with the methods of so-called scientific research in education. The author enumerates a number of reasons why such research, in spite of its limited tangible return, continues to gain…

  12. Scientific Reasoning and Its Relationship with Problem Solving: The Case of Upper Primary Science Teachers

    Science.gov (United States)

    Alshamali, Mahmoud A.; Daher, Wajeeh M.

    2016-01-01

    This study aimed at identifying the levels of scientific reasoning of upper primary stage (grades 4-7) science teachers based on their use of a problem-solving strategy. The study sample (N = 138; 32 % male and 68 % female) was randomly selected using stratified sampling from an original population of 437 upper primary school teachers. The…

  13. Problems of Chernobyl. Materials of International scientific and practical conference 'Shelter-98'

    International Nuclear Information System (INIS)

    Klyuchnikov, O.O.

    1999-01-01

    These transactions contain materials of International Scientific and Practical Conference 'Shelter-98', which was held 27-30 November 1998 in Slavutich. They describe the results of the research work of the specialists from Ukraine, neighborhood and far foreign counties. The results, targeted at solving the problems of converting the Shelter Object into oncologically safe state

  14. Ab initio nuclear structure - the large sparse matrix eigenvalue problem

    Energy Technology Data Exchange (ETDEWEB)

    Vary, James P; Maris, Pieter [Department of Physics, Iowa State University, Ames, IA, 50011 (United States); Ng, Esmond; Yang, Chao [Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Sosonkina, Masha, E-mail: jvary@iastate.ed [Scalable Computing Laboratory, Ames Laboratory, Iowa State University, Ames, IA, 50011 (United States)

    2009-07-01

    The structure and reactions of light nuclei represent fundamental and formidable challenges for microscopic theory based on realistic strong interaction potentials. Several ab initio methods have now emerged that provide nearly exact solutions for some nuclear properties. The ab initio no core shell model (NCSM) and the no core full configuration (NCFC) method, frame this quantum many-particle problem as a large sparse matrix eigenvalue problem where one evaluates the Hamiltonian matrix in a basis space consisting of many-fermion Slater determinants and then solves for a set of the lowest eigenvalues and their associated eigenvectors. The resulting eigenvectors are employed to evaluate a set of experimental quantities to test the underlying potential. For fundamental problems of interest, the matrix dimension often exceeds 10{sup 10} and the number of nonzero matrix elements may saturate available storage on present-day leadership class facilities. We survey recent results and advances in solving this large sparse matrix eigenvalue problem. We also outline the challenges that lie ahead for achieving further breakthroughs in fundamental nuclear theory using these ab initio approaches.

  15. Ab initio nuclear structure - the large sparse matrix eigenvalue problem

    International Nuclear Information System (INIS)

    Vary, James P; Maris, Pieter; Ng, Esmond; Yang, Chao; Sosonkina, Masha

    2009-01-01

    The structure and reactions of light nuclei represent fundamental and formidable challenges for microscopic theory based on realistic strong interaction potentials. Several ab initio methods have now emerged that provide nearly exact solutions for some nuclear properties. The ab initio no core shell model (NCSM) and the no core full configuration (NCFC) method, frame this quantum many-particle problem as a large sparse matrix eigenvalue problem where one evaluates the Hamiltonian matrix in a basis space consisting of many-fermion Slater determinants and then solves for a set of the lowest eigenvalues and their associated eigenvectors. The resulting eigenvectors are employed to evaluate a set of experimental quantities to test the underlying potential. For fundamental problems of interest, the matrix dimension often exceeds 10 10 and the number of nonzero matrix elements may saturate available storage on present-day leadership class facilities. We survey recent results and advances in solving this large sparse matrix eigenvalue problem. We also outline the challenges that lie ahead for achieving further breakthroughs in fundamental nuclear theory using these ab initio approaches.

  16. Proceedings of 8. international scientific conference 'Sakharov readings 2008: Ecological problems of XXI century'

    International Nuclear Information System (INIS)

    Kundas, S.P.; Mel'nov, S.B.; Poznyak, S.S.

    2008-05-01

    The proceedings of the eighth international scientific conference 'Sakharov readings 2008: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: socio-ecological problems in the light of ideas of academic A. Sakharov; medical ecology; bioecology; biomonitoring, bioindication and bioremediation; radioecology and radiation protection; information systems and technologies in ecology; ecological management; ecological monitoring; ecological education, education for sustainable development; ecological ethics in bioethics education system; problems and prospects of renewable energetics development in Belarus. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  17. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    Science.gov (United States)

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  18. Efficient Feature-Driven Visualization of Large-Scale Scientific Data

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Aidong

    2012-12-12

    Very large, complex scientific data acquired in many research areas creates critical challenges for scientists to understand, analyze, and organize their data. The objective of this project is to expand the feature extraction and analysis capabilities to develop powerful and accurate visualization tools that can assist domain scientists with their requirements in multiple phases of scientific discovery. We have recently developed several feature-driven visualization methods for extracting different data characteristics of volumetric datasets. Our results verify the hypothesis in the proposal and will be used to develop additional prototype systems.

  19. Problems of cleaning the large diameter sections of deep wells

    Energy Technology Data Exchange (ETDEWEB)

    Patsch, F; Gilicz, B

    1966-01-01

    In drilling deep wells, great importance is being given to the problem of cutting removal from the hole bottom of sections drilled by large diameter bits. The length of borehole sections drilled by 12-1/4-in. and larger bits has been more than doubled in Hungary in the course of the past 4 years. When the drilling fluid jet is struck against the borehole bottom, pressure waves are brought about which take on a crossed flow pattern and result in a retardation of cleaning of the well bottom, particularly in the case of larger bottom surfaces. In large diameter boreholes, the cleaning efficiency is being achieved by full utilization of the pump power and increased pump delivery. Friction losses in drill pipes are being reduced by using 6-in. XH pipes.

  20. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    Science.gov (United States)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  1. Implicit solvers for large-scale nonlinear problems

    International Nuclear Information System (INIS)

    Keyes, David E; Reynolds, Daniel R; Woodward, Carol S

    2006-01-01

    Computational scientists are grappling with increasingly complex, multi-rate applications that couple such physical phenomena as fluid dynamics, electromagnetics, radiation transport, chemical and nuclear reactions, and wave and material propagation in inhomogeneous media. Parallel computers with large storage capacities are paving the way for high-resolution simulations of coupled problems; however, hardware improvements alone will not prove enough to enable simulations based on brute-force algorithmic approaches. To accurately capture nonlinear couplings between dynamically relevant phenomena, often while stepping over rapid adjustments to quasi-equilibria, simulation scientists are increasingly turning to implicit formulations that require a discrete nonlinear system to be solved for each time step or steady state solution. Recent advances in iterative methods have made fully implicit formulations a viable option for solution of these large-scale problems. In this paper, we overview one of the most effective iterative methods, Newton-Krylov, for nonlinear systems and point to software packages with its implementation. We illustrate the method with an example from magnetically confined plasma fusion and briefly survey other areas in which implicit methods have bestowed important advantages, such as allowing high-order temporal integration and providing a pathway to sensitivity analyses and optimization. Lastly, we overview algorithm extensions under development motivated by current SciDAC applications

  2. Proceedings of 5. international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century'. Pt. 2

    International Nuclear Information System (INIS)

    Kundas, S.P.; Okeanov, A.E.; Shevchuk, V.E.

    2005-05-01

    The first part of proceedings of the fifth international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century', which was held in the International A. Sakharov Environmental University, contents materials on topics: radioecology, ecological and radiation monitoring, new information systems and technologies in ecology, priority ecological power engineering, management in ecology, ecological education. The proceedings are intended for specialists in field of ecology and related sciences, dosimetry, engineers, teachers, students and post-graduate students

  3. Proceedings of 6. international scientific conference 'Sakharov readings 2006: Ecological problems of XXI century'. Pt. 2

    International Nuclear Information System (INIS)

    Kundas, S.P.; Okeanov, A.E.; Poznyak, S.S.

    2006-05-01

    The second part of proceedings of the sixth international scientific conference 'Sakharov readings 2006: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: radioecology, environmental monitoring, information systems and technologies in ecology, ecological priority energy engineering, ecological management and ecological education. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  4. Testing foreign language impact on engineering students' scientific problem-solving performance

    Science.gov (United States)

    Tatzl, Dietmar; Messnarz, Bernd

    2013-12-01

    This article investigates the influence of English as the examination language on the solution of physics and science problems by non-native speakers in tertiary engineering education. For that purpose, a statistically significant total number of 96 students in four year groups from freshman to senior level participated in a testing experiment in the Degree Programme of Aviation at the FH JOANNEUM University of Applied Sciences, Graz, Austria. Half of each test group were given a set of 12 physics problems described in German, the other half received the same set of problems described in English. It was the goal to test linguistic reading comprehension necessary for scientific problem solving instead of physics knowledge as such. The results imply that written undergraduate English-medium engineering tests and examinations may not require additional examination time or language-specific aids for students who have reached university-entrance proficiency in English as a foreign language.

  5. Dirofilariasis in Russian Federation: a big problem with large distribution

    Directory of Open Access Journals (Sweden)

    Tatyana V. Moskvina

    2018-02-01

    Full Text Available Dirofilariasis caused by Dirofilaria spp. is an important vector-borne and largely zoonotic disease. In Russia, dirofilariasis is caused by two agents D. immitis and D. repens. The present article provides detailed analyses of human and canine dirofilariasis methods of diagnosis, treatment, and prevention of the disease, with particular reference to the control programmes. Information has been summarised from literature in different languages that are not readily accessible to the international scientific community. Human dirofilariasis was first registered in Russia in 1915, and recent reports showed that the total number of infected humans increases on average by 1.8 times every three years. Human dirofilariasis was registered in 42 federal subjects. Totally 1162 cases of subcutaneous dirofilariasis were registered in the Russian Federation between 1915-2013, the most frequently type of subcutanious dirofilariasis was ocular dirofilariasis (more than 50% cases. Seven cases of pulmonary dirofilarisis were registered in Russia. The treatment of human dirofilariasis includes surgical removal of worms only; in result preventive measure have major importance for reducing risk of Dirofilaria infection. Control programmes have been implemented by the government at all administrative levels including diagnosis and treatment of patients, identification, isolation, and treatment of infected dogs, monthly chemoprophylactic of dogs during spring-summer periods, and regular vector control.

  6. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  7. Design of General-purpose Industrial signal acquisition system in a large scientific device

    Science.gov (United States)

    Ren, Bin; Yang, Lei

    2018-02-01

    In order to measure the industrial signal of a large scientific device experiment, a set of industrial data general-purpose acquisition system has been designed. It can collect 4~20mA current signal and 0~10V voltage signal. Through the practical experiments, it shows that the system is flexible, reliable, convenient and economical, and the system has characters of high definition and strong anti-interference ability. Thus, the system fully meets the design requirements..

  8. Sakharov readings 2011: environmental problems of the XXI century. Proceedings of 11 international scientific conference

    International Nuclear Information System (INIS)

    Kundas, S.P.; Poznyak, S.S.

    2011-05-01

    The proceeding includes materials of reports of 11-th international scientific conference 'Sakharov readings 2011: Environmental problems of XXI century', which took place 19-20 of May 2011 in the International A. Sakharov Environmental University. The proceeding continues abstracts about social-ecological, ecology-ethical and pedagogical problems in light of the Sakharov' ideas; medical ecology; biological ecology; biomonitoring, bioindication and bioremediation; radioecology and radiation protection; information systems and technologies in ecology and medicine; ecological monitoring, management and audit; renewable energy sources and energy efficiency; climate change and sustainable development; regional ecological problems. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges.

  9. Sakharov readings 2009: Environmental problems of the XXI century. Proceedings of the 9 scientific conference

    International Nuclear Information System (INIS)

    Kundas, S.P.; Mel'nov, S.B.; Poznyak, S.S.

    2009-05-01

    Proceeding includes materials of reports of 9-ts international scientific conference 'Sakharov readings 2009: Environmental problems of XXI century', which took place 21-22 of May 2009 in the International Sakharov Environmental University. Represented materials were integrated to next parts: social-ecological and ecology-ethical problems; medical ecology; bio ecology; bio monitoring, bio indication and bioremediation; radioecology and radiation security; ecological informational systems and technologies; ecological monitoring, management and audit; resumption sources of energy and energy-saving technologies; problems of ecological education. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges. (authors)

  10. Sakharov readings 2013: environmental problems of the XXI century. Proceedings of the 13 international scientific conference

    International Nuclear Information System (INIS)

    Kundas, S.P.; Poznyak, S.S.; Lysukho, N.A.

    2013-05-01

    The proceeding includes materials of reports of 13-th international scientific conference 'Sakharov readings 2013: Environmental problems of XXI century', which took place 16-17 of May 2013 in the International A. Sakharov Environmental University (Minsk, Belarus). The proceeding continues abstracts about social-ecological and ecology-ethical problems of modern times; education for sustainable development; medical ecology; biological ecology; radiobiology; radioecology and radiation protection; information systems and technologies in ecology and medicine; regional ecological problems; ecological monitoring and management; renewable energy sources and energy efficiency. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges.

  11. Sakharov readings 2015: environmental problems of the XXI century. Proceedings of the 15 international scientific conference

    International Nuclear Information System (INIS)

    Poznyak, S.S.; Lysukho, N.A.

    2015-05-01

    The proceeding includes materials of reports of 15-th international scientific conference 'Sakharov readings 2015: Environmental problems of XXI century', which took place 21-22 of May 2015 in the International A. Sakharov Environmental University (Minsk, Belarus). The proceeding continues abstracts about philosophical, social-ecological nd bioethical and problems of modernity; education for sustainable development; medical ecology; biological ecology; radiobiology; radioecology and radiation protection; information systems and technologies in ecology and medicine; regional environmental problems; ecological monitoring and management; renewable energy sources and energy efficiency. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges. (authors)

  12. Sakharov readings 2012: environmental problems of the XXI century. Proceedings of 12 international scientific conference

    International Nuclear Information System (INIS)

    Kundas, S.P.; Poznyak, S.S.

    2012-05-01

    The proceeding includes materials of reports of 12-th international scientific conference 'Sakharov readings 2012: Environmental problems of XXI century', which took place 17-18 of May 2012 in the International A. Sakharov Environmental University (Minsk, Belarus). The proceeding continues abstracts about social-ecological, ecology-ethical and pedagogical problems in light of the Sakharov' ideas; medical ecology; biological ecology; biomonitoring, bioindication and bioremediation; radioecology and radiation protection; information systems and technologies in ecology and medicine; ecological monitoring, management and audit; renewable energy sources and energy efficiency; climate change and sustainable development; regional ecological problems. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges.

  13. Sakharov readings 2016: environmental problems of the XXI century. Proceedings of the 16 international scientific conference

    International Nuclear Information System (INIS)

    Maskevich, S.A.; Poznyak, S.S.; Lysukho, N.A.

    2016-05-01

    The proceeding includes materials of reports of 16-th international scientific conference 'Sakharov readings 2016: Environmental problems of XXI century', which took place 19-20 of May 2015 in the International A. Sakharov Environmental Institute at the Belarus State University (Minsk, Belarus). The proceeding continues abstracts about philosophical, social-ecological and bioethical problems of modernity; education for sustainable development; medical ecology; biological ecology; radiobiology; radioecology and radiation protection; information systems and technologies in ecology and medicine; regional environmental problems; ecological monitoring and management; renewable energy sources and energy efficiency. In the framework of conference a discussion 'Ethical aspects of biomedicine, genetics, nanomedical technologies and human ecology' was conducted. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges. (authors)

  14. Collaborative Problem-Solving Environments; Proceedings for the Workshop CPSEs for Scientific Research, San Diego, California, June 20 to July 1, 1999

    Energy Technology Data Exchange (ETDEWEB)

    Chin, George

    1999-01-11

    A workshop on collaborative problem-solving environments (CPSEs) was held June 29 through July 1, 1999, in San Diego, California. The workshop was sponsored by the U.S. Department of Energy and the High Performance Network Applications Team of the Large Scale Networking Working Group. The workshop brought together researchers and developers from industry, academia, and government to identify, define, and discuss future directions in collaboration and problem-solving technologies in support of scientific research.

  15. Scientific Integrity and Professional Ethics at AGU - The Establishment and Evolution of an Ethics Program at a Large Scientific Society

    Science.gov (United States)

    McPhaden, Michael; Leinen, Margaret; McEntee, Christine; Townsend, Randy; Williams, Billy

    2016-04-01

    The American Geophysical Union, a scientific society of 62,000 members worldwide, has established a set of scientific integrity and professional ethics guidelines for the actions of its members, for the governance of the union in its internal activities, and for the operations and participation in its publications and scientific meetings. This presentation will provide an overview of the Ethics program at AGU, highlighting the reasons for its establishment, the process of dealing ethical breaches, the number and types of cases considered, how AGU helps educate its members on Ethics issues, and the rapidly evolving efforts at AGU to address issues related to the emerging field of GeoEthics. The presentation will also cover the most recent AGU Ethics program focus on the role for AGU and other scientific societies in addressing sexual harassment, and AGU's work to provide additional program strength in this area.

  16. Fishery management problems and possibilities on large southeastern reservoirs

    Science.gov (United States)

    Parsons, John W.

    1958-01-01

    Principal problems concerning the fisheries of large reservoirs in the Southeast are: inefficient and highly selective exploitation of fish stocks, and protection and reclamation of damaged or threatened fisheries in tailwaters and tributary streams. Seven mainstream reservoirs on which data are available support an average angling pressure of 4.9 trips per acre per year and an average catch of 16 pounds of sport fish and 6 pounds of food fish. Commercial take is 7 pounds per acre. The rate of catch of sport fish, based upon tag returns, is only 3 percent. Sixteen storage reservoirs support an average angling pressure of 5.0 trips per acre per year and an average catch of 13 pounds of sport fish and 1 pound of food fish. Commercial catch is of no significance. Average rate of catch of sport fish is 17 percent of the catchable population. Fish population studies indicate that there are twice as many sport fish and four times as many food fish in mainstream than there are in storage reservoirs.

  17. nanoHUB.org: Experiences and Challenges in Software Sustainability for a Large Scientific Community

    Directory of Open Access Journals (Sweden)

    Lynn Zentner

    2014-07-01

    Full Text Available The science gateway nanoHUB.org, funded by the National Science Foundation (NSF, serves a large scientific community dedicated to research and education in nanotechnology with community-contributed simulation codes as well as a vast repository of other materials such as recorded presentations, teaching materials, and workshops and courses. Nearly 330,000 users annually access over 4400 items of content on nanoHUB, including 343 simulation tools. Arguably the largest nanotechnology facility in the world, nanoHUB has led the way not only in providing open access to scientific code in the nanotechnology community, but also in lowering barriers to the use of that code, by providing a platform where developers are able to easily and quickly deploy code written in a variety of languages with user-friendly graphical user interfaces and where users can run the latest versions of codes transparently on the grid or other powerful resources without ever having to download or update code. Being a leader in open access code deployment provides nanoHUB with opportunities and challenges as it meets the current and future needs of its community. This paper discusses the experiences of nanoHUB in addressing and adapting to the changing landscape of scientific software in ways that best serve its community and meet the needs of the largest portion of its user base.

  18. Five decades of tackling models for stiff fluid dynamics problems a scientific autobiography

    CERN Document Server

    Zeytounian, Radyadour Kh

    2014-01-01

    Rationality - as opposed to 'ad-hoc' - and asymptotics - to emphasize the fact that perturbative methods are at the core of the theory - are the two main concepts associated with the Rational Asymptotic Modeling (RAM) approach in fluid dynamics when the goal is to specifically provide useful models accessible to numerical simulation via high-speed computing. This approach has contributed to a fresh understanding of Newtonian fluid flow problems and has opened up new avenues for tackling real fluid flow phenomena, which are known to lead to very difficult mathematical and numerical problems irrespective of turbulence. With the present scientific autobiography the author guides the reader through his somewhat non-traditional career; first discovering fluid mechanics, and then devoting more than fifty years to intense work in the field. Using both personal and general historical contexts, this account will be of benefit to anyone interested in the early and contemporary developments of an important branch of the...

  19. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  20. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  1. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  2. Properties and solution methods for large location-allocation problems

    DEFF Research Database (Denmark)

    Juel, Henrik; Love, Robert F.

    1982-01-01

    Location-allocation with l$ _p$ distances is studied. It is shown that this structure can be expressed as a concave minimization programming problem. Since concave minimization algorithms are not yet well developed, five solution methods are developed which utilize the special properties of the l......Location-allocation with l$ _p$ distances is studied. It is shown that this structure can be expressed as a concave minimization programming problem. Since concave minimization algorithms are not yet well developed, five solution methods are developed which utilize the special properties...... of the location-allocation problem. Using the rectilinear distance measure, two of these algorithms achieved optimal solutions in all 102 test problems for which solutions were known. The algorithms can be applied to much larger problems than any existing exact methods....

  3. Gender Diversity in a STEM Subfield - Analysis of a Large Scientific Society and Its Annual Conferences

    Science.gov (United States)

    Shishkova, Evgenia; Kwiecien, Nicholas W.; Hebert, Alexander S.; Westphall, Michael S.; Prenni, Jessica E.; Coon, Joshua J.

    2017-12-01

    Speaking engagements, serving as session chairs, and receiving awards at national meetings are essential stepping stones towards professional success for scientific researchers. Studies of gender parity in meetings of national scientific societies repeatedly uncover bias in speaker selection, engendering underrepresentation of women among featured presenters. To continue this dialogue, we analyzed membership data and annual conference programs of a large scientific society (>7000 members annually) in a male-rich ( 70% males), technology-oriented STEM subfield. We detected a pronounced skew towards males among invited keynote lecturers, plenary speakers, and recipients of the society's Senior Investigator award (15%, 13%, and 8% females, respectively). However, the proportion of females among Mid-Career and Young Investigator award recipients and oral session chairs resembled the current gender distribution of the general membership. Female members were more likely to present at the conferences and equally likely to apply and be accepted for oral presentations as their male counterparts. The gender of a session chair had no effect on the gender distribution of selected applicants. Interestingly, we identified several research subareas that were naturally enriched (i.e., not influenced by unequal selection of presenters) for either female or male participants, illustrating within a single subfield the gender divide along biology-technology line typical of all STEM disciplines. Two female-enriched topics experienced a rapid growth in popularity within the examined period, more than doubling the number of associated researchers. Collectively, these findings contribute to the contemporary discourse on gender in science and hopefully will propel positive changes within this and other societies. [Figure not available: see fulltext.

  4. Nuclear EMP simulation for large-scale urban environments. FDTD for electrically large problems.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, William S. [Los Alamos National Laboratory; Bull, Jeffrey S. [Los Alamos National Laboratory; Wilcox, Trevor [Los Alamos National Laboratory; Bos, Randall J. [Los Alamos National Laboratory; Shao, Xuan-Min [Los Alamos National Laboratory; Goorley, John T. [Los Alamos National Laboratory; Costigan, Keeley R. [Los Alamos National Laboratory

    2012-08-13

    In case of a terrorist nuclear attack in a metropolitan area, EMP measurement could provide: (1) a prompt confirmation of the nature of the explosion (chemical or nuclear) for emergency response; and (2) and characterization parameters of the device (reaction history, yield) for technical forensics. However, urban environment could affect the fidelity of the prompt EMP measurement (as well as all other types of prompt measurement): (1) Nuclear EMP wavefront would no longer be coherent, due to incoherent production, attenuation, and propagation of gamma and electrons; and (2) EMP propagation from source region outward would undergo complicated transmission, reflection, and diffraction processes. EMP simulation for electrically-large urban environment: (1) Coupled MCNP/FDTD (Finite-difference time domain Maxwell solver) approach; and (2) FDTD tends to be limited to problems that are not 'too' large compared to the wavelengths of interest because of numerical dispersion and anisotropy. We use a higher-order low-dispersion, isotropic FDTD algorithm for EMP propagation.

  5. SCIENTIFIC AND METHODICAL ASPECTS OF FORMATION OF SUBJECT CONTENT OF TRAINING COURSESFOR INVERSE PROBLEMS FOR DIFFERENTIAL EQUATIONS

    Directory of Open Access Journals (Sweden)

    В С Корнилов

    2016-12-01

    Full Text Available The article presents scientific and methodical aspects of forming the content of education inverse problems for differential equations for students of higher educational institutions of physical, mathematical and natural science training areas. The goals are formulated and the principles of training are the content of learning inverse problems for differential equations. Attention is drawn to the particular issues of teaching courses inverse problems. Describes the classification criteria and target modules that play the role of tools to create and analyze the model and curriculum, forming learning content inverse problems for differential equations. The content classification features and target modules. Formulate conclusions that learning the inverse problems for differential equations has scientific, educational and humanitarian potential of students and as a result of this training they gain the fundamental knowledge in the applied and computational mathematics, and also develop scientific worldview, applied, environmental, information thinking.

  6. Promoting access to and use of seismic data in a large scientific community

    Directory of Open Access Journals (Sweden)

    Michel Eric

    2017-01-01

    Full Text Available The growing amount of seismic data available from space missions (SOHO, CoRoT, Kepler, SDO,… but also from ground-based facilities (GONG, BiSON, ground-based large programmes…, stellar modelling and numerical simulations, creates new scientific perspectives such as characterizing stellar populations in our Galaxy or planetary systems by providing model-independent global properties of stars such as mass, radius, and surface gravity within several percent accuracy, as well as constraints on the age. These applications address a broad scientific community beyond the solar and stellar one and require combining indices elaborated with data from different databases (e.g. seismic archives and ground-based spectroscopic surveys. It is thus a basic requirement to develop a simple and effcient access to these various data resources and dedicated tools. In the framework of the European project SpaceInn (FP7, several data sources have been developed or upgraded. The Seismic Plus Portal has been developed, where synthetic descriptions of the most relevant existing data sources can be found, as well as tools allowing to localize existing data for given objects or period and helping the data query. This project has been developed within the Virtual Observatory (VO framework. In this paper, we give a review of the various facilities and tools developed within this programme. The SpaceInn project (Exploitation of Space Data for Innovative Helio- and Asteroseismology has been initiated by the European Helio- and Asteroseismology Network (HELAS.

  7. Linking to Scientific Data: Identity Problems of Unruly and Poorly Bounded Digital Objects

    Directory of Open Access Journals (Sweden)

    Laura Wynholds

    2011-03-01

    Full Text Available Within information systems, a significant aspect of search and retrieval across information objects, such as datasets, journal articles, or images, relies on the identity construction of the objects. This paper uses identity to refer to the qualities or characteristics of an information object that make it definable and recognizable, and can be used to distinguish it from other objects. Identity, in this context, can be seen as the foundation from which citations, metadata and identifiers are constructed.In recent years the idea of including datasets within the scientific record has been gaining significant momentum, with publishers, granting agencies and libraries engaging with the challenge. However, the task has been fraught with questions of best practice for establishing this infrastructure, especially in regards to how citations, metadata and identifiers should be constructed. These questions suggests a problem with how dataset identities are formed, such that an engagement with the definition of datasets as conceptual objects is warranted.This paper explores some of the ways in which scientific data is an unruly and poorly bounded object, and goes on to propose that in order for datasets to fulfill the roles expected for them, the following identity functions are essential for scholarly publications: (i the dataset is constructed as a semantically and logically concrete object, (ii the identity of the dataset is embedded, inherent and/or inseparable, (iii the identity embodies a framework of authorship, rights and limitations, and (iv the identity translates into an actionable mechanism for retrieval or reference.

  8. Fuzzy pictures as philosophical problem and scientific practice a study of visual vagueness

    CERN Document Server

    Cat, Jordi

    2017-01-01

    This book presents a comprehensive discussion on the characterization of vagueness in pictures. It reports on how the problem of representation of images has been approached in scientific practice, highlighting the role of mathematical methods and the philosophical background relevant for issues such as representation, categorization and reasoning. Without delving too much into the technical details, the book examines and defends different kinds of values of fuzziness based on a complex approach to categorization as a practice, adopting conceptual and empirical suggestions from different fields including the arts. It subsequently advances criticisms and provides suggestions for interpretation and application. By describing a cognitive framework based on fuzzy, rough and near sets, and discussing all of the relevant mathematical and philosophical theories for the representation and processing of vagueness in images, the book offers a practice-oriented guide to fuzzy visual reasoning, along with novel insights ...

  9. Radiolabeled monoclonal antibodies for imaging and therapy: Potential, problems, and prospects: Scientific highlights

    International Nuclear Information System (INIS)

    Srivastava, S.C.; Buraggi, G.L.

    1986-01-01

    This meeting focused on areas of research on radiolabeled monoclonal antibodies. Topics covered included the production, purification, and fragmentation of monoclonal antibodies and immunochemistry of hybridomas; the production and the chemistry of radionuclides; the radiohalogenation and radiometal labeling techniques; the in-vivo pharmacokinetics of radiolabeled antibodies; the considerations of immunoreactivity of radiolabeled preparations; the instrumentation and imaging techniques as applied to radioimmunodetection; the radiation dosimetry in diagnostic and therapeutic use of labeled antibodies; the radioimmunoscintigraphy and radioimmunotherapy studies; and perspectives and directions for future research. Tutorial as well as scientific lectures describing the latest research data on the above topics were presented. Three workshop panels were convened on ''Methods for Determining Immunoreactivity of Radiolabeled Monoclonal Antibodies - Problems and Pitfalls,'' Radiobiological and Dosimetric Considerations for Immunotherapy with Labeled Antibodies,'' and ''The Human Anti-Mouse Antibody Response in Patients.''

  10. Why is risk communication hardly applied in Japan? Psychological problem of scientific experts

    International Nuclear Information System (INIS)

    Kosugi, Motoko; Tsuchiya, Tomoko; Taniguchi, Taketoshi

    2000-01-01

    The purpose of this paper is to discuss the problems that impair to communicate about technological risks with the public in Japan, especially focusing on views of experts as a supplier of risk information. In this study, we also clarified through the questionnaire surveys that there were significant differences of risk perception and of information environment about science and technology between the public and scientific experts, as many previous studies showed. And most important fact is that experts perceive the difference in risk perception between the public and experts larger than the public does. We conclude that this experts' cognition impedes to take a first step toward communicating with the public about technological risks. (author)

  11. Scientific provision of the problems of overcoming the Chernobyl catastrophe consequences. Chapter 7

    International Nuclear Information System (INIS)

    Konoplya, E.F.; Rolevich, I.V.; Gurachevskij, V.L.; Poplyko, I.Ya.; Semeshko, A.V.

    1998-01-01

    At present in the Republic of Belarus the research works on the problems of overcoming of the Chernobyl accident consequences are carried out in the following directions: radiation protection of the population; health of the population affected by the Chernobyl NPP accident; complex radiation-ecological estimation of the environment and conditions of the life activity of the population; rehabilitation of the contaminated territories; instrumental and methodical provision of the radiation control. The experience of the scientific approach to the decision of wide-scale and multiple-discipline tasks of overcoming of the Chernobyl accident consequences promotes for transformation of separate knowledge about radiation safety in holistic conception of safety and protection of the population in emergency caused by industrial accidents, catastrophes, natural disasters

  12. Mathematical programming methods for large-scale topology optimization problems

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana

    for mechanical problems, but has rapidly extended to many other disciplines, such as fluid dynamics and biomechanical problems. However, the novelty and improvements of optimization methods has been very limited. It is, indeed, necessary to develop of new optimization methods to improve the final designs......, and at the same time, reduce the number of function evaluations. Nonlinear optimization methods, such as sequential quadratic programming and interior point solvers, have almost not been embraced by the topology optimization community. Thus, this work is focused on the introduction of this kind of second...... for the classical minimum compliance problem. Two of the state-of-the-art optimization algorithms are investigated and implemented for this structural topology optimization problem. A Sequential Quadratic Programming (TopSQP) and an interior point method (TopIP) are developed exploiting the specific mathematical...

  13. Abstracts of papers of international scientific conference 'Ten Years After the Chernobyl Catastrophe (Scientific Aspects of Problem)'

    International Nuclear Information System (INIS)

    Konoplya, E.F.; Amvros'ev, A.P.; Bogdevich, I.M.; Bondar', Yu.I.; Karaseva, E.I.; Lobanok, L.M.; Matsko, V.P.; Pikulik, M.M.; Rolevich, I.V.; Stozharov, A.N.; Yakushev, B.I.

    1996-02-01

    The collection is dedicated to the 10 anniversary of Chernobyl catastrophe and contains the results of researches carried out in Belarus, as well as in Ukraine and Russian Federation, on different aspects of the Chernobyl problems: radiation medicine and risks, biological radiation effects and their forecasting, agricultural radiology and radioecology, decontamination and radioactive waste management, socio-economic and psychologic problems caused by the Chernobyl Catastrophe. (authors)

  14. IYPT problems teach high school students about teamwork and the scientific method

    Science.gov (United States)

    Kochanski, K.; Klishin, A.

    2015-12-01

    Laboratory work is often STEM students' primary exposure to key creative and communicative skills in the sciences, including experimental design, trouble shooting, team work, and oral presentations. The International Young Physicists' Tournament (IYPT) teaches these skills by inviting high school students to investigate simple unsolved systems instead of reproducing familiar results. Students work in teams to form hypotheses, gather data, and present their results orally in a tournament format. The IYPT has published 17 questions yearly since 1988, and its archives are an efficient source of experimental problems for outreach programs and have also been used for first-year undergraduate project classes (Planisic, 2009). We present insights and outcomes from two schools in which we introduced a new extracurricular program based on the IYPT model. Twenty-four students worked in small teams for three hours per day for six weeks. Surprisingly, most teams chose problems in unfamiliar subject areas such as fluid dynamics, and tailored their approaches to take advantage of individual skills including soldering, photography, and theoretical analysis. As the program progressed, students developed an increasingly intuitive understanding of the scientific method. They began to discuss the repeatability of their experiments without prompting, and were increasingly willing to describe alternative hypotheses.

  15. Large-scale computation at PSI scientific achievements and future requirements

    International Nuclear Information System (INIS)

    Adelmann, A.; Markushin, V.

    2008-11-01

    Computational modelling and simulation are among the disciplines that have seen the most dramatic growth in capabilities in the 2Oth Century. Within the past two decades, scientific computing has become an important contributor to all scientific research programs. Computational modelling and simulation are particularly indispensable for solving research problems that are unsolvable by traditional theoretical and experimental approaches, hazardous to study, or time consuming or expensive to solve by traditional means. Many such research areas are found in PSI's research portfolio. Advances in computing technologies (including hardware and software) during the past decade have set the stage for a major step forward in modelling and simulation. We have now arrived at a situation where we have a number of otherwise unsolvable problems, where simulations are as complex as the systems under study. In 2008 the High-Performance Computing (HPC) community entered the petascale area with the heterogeneous Opteron/Cell machine, called Road Runner built by IBM for the Los Alamos National Laboratory. We are on the brink of a time where the availability of many hundreds of thousands of cores will open up new challenging possibilities in physics, algorithms (numerical mathematics) and computer science. However, to deliver on this promise, it is not enough to provide 'peak' performance in terms of peta-flops, the maximum theoretical speed a computer can attain. Most important, this must be translated into corresponding increase in the capabilities of scientific codes. This is a daunting problem that can only be solved by increasing investment in hardware, in the accompanying system software that enables the reliable use of high-end computers, in scientific competence i.e. the mathematical (parallel) algorithms that are the basis of the codes, and education. In the case of Switzerland, the white paper 'Swiss National Strategic Plan for High Performance Computing and Networking

  16. Large-scale computation at PSI scientific achievements and future requirements

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A.; Markushin, V

    2008-11-15

    Computational modelling and simulation are among the disciplines that have seen the most dramatic growth in capabilities in the 2Oth Century. Within the past two decades, scientific computing has become an important contributor to all scientific research programs. Computational modelling and simulation are particularly indispensable for solving research problems that are unsolvable by traditional theoretical and experimental approaches, hazardous to study, or time consuming or expensive to solve by traditional means. Many such research areas are found in PSI's research portfolio. Advances in computing technologies (including hardware and software) during the past decade have set the stage for a major step forward in modelling and simulation. We have now arrived at a situation where we have a number of otherwise unsolvable problems, where simulations are as complex as the systems under study. In 2008 the High-Performance Computing (HPC) community entered the petascale area with the heterogeneous Opteron/Cell machine, called Road Runner built by IBM for the Los Alamos National Laboratory. We are on the brink of a time where the availability of many hundreds of thousands of cores will open up new challenging possibilities in physics, algorithms (numerical mathematics) and computer science. However, to deliver on this promise, it is not enough to provide 'peak' performance in terms of peta-flops, the maximum theoretical speed a computer can attain. Most important, this must be translated into corresponding increase in the capabilities of scientific codes. This is a daunting problem that can only be solved by increasing investment in hardware, in the accompanying system software that enables the reliable use of high-end computers, in scientific competence i.e. the mathematical (parallel) algorithms that are the basis of the codes, and education. In the case of Switzerland, the white paper 'Swiss National Strategic Plan for High Performance Computing

  17. Paul Scherrer Institute Scientific and Technical Report 2000. Volume VI: Large Research Facilities

    International Nuclear Information System (INIS)

    Foroughi, Fereydoun; Bercher, Renate; Buechli, Carmen; Zumkeller, Lotty

    2001-01-01

    The PSI Department Large Research Facilities (GFA) joins the efforts to provide an excellent research environment to Swiss and foreign research groups on the experimental facilities driven by our high intensity proton accelerator complex. Its divisions care for the running, maintenance and enhancement of the accelerator complex, the primary proton beamlines, the targets and the secondary beams as well as the neutron spallation source SINQ. The division for technical support and coordination provides for technical support to the research facility complementary to the basic logistic available from the department for logistics and marketing. Besides running the facilities, the staff of the department is also involved in theoretical and experimental research projects. Some of them address basic scientific questions mainly concerning the properties of micro- or nanostructured materials: experiments as well as large scale computer simulations of molecular dynamics were performed to investigate nonclassical materials properties. Others are related to improvements or extensions of the capabilities of our facilities. We also report on intriguing results from applications of the neutron capture radiography, the prompt gamma activation method and the isotope production facility at SINQ

  18. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  19. Paul Scherrer Institute Scientific and Technical Report 2000. Volume VI: Large Research Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Foroughi, Fereydoun; Bercher, Renate; Buechli, Carmen; Zumkeller, Lotty [eds.

    2001-07-01

    The PSI Department Large Research Facilities (GFA) joins the efforts to provide an excellent research environment to Swiss and foreign research groups on the experimental facilities driven by our high intensity proton accelerator complex. Its divisions care for the running, maintenance and enhancement of the accelerator complex, the primary proton beamlines, the targets and the secondary beams as well as the neutron spallation source SINQ. The division for technical support and coordination provides for technical support to the research facility complementary to the basic logistic available from the department for logistics and marketing. Besides running the facilities, the staff of the department is also involved in theoretical and experimental research projects. Some of them address basic scientific questions mainly concerning the properties of micro- or nanostructured materials: experiments as well as large scale computer simulations of molecular dynamics were performed to investigate nonclassical materials properties. Others are related to improvements or extensions of the capabilities of our facilities. We also report on intriguing results from applications of the neutron capture radiography, the prompt gamma activation method and the isotope production facility at SINQ.

  20. Advancements in Large-Scale Data/Metadata Management for Scientific Data.

    Science.gov (United States)

    Guntupally, K.; Devarakonda, R.; Palanisamy, G.; Frame, M. T.

    2017-12-01

    Scientific data often comes with complex and diverse metadata which are critical for data discovery and users. The Online Metadata Editor (OME) tool, which was developed by an Oak Ridge National Laboratory team, effectively manages diverse scientific datasets across several federal data centers, such as DOE's Atmospheric Radiation Measurement (ARM) Data Center and USGS's Core Science Analytics, Synthesis, and Libraries (CSAS&L) project. This presentation will focus mainly on recent developments and future strategies for refining OME tool within these centers. The ARM OME is a standard based tool (https://www.archive.arm.gov/armome) that allows scientists to create and maintain metadata about their data products. The tool has been improved with new workflows that help metadata coordinators and submitting investigators to submit and review their data more efficiently. The ARM Data Center's newly upgraded Data Discovery Tool (http://www.archive.arm.gov/discovery) uses rich metadata generated by the OME to enable search and discovery of thousands of datasets, while also providing a citation generator and modern order-delivery techniques like Globus (using GridFTP), Dropbox and THREDDS. The Data Discovery Tool also supports incremental indexing, which allows users to find new data as and when they are added. The USGS CSAS&L search catalog employs a custom version of the OME (https://www1.usgs.gov/csas/ome), which has been upgraded with high-level Federal Geographic Data Committee (FGDC) validations and the ability to reserve and mint Digital Object Identifiers (DOIs). The USGS's Science Data Catalog (SDC) (https://data.usgs.gov/datacatalog) allows users to discover a myriad of science data holdings through a web portal. Recent major upgrades to the SDC and ARM Data Discovery Tool include improved harvesting performance and migration using new search software, such as Apache Solr 6.0 for serving up data/metadata to scientific communities. Our presentation will highlight

  1. Guaranteed Discrete Energy Optimization on Large Protein Design Problems.

    Science.gov (United States)

    Simoncini, David; Allouche, David; de Givry, Simon; Delmas, Céline; Barbe, Sophie; Schiex, Thomas

    2015-12-08

    In Computational Protein Design (CPD), assuming a rigid backbone and amino-acid rotamer library, the problem of finding a sequence with an optimal conformation is NP-hard. In this paper, using Dunbrack's rotamer library and Talaris2014 decomposable energy function, we use an exact deterministic method combining branch and bound, arc consistency, and tree-decomposition to provenly identify the global minimum energy sequence-conformation on full-redesign problems, defining search spaces of size up to 10(234). This is achieved on a single core of a standard computing server, requiring a maximum of 66GB RAM. A variant of the algorithm is able to exhaustively enumerate all sequence-conformations within an energy threshold of the optimum. These proven optimal solutions are then used to evaluate the frequencies and amplitudes, in energy and sequence, at which an existing CPD-dedicated simulated annealing implementation may miss the optimum on these full redesign problems. The probability of finding an optimum drops close to 0 very quickly. In the worst case, despite 1,000 repeats, the annealing algorithm remained more than 1 Rosetta unit away from the optimum, leading to design sequences that could differ from the optimal sequence by more than 30% of their amino acids.

  2. Spectral Analysis of Large Finite Element Problems by Optimization Methods

    Directory of Open Access Journals (Sweden)

    Luca Bergamaschi

    1994-01-01

    Full Text Available Recently an efficient method for the solution of the partial symmetric eigenproblem (DACG, deflated-accelerated conjugate gradient was developed, based on the conjugate gradient (CG minimization of successive Rayleigh quotients over deflated subspaces of decreasing size. In this article four different choices of the coefficient βk required at each DACG iteration for the computation of the new search direction Pk are discussed. The “optimal” choice is the one that yields the same asymptotic convergence rate as the CG scheme applied to the solution of linear systems. Numerical results point out that the optimal βk leads to a very cost effective algorithm in terms of CPU time in all the sample problems presented. Various preconditioners are also analyzed. It is found that DACG using the optimal βk and (LLT−1 as a preconditioner, L being the incomplete Cholesky factor of A, proves a very promising method for the partial eigensolution. It appears to be superior to the Lanczos method in the evaluation of the 40 leftmost eigenpairs of five finite element problems, and particularly for the largest problem, with size equal to 4560, for which the speed gain turns out to fall between 2.5 and 6.0, depending on the eigenpair level.

  3. Measuring scientific reasoning through behavioral analysis in a computer-based problem solving exercise

    Science.gov (United States)

    Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2016-12-01

    Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new

  4. Newton Methods for Large Scale Problems in Machine Learning

    Science.gov (United States)

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  5. Biopolitics problems of large-scale hydraulic engineering construction

    International Nuclear Information System (INIS)

    Romanenko, V.D.

    1997-01-01

    The XX century which will enter in a history as a century of large-scale hydraulic engineering constructions come to the finish. Only on the European continent 517 large reservoirs (more than 1000 million km 3 of water were detained, had been constructed for a period from 1901 till 1985. In the Danube basin a plenty for reservoirs of power stations, navigations, navigating sluices and other hydraulic engineering structures are constructed. Among them more than 40 especially large objects are located along the main bed of the river. A number of hydro-complexes such as Dnieper-Danube and Gabcikovo, Danube-Oder-Labe (project), Danube-Tissa, Danube-Adriatic Sea (project), Danube-Aegean Sea, Danube-Black Sea ones, are entered into operation or are in a stage of designing. Hydraulic engineering construction was especially heavily conducted in Ukraine. On its territory some large reservoirs on Dnieper and Yuzhny Bug were constructed, which have heavily changed the hydrological regime of the rivers. Summarised the results of river systems regulating in Ukraine one can be noted that more than 27 thousand ponds (3 km 3 per year), 1098 reservoirs of total volume 55 km 3 , 11 large channels of total length more than 2000 km and with productivity of 1000 m 2 /s have been created in Ukraine. Hydraulic engineering construction played an important role in development of the industry and agriculture, water-supply of the cities and settlements, in environmental effects, and maintenance of safe navigation in Danube, Dnieper and other rivers. In next part of the paper, the environmental changes after construction of the Karakum Channel on the Aral Sea in the Middle Asia are discussed

  6. The Problem of Scientific Realism Vis-a-Vis the Probabilistic ...

    African Journals Online (AJOL)

    Scientific realism holds that the objects of scientific knowledge exist independently of the minds or activities of scientists and those scientific theories are true of that objective world. This is antithetical to the posits of quantum physics, a body of knowledge taken as the seal of the sciences because of its incredible feat, which ...

  7. Parallel time domain solvers for electrically large transient scattering problems

    KAUST Repository

    Liu, Yang

    2014-09-26

    Marching on in time (MOT)-based integral equation solvers represent an increasingly appealing avenue for analyzing transient electromagnetic interactions with large and complex structures. MOT integral equation solvers for analyzing electromagnetic scattering from perfect electrically conducting objects are obtained by enforcing electric field boundary conditions and implicitly time advance electric surface current densities by iteratively solving sparse systems of equations at all time steps. Contrary to finite difference and element competitors, these solvers apply to nonlinear and multi-scale structures comprising geometrically intricate and deep sub-wavelength features residing atop electrically large platforms. Moreover, they are high-order accurate, stable in the low- and high-frequency limits, and applicable to conducting and penetrable structures represented by highly irregular meshes. This presentation reviews some recent advances in the parallel implementations of time domain integral equation solvers, specifically those that leverage multilevel plane-wave time-domain algorithm (PWTD) on modern manycore computer architectures including graphics processing units (GPUs) and distributed memory supercomputers. The GPU-based implementation achieves at least one order of magnitude speedups compared to serial implementations while the distributed parallel implementation are highly scalable to thousands of compute-nodes. A distributed parallel PWTD kernel has been adopted to solve time domain surface/volume integral equations (TDSIE/TDVIE) for analyzing transient scattering from large and complex-shaped perfectly electrically conducting (PEC)/dielectric objects involving ten million/tens of millions of spatial unknowns.

  8. Empirical Phenomenon, Subjective Construction And Ontological Trught: (An Analysis of Problems of Scientific Explanation and Critical Realism Approach

    Directory of Open Access Journals (Sweden)

    Faramarz Taghilou

    2014-12-01

    Full Text Available Both the positivist and negativist frameworks of explanation are common in this naturalist proposition that unlike the metaphysical philosophy, reality is embedded only in experimental level. Therefore, the scientific explanation of natural and social phenomenon should refer to this experimental level in order to be called meaningful, verifiable and scientific. But, the problem was always that the principle of causality as a necessary condition for every kind of scientific explanation is not logically deductible from induction in experimental level and remains as a metaphysical principle. The principle of experimental objectivity as a condition for the verifiability clause of scientific explanations could not be defended, because the experimentation was always embedded in subjectivity and theory. The Kantian idealists, in contrast, considering the scientific explanation as a mere representation of reality in subjective categories, could not justify the experimental knowledge of reality and the rationality for comparison among theories and paradigms. Critical Realism as an important approach in philosophy of science that relates to the works and thoughts of Roy Bhaskar tries to solve these problems by resorting to its principles of ontological realism, epistemological relativism, and judgmental rationality. Considering and analyzing the scientific explanation’s issues, we have focused here on the answers of the Critical Realism in this case. We will argue that how the Critical Realist interpretation of scientific explanation, the experimental phenomenon, and the subjective construction and ontological reality all reach to a logical coherence with each other.

  9. A Glimpse of Scientific Research on Fundamental Problems of Military and Civil Aeronautics

    Science.gov (United States)

    1939-01-01

    Among the outstanding accomplishments of the last century is man's conquest of the air. That conquest began in 1903 when the Wright brothers made the first successful flight of an airplane at Kitty Hawk, N. C. Five years later the United States Government purchased its first airplane for the use of the Army, and began the training of officers for military flying. During the years immediately preceding the outbreak of the World War the Government and a meager aircraft industry had made important progress, but the Government, practically the only customer, had purchased less than 100 airplanes. In the meantime, leading European nations, sensing acutely the potentialities of aircraft in warfare, had made greater progress and had begun laying the foundations for the new science of aeronautics. The World War gave a remarkable impetus to the development of aeronautics and emphasized the need for organized research on the fundamental problems of flight. By act of Congress approved March 3, 1915, the National Advisory Committee for Aeronautics was created and charged with the duty of supervising, directing, and conducting fundamental scientific research and experiment in aeronautics. With the farsighted support of the Congress the Committee has led the world in the development of unique aeronautical research facilities in its laboratories at Langley Field, Va. The research programs include problems initiated by the Committee and its subcommittees and also investigations requested by the Army, the Navy, and the Civil Aeronautics Authority. The results of researches conducted under one control, serve without duplication of effort, the needs of all branches of aviation, civil and military, and exert a profound influence on the progress of aeronautics by improving the performance, efficiency, and safety of aircraft. A brief description of the results of some of the committee's researches and of the equipment employed will be found in the following pages.

  10. Solving Large Quadratic|Assignment Problems in Parallel

    DEFF Research Database (Denmark)

    Clausen, Jens; Perregaard, Michael

    1997-01-01

    and recalculation of bounds between branchings when used in a parallel Branch-and-Bound algorithm. The algorithm has been implemented on a 16-processor MEIKO Computing Surface with Intel i860 processors. Computational results from the solution of a number of large QAPs, including the classical Nugent 20...... processors, and have hence not been ideally suited for computations essentially involving non-vectorizable computations on integers.In this paper we investigate the combination of one of the best bound functions for a Branch-and-Bound algorithm (the Gilmore-Lawler bound) and various testing, variable binding...

  11. Solution accelerators for large scale 3D electromagnetic inverse problems

    International Nuclear Information System (INIS)

    Newman, Gregory A.; Boggs, Paul T.

    2004-01-01

    We provide a framework for preconditioning nonlinear 3D electromagnetic inverse scattering problems using nonlinear conjugate gradient (NLCG) and limited memory (LM) quasi-Newton methods. Key to our approach is the use of an approximate adjoint method that allows for an economical approximation of the Hessian that is updated at each inversion iteration. Using this approximate Hessian as a preconditoner, we show that the preconditioned NLCG iteration converges significantly faster than the non-preconditioned iteration, as well as converging to a data misfit level below that observed for the non-preconditioned method. Similar conclusions are also observed for the LM iteration; preconditioned with the approximate Hessian, the LM iteration converges faster than the non-preconditioned version. At this time, however, we see little difference between the convergence performance of the preconditioned LM scheme and the preconditioned NLCG scheme. A possible reason for this outcome is the behavior of the line search within the LM iteration. It was anticipated that, near convergence, a step size of one would be approached, but what was observed, instead, were step lengths that were nowhere near one. We provide some insights into the reasons for this behavior and suggest further research that may improve the performance of the LM methods

  12. Density control problems in large stellarators with neoclassical transport

    International Nuclear Information System (INIS)

    Maassberg, H.; Beidler, C.D.; Simmet, E.E.

    1999-01-01

    With respect to the particle flux, the off-diagonal term in the neoclassical transport matrix becomes crucial in the stellarator long-mean-free-path regime. Central heating with peaked temperature profiles can make an active density profile control by central particle refuelling mandatory. The neoclassical particle confinement can significantly exceed the energy confinement at the outer radii. As a consequence, the required central refuelling may be larger than the neoclassical particle fluxes at outer radii leading to the loss of the global density control. Radiative losses as well as additional 'anomalous' electron heat diffusivities further exacerbate this problem. In addition to the analytical formulation of the neoclassical link of particle and energy fluxes, simplified model simulations as well as time-dependent ASTRA code simulations are described. In particular, the 'low-' and 'high-mirror' W7-X configurations are compared. For the W7-X 'high-mirror' configuration especially, the appearance of the neoclassical particle transport barrier is predicted at higher densities. (author)

  13. Paul Scherrer Institute Scientific and Technical Report 2001. Volume VI: Large Research Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Bercher, R.; Buechli, C.; Zumkeller, L. (eds.)

    2002-03-01

    While the main effort in the past ten years was directed towards increasing the beam current from 100 to 2000 {mu}A and installation of additional user facilities like SINQ to satisfy new needs, we are now concentrating on stable operation at these high beam intensities. Unfortunately, 'stable operation' is not clearly defined. A few years ago, the accelerator physicists considered 80% beam on time excellent but the users complained about poor performance. Today we achieve a yearly mean beam on time of almost 90% at 1.7 mA and we have achieved 95% to 98% of the scheduled beam time for periods of weeks. These numbers seem to be satisfactory for the users. Despite this achievement, we try hard to further reduce the number of serious and long breakdowns, which are the main cause of the reduced yearly mean availability. Furthermore, breakdowns that necessitate long repair times are extremely detrimental for many experiments, which have only been allocated a few days of beam time. As a result of our discussions, we launched a number of activities, which include design and construction of improved power supplies, intensifying preventive maintenance, procuring vital spare parts, and reducing repair times through careful preparation. In addition, we were given permission to strengthen the accelerator staff with highly qualified physicists in order to study and solve several pending problems. We are aware that the planned measures will by no means be fast and will require considerable financial and personnel support. A long-standing issue concerning the improvement of the machine performance is the replacement of the aluminum cavities in the main ring accelerator by new high power copper cavities. The studies and tests on a model cavity are finished and we have ordered a prototype cavity, which will arrive in fall 2002 and be installed in the ring after a rigorous test phase in 2004. A list of scientific publications in 2000 is also provided.

  14. Paul Scherrer Institute Scientific and Technical Report 2001. Volume VI: Large Research Facilities

    International Nuclear Information System (INIS)

    Bercher, R.; Buechli, C.; Zumkeller, L.

    2002-03-01

    While the main effort in the past ten years was directed towards increasing the beam current from 100 to 2000 μA and installation of additional user facilities like SINQ to satisfy new needs, we are now concentrating on stable operation at these high beam intensities. Unfortunately, 'stable operation' is not clearly defined. A few years ago, the accelerator physicists considered 80% beam on time excellent but the users complained about poor performance. Today we achieve a yearly mean beam on time of almost 90% at 1.7 mA and we have achieved 95% to 98% of the scheduled beam time for periods of weeks. These numbers seem to be satisfactory for the users. Despite this achievement, we try hard to further reduce the number of serious and long breakdowns, which are the main cause of the reduced yearly mean availability. Furthermore, breakdowns that necessitate long repair times are extremely detrimental for many experiments, which have only been allocated a few days of beam time. As a result of our discussions, we launched a number of activities, which include design and construction of improved power supplies, intensifying preventive maintenance, procuring vital spare parts, and reducing repair times through careful preparation. In addition, we were given permission to strengthen the accelerator staff with highly qualified physicists in order to study and solve several pending problems. We are aware that the planned measures will by no means be fast and will require considerable financial and personnel support. A long-standing issue concerning the improvement of the machine performance is the replacement of the aluminum cavities in the main ring accelerator by new high power copper cavities. The studies and tests on a model cavity are finished and we have ordered a prototype cavity, which will arrive in fall 2002 and be installed in the ring after a rigorous test phase in 2004. A list of scientific publications in 2000 is also provided

  15. Cleaning of scientific references in large patent databases using rule-based scoring and clustering

    NARCIS (Netherlands)

    Caron, Emiel

    2017-01-01

    Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal

  16. Enabling High Performance Large Scale Dense Problems through KBLAS

    KAUST Repository

    Abdelfattah, Ahmad

    2014-05-04

    KBLAS (KAUST BLAS) is a small library that provides highly optimized BLAS routines on systems accelerated with GPUs. KBLAS is entirely written in CUDA C, and targets NVIDIA GPUs with compute capability 2.0 (Fermi) or higher. The current focus is on level-2 BLAS routines, namely the general matrix vector multiplication (GEMV) kernel, and the symmetric/hermitian matrix vector multiplication (SYMV/HEMV) kernel. KBLAS provides these two kernels in all four precisions (s, d, c, and z), with support to multi-GPU systems. Through advanced optimization techniques that target latency hiding and pushing memory bandwidth to the limit, KBLAS outperforms state-of-the-art kernels by 20-90% improvement. Competitors include CUBLAS-5.5, MAGMABLAS-1.4.0, and CULAR17. The SYMV/HEMV kernel from KBLAS has been adopted by NVIDIA, and should appear in CUBLAS-6.0. KBLAS has been used in large scale simulations of multi-object adaptive optics.

  17. DEVELOPMENT OF SCIENTIFIC AND INFORMATIVE POTENTIAL OF STUDENTS IN THE TEACHING OF THE INVERSE PROBLEMS FOR DIFFERENTIAL EQUATIONS

    Directory of Open Access Journals (Sweden)

    Виктор Семенович Корнилов

    2017-12-01

    Full Text Available In article attention that when training in the inverse problems for differential equations at students scientific and cognitive potential develops is paid. Students realize that mathematical models of the inverse problems for differential equations find the application in economy, the industries, ecology, sociology, biology, chemistry, mathematician, physics, in researches of the processes and the phenomena occurring in water and earth’s environment, air and space.Attention of the reader that in training activity to the inverse problems for differential equations at students the scientific outlook, logical, algorithmic, information thinking, creative activity, independence and ingenuity develop is focused. Students acquire skills to apply knowledge of many physical and mathematical disciplines, to carry out the analysis of the received decision of the reverse task and to formulate logical outputs of application-oriented character. Solving the inverse problems for differential equations, students acquire new knowledge in the field of applied and calculus mathematics, informatics, natural sciences and other knowledge.

  18. Fifth Anniversary youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects. Theses of reports

    International Nuclear Information System (INIS)

    2009-01-01

    Theses of reports of the Fifth Anniversary youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects (21-23 April 2009, Ozersk) are presented. The book contains abstracts of papers of fourth thematic sections: SNF reprocessing: science and industry; Radioecology and radiobiology; Advanced science-intensive technologies and materials; Education and training for NFC plants

  19. A Test of the Circumvention-of-Limits Hypothesis in Scientific Problem Solving: The Case of Geological Bedrock Mapping

    Science.gov (United States)

    Hambrick, David Z.; Libarkin, Julie C.; Petcovic, Heather L.; Baker, Kathleen M.; Elkins, Joe; Callahan, Caitlin N.; Turner, Sheldon P.; Rench, Tara A.; LaDue, Nicole D.

    2012-01-01

    Sources of individual differences in scientific problem solving were investigated. Participants representing a wide range of experience in geology completed tests of visuospatial ability and geological knowledge, and performed a geological bedrock mapping task, in which they attempted to infer the geological structure of an area in the Tobacco…

  20. A Science-Technology-Society Paradigm and Cross River State Secondary School Students' Scientific Literacy: Problem Solving and Decision Making

    Science.gov (United States)

    Umoren, Grace

    2007-01-01

    The aim of this study was to investigate the effect of Science-Technology-Society (STS) curriculum on students' scientific literacy, problem solving and decision making. Four hundred and eighty (480) Senior Secondary two science and non-science students were randomly selected from intact classes in six secondary schools in Calabar Municipality of…

  1. Using a Scientific Paper Format to Foster Problem-Based, Cohort-Learning in Undergraduate Environmental Science

    Science.gov (United States)

    Wagner, T.; Langley-Turnbaugh, S. J.; Sanford, R.

    2006-01-01

    The Department of Environmental Science at the University of Southern Maine implemented a problem-based, cohort-learning curriculum for undergraduate environmental science majors. The curriculum was based on a five-course sequence patterned after the outline of a scientific paper. Under faculty guidance, students select local environmental…

  2. Influence of Family Processes, Motivation, and Beliefs about Intelligence on Creative Problem Solving of Scientifically Talented Individuals

    Science.gov (United States)

    Cho, Seokhee; Lin, Chia-Yi

    2011-01-01

    Predictive relationships among perceived family processes, intrinsic and extrinsic motivation, incremental beliefs about intelligence, confidence in intelligence, and creative problem-solving practices in mathematics and science were examined. Participants were 733 scientifically talented Korean students in fourth through twelfth grades as well as…

  3. On the role of individual human abilities in the design of adaptive user interfaces for scientific problem solving environments

    NARCIS (Netherlands)

    Zudilova-Seinstra, E.V.

    2007-01-01

    A scientific problem solving environment should be built in such a way that users (scientists) might exploit underlying technologies without a specialised knowledge about available tools and resources. An adaptive user interface can be considered as an opportunity in addressing this challenge. This

  4. The effect of Think Pair Share (TPS) using scientific approach on students’ self-confidence and mathematical problem-solving

    Science.gov (United States)

    Rifa’i, A.; Lestari, H. P.

    2018-03-01

    This study was designed to know the effects of Think Pair Share using Scientific Approach on students' self-confidence and mathematical problem-solving. Quasi-experimental with pre-test post-test non-equivalent group method was used as a basis for design this study. Self-confidence questionnaire and problem-solving test have been used for measurement of the two variables. Two classes of the first grade in religious senior high school (MAN) in Indonesia were randomly selected for this study. Teaching sequence and series from mathematics book at control group in the traditional way and at experiment group has been in TPS using scientific approach learning method. For data analysis regarding students’ problem-solving skill and self-confidence, One-Sample t-Test, Independent Sample t-Test, and Multivariate of Variance (MANOVA) were used. The results showed that (1) TPS using a scientific approach and traditional learning had positive effects (2) TPS using scientific approach learning in comparative with traditional learning had a more significant effect on students’ self-confidence and problem-solving skill.

  5. On nuclear power problem in science education in Japan. Supplementary reader, authorization and scientific literacy for citizen

    International Nuclear Information System (INIS)

    Ryu, Jumpei

    2012-01-01

    Distribution of 'supplementary reader on nuclear power: Challenge! Nuclear power world' issued in 2010 and 'supplementary reader on radiation' issued in October 2011 was shelved in June 2012 by the administrative project review with revised policy of nuclear education for nuclear power promotion reflected. Great East Japan Earthquake and Fukushima Daiichi Nuclear Power Accident brought about great effects and change on fundamental conditions of citizen's life as well as national consciousness of future society in Japan. Reconsideration of scientific education should be needed taking account how to recognize 'scientific literacy' and 'scientific communication'. This article discussed nuclear power problem related with supplementary reader and nuclear power education so as to establish science education framework for 'scientific literacy' for citizen. Preparation of nuclear power education at junior high school according to guideline of new course of study was reviewed and then 'scientific literacy' based on British science higher level student textbook for public understanding of science in society was described for reference, which suggested some problem in science education in Japan although social background was different. (T. Tanaka)

  6. Solving large nonlinear generalized eigenvalue problems from Density Functional Theory calculations in parallel

    DEFF Research Database (Denmark)

    Bendtsen, Claus; Nielsen, Ole Holm; Hansen, Lars Bruno

    2001-01-01

    The quantum mechanical ground state of electrons is described by Density Functional Theory, which leads to large minimization problems. An efficient minimization method uses a self-consistent field (SCF) solution of large eigenvalue problems. The iterative Davidson algorithm is often used, and we...

  7. Scalable Newton-Krylov solver for very large power flow problems

    NARCIS (Netherlands)

    Idema, R.; Lahaye, D.J.P.; Vuik, C.; Van der Sluis, L.

    2010-01-01

    The power flow problem is generally solved by the Newton-Raphson method with a sparse direct solver for the linear system of equations in each iteration. While this works fine for small power flow problems, we will show that for very large problems the direct solver is very slow and we present

  8. Maintaining Masculinity in Mid-Twentieth-Century American Psychology: Edwin Boring, Scientific Eminence, and the "Woman Problem".

    Science.gov (United States)

    Rutherford, Alexandra

    2015-01-01

    Using mid-twentieth-century American psychology as my focus, I explore how scientific psychology was constructed as a distinctly masculine enterprise and was navigated by those who did not conform easily to this masculine ideal. I show how women emerged as problems for science through the vigorous gatekeeping activities and personal and professional writings of disciplinary figurehead Edwin G. Boring. I trace Boring's intellectual and professional socialization into masculine science and his efforts to understand women's apparent lack of scientific eminence, efforts that were clearly undergirded by preexisting and widely shared assumptions about men's and women's capacities and preferences.

  9. DORT and TORT workshop -- Outline for presentation for performance issues for large problems

    International Nuclear Information System (INIS)

    Barnett, A.

    1998-04-01

    This paper addresses the problem of running large TORT programs. The problem being a limited amount of time per job and limited amount of memory and disk space. The solution that the author outlines here is to break up the TORT run in time and space. For the time problem run multiple, sequential, dependent jobs. For the space problem use TORT internal memory conservation features. TORT is a three-dimensional discrete ordinates neutron/photon transport code

  10. Examination of the relationship between preservice science teachers' scientific reasoning and problem solving skills on basic mechanics

    Science.gov (United States)

    Yuksel, Ibrahim; Ates, Salih

    2018-02-01

    The purpose of this study is to determine relationship between scientific reasoning and mechanics problem solving skills of students in science education program. Scientific Reasoning Skills Test (SRST) and Basic Mechanics Knowledge Test (BMKT) were applied to 90 second, third and fourth grade students who took Scientific Reasoning Skills course at science teaching program of Gazi Faculty of Education for three successive fall semesters of 2014, 2015 and 2016 academic years. It was found a statistically significant positive (p = 0.038 <0.05) but a low correlation (r = 0.219) between SRST and BMKT. There were no significant relationship among Conservation Laws, Proportional Thinking, Combinational Thinking, Correlational Thinking, Probabilistic Thinking subskills of reasoning and BMKT. There were significant and positive correlation among Hypothetical Thinking and Identifying and Controlling Variables subskills of reasoning and BMKT. The findings of the study were compared with other studies in the field and discussed.

  11. The Art of Scientific Ideas: Teaching and Learning Strategies that Promote Creative Problem Finding

    Science.gov (United States)

    LaBanca, Frank; Ritchie, Krista C.

    2011-01-01

    Problem solving is a valuable skill in the science classroom. Students often use a variety of inquiry strategies to identify problems and their implications; develop action plans; locate relevant sources, information, and data; and formulate solutions. Problem solving is a logical, analytical, and sometimes creative process. The less tangible,…

  12. The effects of problem content and scientific background on information search and the assessment and valuation of correlations.

    Science.gov (United States)

    Soffer, Shira; Kareev, Yaakov

    2011-01-01

    The effects of problem contents and one's scientific background on the detection of correlations and the assessment of their strength were studied using a task that required active data search, assessment of the strength of a correlation, and monetary valuation of the correlation's predictive utility. Participants (N = 72) who were trained either in the natural sciences or in the social sciences and humanities explored data sets differing in contents and actual strength of correlation. Data search was consistent across all variables: Participants drew relatively small samples whose relative sizes would favor the detection of a correlation, if one existed. In contrast, the assessment of the correlation strength and the valuation of its predictive utility were strongly related not only to its objective strength, but also to the correspondence between problem contents and one's scientific background: When the two matched, correlations were judged to be stronger and more valuable than when they did not.

  13. [The Problems with Domestic Introduction of rTMS from the Three Viewpoints of Scientific Evidence, Specialty and Social Responsibility].

    Science.gov (United States)

    Shinosaki, Kazuhiro

    2015-01-01

    The domestic introduction of rTMS is expected as a new treatment option for treatment-resistant depression. I discussed some problems with the introduction from three viewpoints : scientific evidence, specialty, and social responsibility. I surveyed scientific evidence for rTMS regarding the action mechanism, effectiveness, side effects, and its positioning in the treatment guidelines. To secure the quality of rTMS treatment, I proposed rTMS guidelines, nurturing of the specialists, and a center hospital plan, and pointed out some medium-term problems after its introduction and the consistency of rTMS treatment and standard depression treatment. From the viewpoint of social responsibility, rTMS treatment should be a medical service covered by health insurance to avoid its misuse. We should prepare to overcome the public suspicion of brain stimulation treatment for mental disease.

  14. Improving Problem Solving Skill and Self Regulated Learning of Senior High School Students through Scientific Approach using Quantum Learning strategy

    Directory of Open Access Journals (Sweden)

    M Sudirman

    2017-12-01

    Full Text Available This research is quasi experiment with control group pretest-postest design. The sampel in this research using the techique of purposive sampling so the samples used were two classes of the 11th grade students of SMAN 14 Bandung in the academic year 2017/2018. The experiment group uses saintific approach using Quantum Learning strategy and control group uses saintific approach. In collecting the data the researcher will use the test of problem solving ability and self regulated learning as the instrument. The aims of this research are to:1find out the improvement of students mathematical problem solving through scientific approach using Quantum Learning study, 2 find out students self regulated learning through scientific approach using Quantum Learning.

  15. Developing a problem-based learning (PBL) curriculum for professionalism and scientific integrity training for biomedical graduate students.

    Science.gov (United States)

    Jones, Nancy L; Peiffer, Ann M; Lambros, Ann; Guthold, Martin; Johnson, A Daniel; Tytell, Michael; Ronca, April E; Eldridge, J Charles

    2010-10-01

    A multidisciplinary faculty committee designed a curriculum to shape biomedical graduate students into researchers with a high commitment to professionalism and social responsibility and to provide students with tools to navigate complex, rapidly evolving academic and societal environments with a strong ethical commitment. The curriculum used problem-based learning (PBL), because it is active and learner-centred and focuses on skill and process development. Two courses were developed: Scientific Professionalism: Scientific Integrity addressed discipline-specific and broad professional norms and obligations for the ethical practice of science and responsible conduct of research (RCR). Scientific Professionalism: Bioethics and Social Responsibility focused on current ethical and bioethical issues within the scientific profession, and implications of research for society. Each small-group session examined case scenarios that included: (1) learning objectives for professional norms and obligations; (2) key ethical issues and philosophies within each topic area; (3) one or more of the RCR instructional areas; and (4) at least one type of moral reflection. Cases emphasised professional standards, obligations and underlying philosophies for the ethical practice of science, competing interests of stakeholders and oversight of science (internal and external). To our knowledge, this is the first use of a longitudinal, multi-semester PBL course to teach scientific integrity and professionalism. Both faculty and students endorsed the active learning approach for these topics, in contrast to a compliance-based approach that emphasises learning rules and regulations.

  16. Design of a Generic and Flexible Data Structure for Efficient Formulation of Large Scale Network Problems

    DEFF Research Database (Denmark)

    Quaglia, Alberto; Sarup, Bent; Sin, Gürkan

    2013-01-01

    structure for efficient formulation of enterprise-wide optimization problems is presented. Through the integration of the described data structure in our synthesis and design framework, the problem formulation workflow is automated in a software tool, reducing time and resources needed to formulate large......The formulation of Enterprise-Wide Optimization (EWO) problems as mixed integer nonlinear programming requires collecting, consolidating and systematizing large amount of data, coming from different sources and specific to different disciplines. In this manuscript, a generic and flexible data...... problems, while ensuring at the same time data consistency and quality at the application stage....

  17. Problems in the Science and Mathematics of 'The Logic of Scientific Discovery'

    Directory of Open Access Journals (Sweden)

    Alan B. Whiting

    2012-11-01

    Full Text Available Professor Sir Karl Popper (1902-1994 was one of the most influential philosophers of science of the twentieth century. However, in his most famous work 'The Logic of Scientific Discovery' he displays troubling misunderstandings of science and mathematics at a basic level. These call into question his conclusions concerning the philosophy of science. Quanta 2012; 1: 13–18.

  18. Scientific approach to the problem of early memories and self-criminals

    Directory of Open Access Journals (Sweden)

    Debolsky M.G.

    2014-12-01

    Full Text Available This article deal with a theoretical overview of the problem of early recollections and self-esteem criminals. Discussed issues are considered within the concept of Alfred Adler. A. Adler said that the earliest memories - is the starting point of an autobiography, which traced the first assessment itself, which is the basis of self-esteem and life style. Has been hypothesized about the existence correlation between the content of early recollections and self-esteem, which is a precondition for the formation of a criminal lifestyle. This hypothesis is based on the analysis of large number of theoretical studies foreign scientists (J. Bruner, C. Miner, G. Murray, J. Kramer, S. Tomkins, L. Ross and Newby-Clark, M. Singer and P. Salovey, LA Polkington and D. McAdams and some empirical studies (A. Molostvov. The article describes the main points of view on the question of the correctness of reproducible human memories. The authors share the position of A. Adler that certain childhood experiences form the self-esteem of man. If in childhood has been formed inferiority in a certain area, the man all his life to strive for superiority, and it was on the values in this field will be based his self-esteem. Important to find the fundamental mistakes made in the early period of personality development, then using particular therapy can to correct them into adulthood. Thus, the article focuses on the reader an opportunity to improve the psychotherapeutic work with early recollections of convicts to reduce the risk of re-offending.

  19. A scientific operations plan for the large space telescope. [ground support system design

    Science.gov (United States)

    West, D. K.

    1977-01-01

    The paper describes an LST ground system which is compatible with the operational requirements of the LST. The goal of the approach is to minimize the cost of post launch operations without seriously compromising the quality and total throughput of LST science. Attention is given to cost constraints and guidelines, the telemetry operations processing systems (TELOPS), the image processing facility, ground system planning and data flow, and scientific interfaces.

  20. Sakharov readings 2010: Environmental problems of the XXI century. Proceedings of the 10 international scientific conference. Part 1

    International Nuclear Information System (INIS)

    Kundas, S.P.; Mel'nov, S.B.; Poznyak, S.S.

    2010-05-01

    Proceeding includes materials of reports of 10-ts international scientific conference. Part 1e 'Sakharov readings 2010: Environmental problems of XXI century', which took place 20-21 of May 2010 in the International Sakharov Environmental University. The first part of the proceedings continues abstracts about social-ecological and ecology-ethical problems, medical ecology, biological ecology, biomonitoring, bioindication and bioremediation. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges.

  1. Sakharov readings 2010: Environmental problems of the XXI century. Proceedings of the 10 international scientific conference. Part 2

    International Nuclear Information System (INIS)

    Kundas, S.P.; Mel'nov, S.B.; Poznyak, S.S.

    2010-05-01

    Proceeding includes materials of reports of 10-ts international scientific conference 'Sakharov readings 2010: Environmental problems of XXI century', which took place 20-21 of May 2010 in the International Sakharov Environmental University. The second part of the proceedings continues abstracts about radioecology and radiation security; ecological informational systems and technologies; ecological monitoring, management and audit; resumption sources of energy and energy-saving technologies; problems of ecological education. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges. (authors)

  2. Fourth youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects. Theses of reports

    International Nuclear Information System (INIS)

    2007-01-01

    Theses of reports of the Fourth youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects (18-20 April 2007, Ozersk) are presented. The book contains theses of reports of the seventh subject sections: NFC: science and industry; Ecological problems in NFC development: radiation safety, radioecology and radiobiology; Nuclear power engineering: economics, safety, field experience; Atomic branch: history, today and future; New technologies in education. Education and training for NFC plants, public opinion; Information technologies and telecommunications; Long-term science intensive technologies and new materials [ru

  3. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    International Nuclear Information System (INIS)

    Nam, H; Stoitsov, M; Nazarewicz, W; Hagen, G; Kortelainen, M; Pei, J C; Bulgac, A; Maris, P; Vary, J P; Roche, K J; Schunck, N; Thompson, I; Wild, S M

    2012-01-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  4. Scientific-practical and legal problems of implementation of the personalized medicine.

    Science.gov (United States)

    Bezdieniezhnykh, N O; Reznikova, V V; Rossylna, O V

    2017-09-01

    The article is devoted to the comprehensive analysis of scientific, practical and legal issues of personalized medicine that is a rapidly developing science-driven approach to healthcare. It is concluded that there is lack of general legal framework for the encouragement of scientific researches and practical implementation in this field. The article shows foreign experience and prospects for the introduction of personalized medicine as a key concept of healthcare system, which is based on a selection of diagnostic, therapeutic and preventive measures that would be the most effective for a particular person in view of individual characteristics. The conclusions and proposals to improve the current legislation and development of personalized medicine in Ukraine are suggested.

  5. Impact of problem-based learning in a large classroom setting: student perception and problem-solving skills.

    Science.gov (United States)

    Klegeris, Andis; Hurren, Heather

    2011-12-01

    Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better problem-solving skills and an increase in overall motivation. However, very little research has been done on the educational benefits of PBL in a large classroom setting. Here, we describe a PBL approach (using tutorless groups) that was introduced as a supplement to standard didactic lectures in University of British Columbia Okanagan undergraduate biochemistry classes consisting of 45-85 students. PBL was chosen as an effective method to assist students in learning biochemical and physiological processes. By monitoring student attendance and using informal and formal surveys, we demonstrated that PBL has a significant positive impact on student motivation to attend and participate in the course work. Student responses indicated that PBL is superior to traditional lecture format with regard to the understanding of course content and retention of information. We also demonstrated that student problem-solving skills are significantly improved, but additional controlled studies are needed to determine how much PBL exercises contribute to this improvement. These preliminary data indicated several positive outcomes of using PBL in a large classroom setting, although further studies aimed at assessing student learning are needed to further justify implementation of this technique in courses delivered to large undergraduate classes.

  6. Paul Scherrer Institute Scientific Report 1998. Volume VI: Large Research Facilities

    International Nuclear Information System (INIS)

    Bauer, Guenter; Bercher, Renate; Buechli, Carmen; Foroughi, Fereydoun; Meyer, Rosa

    1999-01-01

    The department GFA (Grossforschungsanlagen, Large Research Facilities) has been established in October 1998 and its main duty is operation, maintenance and development of the PSI accelerators, the spallation neutron source and the beam transport systems for pions and muons. A large effort of this group concerns the planning and co-ordination of the assembly of the Swiss Light Source (SLS). (author)

  7. Wicked Problems in Large Organizations: Why Pilot Retention Continues to Challenge the Air Force

    Science.gov (United States)

    2017-05-25

    solving complex problems even more challenging.10 This idea of complexity extends to another theoretical concept , the complex adaptive system, which... concept in order to avoid the pitfalls and dangers in group problem - solving .26 His ideas to mitigate potential groupthink place responsibility... Problems in Large Organizations: Why Pilot Retention Continues to Challenge the Air Force 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  8. Integrating scientific knowledge into large-scale restoration programs: the CALFED Bay-Delta Program experience

    Science.gov (United States)

    Taylor, K.A.; Short, A.

    2009-01-01

    Integrating science into resource management activities is a goal of the CALFED Bay-Delta Program, a multi-agency effort to address water supply reliability, ecological condition, drinking water quality, and levees in the Sacramento-San Joaquin Delta of northern California. Under CALFED, many different strategies were used to integrate science, including interaction between the research and management communities, public dialogues about scientific work, and peer review. This paper explores ways science was (and was not) integrated into CALFED's management actions and decision systems through three narratives describing different patterns of scientific integration and application in CALFED. Though a collaborative process and certain organizational conditions may be necessary for developing new understandings of the system of interest, we find that those factors are not sufficient for translating that knowledge into management actions and decision systems. We suggest that the application of knowledge may be facilitated or hindered by (1) differences in the objectives, approaches, and cultures of scientists operating in the research community and those operating in the management community and (2) other factors external to the collaborative process and organization.

  9. Paul Scherrer Institute Scientific and Technical Report 1999. Volume VI: Large Research Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Foroughi, Fereydoun; Bercher, Renate; Buechli, Carmen; Meyer, Rosa [eds.

    2000-07-01

    The department GFA (Grossforschungsanlagen, Large Research Facilities) has been established in October 1998. Its main duty is operation, maintenance and development of the PSI accelerators, the spallation neutron source and the beam transport systems for pions and muons. A large effort of this group concerns the planning and co-ordination of new projects like e.g. the assembly of the synchrotron light source (SLS), design studies of a new proton therapy facility, the ultracold neutron source and a new intensive secondary beam line for low energy muons. A large fraction of this report is devoted to research especially in the field of materials Science. The studies include large scale molecular dynamics computer simulations on the elastic and plastic behavior of nanostructured metals, complemented by experimental mechanical testing using micro-indentation and miniaturized tensile testing, as well as microstructural characterisation and strain field mapping of metallic coatings and thin ceramic layers, the latter done with synchrotron radiation.

  10. Paul Scherrer Institute Scientific and Technical Report 1999. Volume VI: Large Research Facilities

    International Nuclear Information System (INIS)

    Foroughi, Fereydoun; Bercher, Renate; Buechli, Carmen; Meyer, Rosa

    2000-01-01

    The department GFA (Grossforschungsanlagen, Large Research Facilities) has been established in October 1998. Its main duty is operation, maintenance and development of the PSI accelerators, the spallation neutron source and the beam transport systems for pions and muons. A large effort of this group concerns the planning and co-ordination of new projects like e.g. the assembly of the synchrotron light source (SLS), design studies of a new proton therapy facility, the ultracold neutron source and a new intensive secondary beam line for low energy muons. A large fraction of this report is devoted to research especially in the field of materials Science. The studies include large scale molecular dynamics computer simulations on the elastic and plastic behavior of nanostructured metals, complemented by experimental mechanical testing using micro-indentation and miniaturized tensile testing, as well as microstructural characterisation and strain field mapping of metallic coatings and thin ceramic layers, the latter done with synchrotron radiation

  11. Sakharov readings 2014: environmental problems of the XXI century. Proceedings of the 14th international scientific conference

    International Nuclear Information System (INIS)

    Dunaj, V.I.; Poznyak, S.S.; Lysukho, N.A.

    2014-05-01

    The proceeding includes materials of reports of 14-th international scientific conference 'Sakharov readings 2014: Environmental problems of XXI century', which took place 29-30 of May 2014 in the International A. Sakharov Environmental University (Minsk, Belarus). The proceeding continues abstracts about social-ecological, ecology-ethical and pedagogical problems in light of the Sakharov' ideas; medical ecology; biological ecology; biomonitoring, bioindication and bioremediation; radioecology and radiation protection; information systems and technologies in ecology and medicine; ecological monitoring, management and audit; renewable energy sources and energy efficiency; climate change and sustainable development; regional ecological problems. Materials of the conference intend on wide area of the specialists in ecology and adjacent sciences, teachers, post-graduate students and students of universities and colleges. (authors)

  12. THE LACK OF SCIENTIFIC ATTITUDE AS A PROBLEM IN THE DESIGNING OF A YACHT CLUB

    Directory of Open Access Journals (Sweden)

    Екатерина Анатольевна Пирогова

    2016-08-01

    Full Text Available The papers draws attention the fact that nowadays there is a lack of scientific basis for designinh of yacht clubs and ports. There is no legal framework and many terms are misinterpreted. This can be observed in the new decree issued by the Government of the Leningrad region. The author considers it is necessary to develop a uniform legal framework for the design of yacht clubs and ports. The framework should posess a single classification, which allows not only to project a modern yacht education, but to develop yachting infrastructure.

  13. Solving Large-Scale Computational Problems Using Insights from Statistical Physics

    Energy Technology Data Exchange (ETDEWEB)

    Selman, Bart [Cornell University

    2012-02-29

    Many challenging problems in computer science and related fields can be formulated as constraint satisfaction problems. Such problems consist of a set of discrete variables and a set of constraints between those variables, and represent a general class of so-called NP-complete problems. The goal is to find a value assignment to the variables that satisfies all constraints, generally requiring a search through and exponentially large space of variable-value assignments. Models for disordered systems, as studied in statistical physics, can provide important new insights into the nature of constraint satisfaction problems. Recently, work in this area has resulted in the discovery of a new method for solving such problems, called the survey propagation (SP) method. With SP, we can solve problems with millions of variables and constraints, an improvement of two orders of magnitude over previous methods.

  14. Paul Scherrer Institute Scientific Report 1998. Volume VI: Large Research Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Guenter; Bercher, Renate; Buechli, Carmen; Foroughi, Fereydoun; Meyer, Rosa [eds.

    1999-09-01

    The department GFA (Grossforschungsanlagen, Large Research Facilities) has been established in October 1998 and its main duty is operation, maintenance and development of the PSI accelerators, the spallation neutron source and the beam transport systems for pions and muons. A large effort of this group concerns the planning and co-ordination of the assembly of the Swiss Light Source (SLS). (author) figs., tabs., refs.

  15. The Cauchy problem for a model of immiscible gas flow with large data

    Energy Technology Data Exchange (ETDEWEB)

    Sande, Hilde

    2008-12-15

    The thesis consists of an introduction and two papers; 1. The solution of the Cauchy problem with large data for a model of a mixture of gases. 2. Front tracking for a model of immiscible gas flow with large data. (AG) refs, figs

  16. Application of Text Analytics to Extract and Analyze Material–Application Pairs from a Large Scientific Corpus

    Directory of Open Access Journals (Sweden)

    Nikhil Kalathil

    2018-01-01

    Full Text Available When assessing the importance of materials (or other components to a given set of applications, machine analysis of a very large corpus of scientific abstracts can provide an analyst a base of insights to develop further. The use of text analytics reduces the time required to conduct an evaluation, while allowing analysts to experiment with a multitude of different hypotheses. Because the scope and quantity of metadata analyzed can, and should, be large, any divergence from what a human analyst determines and what the text analysis shows provides a prompt for the human analyst to reassess any preliminary findings. In this work, we have successfully extracted material–application pairs and ranked them on their importance. This method provides a novel way to map scientific advances in a particular material to the application for which it is used. Approximately 438,000 titles and abstracts of scientific papers published from 1992 to 2011 were used to examine 16 materials. This analysis used coclustering text analysis to associate individual materials with specific clean energy applications, evaluate the importance of materials to specific applications, and assess their importance to clean energy overall. Our analysis reproduced the judgments of experts in assigning material importance to applications. The validated methods were then used to map the replacement of one material with another material in a specific application (batteries.

  17. Probabilities and Predictions: Modeling the Development of Scientific Problem-Solving Skills

    Science.gov (United States)

    2005-01-01

    The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative manner. This article describes the development of probabilistic models of undergraduate student problem solving in molecular genetics that detailed the spectrum of strategies students used when problem solving, and how the strategic approaches evolved with experience. The actions of 776 university sophomore biology majors from three molecular biology lecture courses were recorded and analyzed. Each of six simulations were first grouped by artificial neural network clustering to provide individual performance measures, and then sequences of these performances were probabilistically modeled by hidden Markov modeling to provide measures of progress. The models showed that students with different initial problem-solving abilities choose different strategies. Initial and final strategies varied across different sections of the same course and were not strongly correlated with other achievement measures. In contrast to previous studies, we observed no significant gender differences. We suggest that instructor interventions based on early student performances with these simulations may assist students to recognize effective and efficient problem-solving strategies and enhance learning. PMID:15746978

  18. [Between scientific management and research-action: the problem of overconsumption of drugs in Kasongo (Zaire)].

    Science.gov (United States)

    De Brouwere, V; Van Lerberghe, W; Criel, B; Van Dormael, M

    1996-01-01

    A Primary Health Care (PHC) system may be effective and efficient to the extent that essential drugs are available in health services and financially accessible to the population. In developing countries, besides the difficulties related to supplying health services with adequate amounts of drugs, the control of drug consumption is one of the frequent problems encountered by health authorities. Literature is relatively abundant in the field of rationalization of the diagnosis and drug prescription processes, and also in the field of drug financing mechanisms; publications are however rather scarce when topics related to corruption or drug misappropriation are concerned. The case study submitted hereafter reports a drug overconsumption problem in the health centres (HC) of the Kasongo district (Zaire). Despite the existence of direct control mechanisms as well as indirect ones (monitoring of drug consumption by HC), the problem has been identified belatedly. The district staff then used a step-by-step analysis of the HC drug consumption profiles; this analysis allowed to demonstrate that misappropriation would be the most plausible hypothesis. In order to solve the misappropriation problem-the consequences of which jeopardized the functioning of the very health system-the district staff chose to involve the nurses, in charge of the HC, in the entire problem-solving process. This participative approach, involving different actors as partners, allowed to deepen the situation analysis and to elaborate solutions congruent with PHC principles and acceptable to all concerned.

  19. Social inequalities in the face of scientific and technological development: an antinomy or an historic problem?

    Science.gov (United States)

    Delgado, Guilherme Costa

    2017-07-01

    This paper aims to conduct a conceptual analysis of the relationship between scientific and technical progress and social equality, or the reduction of inequalities. We examine this relationship by drawing on three theoretical perspectives: 1) ethical economics, championed by classical economic thinkers and centered on utilitarian self-interest, 2) Mainstream theories of economic development espousing the endogenous link between labor productivity growth and technical progress, 3) the critique of theories of economic development that emerged in the second half of the twentieth century, including Celso Furtado's critique of the theory of underdevelopment, emphasizing the prevalence of egalitarian tendencies, and ecological economics, which suggest alternative paths to those set by "classical" theories of development. The fundamental antinomy posed by the title of this article, characterized by an intrinsic contradiction between technical progress and social equality, strictly presupposes the ethical economics perspective, dominated by the social relations that constitute the "social order".

  20. DIGITAL HUMANISTIC PEDAGOGY: RELEVANT PROBLEMS OF SCIENTIFIC RESEARCH IN THE FIELD OF USING ICT IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Valeriy Yu. Bykov

    2016-07-01

    Full Text Available In the article theoretical and methodological principles of digital humanistic pedagogy – the science about the laws of creating a positive integrated educational reality as a result of the convergence of physical and virtual (created using ICT training spaces (environments are determined. Based on the use of modern ICT learning activity (formal, nonformal and informal is at the intersection of two worlds: the real and the virtual. Methodology and research methods of classical pedagogy require review and improvement in the context of current realities of the educational process, needs and interests of all its subjects. The development of digital humanities in the international educational space is analyzed; the content of the new field of pedagogical knowledge as part of digital humanistic is outlined; research methods and directions of current scientific research are defined.

  1. Communicating Science: The Special Problems of Reporting Scientific Enquiry in the Media.

    Science.gov (United States)

    Goodfield, June

    The relationship of reporters, scientists, and the public is explored in this paper. Recent issues that have triggered a demand for a new kind of science writer are noted as including society's increased interest in health care, in problems of the environment, the ethics of genetic engineering, and other issues concerning the autonomy of the…

  2. Testing Foreign Language Impact on Engineering Students' Scientific Problem-Solving Performance

    Science.gov (United States)

    Tatzl, Dietmar; Messnarz, Bernd

    2013-01-01

    This article investigates the influence of English as the examination language on the solution of physics and science problems by non-native speakers in tertiary engineering education. For that purpose, a statistically significant total number of 96 students in four year groups from freshman to senior level participated in a testing experiment in…

  3. Large underground, liquid based detectors for astro-particle physics in Europe scientific case and prospects

    CERN Document Server

    Autiero, D; Badertscher, A; Bezrukov, L; Bouchez, J; Bueno, A; Busto, J; Campagne, J -E; Cavata, C; De Bellefon, A; Dumarchez, J; Ebert, J; Enqvist, T; Ereditato, A; Von Feilitzsch, F; Perez, P Fileviez; Goger-Neff, M; Gninenko, S; Gruber, W; Hagner, C; Hess, M; Hochmuth, K A; Kisiel, J; Knecht, L; Kreslo, I; Kudryavtsev, V A; Kuusiniemi, P; Lachenmaier, T; Laffranchi, M; Lefièvre, B; Lightfoot, P K; Lindner, M; Maalampi, J; Maltoni, M; Marchionni, A; Undagoitia, T Marrodan; Meregaglia, A; Messina, M; Mezzetto, M; Mirizzi, A; Mosca, L; Moser, U; Müller, A; Natterer, G; Oberauer, L; Otiougova, P; Patzak, T; Peltoniemi, J; Potzel, W; Pistillo, C; Raffelt, G G; Rondio, E; Roos, M; Rossi, B; Rubbia, André; Savvinov, N; Schwetz, T; Sobczyk, J; Spooner, N J C; Stefan, D; Tonazzo, A; Trzaska, W; Ulbricht, J; Volpe, C; Winter, J; Wurm, M; Zalewska-Bak, A; Zimmermann, R

    2007-01-01

    This document reports on a series of experimental and theoretical studies conducted to assess the astro-particle physics potential of three future large-scale particle detectors proposed in Europe as next generation underground observatories. The proposed apparatus employ three different and, to some extent, complementary detection techniques: GLACIER (liquid Argon TPC), LENA (liquid scintillator) and MEMPHYS (\\WC), based on the use of large mass of liquids as active detection media. The results of these studies are presented along with a critical discussion of the performance attainable by the three proposed approaches coupled to existing or planned underground laboratories, in relation to open and outstanding physics issues such as the search for matter instability, the detection of astrophysical- and geo-neutrinos and to the possible use of these detectors in future high-intensity neutrino beams.

  4. Performance of large-scale scientific applications on the IBM ASCI Blue-Pacific system

    International Nuclear Information System (INIS)

    Mirin, A.

    1998-01-01

    The IBM ASCI Blue-Pacific System is a scalable, distributed/shared memory architecture designed to reach multi-teraflop performance. The IBM SP pieces together a large number of nodes, each having a modest number of processors. The system is designed to accommodate a mixed programming model as well as a pure message-passing paradigm. We examine a number of applications on this architecture and evaluate their performance and scalability

  5. An overview of international actions to deal with climate change problem and the scientific update

    International Nuclear Information System (INIS)

    Usher, P.

    1995-01-01

    The atmospheric environment is under threat from anthropogenic emissions of pollutants and greenhouse gases to the extent that irreversible changes to the climate, the ozone layer and the quality of the air we breathe could occur. However, considerable scientific uncertainty remains with regard to the extent and magnitude of the change in climate as a result of human activities, and the impacts of such change. The natural variability of climate makes assessment of the human induced climate change difficult. Even if the magnitude of global warming from greenhouse gases in the atmosphere could be defined the impacts of this global average warming on, for example, the sea-level; the weather patterns such as rainfall, cloudiness, storms and droughts, agriculture; and marine and terrestrial eco-systems would have to be defined on regional, national and local scales. The assessments of these environmental impacts are, in turn, necessary for estimating the socio-economic impacts of environmental changes. This paper gives an overview of the international actions in combatting climate change and some information on the status of science on the climate change and its impacts. (EG)

  6. An overview of international actions to deal with climate change problem and the scientific update

    Energy Technology Data Exchange (ETDEWEB)

    Usher, P. [United Nations Environment Programme, Climate Unit, Nairobi (Kenya)

    1995-06-01

    The atmospheric environment is under threat from anthropogenic emissions of pollutants and greenhouse gases to the extent that irreversible changes to the climate, the ozone layer and the quality of the air we breathe could occur. However, considerable scientific uncertainty remains with regard to the extent and magnitude of the change in climate as a result of human activities, and the impacts of such change. The natural variability of climate makes assessment of the human induced climate change difficult. Even if the magnitude of global warming from greenhouse gases in the atmosphere could be defined the impacts of this global average warming on, for example, the sea-level; the weather patterns such as rainfall, cloudiness, storms and droughts, agriculture; and marine and terrestrial eco-systems would have to be defined on regional, national and local scales. The assessments of these environmental impacts are, in turn, necessary for estimating the socio-economic impacts of environmental changes. This paper gives an overview of the international actions in combatting climate change and some information on the status of science on the climate change and its impacts. (EG)

  7. Improved Structure and Fabrication of Large, High-Power KHPS Rotors - Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Corren, Dean [Verdant Power, Inc.; Colby, Jonathan [Verdant Power, Inc.; Adonizio, Mary Ann [Verdant Power, Inc.

    2013-01-29

    Verdant Power, Inc, working in partnership with the National Renewable Energy Laboratory (NREL), Sandia National Laboratories (SNL), and the University of Minnesota St. Anthony Falls Laboratory (SAFL), among other partners, used evolving Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) models and techniques to improve the structure and fabrication of large, high-power composite Kinetic Hydropower System (KHPS) rotor blades. The objectives of the project were to: design; analyze; develop for manufacture and fabricate; and thoroughly test, in the lab and at full scale in the water, the improved KHPS rotor blade.

  8. The multilevel fast multipole algorithm (MLFMA) for solving large-scale computational electromagnetics problems

    CERN Document Server

    Ergul, Ozgur

    2014-01-01

    The Multilevel Fast Multipole Algorithm (MLFMA) for Solving Large-Scale Computational Electromagnetic Problems provides a detailed and instructional overview of implementing MLFMA. The book: Presents a comprehensive treatment of the MLFMA algorithm, including basic linear algebra concepts, recent developments on the parallel computation, and a number of application examplesCovers solutions of electromagnetic problems involving dielectric objects and perfectly-conducting objectsDiscusses applications including scattering from airborne targets, scattering from red

  9. Causes of Indoor Air Quality Problems in Schools: Summary of Scientific Research

    Energy Technology Data Exchange (ETDEWEB)

    Bayer, C.W.

    2001-02-22

    In the modern urban setting, most individuals spend about 80% of their time indoors and are therefore exposed to the indoor environment to a much greater extent than to the outdoors (Lebowitz 1992). Concomitant with this increased habitation in urban buildings, there have been numerous reports of adverse health effects related to indoor air quality (IAQ) (sick buildings). Most of these buildings were built in the last two decades and were constructed to be energy-efficient. The quality of air in the indoor environment can be altered by a number of factors: release of volatile compounds from furnishings, floor and wall coverings, and other finishing materials or machinery; inadequate ventilation; poor temperature and humidity control; re-entrainment of outdoor volatile organic compounds (VOCs); and the contamination of the indoor environment by microbes (particularly fungi). Armstrong Laboratory (1992) found that the three most frequent causes of IAQ are (1) inadequate design and/or maintenance of the heating, ventilation, and air-conditioning (HVAC) system, (2) a shortage of fresh air, and (3) lack of humidity control. A similar study by the National Institute for Occupational Safety and Health (NIOSH 1989) recognized inadequate ventilation as the most frequent source of IAQ problems in the work environment (52% of the time). Poor IAQ due to microbial contamination can be the result of the complex interactions of physical, chemical, and biological factors. Harmful fungal populations, once established in the HVAC system or occupied space of a modern building, may episodically produce or intensify what is known as sick building syndrome (SBS) (Cummings and Withers 1998). Indeed, SBS caused by fungi may be more enduring and recalcitrant to treatment than SBS from multiple chemical exposures (Andrae 1988). An understanding of the microbial ecology of the indoor environment is crucial to ultimately resolving many IAQ problems. The incidence of SBS related to multiple

  10. FDTD method for laser absorption in metals for large scale problems.

    Science.gov (United States)

    Deng, Chun; Ki, Hyungson

    2013-10-21

    The FDTD method has been successfully used for many electromagnetic problems, but its application to laser material processing has been limited because even a several-millimeter domain requires a prohibitively large number of grids. In this article, we present a novel FDTD method for simulating large-scale laser beam absorption problems, especially for metals, by enlarging laser wavelength while maintaining the material's reflection characteristics. For validation purposes, the proposed method has been tested with in-house FDTD codes to simulate p-, s-, and circularly polarized 1.06 μm irradiation on Fe and Sn targets, and the simulation results are in good agreement with theoretical predictions.

  11. Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization.

    Science.gov (United States)

    Marai, G Elisabeta

    2018-01-01

    Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage-and its evaluation-of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature.

  12. Solving a large-scale precedence constrained scheduling problem with elastic jobs using tabu search

    DEFF Research Database (Denmark)

    Pedersen, C.R.; Rasmussen, R.V.; Andersen, Kim Allan

    2007-01-01

    exploitation of the elastic jobs and solve the problem using a tabu search procedure. Finding an initial feasible solution is in general -complete, but the tabu search procedure includes a specialized heuristic for solving this problem. The solution method has proven to be very efficient and leads......This paper presents a solution method for minimizing makespan of a practical large-scale scheduling problem with elastic jobs. The jobs are processed on three servers and restricted by precedence constraints, time windows and capacity limitations. We derive a new method for approximating the server...... to a significant decrease in makespan compared to the strategy currently implemented....

  13. Minimization of Linear Functionals Defined on| Solutions of Large-Scale Discrete Ill-Posed Problems

    DEFF Research Database (Denmark)

    Elden, Lars; Hansen, Per Christian; Rojas, Marielba

    2003-01-01

    The minimization of linear functionals de ned on the solutions of discrete ill-posed problems arises, e.g., in the computation of con dence intervals for these solutions. In 1990, Elden proposed an algorithm for this minimization problem based on a parametric-programming reformulation involving...... the solution of a sequence of trust-region problems, and using matrix factorizations. In this paper, we describe MLFIP, a large-scale version of this algorithm where a limited-memory trust-region solver is used on the subproblems. We illustrate the use of our algorithm in connection with an inverse heat...

  14. Solving a large-scale precedence constrained scheduling problem with elastic jobs using tabu search

    DEFF Research Database (Denmark)

    Pedersen, C.R.; Rasmussen, R.V.; Andersen, Kim Allan

    2007-01-01

    This paper presents a solution method for minimizing makespan of a practical large-scale scheduling problem with elastic jobs. The jobs are processed on three servers and restricted by precedence constraints, time windows and capacity limitations. We derive a new method for approximating the server...... exploitation of the elastic jobs and solve the problem using a tabu search procedure. Finding an initial feasible solution is in general -complete, but the tabu search procedure includes a specialized heuristic for solving this problem. The solution method has proven to be very efficient and leads...

  15. Solving large-scale sparse eigenvalue problems and linear systems of equations for accelerator modeling

    International Nuclear Information System (INIS)

    Gene Golub; Kwok Ko

    2009-01-01

    The solutions of sparse eigenvalue problems and linear systems constitute one of the key computational kernels in the discretization of partial differential equations for the modeling of linear accelerators. The computational challenges faced by existing techniques for solving those sparse eigenvalue problems and linear systems call for continuing research to improve on the algorithms so that ever increasing problem size as required by the physics application can be tackled. Under the support of this award, the filter algorithm for solving large sparse eigenvalue problems was developed at Stanford to address the computational difficulties in the previous methods with the goal to enable accelerator simulations on then the world largest unclassified supercomputer at NERSC for this class of problems. Specifically, a new method, the Hemitian skew-Hemitian splitting method, was proposed and researched as an improved method for solving linear systems with non-Hermitian positive definite and semidefinite matrices.

  16. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu

    2011-08-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  17. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  18. El problema de la barrera linguistica en el desarrollo cientifico y tecnologico (The Problem of the Language Barrier in Scientific and Technological Development).

    Science.gov (United States)

    Zierer, Ernesto

    This monograph discusses the problem of the language barrier in scientific and technological development in terms of several parameters describing the flow of scientific information from one language to another. The numerical values of the language barrier parameters of the model are calculated in the field of information on second language…

  19. PSEUDO-SCIENTIFIC ECONOMIC POLICIES OF MOLDOVA ASSOCIATION TO THE EU: METHODOLOGY, PROBLEMS, SOLUTIONS

    Directory of Open Access Journals (Sweden)

    Gheorghe RUSU

    2016-01-01

    Full Text Available Economic policies and decisions on EU association starting with the begginig of 90’s were pseudo-scientific, contradictory, incoherent because those policies have not based themselves on modern and current economic theories elaborated and promoted by the EU. Actuality. The topic is actual from the perspective of the factors’ analysis which were conducting to delay the association process of Moldova to the EU. At the same time, those were increasing instability, disequilibrium in the national economy and raise of social vulnerability and constraint levels which ultimately increased the gap between the national and EU economic development levels. During the period of 2000-2015, the socio-economic policy of the Republic of Moldova is described more as small and fragmented steps on conceiving economic and financial instruments for the integration into the EU which were reflected in the Neighbourhood Partnership and Association Agreement with the EU. These processes conducted for the state incapacity to define its own objectives and social-economic priorities for the association as well as legitimated a continuous stage of transition to the market economy. The scope of the present article is to propose a real change of the development and social-economic association policies for achieving final objective on integration to EU. The proposals would consist in emphasizing and implementation of the EU economic principles reflected in the neoclassic synthesis and neo-conservative theories; the elaboration and implementation of a new Strategy on economic supervision, coordination and anticipation of the economic disequilibrium; achieve economic stability for diminishing the negative effects of the global and regional crisis on national economy and adaptation of the development policies to the national socio-economic conditions. The methods used for the elaboration and achieving the expected results of the study were analysis and synthesis of the

  20. Extracting scientific articles from a large digital archive: BioStor and the Biodiversity Heritage Library.

    Science.gov (United States)

    Page, Roderic D M

    2011-05-23

    The Biodiversity Heritage Library (BHL) is a large digital archive of legacy biological literature, comprising over 31 million pages scanned from books, monographs, and journals. During the digitisation process basic metadata about the scanned items is recorded, but not article-level metadata. Given that the article is the standard unit of citation, this makes it difficult to locate cited literature in BHL. Adding the ability to easily find articles in BHL would greatly enhance the value of the archive. A service was developed to locate articles in BHL based on matching article metadata to BHL metadata using approximate string matching, regular expressions, and string alignment. This article locating service is exposed as a standard OpenURL resolver on the BioStor web site http://biostor.org/openurl/. This resolver can be used on the web, or called by bibliographic tools that support OpenURL. BioStor provides tools for extracting, annotating, and visualising articles from the Biodiversity Heritage Library. BioStor is available from http://biostor.org/.

  1. Neighborhood communication paradigm to increase scalability in large-scale dynamic scientific applications

    KAUST Repository

    Ovcharenko, Aleksandr

    2012-03-01

    This paper introduces a general-purpose communication package built on top of MPI which is aimed at improving inter-processor communications independently of the supercomputer architecture being considered. The package is developed to support parallel applications that rely on computation characterized by large number of messages of various sizes, often small, that are focused within processor neighborhoods. In some cases, such as solvers having static mesh partitions, the number and size of messages are known a priori. However, in other cases such as mesh adaptation, the messages evolve and vary in number and size and include the dynamic movement of partition objects. The current package provides a utility for dynamic applications based on two key attributes that are: (i) explicit consideration of the neighborhood communication pattern to avoid many-to-many calls and also to reduce the number of collective calls to a minimum, and (ii) use of non-blocking MPI functions along with message packing to manage message flow control and reduce the number and time of communication calls. The test application demonstrated is parallel unstructured mesh adaptation. Results on IBM Blue Gene/P and Cray XE6 computers show that the use of neighborhood-based communication control leads to scalable results when executing generally imbalanced mesh adaptation runs. © 2011 Elsevier B.V. All rights reserved.

  2. The Analysis of Students Scientific Reasoning Ability in Solving the Modified Lawson Classroom Test of Scientific Reasoning (MLCTSR Problems by Applying the Levels of Inquiry

    Directory of Open Access Journals (Sweden)

    N. Novia

    2017-04-01

    Full Text Available This study aims to determine the students’ achievement in answering modified lawson classroom test of scientific reasoning (MLCTSR questions in overall science teaching and by every aspect of scientific reasoning abilities. There are six aspects related to the scientific reasoning abilities that were measured; they are conservatorial reasoning, proportional reasoning, controlling variables, combinatorial reasoning, probabilistic reasoning, correlational reasoning. The research is also conducted to see the development of scientific reasoning by using levels of inquiry models. The students reasoning ability was measured using the Modified Lawson Classroom Test of Scientific Reasoning (MLCTSR. MLCTSR is a test developed based on the test of scientific reasoning of Lawson’s Classroom Test of Scientific Reasoning (LCTSR in 2000 which amounted to 12 multiple-choice questions. The research method chosen in this study is descriptive quantitative research methods. The research design used is One Group Pretest-Posttest Design. The population of this study is the entire junior high students class VII the academic year 2014/2015 in one junior high school in Bandung. The samples in this study are one of class VII, which is class VII C. The sampling method used in this research is purposive sampling. The results showed that there is an increase in quantitative scientific reasoning although its value is not big.

  3. A note on solving large-scale zero-one programming problems

    NARCIS (Netherlands)

    Adema, Jos J.

    1988-01-01

    A heuristic for solving large-scale zero-one programming problems is provided. The heuristic is based on the modifications made by H. Crowder et al. (1983) to the standard branch-and-bound strategy. First, the initialization is modified. The modification is only useful if the objective function

  4. Towards a Versatile Problem Diagnosis Infrastructure for LargeWireless Sensor Networks

    NARCIS (Netherlands)

    Iwanicki, Konrad; Steen, van Maarten

    2007-01-01

    In this position paper, we address the issue of durable maintenance of a wireless sensor network, which will be crucial if the vision of large, long-lived sensornets is to become reality. Durable maintenance requires tools for diagnosing and fixing occurring problems, which can range from

  5. Large scale inverse problems computational methods and applications in the earth sciences

    CERN Document Server

    Scheichl, Robert; Freitag, Melina A; Kindermann, Stefan

    2013-01-01

    This book is thesecond volume of three volume series recording the ""Radon Special Semester 2011 on Multiscale Simulation & Analysis in Energy and the Environment"" taking place in Linz, Austria, October 3-7, 2011. The volume addresses the common ground in the mathematical and computational procedures required for large-scale inverse problems and data assimilation in forefront applications.

  6. An efficient method to handle the 'large p, small n' problem for ...

    Indian Academy of Sciences (India)

    So-called 'large p small n' or 'short-fat data' problem can occur if the number of ... (SNPs) in GWAS based on the Haseman–Elston regression. (H–E) (DeFries 2010). ..... For instance, if a population is admixed or genetic heterogeneity, the ...

  7. Identifying Problems in Students’ Final Projects Based on Scientific Writing Guidelines

    Directory of Open Access Journals (Sweden)

    Endang Ernawati

    2010-11-01

    Full Text Available Article analyzed student’s difficulties and abilities in writing their final projects, which were undergraduate theses and undergraduate paper conducted by some students at the English Department, Bina Nusantara University. This was a preliminary study to support an appropriate student guideline in writing their final project. The study was conducted by applying qualitative methods that was by analyzing the four theses and one paper in terms of their format: titles, introduction, theoretical background, analysis, conclusion, bibliography, and paper rubric to analyze the contents. It can be concluded that generally, students, guided by their mentor/lecturer, understand the final paper guidelines and they are able to apply it in their thesis and paper. But, there are still lack of clarity and relevancy in expressing their ideas properly, and their ability in writing in both English and Bahasa Indonesia must be improved. These problems can be overcome by socializing the writing guidelines to both students and lecturers, providing them with critical thinking skills, cooperation with library that will guide them in information literacy skills, and language center that will improve their writing skills. 

  8. The interaction problems between large and small business in modern conditions

    Directory of Open Access Journals (Sweden)

    Belyaev Mikhail

    2017-01-01

    Full Text Available The development of market relations, changes of the conditions in the business environment encourage the enterprises to look for new management methods and to improve forms of interaction. In this regard, the identification of the interaction of large and small businesses, as well as the evaluation of their relation development seems important and urgent problem in modern conditions. The purpose of the survey – the study of the interaction of large and small businesses, as well as the evaluation of the relations between them. The study was conducted on the basis of a comprehensive and systematic approach in which methods of comparative, retrospective, statistical, mathematical analysis are used. In accordance with the purpose there are identified the prerequisites for the development of large and small businesses, features and their functioning problems, and the links between them. The most common form of interaction of large and small enterprises were identified - outsourcing, franchising, leasing, subcontracting, venture financing, the establishment of regional cooperation forms of large and small businesses. However, the cooperative processes of large and small business in Russia developed are not enough today.The authors identified factors that impede the growth of Russian production, offered recommendations for the development of large and small businesses, justified the state's role in this process. In addition, they described the mechanism of state support of small business, including organizational, financial, information and consulting components.

  9. Large area high quality silicon detectors for scientific research and radioactivity monitoring in the environment

    International Nuclear Information System (INIS)

    Frolov, D.; Perevertailo, V.; Frolov, O.; Kononenko, Yu.; Pugatch, V.; Rozenfeld, A.

    1995-01-01

    breakdown of the junction were detected and examined. Examination of gate-controlled junctions allows one to obtain a useful information on generation currents, breakdown effects, protective rings operation in detectors. Results of this research were in design of a series of the following Si detectors: detectors with active areas 0.1 to 20 cm 2 and a wide range of depletion depth; photodetectors capable of working together with scintillator; large area strip detector with 16 and 128 strips as well as an annular strip detector with 40 strips and a max strip diameter of 13 cm. (author)

  10. A novel artificial fish swarm algorithm for solving large-scale reliability-redundancy application problem.

    Science.gov (United States)

    He, Qiang; Hu, Xiangtao; Ren, Hong; Zhang, Hongqi

    2015-11-01

    A novel artificial fish swarm algorithm (NAFSA) is proposed for solving large-scale reliability-redundancy allocation problem (RAP). In NAFSA, the social behaviors of fish swarm are classified in three ways: foraging behavior, reproductive behavior, and random behavior. The foraging behavior designs two position-updating strategies. And, the selection and crossover operators are applied to define the reproductive ability of an artificial fish. For the random behavior, which is essentially a mutation strategy, the basic cloud generator is used as the mutation operator. Finally, numerical results of four benchmark problems and a large-scale RAP are reported and compared. NAFSA shows good performance in terms of computational accuracy and computational efficiency for large scale RAP. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Efficient numerical methods for the large-scale, parallel solution of elastoplastic contact problems

    KAUST Repository

    Frohne, Jörg

    2015-08-06

    © 2016 John Wiley & Sons, Ltd. Quasi-static elastoplastic contact problems are ubiquitous in many industrial processes and other contexts, and their numerical simulation is consequently of great interest in accurately describing and optimizing production processes. The key component in these simulations is the solution of a single load step of a time iteration. From a mathematical perspective, the problems to be solved in each time step are characterized by the difficulties of variational inequalities for both the plastic behavior and the contact problem. Computationally, they also often lead to very large problems. In this paper, we present and evaluate a complete set of methods that are (1) designed to work well together and (2) allow for the efficient solution of such problems. In particular, we use adaptive finite element meshes with linear and quadratic elements, a Newton linearization of the plasticity, active set methods for the contact problem, and multigrid-preconditioned linear solvers. Through a sequence of numerical experiments, we show the performance of these methods. This includes highly accurate solutions of a three-dimensional benchmark problem and scaling our methods in parallel to 1024 cores and more than a billion unknowns.

  12. Efficient numerical methods for the large-scale, parallel solution of elastoplastic contact problems

    KAUST Repository

    Frohne, Jö rg; Heister, Timo; Bangerth, Wolfgang

    2015-01-01

    © 2016 John Wiley & Sons, Ltd. Quasi-static elastoplastic contact problems are ubiquitous in many industrial processes and other contexts, and their numerical simulation is consequently of great interest in accurately describing and optimizing production processes. The key component in these simulations is the solution of a single load step of a time iteration. From a mathematical perspective, the problems to be solved in each time step are characterized by the difficulties of variational inequalities for both the plastic behavior and the contact problem. Computationally, they also often lead to very large problems. In this paper, we present and evaluate a complete set of methods that are (1) designed to work well together and (2) allow for the efficient solution of such problems. In particular, we use adaptive finite element meshes with linear and quadratic elements, a Newton linearization of the plasticity, active set methods for the contact problem, and multigrid-preconditioned linear solvers. Through a sequence of numerical experiments, we show the performance of these methods. This includes highly accurate solutions of a three-dimensional benchmark problem and scaling our methods in parallel to 1024 cores and more than a billion unknowns.

  13. Mutation-based learning to improve student autonomy and scientific inquiry skills in a large genetics laboratory course.

    Science.gov (United States)

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the "mutations"; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional "cookbook"-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class.

  14. Solution of large nonlinear time-dependent problems using reduced coordinates

    International Nuclear Information System (INIS)

    Mish, K.D.

    1987-01-01

    This research is concerned with the idea of reducing a large time-dependent problem, such as one obtained from a finite-element discretization, down to a more manageable size while preserving the most-important physical behavior of the solution. This reduction process is motivated by the concept of a projection operator on a Hilbert Space, and leads to the Lanczos Algorithm for generation of approximate eigenvectors of a large symmetric matrix. The Lanczos Algorithm is then used to develop a reduced form of the spatial component of a time-dependent problem. The solution of the remaining temporal part of the problem is considered from the standpoint of numerical-integration schemes in the time domain. All of these theoretical results are combined to motivate the proposed reduced coordinate algorithm. This algorithm is then developed, discussed, and compared to related methods from the mechanics literature. The proposed reduced coordinate method is then applied to the solution of some representative problems in mechanics. The results of these problems are discussed, conclusions are drawn, and suggestions are made for related future research

  15. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    Science.gov (United States)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  16. A hybrid adaptive large neighborhood search algorithm applied to a lot-sizing problem

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt; Spoorendonk, Simon

    This paper presents a hybrid of a general heuristic framework that has been successfully applied to vehicle routing problems and a general purpose MIP solver. The framework uses local search and an adaptive procedure which choses between a set of large neighborhoods to be searched. A mixed integer...... of a solution and to investigate the feasibility of elements in such a neighborhood. The hybrid heuristic framework is applied to the multi-item capacitated lot sizing problem with dynamic lot sizes, where experiments have been conducted on a series of instances from the literature. On average the heuristic...

  17. Computational Comparison of Several Greedy Algorithms for the Minimum Cost Perfect Matching Problem on Large Graphs

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Laporte, Gilbert

    2017-01-01

    The aim of this paper is to computationally compare several algorithms for the Minimum Cost Perfect Matching Problem on an undirected complete graph. Our work is motivated by the need to solve large instances of the Capacitated Arc Routing Problem (CARP) arising in the optimization of garbage...... collection in Denmark. Common heuristics for the CARP involve the optimal matching of the odd-degree nodes of a graph. The algorithms used in the comparison include the CPLEX solution of an exact formulation, the LEDA matching algorithm, a recent implementation of the Blossom algorithm, as well as six...

  18. Large scientific releases

    International Nuclear Information System (INIS)

    Pongratz, M.B.

    1981-01-01

    The motivation for active experiments in space is considered, taking into account the use of active techniques to obtain a better understanding of the natural space environment, the utilization of the advantages of space as a laboratory to study fundamental plasma physics, and the employment of active techniques to determine the magnitude, degree, and consequences of artificial modification of the space environment. It is pointed out that mass-injection experiments in space plasmas began about twenty years ago with the Project Firefly releases. Attention is given to mass-release techniques and diagnostics, operational aspects of mass release active experiments, the active observation of mass release experiments, active perturbation mass release experiments, simulating an artificial modification of the space environment, and active experiments to study fundamental plasma physics

  19. Improving the computation efficiency of COBRA-TF for LWR safety analysis of large problems

    International Nuclear Information System (INIS)

    Cuervo, D.; Avramova, M. N.; Ivanov, K. N.

    2004-01-01

    A matrix solver is implemented in COBRA-TF in order to improve the computation efficiency of both numerical solution methods existing in the code, the Gauss elimination and the Gauss-Seidel iterative technique. Both methods are used to solve the system of pressure linear equations and relay on the solution of large sparse matrices. The introduced solver accelerates the solution of these matrices in cases of large number of cells. The execution time is reduced in half as compared to the execution time without using matrix solver for the cases with large matrices. The achieved improvement and the planned future work in this direction are important for performing efficient LWR safety analyses of large problems. (authors)

  20. Numerical solution of large nonlinear boundary value problems by quadratic minimization techniques

    International Nuclear Information System (INIS)

    Glowinski, R.; Le Tallec, P.

    1984-01-01

    The objective of this paper is to describe the numerical treatment of large highly nonlinear two or three dimensional boundary value problems by quadratic minimization techniques. In all the different situations where these techniques were applied, the methodology remains the same and is organized as follows: 1) derive a variational formulation of the original boundary value problem, and approximate it by Galerkin methods; 2) transform this variational formulation into a quadratic minimization problem (least squares methods) or into a sequence of quadratic minimization problems (augmented lagrangian decomposition); 3) solve each quadratic minimization problem by a conjugate gradient method with preconditioning, the preconditioning matrix being sparse, positive definite, and fixed once for all in the iterative process. This paper will illustrate the methodology above on two different examples: the description of least squares solution methods and their application to the solution of the unsteady Navier-Stokes equations for incompressible viscous fluids; the description of augmented lagrangian decomposition techniques and their application to the solution of equilibrium problems in finite elasticity

  1. Scientific ballooning. Proceedings. PSB Meeting of the COSPAR Panel on Technical Problems Related to Scientific Ballooning which was held during the Thirtieth COSPAR Scientific Assembly, Hamburg (Germany), 11 - 21 Jul 1994.

    Science.gov (United States)

    Riedler, W.; Torkar, K.

    1996-05-01

    This issue is grouped into sections on materials, design, performance and analysis of balloons, reviews of major national and international balloon programmes, novel instrumentation and systems for scientific ballooning, and selected recent scientific observations.

  2. Pioneering space research in the USSR and mathematical modeling of large problems of radiation transfer

    International Nuclear Information System (INIS)

    Sushkevich, T.A.

    2011-01-01

    This review is to remind scientists of the older generation of some memorable historical pages and of many famous researchers, teachers and colleagues. For the younger researchers and foreign colleagues it will be useful to get to know about pioneer advancements of the Soviet scientists in the field of information and mathematical supply for cosmonautic problems on the eve of the space era. Main attention is paid to the scientific experiments conducted on the piloted space vehicles and the research teams who created the information and mathematical tools for the first space projects. The role of Mstislav Vsevolodovich Keldysh, the Major Theoretician of cosmonautics, is particularly emphasized. He determined for the most part the basic directions of development of space research and remote sensing of the Earth and planets that are shortly called remote sensing

  3. An Adaptive Large Neighborhood Search Algorithm for the Resource-constrained Project Scheduling Problem

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt

    2009-01-01

    We present an application of an Adaptive Large Neighborhood Search (ALNS) algorithm to the Resource-constrained Project Scheduling Problem (RCPSP). The ALNS framework was first proposed by Pisinger and Røpke [19] and can be described as a large neighborhood search algorithm with an adaptive layer......, where a set of destroy/repair neighborhoods compete to modify the current solution in each iteration of the algorithm. Experiments are performed on the wellknown J30, J60 and J120 benchmark instances, which show that the proposed algorithm is competitive and confirms the strength of the ALNS framework...

  4. Proceedings of 7. international scientific conference 'Sakharov readings 2007: Ecological problems of XXI century'; Materialy 7-oj mezhdunarodnoj nauchnoj konferentsii 'Sakharovskie chteniya 2007 goda: Ehkologicheskie problemy XXI veka'

    Energy Technology Data Exchange (ETDEWEB)

    Kundas, S P; Mel' nov, S B; Poznyak, S S [International A. Sakharov environmental univ., Minsk (Belarus)

    2007-05-15

    Abstracts of the seventh international scientific conference 'Sakharov readings 2007: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: socio-ecological problems, medical ecology, biomonitoring and bioindication, biological ecology. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  5. Large radiative corrections to the effective potential and the gauge hierarchy problem

    International Nuclear Information System (INIS)

    Sachrajda, C.T.C.

    1982-01-01

    We study the higher order corrections to the effective potential in a simple toy model and in the SU(5) grand unified theory, with a view to seeing what their effects are on the stability equations, and hence on the gauge hierarchy problem for these theories. These corrections contain powers of log (v 2 /h 2 ), where v and h are the large and small vacuum expectation values respectively, and hence cannot a priori be neglected. Nevertheless, after summing these large logarithms we find that the stability equations always contain two equations for v (i.e. these equations are independent of h) and hence can only be satisfied by a special (and hence unnatural) choice of parameters. This we claim is the precise statement of the gauge hierarchy problem. (orig.)

  6. A Large Scale Problem Based Learning inter-European Student Satellite Construction Project

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Alminde, Lars; Bisgaard, Morten

    2006-01-01

    that electronic communication technology was vital within the project. Additionally the SSETI EXPRESS project implied the following problems it didn’t fit to a standard semester - 18 months for the satellite project compared to 5/6 months for a “normal” semester project. difficulties in integrating the tasks......A LARGE SCALE PROBLEM BASED LEARNING INTER-EUROPEAN STUDENT SATELLITE CONSTRUCTION PROJECT This paper describes the pedagogical outcome of a large scale PBL experiment. ESA (European Space Agency) Education Office launched January 2004 an ambitious project: Let students from all over Europe build....... The satellite was successfully launched on October 27th 2005 (http://www.express.space.aau.dk). The project was a student driven project with student project responsibility adding at lot of international experiences and project management skills to the outcome of more traditional one semester, single group...

  7. Parallel Optimization of Polynomials for Large-scale Problems in Stability and Control

    Science.gov (United States)

    Kamyar, Reza

    In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials. To avoid the problem of intractability, we establish a trade off between accuracy and complexity. In particular, we develop a sequence of tractable optimization problems --- in the form of Linear Programs (LPs) and/or Semi-Definite Programs (SDPs) --- whose solutions converge to the exact solution of the NP-hard problem. However, the computational and memory complexity of these LPs and SDPs grow exponentially with the progress of the sequence - meaning that improving the accuracy of the solutions requires solving SDPs with tens of thousands of decision variables and constraints. Setting up and solving such problems is a significant challenge. The existing optimization algorithms and software are only designed to use desktop computers or small cluster computers --- machines which do not have sufficient memory for solving such large SDPs. Moreover, the speed-up of these algorithms does not scale beyond dozens of processors. This in fact is the reason we seek parallel algorithms for setting-up and solving large SDPs on large cluster- and/or super-computers. We propose parallel algorithms for stability analysis of two classes of systems: 1) Linear systems with a large number of uncertain parameters; 2) Nonlinear systems defined by polynomial vector fields. First, we develop a distributed parallel algorithm which applies Polya's and/or Handelman's theorems to some variants of parameter-dependent Lyapunov inequalities with parameters defined over the standard simplex. The result is a sequence of SDPs which possess a block-diagonal structure. We then develop a parallel SDP solver which exploits this structure in order to map the computation, memory and communication to a distributed parallel environment. Numerical tests on a supercomputer demonstrate the ability of the algorithm to

  8. Solving Large Scale Nonlinear Eigenvalue Problem in Next-Generation Accelerator Design

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Ben-Shan; Bai, Zhaojun; /UC, Davis; Lee, Lie-Quan; Ko, Kwok; /SLAC

    2006-09-28

    A number of numerical methods, including inverse iteration, method of successive linear problem and nonlinear Arnoldi algorithm, are studied in this paper to solve a large scale nonlinear eigenvalue problem arising from finite element analysis of resonant frequencies and external Q{sub e} values of a waveguide loaded cavity in the next-generation accelerator design. They present a nonlinear Rayleigh-Ritz iterative projection algorithm, NRRIT in short and demonstrate that it is the most promising approach for a model scale cavity design. The NRRIT algorithm is an extension of the nonlinear Arnoldi algorithm due to Voss. Computational challenges of solving such a nonlinear eigenvalue problem for a full scale cavity design are outlined.

  9. The holographic dual of a Riemann problem in a large number of dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, Christopher P.; Spillane, Michael [C.N. Yang Institute for Theoretical Physics, Department of Physics and Astronomy,Stony Brook University, Stony Brook, NY 11794 (United States); Yarom, Amos [Department of Physics, Technion,Haifa 32000 (Israel)

    2016-08-22

    We study properties of a non equilibrium steady state generated when two heat baths are initially in contact with one another. The dynamics of the system we study are governed by holographic duality in a large number of dimensions. We discuss the “phase diagram” associated with the steady state, the dual, dynamical, black hole description of this problem, and its relation to the fluid/gravity correspondence.

  10. SCIENTIFIC AND INNOVATIVE APPROACH TO PROBLEM PERTAINING TO EVALUATION AND MONITORING OF ENVIRONMENT QUALITY IN REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    I. V. Voytov

    2009-01-01

    Full Text Available The paper proposes a scientific and innovative approach to solution of an important problem in the field of rational nature management and ecology which presupposes realization of evaluation, analysis and monitoring of environment  quality  (EQ in Belarus.  This  approach is based on methods and  facilities  of  administrative-command  and  partially  automatic-control  management.   The  main components of the innovative approach are an automatic  system for  evaluation and monitoring of EQ including estimation and formation of nature-resource potential within 11 cadaster and other data base, general principles on evaluation and monitoring of EQ, structural and algorithmic schemes for evaluation of ecological state of administrative territories, calculation of generalized indices of nature-territorial complexes and solution of nature protection problems in respect of EQ monitoring. A system of equation calculation for the analysis and evaluation of technogenic load on main nature components of the environment (free air, water objects, soil cover, realization of monitoring function in respect of EQ and ecological state of local and urban territories, nature resources  and enterprises, pollution and state of some recipients and also data resources for execution of analytical calculations and functions directed on monitoring quality of nature components of the environment is advanced in the paper.

  11. Improvement of Monte Carlo code A3MCNP for large-scale shielding problems

    International Nuclear Information System (INIS)

    Miyake, Y.; Ohmura, M.; Hasegawa, T.; Ueki, K.; Sato, O.; Haghighat, A.; Sjoden, G.E.

    2004-01-01

    A 3 MCNP (Automatic Adjoint Accelerated MCNP) is a revised version of the MCNP Monte Carlo code, that automatically prepares variance reduction parameters for the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using a deterministic 'importance' (or adjoint) function, CADIS performs source and transport biasing within the weight-window technique. The current version of A 3 MCNP uses the 3-D Sn transport TORT code to determine a 3-D importance function distribution. Based on simulation of several real-life problems, it is demonstrated that A 3 MCNP provides precise calculation results with a remarkably short computation time by using the proper and objective variance reduction parameters. However, since the first version of A 3 MCNP provided only a point source configuration option for large-scale shielding problems, such as spent-fuel transport casks, a large amount of memory may be necessary to store enough points to properly represent the source. Hence, we have developed an improved version of A 3 MCNP (referred to as A 3 MCNPV) which has a volumetric source configuration option. This paper describes the successful use of A 3 MCNPV for a concrete cask streaming problem and a PWR dosimetry problem. (author)

  12. The a3 problem solving report: a 10-step scientific method to execute performance improvements in an academic research vivarium.

    Science.gov (United States)

    Bassuk, James A; Washington, Ida M

    2013-01-01

    The purpose of this study was to illustrate the application of A3 Problem Solving Reports of the Toyota Production System to our research vivarium through the methodology of Continuous Performance Improvement, a lean approach to healthcare management at Seattle Children's (Hospital, Research Institute, Foundation). The Report format is described within the perspective of a 10-step scientific method designed to realize measurable improvements of Issues identified by the Report's Author, Sponsor and Coach. The 10-step method (Issue, Background, Current Condition, Goal, Root Cause, Target Condition, Countermeasures, Implementation Plan, Test, and Follow-up) was shown to align with Shewhart's Plan-Do-Check-Act process improvement cycle in a manner that allowed for quantitative analysis of the Countermeasure's outcomes and of Testing results. During fiscal year 2012, 9 A3 Problem Solving Reports were completed in the vivarium under the teaching and coaching system implemented by the Research Institute. Two of the 9 reports are described herein. Report #1 addressed the issue of the vivarium's veterinarian not being able to provide input into sick animal cases during the work day, while report #7 tackled the lack of a standard in keeping track of weekend/holiday animal health inspections. In each Report, a measurable Goal that established the basis for improvement recognition was present. A Five Whys analysis identified the Root Cause for Report #1 as historical work patterns that existed before the veterinarian was hired on and that modern electronic communication tools had not been implemented. The same analysis identified the Root Cause for Report #7 as the vivarium had never standardized the process for weekend/holiday checks. Successful outcomes for both Reports were obtained and validated by robust audit plans. The collective data indicate that vivarium staff acquired a disciplined way of reporting on, as well as solving, problems in a manner consistent with high

  13. The a3 problem solving report: a 10-step scientific method to execute performance improvements in an academic research vivarium.

    Directory of Open Access Journals (Sweden)

    James A Bassuk

    Full Text Available The purpose of this study was to illustrate the application of A3 Problem Solving Reports of the Toyota Production System to our research vivarium through the methodology of Continuous Performance Improvement, a lean approach to healthcare management at Seattle Children's (Hospital, Research Institute, Foundation. The Report format is described within the perspective of a 10-step scientific method designed to realize measurable improvements of Issues identified by the Report's Author, Sponsor and Coach. The 10-step method (Issue, Background, Current Condition, Goal, Root Cause, Target Condition, Countermeasures, Implementation Plan, Test, and Follow-up was shown to align with Shewhart's Plan-Do-Check-Act process improvement cycle in a manner that allowed for quantitative analysis of the Countermeasure's outcomes and of Testing results. During fiscal year 2012, 9 A3 Problem Solving Reports were completed in the vivarium under the teaching and coaching system implemented by the Research Institute. Two of the 9 reports are described herein. Report #1 addressed the issue of the vivarium's veterinarian not being able to provide input into sick animal cases during the work day, while report #7 tackled the lack of a standard in keeping track of weekend/holiday animal health inspections. In each Report, a measurable Goal that established the basis for improvement recognition was present. A Five Whys analysis identified the Root Cause for Report #1 as historical work patterns that existed before the veterinarian was hired on and that modern electronic communication tools had not been implemented. The same analysis identified the Root Cause for Report #7 as the vivarium had never standardized the process for weekend/holiday checks. Successful outcomes for both Reports were obtained and validated by robust audit plans. The collective data indicate that vivarium staff acquired a disciplined way of reporting on, as well as solving, problems in a manner

  14. PREFACE: XVIII International Scientific Symposium in Honour of Academician M. A. Usov: Problems of Geology and Subsurface Development

    Science.gov (United States)

    2014-08-01

    XVIII International Scientific Symposium in honor of Academician M.A. Usov ''Problems of Geology and Subsurface Development'' (for students and young scientists) was organized under the guidance of the Ministry of Education and Science of the Russian Federation and the Russian Foundation for Basic Research. Being one of the oldest technical higher education institutions which trains specialists who contribute to scientific research in geosciences, The Institute of Natural Resources of National Research Tomsk Polytechnic University (TPU INR) was chosen to hold the symposium. In 2014 The Institute of Natural Resources celebrated its 113th anniversary. It was founded in 1901 by V.A. Obruchev, the first geologist in Siberia, member of USSR Academy of Sciences, Hero of Socialist Labor, and the first Laureate of the Lenin Prize. He was recognized all over the world as a prominent scientist in the area of geology. INR is the first institute of geological education and geosciences in the Asian part of Russia. Siberian Mining and Geological Schola, established by V.A. Obruchev and M.A. Usov, has been retaining its significance for discovery, exploration and development of mineral resources not only in Siberia, in the Far East and North-East of the country, but also in Central Asia. There are a lot of outstanding scientists, engineers and manufacturers among alumni of The Institute of Natural Resources. The institute is proud of M.A. Usov, the student and first postgraduate of V.A. Obruchev, first professor and academician in Siberia, whose name is associated with the development of the mining industry in Siberia; Academician K.I. Satpaev, the founder and first president of the Academy of Sciences of Kazakhstan; Professor N.N. Urvantsev, the discoverer of the unique Norilsk ore deposits in the north of East Siberia and Professor M.K. Korovin, who considered West Siberia deposits to be prospective for oil-gas exploration. There are over 35 000 graduates of the institute and

  15. Sparse deconvolution for the large-scale ill-posed inverse problem of impact force reconstruction

    Science.gov (United States)

    Qiao, Baijie; Zhang, Xingwu; Gao, Jiawei; Liu, Ruonan; Chen, Xuefeng

    2017-01-01

    Most previous regularization methods for solving the inverse problem of force reconstruction are to minimize the l2-norm of the desired force. However, these traditional regularization methods such as Tikhonov regularization and truncated singular value decomposition, commonly fail to solve the large-scale ill-posed inverse problem in moderate computational cost. In this paper, taking into account the sparse characteristic of impact force, the idea of sparse deconvolution is first introduced to the field of impact force reconstruction and a general sparse deconvolution model of impact force is constructed. Second, a novel impact force reconstruction method based on the primal-dual interior point method (PDIPM) is proposed to solve such a large-scale sparse deconvolution model, where minimizing the l2-norm is replaced by minimizing the l1-norm. Meanwhile, the preconditioned conjugate gradient algorithm is used to compute the search direction of PDIPM with high computational efficiency. Finally, two experiments including the small-scale or medium-scale single impact force reconstruction and the relatively large-scale consecutive impact force reconstruction are conducted on a composite wind turbine blade and a shell structure to illustrate the advantage of PDIPM. Compared with Tikhonov regularization, PDIPM is more efficient, accurate and robust whether in the single impact force reconstruction or in the consecutive impact force reconstruction.

  16. Proceedings of 1.International scientific and technological conference 'Modern problems of geophysics, geology, development, processing and use of Kazakhstan hydrocarbon raw materials'. v. 1-2

    International Nuclear Information System (INIS)

    2000-01-01

    Proceedings of reports presented on 1.International scientific and technological conference 'Modern problems of geophysics, geology, development, processing and use of Kazakhstan hydrocarbon raw materials', devoted to the 20th anniversary of the Atyrau Institute of Oil and Gas (Atyrau, 2000, 18-19 December) are published in 2 volumes. The problems and new methods for prediction of oil and gas as well as different resources in both the coastal lands and the shelf of the Caspian Sea are considered. Scientific problems of drilling and repair of oil and gas wells are highlighted. Results of fundamental and applied studies on problems of oil and oil products processing, its transportation through pipelines with taking into account rheological and physico-chemical properties of oils mining on western fields of the Republic are cited. The points of ecological safety guarantee, reliability of mechanisms and machines operation and others problems are widely discussed

  17. Titanium condenser tubes. Problems and their solution for wider application to large surface condensers. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Sato, S; Sugiyama, S; Nagata, K; Nanba, K; Shimono, M [Sumitomo Light Metal Industries Ltd., Tokyo (Japan)

    1977-06-01

    The corrosion resistance of titanium in sea water is extremely excellent, but titanium tubes are expensive, and the copper alloy tubes resistant in polluted sea water were developed, therefore they were not used practically. In 1970, ammonia attack was found on the copper alloy tubes in the air-cooled portion of condensers, and titanium tubes have been used as the countermeasure. As the result of the use, the galvanic attack on copper alloy tube plates with titanium tubes as cathode and the hydrogen absorption at titanium tube ends owing to excess electrolytic protection was observed, but the corrosion resistance of titanium tubes was perfect. These problems can be controlled by the application of proper electrolytic protection. The condensers with all titanium tubes adopted recently in USA are intended to realize perfectly no-leak condensers as the countermeasure to the corrosion in steam generators of PWR plants. Regarding large condensers of nowadays, three problems are pointed out, namely the vibration of condenser tubes, the method of joining tubes and tube plates, and the tubes of no coolant leak. These three problems in case of titanium tubes were studied, and the problem of the fouling of tubes was also examined. The intervals of supporting plates for titanium tubes should be narrowed. The joining of titanium tubes and titanium tube plates by welding is feasible and promising. The cleaning with sponge balls is effective to control fouling.

  18. Scientific ballooning. Proceedings of the symposium on the scientific use of balloons and related technical problems, Innsbruck, Austria, May 29-June 10, 1978

    Energy Technology Data Exchange (ETDEWEB)

    Riedler, W

    1979-01-01

    The book includes works on operational and technical aspects of balloon launching I and II, cooperative balloon campaigns, and new developments in scientific use of balloons. The specific topics discussed are coordinated balloon and rocket measurements of stratospheric wind shears and turbulence, ballooning in Japan and India, magnetospheric processes investigated with data taken from balloon flights, and remote sensing of middle atmosphere winds from balloon platforms.

  19. Large neighborhood search for the double traveling salesman problem with multiple stacks

    Energy Technology Data Exchange (ETDEWEB)

    Bent, Russell W [Los Alamos National Laboratory; Van Hentenryck, Pascal [BROWN UNIV

    2009-01-01

    This paper considers a complex real-life short-haul/long haul pickup and delivery application. The problem can be modeled as double traveling salesman problem (TSP) in which the pickups and the deliveries happen in the first and second TSPs respectively. Moreover, the application features multiple stacks in which the items must be stored and the pickups and deliveries must take place in reserve (LIFO) order for each stack. The goal is to minimize the total travel time satisfying these constraints. This paper presents a large neighborhood search (LNS) algorithm which improves the best-known results on 65% of the available instances and is always within 2% of the best-known solutions.

  20. Future Low Temperature Plasma Science and Technology: Attacking Major Societal Problems by Building on a Tradition of Scientific Rigor

    Science.gov (United States)

    Graves, David

    2014-10-01

    Low temperature plasma (LTP) science is unequivocally one of the most prolific areas for varied applications in modern technology. For example, plasma etching technology is essential for reliably and rapidly patterning nanometer scale features over areas approaching one square meter with relatively inexpensive equipment. This technology enabled the telecommunication and information processing revolution that has transformed human society. I explore two concepts in this talk. The first is that the firm scientific understanding of LTP is and has been the enabling feature of these established technological applications. And the second is that LTP technology is poised to contribute to several emerging societal challenges. Beyond the important, ongoing applications of LTP science to problems of materials processing related to energy generation (e.g. thin film solar cell manufacture), there are novel and less well known potential applications in food and agriculture, infection control and medicine. In some cases, the potentially low cost nature of the applications in so compelling that they can be thought of as examples of frugal innovation. Supported in part by NSF and DoE.

  1. [Scientific bases of the organization of psychiatric care: the solution of practical problems in the framework of priority research].

    Science.gov (United States)

    Yastrebov, V S; Mitikhin, V G; Solokhina, T A; Mitikhina, I A

    ОBJECTIVE: a system analysis and modeling for important areas of research of the organization of psychiatric services in Russia in the study mental health of the population, identification of factors affecting the formation of the contingent of persons with mental disorders, organizational and functional structure of mental health services and mental health care. The authors analyzed scientific publications on the problems of psychiatric care organization as well as the results of own research over the last 25 years using system analysis. The approach that allows a creation of a range of population models to monitor the status of mental health based on medical, demographic and social factors (more than 60 factors) of life was suggested. The basic models and approaches for the evaluation of activity of divisions of mental health services at the macro and micro-social levels, taking into account expert information and individual characteristics of patients and relatives, were demonstrated. To improve treatment quality, the models of identification of the factors, which positively or negatively influenced the commitment to psychopharmacotherapy of patients with schizophrenia and their families, were developed.

  2. An analysis of 12th-grade students' reasoning styles and competencies when presented with an environmental problem in a social and scientific context

    Science.gov (United States)

    Yang, Fang-Ying

    This study examined reasoning and problem solving by 182 12th grade students in Taiwan when considering a socio-scientific issue regarding the use of nuclear energy. Students' information preferences, background characteristics, and eleven everyday scientific thinking skills were scrutinized. It was found most participants displayed a willingness to take into account both scientific and social information in reasoning the merits of a proposed construction of a nuclear power plant. Students' reasoning scores obtained from the "information reasoning style" test ranged from -0.5 to 1.917. And, the distribution was approximately normal with mean and median at around 0.5. For the purpose of categorization, students whose scores were within one standard deviation from the mean were characterized as having a "equally disposed" reasoning style. One hundred and twenty-five subjects, about 69%, belonged to this category. Students with scores locating at the two tails of the distribution were assigned to either the "scientifically oriented" or the "socially oriented" reasoning category. Among 23 background characteristics investigated using questionnaire data and ANOVA statistical analysis, only students' science performance and knowledge about nuclear energy were statistically significantly related to their information reasoning styles (p religion, and the political party preference. For everyday scientific thinking skills, interview data showed that both "scientifically oriented" students and those who were categorized as "equally disposed to using scientific and social scientific sources of data" displayed higher frequencies than "socially oriented" ones in using these skills, except in the use of the "multidisciplinary thinking" skill. Among the 11 skills assessed, the "scientifically oriented" students outperformed the "equally disposed" ones only in the use of 3 thinking skills; namely, searching for or recalling scientific concepts/evidence, recognizing and evaluating

  3. Pseudoinverse preconditioners and iterative methods for large dense linear least-squares problems

    Directory of Open Access Journals (Sweden)

    Oskar Cahueñas

    2013-05-01

    Full Text Available We address the issue of approximating the pseudoinverse of the coefficient matrix for dynamically building preconditioning strategies for the numerical solution of large dense linear least-squares problems. The new preconditioning strategies are embedded into simple and well-known iterative schemes that avoid the use of the, usually ill-conditioned, normal equations. We analyze a scheme to approximate the pseudoinverse, based on Schulz iterative method, and also different iterative schemes, based on extensions of Richardson's method, and the conjugate gradient method, that are suitable for preconditioning strategies. We present preliminary numerical results to illustrate the advantages of the proposed schemes.

  4. Large-Scale Parallel Finite Element Analysis of the Stress Singular Problems

    International Nuclear Information System (INIS)

    Noriyuki Kushida; Hiroshi Okuda; Genki Yagawa

    2002-01-01

    In this paper, the convergence behavior of large-scale parallel finite element method for the stress singular problems was investigated. The convergence behavior of iterative solvers depends on the efficiency of the pre-conditioners. However, efficiency of pre-conditioners may be influenced by the domain decomposition that is necessary for parallel FEM. In this study the following results were obtained: Conjugate gradient method without preconditioning and the diagonal scaling preconditioned conjugate gradient method were not influenced by the domain decomposition as expected. symmetric successive over relaxation method preconditioned conjugate gradient method converged 6% faster as maximum if the stress singular area was contained in one sub-domain. (authors)

  5. Escript: Open Source Environment For Solving Large-Scale Geophysical Joint Inversion Problems in Python

    Science.gov (United States)

    Gross, Lutz; Altinay, Cihan; Fenwick, Joel; Smith, Troy

    2014-05-01

    inversion and appropriate solution schemes in escript. We will also give a brief introduction into escript's open framework for defining and solving geophysical inversion problems. Finally we will show some benchmark results to demonstrate the computational scalability of the inversion method across a large number of cores and compute nodes in a parallel computing environment. References: - L. Gross et al. (2013): Escript Solving Partial Differential Equations in Python Version 3.4, The University of Queensland, https://launchpad.net/escript-finley - L. Gross and C. Kemp (2013) Large Scale Joint Inversion of Geophysical Data using the Finite Element Method in escript. ASEG Extended Abstracts 2013, http://dx.doi.org/10.1071/ASEG2013ab306 - T. Poulet, L. Gross, D. Georgiev, J. Cleverley (2012): escript-RT: Reactive transport simulation in Python using escript, Computers & Geosciences, Volume 45, 168-176. http://dx.doi.org/10.1016/j.cageo.2011.11.005.

  6. Scientific and technical conference. Problems and horizons of development of chemical and radiochemical control in nuclear energetics. Collection of summaries of reports

    International Nuclear Information System (INIS)

    2001-01-01

    During scientific and technical conference on problems of development of chemical and radiochemical control in nuclear energetics following themes were considered: the problems of methodological and instrumental assurance of chemical and radiochemical control at working nuclear power plants and nuclear energetic units; modern conceptions of automation systems construction of chemical and radiochemical control on the basis of intellectual measuring channels; the ways of decision of generally system problems of organization and management of chemical and radiochemical control using computed technologies; the problems of certification of chemical and radiochemical methods of measuring in nuclear energetics [ru

  7. EvArnoldi: A New Algorithm for Large-Scale Eigenvalue Problems.

    Science.gov (United States)

    Tal-Ezer, Hillel

    2016-05-19

    Eigenvalues and eigenvectors are an essential theme in numerical linear algebra. Their study is mainly motivated by their high importance in a wide range of applications. Knowledge of eigenvalues is essential in quantum molecular science. Solutions of the Schrödinger equation for the electrons composing the molecule are the basis of electronic structure theory. Electronic eigenvalues compose the potential energy surfaces for nuclear motion. The eigenvectors allow calculation of diople transition matrix elements, the core of spectroscopy. The vibrational dynamics molecule also requires knowledge of the eigenvalues of the vibrational Hamiltonian. Typically in these problems, the dimension of Hilbert space is huge. Practically, only a small subset of eigenvalues is required. In this paper, we present a highly efficient algorithm, named EvArnoldi, for solving the large-scale eigenvalues problem. The algorithm, in its basic formulation, is mathematically equivalent to ARPACK ( Sorensen , D. C. Implicitly Restarted Arnoldi/Lanczos Methods for Large Scale Eigenvalue Calculations ; Springer , 1997 ; Lehoucq , R. B. ; Sorensen , D. C. SIAM Journal on Matrix Analysis and Applications 1996 , 17 , 789 ; Calvetti , D. ; Reichel , L. ; Sorensen , D. C. Electronic Transactions on Numerical Analysis 1994 , 2 , 21 ) (or Eigs of Matlab) but significantly simpler.

  8. Radiation problems in the design of the large electron-positron collider (LEP)

    International Nuclear Information System (INIS)

    Fasso, A.; Goebel, K.; Hoefert, M.; Rau, G.; Schoenbacher, H.; Stevenson, G.R.; Sullivan, A.H.; Swanson, W.P.; Tuyn, J.W.N.

    1984-01-01

    This is a comprehensive review of the radiation problems taken into account in the design studies for the Large Electron-Positron collider (LEP) now under construction at CERN. It provides estimates and calculations of the magnitude of the most important hazards, including those from non-ionizing radiations and magnetic fields as well as from ionizing radiation, and describes the measures to be taken in the design, construction, and operation to limit them. Damage to components is considered as well as the risk to people. More general explanations are given of the physical processes and technical parameters that influence the production and effects of radiation, and a comprehensive bibliography provides access to the basic theories and other discussions of the subject. The report effectively summarizes the findings of the Working Group on LEP radiation problems and parallels the results of analogous studies made for the previous large accelerator. The concluding chapters describe the LEP radiation protection system, which is foreseen to reduce doses far below the legal limits for all those working with the machine or living nearby, and summarize the environmental impact. Costs are also briefly considered. (orig.)

  9. Decomposition and parallelization strategies for solving large-scale MDO problems

    Energy Technology Data Exchange (ETDEWEB)

    Grauer, M.; Eschenauer, H.A. [Research Center for Multidisciplinary Analyses and Applied Structural Optimization, FOMAAS, Univ. of Siegen (Germany)

    2007-07-01

    During previous years, structural optimization has been recognized as a useful tool within the discriptiones of engineering and economics. However, the optimization of large-scale systems or structures is impeded by an immense solution effort. This was the reason to start a joint research and development (R and D) project between the Institute of Mechanics and Control Engineering and the Information and Decision Sciences Institute within the Research Center for Multidisciplinary Analyses and Applied Structural Optimization (FOMAAS) on cluster computing for parallel and distributed solution of multidisciplinary optimization (MDO) problems based on the OpTiX-Workbench. Here the focus of attention will be put on coarsegrained parallelization and its implementation on clusters of workstations. A further point of emphasis was laid on the development of a parallel decomposition strategy called PARDEC, for the solution of very complex optimization problems which cannot be solved efficiently by sequential integrated optimization. The use of the OptiX-Workbench together with the FEM ground water simulation system FEFLOW is shown for a special water management problem. (orig.)

  10. PREFACE: XIX International Scientific Symposium in honor of Academician M.A. Usov ''Problems of Geology and Subsurface Development''

    Science.gov (United States)

    Ivanova, G. M.

    2015-11-01

    XIX International Scientific Symposium in honor of Academician M.A. Usov ''Problems of Geology and Subsurface Development'' (for students and young scientists) was organized under the guidance of the Ministry of Education and Science of the Russian Federation and the Russian Foundation for Fundamental Research within the National Research Tomsk Polytechnic University (NR TPU). TPU is one of the oldest technical higher education institutions in Russia, training specialists in the domain of geoscience and enhancing their further research in this area. The Institute of Natural Resources, National Research Tomsk Polytechnic University (INR TPU) was chosen to hold the International Scientific Symposium. In 2015 the Institute of Natural Resources celebrated its 114th anniversary. It was founded by V.A. Obruchev in 1901, first Siberian geologist, member of USSR Academy of Sciences, Hero of Socialist Labor, and first Laureate of the Lenin Prize. He was recognized as a prominent scientist in the area of geology all over the world. INR is the first institute of geological education and geosciences in Asian Russia. Even today the Siberian Mining and Geological School, established by V.A. Obruchev and M.A. Usov, has retained its significance in the discovery, exploration and development of mineral resources not only in Siberia, the Far East and North-East of Russia, but also in Central Asia. There are numerous outstanding scientists and engineers among alumni of The Institute of Natural Resources. The institute is proud of such outstanding people as: M.A. Usov, student and first postgraduate of V.A. Obruchev, first professor and academician in Siberia, whose name is associated with the mining industry in Siberia; Academician K.I. Satpaev, founder and first president of the Academy of Sciences of Kazakhstan; Professor N.N. Urvantsev, discoverer of the unique Norilsk ore deposits in the North of East Siberia and Professor M.K. Korovin, who, in the 30s of the 20th century

  11. Interacting star clusters in the Large Magellanic Cloud. Overmerging problem solved by cluster group formation

    Science.gov (United States)

    Leon, Stéphane; Bergond, Gilles; Vallenari, Antonella

    1999-04-01

    We present the tidal tail distributions of a sample of candidate binary clusters located in the bar of the Large Magellanic Cloud (LMC). One isolated cluster, SL 268, is presented in order to study the effect of the LMC tidal field. All the candidate binary clusters show tidal tails, confirming that the pairs are formed by physically linked objects. The stellar mass in the tails covers a large range, from 1.8x 10(3) to 3x 10(4) \\msun. We derive a total mass estimate for SL 268 and SL 356. At large radii, the projected density profiles of SL 268 and SL 356 fall off as r(-gamma ) , with gamma = 2.27 and gamma =3.44, respectively. Out of 4 pairs or multiple systems, 2 are older than the theoretical survival time of binary clusters (going from a few 10(6) years to 10(8) years). A pair shows too large age difference between the components to be consistent with classical theoretical models of binary cluster formation (Fujimoto & Kumai \\cite{fujimoto97}). We refer to this as the ``overmerging'' problem. A different scenario is proposed: the formation proceeds in large molecular complexes giving birth to groups of clusters over a few 10(7) years. In these groups the expected cluster encounter rate is larger, and tidal capture has higher probability. Cluster pairs are not born together through the splitting of the parent cloud, but formed later by tidal capture. For 3 pairs, we tentatively identify the star cluster group (SCG) memberships. The SCG formation, through the recent cluster starburst triggered by the LMC-SMC encounter, in contrast with the quiescent open cluster formation in the Milky Way can be an explanation to the paucity of binary clusters observed in our Galaxy. Based on observations collected at the European Southern Observatory, La Silla, Chile}

  12. Problems of Chernobyl. Materials of International scientific and practical conference 'Shelter-98'; Problemi Chornobilya. Materyiali Myizhnarodnoyi Naukovo-Praktichnoyi Konferentsyiyi 'Ukrittya-98'

    Energy Technology Data Exchange (ETDEWEB)

    Klyuchnikov, O O [eds.

    1999-07-01

    These transactions contain materials of International Scientific and Practical Conference 'Shelter-98', which was held 27-30 November 1998 in Slavutich. They describe the results of the research work of the specialists from Ukraine, neighborhood and far foreign counties. The results, targeted at solving the problems of converting the Shelter Object into oncologically safe state.

  13. Proceedings of international scientific conference 'Sakharov readings 2004: Environmental problems of the XXI century'; Materialy mezhdunarodnoj nauchnoj konferentsii 'Sakharovskie chteniya 2004 goda: Ehkologicheskie problemy XXI veka'

    Energy Technology Data Exchange (ETDEWEB)

    Kundas, S P; Chudakov, V A [International A. Sakharov Environmental Univ., Minsk (Belarus)

    2004-05-01

    The present publication represents the collection of materials of a scientific conference, which was organized by Ministry for Education of the Republic of Belarus on the basis of International A. Sakharov Environmental University (Minsk, Republic of Belarus). The ecological problems were viewed on the following directions: medical and biological ecology, radioecology and ecological monitoring, eco priority power engineering, social ecology.

  14. The Effect of Problem-Based Learning on Undergraduate Students' Learning about Solutions and Their Physical Properties and Scientific Processing Skills

    Science.gov (United States)

    Tosun, Cemal; Taskesenligil, Yavuz

    2013-01-01

    The aim of this study was to investigate the effect of Problem-Based Learning (PBL) on undergraduate students' learning about solutions and their physical properties, and on their scientific processing skills. The quasi experimental study was carried out through non-equivalent control and comparison groups pre-post test design. The data were…

  15. SCIENTIFIC AND EDUCATIONAL GEOPORTAL AS INSTRUMENT OF INTEGRATION OF RESULTS OF SCIENTIFIC RESEARCHES OF THE REPUBLIC OF BASHKORTOSTAN BY THE LARGE NUMBER OF USERS

    Directory of Open Access Journals (Sweden)

    Olga I. Hristodulo

    2015-01-01

    Full Text Available The article covers the urgency of establishing a scientifi c and educational geoportal as a single data center for the Republic of Bashkortostan, providing quick access to a distributed network of geospatial data and geoservices to all responsible and interested parties. We considered the main tasks, functions and architecture of a scientifi c and educational geoportal for different types of users. We also carried out a comparative analysis of the basic technology for the development of mapping services and information systems, representing the major structural elements of geoportals. As an example, we considered information retrieval problems of the scientifi c and educational geoportal for the Republic of Bashkortostan. 

  16. No firewalls or information problem for black holes entangled with large systems

    Science.gov (United States)

    Stoltenberg, Henry; Albrecht, Andreas

    2015-01-01

    We discuss how under certain conditions the black hole information puzzle and the (related) arguments that firewalls are a typical feature of black holes can break down. We first review the arguments of Almheiri, Marolf, Polchinski and Sully favoring firewalls, focusing on entanglements in a simple toy model for a black hole and the Hawking radiation. By introducing a large and inaccessible system entangled with the black hole (representing perhaps a de Sitter stretched horizon or inaccessible part of a landscape), we show complementarity can be restored and firewalls can be avoided throughout the black hole's evolution. Under these conditions black holes do not have an "information problem." We point out flaws in some of our earlier arguments that such entanglement might be generically present in some cosmological scenarios and call out certain ways our picture may still be realized.

  17. How multiagency partnerships can successfully address large-scale pollution problems: a Hawaii case study.

    Science.gov (United States)

    Donohue, Mary J

    2003-06-01

    Oceanic circulation patterns deposit significant amounts of marine pollution, including derelict fishing gear from North Pacific Ocean fisheries, in the Hawaiian Archipelago [Mar. Pollut. Bull. 42(12) (2001) 1301]. Management responsibility for these islands and their associated natural resources is shared by several government authorities. Non-governmental organizations (NGOs) and private industry also have interests in the archipelago. Since the marine debris problem in this region is too large for any single agency to manage, a multiagency marine debris working group (group) was established in 1998 to improve marine debris mitigation in Hawaii. To date, 16 federal, state, and local agencies, working with industry and NGOs, have removed 195 tons of derelict fishing gear from the Northwestern Hawaiian Islands. This review details the evolution of the partnership, notes its challenges and rewards, and advocates its continued use as an effective resource management tool.

  18. Application of spectral Lanczos decomposition method to large scale problems arising geophysics

    Energy Technology Data Exchange (ETDEWEB)

    Tamarchenko, T. [Western Atlas Logging Services, Houston, TX (United States)

    1996-12-31

    This paper presents an application of Spectral Lanczos Decomposition Method (SLDM) to numerical modeling of electromagnetic diffusion and elastic waves propagation in inhomogeneous media. SLDM approximates an action of a matrix function as a linear combination of basis vectors in Krylov subspace. I applied the method to model electromagnetic fields in three-dimensions and elastic waves in two dimensions. The finite-difference approximation of the spatial part of differential operator reduces the initial boundary-value problem to a system of ordinary differential equations with respect to time. The solution to this system requires calculating exponential and sine/cosine functions of the stiffness matrices. Large scale numerical examples are in a good agreement with the theoretical error bounds and stability estimates given by Druskin, Knizhnerman, 1987.

  19. A Very Large Area Network (VLAN) knowledge-base applied to space communication problems

    Science.gov (United States)

    Zander, Carol S.

    1988-01-01

    This paper first describes a hierarchical model for very large area networks (VLAN). Space communication problems whose solution could profit by the model are discussed and then an enhanced version of this model incorporating the knowledge needed for the missile detection-destruction problem is presented. A satellite network or VLAN is a network which includes at least one satellite. Due to the complexity, a compromise between fully centralized and fully distributed network management has been adopted. Network nodes are assigned to a physically localized group, called a partition. Partitions consist of groups of cell nodes with one cell node acting as the organizer or master, called the Group Master (GM). Coordinating the group masters is a Partition Master (PM). Knowledge is also distributed hierarchically existing in at least two nodes. Each satellite node has a back-up earth node. Knowledge must be distributed in such a way so as to minimize information loss when a node fails. Thus the model is hierarchical both physically and informationally.

  20. Proceedings of 8. international scientific conference 'Sakharov readings 2008: Ecological problems of XXI century'; Materialy 8-oj mezhdunarodnoj nauchnoj konferentsii 'Sakharovskie chteniya 2008 goda: Ehkologicheskie problemy XXI veka'

    Energy Technology Data Exchange (ETDEWEB)

    Kundas, S P; Mel' nov, S B; Poznyak, S S [International A. Sakharov environmental univ., Minsk (Belarus)

    2008-05-15

    The proceedings of the eighth international scientific conference 'Sakharov readings 2008: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: socio-ecological problems in the light of ideas of academic A. Sakharov; medical ecology; bioecology; biomonitoring, bioindication and bioremediation; radioecology and radiation protection; information systems and technologies in ecology; ecological management; ecological monitoring; ecological education, education for sustainable development; ecological ethics in bioethics education system; problems and prospects of renewable energetics development in Belarus. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  1. Scientometric analysis of the means of scientific communication of the problem of medical consequences of Chernobyl Nuclear accident

    International Nuclear Information System (INIS)

    Artamonova, N.O.; Kulyinyich, G.V.; Pavlyichenko, Yu.V.; Gorvan', A.Je.; Zakrut'ko, L.Yi.; Novgorods'ka, L.M.; Byilan, L.G.

    2014-01-01

    In this paper evaluation of the structure and trends in the development of the Ukrainian scientific communication tools on the medical consequences of the Chernobyl nuclear accident using bibliometric methods has been given. The main developers of methodical documents are allocated, the dynamics of the distribution of methodical references, information letters and innovations is estimated. The importance of scientific communications tools in dissemination and use of new medical knowledge is demonstrated

  2. Aspects of the role of scientific-technical expert knowledge in administrative court procedures on licensing of large technical projects

    International Nuclear Information System (INIS)

    Wagner, H.

    1983-01-01

    On the basis of atomic energy law, the author explains some specific problems associated with the respective roles of experts (or expert bodies) and courts of law. In legal theory, it is comparatively easy to draw the line between the two functions, but in practice this delimination meets with difficulties. Finally, the author proposes to improve the definitions of the respective functions of experts (expert bodies) and courts of law in procedures dealing with permits of large technical facilities as follows: A highly qualified, independent body of experts in a technically representative composition lays down, in a binding way, the main elements of the safety standard of a specific plant or type of plant. The responsible administrative authority, after having examined all other legal conditions, grants the permit for that plant. There are no objections to such a model in the light either of aspects of constitutional law or of legal policy or constitutional policy, not are there any practical reasons against this approach. The only doubtful aspect is the present political feasibility. (orig.) [de

  3. A Study on "Distinction of the Problem" as the Scientific Thinking : Development of the Teaching Method in Elementary School Science

    OpenAIRE

    川﨑, 弘作; 松浦, 拓也; 中山, 貴司

    2013-01-01

    The purpose of this study is to devise the teaching method to develop the ability to distinguish whether we can investigate the problem. We call it "distinction of the problem". The teaching method has two features: (1) Letting them distinguish whether they can investigate the problem in a problem setting scene, (2) Letting them use the worksheet about the way of thinking to distinguish the problem. This teaching method was administered to the 64 sixth graders to investigate the availability ...

  4. SCIENTIFIC PROBLEM: FEATURES AND TRENDS IN PEDAGOGIC RESEARCH / RASGOS Y TENDENCIAS DEL PROBLEMA CIENTÍFICO EN LA INVESTIGACIÓN PEDAGÓGICA

    Directory of Open Access Journals (Sweden)

    Yaritza Tardo Fernández

    2012-12-01

    Full Text Available An analysis of the main historical trends in the elaboration of the scientific problem in the pedagogic investigation is carried out starting from assuming and systematizing the features distinguished taking into account some examples of doctoral thesis defended in the field of the Pedagogic Sciences, what allows to reveal their behavior and development as expression of its own transformations in the logic of the scientific thought in a certain moment, as a didactic attempt to contribute to the process of the investigators' formation.

  5. Oceans of Data: In what ways can learning research inform the development of electronic interfaces and tools for use by students accessing large scientific databases?

    Science.gov (United States)

    Krumhansl, R. A.; Foster, J.; Peach, C. L.; Busey, A.; Baker, I.

    2012-12-01

    The practice of science and engineering is being revolutionized by the development of cyberinfrastructure for accessing near real-time and archived observatory data. Large cyberinfrastructure projects have the potential to transform the way science is taught in high school classrooms, making enormous quantities of scientific data available, giving students opportunities to analyze and draw conclusions from many kinds of complex data, and providing students with experiences using state-of-the-art resources and techniques for scientific investigations. However, online interfaces to scientific data are built by scientists for scientists, and their design can significantly impede broad use by novices. Knowledge relevant to the design of student interfaces to complex scientific databases is broadly dispersed among disciplines ranging from cognitive science to computer science and cartography and is not easily accessible to designers of educational interfaces. To inform efforts at bridging scientific cyberinfrastructure to the high school classroom, Education Development Center, Inc. and the Scripps Institution of Oceanography conducted an NSF-funded 2-year interdisciplinary review of literature and expert opinion pertinent to making interfaces to large scientific databases accessible to and usable by precollege learners and their teachers. Project findings are grounded in the fundamentals of Cognitive Load Theory, Visual Perception, Schemata formation and Universal Design for Learning. The Knowledge Status Report (KSR) presents cross-cutting and visualization-specific guidelines that highlight how interface design features can address/ ameliorate challenges novice high school students face as they navigate complex databases to find data, and construct and look for patterns in maps, graphs, animations and other data visualizations. The guidelines present ways to make scientific databases more broadly accessible by: 1) adjusting the cognitive load imposed by the user

  6. [Scientific journalism and epidemiological risk].

    Science.gov (United States)

    Luiz, Olinda do Carmo

    2007-01-01

    The importance of the communications media in the construction of symbols has been widely acknowledged. Many of the articles on health published in the daily newspapers mention medical studies, sourced from scientific publications focusing on new risks. The disclosure of risk studies in the mass media is also a topic for editorials and articles in scientific journals, focusing the problem of distortions and the appearance of contradictory news items. The purpose of this paper is to explore the meaning and content of disclosing scientific risk studies in large-circulation daily newspapers, analyzing news items published in Brazil and the scientific publications used as their sources during 2000. The "risk" is presented in the scientific research projects as a "black box" in the meaning of Latour, with the news items downplaying scientific disputes and underscoring associations between behavioral habits and the occurrence of diseases, emphasizing individual aspects of the epidemiological approach, to the detriment of the group.

  7. Solving Man-Induced Large-Scale Conservation Problems: The Spanish Imperial Eagle and Power Lines

    Science.gov (United States)

    López-López, Pascual; Ferrer, Miguel; Madero, Agustín; Casado, Eva; McGrady, Michael

    2011-01-01

    Background Man-induced mortality of birds caused by electrocution with poorly-designed pylons and power lines has been reported to be an important mortality factor that could become a major cause of population decline of one of the world rarest raptors, the Spanish imperial eagle (Aquila adalberti). Consequently it has resulted in an increasing awareness of this problem amongst land managers and the public at large, as well as increased research into the distribution of electrocution events and likely mitigation measures. Methodology/Principal Findings We provide information of how mitigation measures implemented on a regional level under the conservation program of the Spanish imperial eagle have resulted in a positive shift of demographic trends in Spain. A 35 years temporal data set (1974–2009) on mortality of Spanish imperial eagle was recorded, including population censuses, and data on electrocution and non-electrocution of birds. Additional information was obtained from 32 radio-tracked young eagles and specific field surveys. Data were divided into two periods, before and after the approval of a regional regulation of power line design in 1990 which established mandatory rules aimed at minimizing or eliminating the negative impacts of power lines facilities on avian populations. Our results show how population size and the average annual percentage of population change have increased between the two periods, whereas the number of electrocuted birds has been reduced in spite of the continuous growing of the wiring network. Conclusions Our results demonstrate that solving bird electrocution is an affordable problem if political interest is shown and financial investment is made. The combination of an adequate spatial planning with a sustainable development of human infrastructures will contribute positively to the conservation of the Spanish imperial eagle and may underpin population growth and range expansion, with positive side effects on other endangered

  8. Problems occurred in prospecting and mining of a broken thick large uranium ore body and improvement scheme

    International Nuclear Information System (INIS)

    Hu Longfei; Fang Yang; Wang Yishun; Wang Long

    2014-01-01

    In prospecting and mining of a broken thick large uranium ore body, uncertain prospecting and shallow-hole shrinkage mining method resulted in large dilution rate and resource waste problems. Aimed at these problems, improvement schemes of enhancing the strength force of 'drilling prospecting instead of pit prospecting' and employing filling method stoping ore body were applied, and improvement result was analyzed. Experience was accumulated and evidence was provided for late prospecting and stoping work. (authors)

  9. Scientific communication

    Directory of Open Access Journals (Sweden)

    Aleksander Kobylarek

    2017-09-01

    Full Text Available The article tackles the problem of models of communication in science. The formal division of communication processes into oral and written does not resolve the problem of attitude. The author defines successful communication as a win-win game, based on the respect and equality of the partners, regardless of their position in the world of science. The core characteristics of the process of scientific communication are indicated , such as openness, fairness, support, and creation. The task of creating the right atmosphere for science communication belongs to moderators, who should not allow privilege and differentiation of position to affect scientific communication processes.

  10. Well-posedness of the Cauchy problem for models of large amplitude internal waves

    International Nuclear Information System (INIS)

    Guyenne, Philippe; Lannes, David; Saut, Jean-Claude

    2010-01-01

    We consider in this paper the 'shallow-water/shallow-water' asymptotic model obtained in Choi and Camassa (1999 J. Fluid Mech. 396 1–36), Craig et al (2005 Commun. Pure. Appl. Math. 58 1587–641) (one-dimensional interface) and Bona et al (2008 J. Math. Pures Appl. 89 538–66) (two-dimensional interface) from the two-layer system with rigid lid, for the description of large amplitude internal waves at the interface of two layers of immiscible fluids of different densities. For one-dimensional interfaces, this system is of hyperbolic type and its local well-posedness does not raise serious difficulties, although other issues (blow-up, loss of hyperbolicity, etc) turn out to be delicate. For two-dimensional interfaces, the system is nonlocal. Nevertheless, we prove that it conserves some properties of 'hyperbolic type' and show that the associated Cauchy problem is locally well posed in suitable Sobolev classes provided some natural restrictions are imposed on the data. These results are illustrated by numerical simulations with emphasis on the formation of shock waves

  11. Solving very large scattering problems using a parallel PWTD-enhanced surface integral equation solver

    KAUST Repository

    Liu, Yang

    2013-07-01

    The computational complexity and memory requirements of multilevel plane wave time domain (PWTD)-accelerated marching-on-in-time (MOT)-based surface integral equation (SIE) solvers scale as O(NtNs(log 2)Ns) and O(Ns 1.5); here N t and Ns denote numbers of temporal and spatial basis functions discretizing the current [Shanker et al., IEEE Trans. Antennas Propag., 51, 628-641, 2003]. In the past, serial versions of these solvers have been successfully applied to the analysis of scattering from perfect electrically conducting as well as homogeneous penetrable targets involving up to Ns ≈ 0.5 × 106 and Nt ≈ 10 3. To solve larger problems, parallel PWTD-enhanced MOT solvers are called for. Even though a simple parallelization strategy was demonstrated in the context of electromagnetic compatibility analysis [M. Lu et al., in Proc. IEEE Int. Symp. AP-S, 4, 4212-4215, 2004], by and large, progress in this area has been slow. The lack of progress can be attributed wholesale to difficulties associated with the construction of a scalable PWTD kernel. © 2013 IEEE.

  12. A Framing Link Based Tabu Search Algorithm for Large-Scale Multidepot Vehicle Routing Problems

    Directory of Open Access Journals (Sweden)

    Xuhao Zhang

    2014-01-01

    Full Text Available A framing link (FL based tabu search algorithm is proposed in this paper for a large-scale multidepot vehicle routing problem (LSMDVRP. Framing links are generated during continuous great optimization of current solutions and then taken as skeletons so as to improve optimal seeking ability, speed up the process of optimization, and obtain better results. Based on the comparison between pre- and postmutation routes in the current solution, different parts are extracted. In the current optimization period, links involved in the optimal solution are regarded as candidates to the FL base. Multiple optimization periods exist in the whole algorithm, and there are several potential FLs in each period. If the update condition is satisfied, the FL base is updated, new FLs are added into the current route, and the next period starts. Through adjusting the borderline of multidepot sharing area with dynamic parameters, the authors define candidate selection principles for three kinds of customer connections, respectively. Link split and the roulette approach are employed to choose FLs. 18 LSMDVRP instances in three groups are studied and new optimal solution values for nine of them are obtained, with higher computation speed and reliability.

  13. 17 years after the Chernobyl' accident: problems and decisions. Proceedings of the International scientific and practical conference; 17 let posle Chernobylya: problemy i resheniya. Sbornik nauchnykh trudov Mezhdunarodnoj nauchno-prakticheskoj konferentsii

    Energy Technology Data Exchange (ETDEWEB)

    Shevchuk, V E; Gurachevskij, V L; Kolbanov, V V

    2003-04-01

    The book contains proceedings of the scientific conference on difference medical and biological problems of consequences of the Chernobyl NPP accident, as well as on the problems of rehabilitation of the contaminated territories and ecosystems.

  14. The Process of Scientific Inquiry as It Relates to the Creation/Evolution Controversy: I. A Serious Social Problem

    Science.gov (United States)

    Miller, Jon S.; Toth, Ronald

    2014-01-01

    We describe how the increased level of religiosity in the United States is correlated with the resistance to the teaching of evolution and argue that this is a social, rather than scientific, issue. Our goal is to foster teachers' understanding of the philosophy of biology and encourage them to proactively deal with creationism at all levels,…

  15. Tackling the "so what" problem in scientific research: a systems-based approach to resource and publication tracking.

    Science.gov (United States)

    Harris, Paul A; Kirby, Jacqueline; Swafford, Jonathan A; Edwards, Terri L; Zhang, Minhua; Yarbrough, Tonya R; Lane, Lynda D; Helmer, Tara; Bernard, Gordon R; Pulley, Jill M

    2015-08-01

    Peer-reviewed publications are one measure of scientific productivity. From a project, program, or institutional perspective, publication tracking provides the quantitative data necessary to guide the prudent stewardship of federal, foundation, and institutional investments by identifying the scientific return for the types of support provided. In this article, the authors describe the Vanderbilt Institute for Clinical and Translational Research's (VICTR's) development and implementation of a semiautomated process through which publications are automatically detected in PubMed and adjudicated using a "just-in-time" workflow by a known pool of researchers (from Vanderbilt University School of Medicine and Meharry Medical College) who receive support from Vanderbilt's Clinical and Translational Science Award. Since implementation, the authors have (1) seen a marked increase in the number of publications citing VICTR support, (2) captured at a more granular level the relationship between specific resources/services and scientific output, (3) increased awareness of VICTR's scientific portfolio, and (4) increased efficiency in complying with annual National Institutes of Health progress reports. They present the methodological framework and workflow, measures of impact for the first 30 months, and a set of practical lessons learned to inform others considering a systems-based approach for resource and publication tracking. They learned that contacting multiple authors from a single publication can increase the accuracy of the resource attribution process in the case of multidisciplinary scientific projects. They also found that combining positive (e.g., congratulatory e-mails) and negative (e.g., not allowing future resource requests until adjudication is complete) triggers can increase compliance with publication attribution requests.

  16. III Scientific-technical conference. Problems and outlooks for development of chemical and radiochemical control in atomic energetics (Atomenergoanalytics-2005). Summaries of reports

    International Nuclear Information System (INIS)

    2005-01-01

    Summaries of reports of the III Scientific-technical conference: Problems and outlooks for development of chemical and radiochemical control in atomic energetics (Atomenergoanalytics-2005) are presented. The conference performed 20-22 September, 2005, in Sosnovyj Bor. Problems of methodical, instrumental and metrological supply of chemical, radiochemical and radiometric control at active NPP and NPU, modern concepts of construction of automated systems of chemical and radiometric control in the atomic energetics, directions for the decision of questions in organization and conducting of chemical and radiochemical control of water-chemical regimes of NPP and NPU are discussed [ru

  17. Beyond Music Sharing: An Evaluation of Peer-to-Peer Data Dissemination Techniques in Large Scientific Collaborations

    Energy Technology Data Exchange (ETDEWEB)

    Ripeanu, Matei [University of British Columbia, Vancouver; Al-Kiswany, Samer [University of British Columbia, Vancouver; Iamnitchi, Adriana [University of South Florida, Tampa; Vazhkudai, Sudharshan S [ORNL

    2009-03-01

    The avalanche of data from scientific instruments and the ensuing interest from geographically distributed users to analyze and interpret it accentuates the need for efficient data dissemination. A suitable data distribution scheme will find the delicate balance between conflicting requirements of minimizing transfer times, minimizing the impact on the network, and uniformly distributing load among participants. We identify several data distribution techniques, some successfully employed by today's peer-to-peer networks: staging, data partitioning, orthogonal bandwidth exploitation, and combinations of the above. We use simulations to explore the performance of these techniques in contexts similar to those used by today's data-centric scientific collaborations and derive several recommendations for efficient data dissemination. Our experimental results show that the peer-to-peer solutions that offer load balancing and good fault tolerance properties and have embedded participation incentives lead to unjustified costs in today's scientific data collaborations deployed on over-provisioned network cores. However, as user communities grow and these deployments scale, peer-to-peer data delivery mechanisms will likely outperform other techniques.

  18. Parent and adolescent reports in assessing adolescent sleep problems: results from a large population study.

    Science.gov (United States)

    Fatima, Yaqoot; Doi, Suhail A R; O'Callaghan, Michael; Williams, Gail; Najman, Jake M; Mamun, Abdullah Al

    2016-09-01

    To compare parent and adolescent reports in exploring adolescent sleep problems and to identify the factors associated with adolescent sleep problem disclosures. Parent (n = 5185) and adolescent reports (n = 5171, age=13.9 ± 0.3 years), from a birth cohort were used to explore adolescent sleep problems. Kappa coefficients were used to assess the agreement, whereas, conditional agreement and disagreement ratios were used to identify the optimal informant. Logistic regression analysis was used to determine the factors affecting adolescent sleep problem disclosure. Parental reports identified only about one-third of the sleep problems reported by adolescents. Whereas adolescent reports identified up to two-thirds of the sleep problems reported by parents. Combined reports of parents and adolescent did not show any considerable difference from the adolescent report. Adolescent and parent health, maternal depression, and family communication were significantly associated with adolescents sleep problem disclosures. Adolescent reports could be used as the preferred source to explore adolescent sleep problems. Parental reports should be used when parents as observers are more reliable reporters, or where adolescents are cognitively unable to report sleep problems. Additionally, the impact of poor health, maternal depression and family communication on sleep problems disclosure should be considered for adolescent sleep problem diagnosis. ©2016 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  19. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  20. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    Science.gov (United States)

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  1. About the 'scientification' of politics by way of scientific expertise by advisory bodies. Social science expertise and desicion-making in social problem areas in the Federal Republic of Germany

    International Nuclear Information System (INIS)

    Wagner, P.

    1985-01-01

    Taking the examples of the Council of Economic Advisors, the Education Council and the Federal Parliament's Commission of Inquiry on Future Nuclear Energy Policy, this paper analyses political situations in the Federal Republic of Germany in which social science expertise entered public debate and decision-making in certain social problem areas in a very pronounced way. By considering the social context in which these advisory bodies were created, an attempt is made to link an analysis of different social actors' interests to a review of existing knowledge and patterns of interpretation in the social sciences. It is shown that by using social science findings some actors achieved advantages in justifying and legitimating their political positions and that subsequently the relations of actors in some arenas of conflict changed-without, however, allowing to relate this causally only to the use of scientific knowledge. If, however, the use of scientific arguments is rapidly generalized, the confrontation of expertise and counter-expertise by opposing actors becomes usual practice. This, in turn, provides for questions concerning their 'scientificity', which the social sciences are asked to take up in reflections of their relation to social practice. (orig./HSCH) [de

  2. POLITICAL RISK ON THE FINANCIAL MARKET The problem of adequate scientific assessment of business operations - the naivety of economists

    Directory of Open Access Journals (Sweden)

    Leszek Dziawgo

    2014-04-01

    Full Text Available One of the significant problems of a modern economy and economics is political risk. A destructive influence of politics on the financial market cannot be ignored. It is necessary to indicate some selected specific problems of the financial market connected with politics in the area of: public finance (including EU, monetary policy and capital market. Nowadays, the scale and dynamics of political interference in the economy and finance leads to the problem of rationality in business activities. Moreover, many hidden political factors change the political risk into immeasurable political uncertainty.

  3. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-less Problem-Based Learning in a Large Classroom Setting

    Science.gov (United States)

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230

  4. Enhancement of a model for Large-scale Airline Network Planning Problems

    NARCIS (Netherlands)

    Kölker, K.; Lopes dos Santos, Bruno F.; Lütjens, K.

    2016-01-01

    The main focus of this study is to solve the network planning problem based on passenger decision criteria including the preferred departure time and travel time for a real-sized airline network. For this purpose, a model of the integrated network planning problem is formulated including scheduling

  5. Adding intelligence to scientific data management

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas M., Jr.; Treinish, Lloyd A.

    1989-01-01

    NASA plans to solve some of the problems of handling large-scale scientific data bases by turning to artificial intelligence (AI) are discussed. The growth of the information glut and the ways that AI can help alleviate the resulting problems are reviewed. The employment of the Intelligent User Interface prototype, where the user will generate his own natural language query with the assistance of the system, is examined. Spatial data management, scientific data visualization, and data fusion are discussed.

  6. Solution of complex measuring problems for automation of a scientific experiment and technological processes; Reshenie slozhnykh izmeritel`nykh problem pri avtomatizatsii nauchnogo ehksperimenta i tekhnologicheskikh protsessov

    Energy Technology Data Exchange (ETDEWEB)

    Gribov, A A; Zhukov, V A; Sdobnov, S I; Yakovlev, G V [Rossijskij Nauchnyj Tsentr Kurchatovskij Inst., Moskva (Russian Federation)

    1996-12-31

    Paper discusses problems linked with automation of reactor measurements. Paper describes automated system to carry out neutron-physical experiments linked with measuring of slowly varying current of ionization chambers. The system is based on the trunk-module principle with application of a specialized 16-discharge trunk. Total information capacity for one current channel constitutes 5 bytes. 4 refs.; 1 fig.

  7. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  8. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu

    2013-07-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  9. On the problem of earthquake correlation in space and time over large distances

    Science.gov (United States)

    Georgoulas, G.; Konstantaras, A.; Maravelakis, E.; Katsifarakis, E.; Stylios, C. D.

    2012-04-01

    such as the well known k-means that tend to form "well-shaped" clusters may not suffice for the problem at hand and other families of unsupervised pattern recognition methods might be a better choice. One such algorithm is the DBSCAN algorithm which is based on the notion of density. In this proposed version the density is not estimated solely on the number of seismic events occurring at a specific spatio-temporal area, but also takes into account the size of the seismic event. A second method proposes the use of a modified measure of proximity that will also account for the size of the earthquake along with traditional clustering schemes such as k-means and agglomerative clustering (k-means is seeded with a quite large number for k and the results are fed to the hierarchical algorithm in order to alleviate the memory requirements on one hand and also allow for irregular shapes on the other hand). Preliminary results of seismic cluster formation using these algorithms appear promising as they are in agreement with geophysical observations on distinct seismic regions, such as those of the neighbouring regions in the Ionian sea and that of the southern Hellenic seismic arc; as well as by the location and orientation of the mapped network of underlying natural hazards beneath each clusters vicinity.

  10. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  11. A New Method Based On Modified Shuffled Frog Leaping Algorithm In Order To Solve Nonlinear Large Scale Problem

    Directory of Open Access Journals (Sweden)

    Aliasghar Baziar

    2015-03-01

    Full Text Available Abstract In order to handle large scale problems this study has used shuffled frog leaping algorithm. This algorithm is an optimization method based on natural memetics that uses a new two-phase modification to it to have a better search in the problem space. The suggested algorithm is evaluated by comparing to some well known algorithms using several benchmark optimization problems. The simulation results have clearly shown the superiority of this algorithm over other well-known methods in the area.

  12. APPLICATION OF PARAMETER CONTINUATION METHOD FOR INVESTIGATION OF VIBROIMPACT SYSTEMS DYNAMIC BEHAVIOUR. PROBLEM STATE. SHORT SURVEY OF WORLD SCIENTIFIC LITERATURE

    Directory of Open Access Journals (Sweden)

    V.A. Bazhenov

    2014-12-01

    Full Text Available Authors in their works study vibroimpact system dynamic behaviour by numerical parametric continuation technique combined with shooting and Newton-Raphson’s methods. The technique is adapted to two-mass two-degree-of-freedom vibroimpact system under periodic excitation. Impact is simulated by nonlinear contact interaction force based on Hertz’s contact theory. Stability or instability of obtained periodic solutions is determined by monodromy matrix eigenvalues (multipliers based on Floquet’s theory. In the present paper we describe the state of problem of parameter continuation method using for nonlinear tasks solution. Also we give the short survey of numerous contemporary literature in English and Russian about parameter continuation method application for nonlinear problems. This method is applied for vibroimpact problem solving more rarely because of the difficulties connected with repeated impacts.

  13. The IAPG: International Association for Promoting Geoethics: a scientific platform for widening the debate on problems of ethics applied to the geosciences

    Science.gov (United States)

    Bobrowsky, Peter; Brocx, Margaret; Di Capua, Giuseppe; Errami, Ezzoura; Greco, Roberto; Kieffer, Susan W.; Daji Limaye, Shrikant; Peppoloni, Silvia; Silva, Elizabeth; Tinti, Stefano; Wang, Meng

    2013-04-01

    Geoethics consists of the research and reflection on those values upon which to base appropriate behaviours and practices regarding the Geosphere. Geoethics also deals with problems related to risk management and mitigation of geohazards. One of the most important goals of the Geoethics is to foster the proper and correct dissemination of results of scientific studies and other information on risks. Moreover, Geoethics aims to improve the relationships between the scientific community, mass media and public and aims to organize effective teaching tools to develop awareness, values and responsibility within the population. Geoethics should become part of the social knowledge and an essential point of reference for every action affecting land, water and atmosphere usage that is taken by stake-holders and decision-makers. Although Geoethics is a young discipline, it provides a forum for open discussion inside the Geosciences on the social and cultural role that Geoscientists can play in society. First, Geoethics represents an opportunity for Geoscientists to become more conscious of their responsibilities in conducting their activity, highlighting the ethical, cultural and economic repercussions that their behavioral choices may have on society. From this point of view Geoethics, at this stage of its development, is primarily an attitude of thinking: through consideration of geoethical questions, Geoscientists have the opportunity to ask questions about themselves, their skills, the quality of their work and the contribution they can provide to the healthy progress of humanity. The International Association for Promoting Geoethics (IAPG: http://www.iapg.geoethics.org) is a new multidisciplinary, scientific platform for widening the debate on problems of Ethics applied to the Geosciences, through international cooperation and for encouraging the involvement of geoscientists on Geoethics themes. The IAPG was founded to increase the awareness inside the scientific

  14. Numerical methods for the design of large-scale nonlinear discrete ill-posed inverse problems

    International Nuclear Information System (INIS)

    Haber, E; Horesh, L; Tenorio, L

    2010-01-01

    Design of experiments for discrete ill-posed problems is a relatively new area of research. While there has been some limited work concerning the linear case, little has been done to study design criteria and numerical methods for ill-posed nonlinear problems. We present an algorithmic framework for nonlinear experimental design with an efficient numerical implementation. The data are modeled as indirect, noisy observations of the model collected via a set of plausible experiments. An inversion estimate based on these data is obtained by a weighted Tikhonov regularization whose weights control the contribution of the different experiments to the data misfit term. These weights are selected by minimization of an empirical estimate of the Bayes risk that is penalized to promote sparsity. This formulation entails a bilevel optimization problem that is solved using a simple descent method. We demonstrate the viability of our design with a problem in electromagnetic imaging based on direct current resistivity and magnetotelluric data

  15. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    Directory of Open Access Journals (Sweden)

    Helala AlShehri

    2018-03-01

    Full Text Available The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  16. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    Science.gov (United States)

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  17. A set of vertically integrated inquiry-based practical curricula that develop scientific thinking skills for large cohorts of undergraduate students.

    Science.gov (United States)

    Zimbardi, Kirsten; Bugarcic, Andrea; Colthorpe, Kay; Good, Jonathan P; Lluka, Lesley J

    2013-12-01

    Science graduates require critical thinking skills to deal with the complex problems they will face in their 21st century workplaces. Inquiry-based curricula can provide students with the opportunities to develop such critical thinking skills; however, evidence suggests that an inappropriate level of autonomy provided to underprepared students may not only be daunting to students but also detrimental to their learning. After a major review of the Bachelor of Science, we developed, implemented, and evaluated a series of three vertically integrated courses with inquiry-style laboratory practicals for early-stage undergraduate students in biomedical science. These practical curricula were designed so that students would work with increasing autonomy and ownership of their research projects to develop increasingly advanced scientific thinking and communication skills. Students undertaking the first iteration of these three vertically integrated courses reported learning gains in course content as well as skills in scientific writing, hypothesis construction, experimental design, data analysis, and interpreting results. Students also demonstrated increasing skills in both hypothesis formulation and communication of findings as a result of participating in the inquiry-based curricula and completing the associated practical assessment tasks. Here, we report the specific aspects of the curricula that students reported as having the greatest impact on their learning and the particular elements of hypothesis formulation and communication of findings that were more challenging for students to master. These findings provide important implications for science educators concerned with designing curricula to promote scientific thinking and communication skills alongside content acquisition.

  18. THE INFLUENCE OF SCIENCE LEARNING SET USING SCIENTIFIC APPROACH AND PROBLEM SOLVING MODEL ON LEARNING OUTCOMES OF JUNIOR HIGH SCHOOL STUDENTS IN THE SUBJECT OF HEAT AND TEMPERATURE

    OpenAIRE

    T. Triyuni

    2016-01-01

    This research aims to produce the scientific approach for science learning using a problem solving model on the topic of heat and temperatureon the junior high school learning outcome. The curriculum used during the study was curriculum 2013 (valid, practical and effective). The development of the learning setfollowed the four-D model which was reduced to three-D model (without dissemination). The study was tested in Class VIIA, VIIB, and VIIC in SMP Negeri 5 Academic Year 2015/2016. The data...

  19. The Scientific Enterprise

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 13; Issue 9. The Scientific Enterprise - Assumptions, Problems, and Goals in the Modern Scientific Framework. V V Raman. Reflections Volume 13 Issue 9 September 2008 pp 885-894 ...

  20. An Adaptive Large Neighborhood Search-based Three-Stage Matheuristic for the Vehicle Routing Problem with Time Windows

    DEFF Research Database (Denmark)

    Christensen, Jonas Mark; Røpke, Stefan

    that serves all the customers. The second stage usesan Adaptive Large Neighborhood Search (ALNS) algorithm to minimise the travel distance, during the second phase all of the generated routes are considered by solving a set cover problem. The ALNS algorithm uses 4 destroy operators, 2 repair operators...

  1. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the

  2. [Medical surveillance in university: organizational difficulties, legal problems, scientific e technical specificities. Experience of University of Milan Bicocca].

    Science.gov (United States)

    D'Orso, M I; Giuliani, C; Assini, R; Riva, M A; Cesana, G

    2012-01-01

    Our research describes activities of Occupational Health carried out during last year in University of Milan Bicocca by Occupational Doctors. We describe results of medical surveillance in 1153 employees or students exposed to occupational risks for health and safety. We report results obtained, technical difficulties, organizational problems, and preventive actions decided to improve functionality of our activity. Students seem to be less protected and consequently seem to have higher professional safety and health risks.

  3. Memories of atomic fear: the construction of imaginary scientific risk from the speeches on large radiation accidents by media

    International Nuclear Information System (INIS)

    Ferreira, Maria da Conceição da Rocha

    2018-01-01

    The thesis reveals some aspects that concern the fear presented by a great part of society to the use of atomic energy. There are many studies pointing to the memories that refer to the terror of radioactive contamination or the destruction caused by atomic weapons, or even a controversial environmental view of the energy efficiency against climate warming. The object herein is the communication of scientific and technological risk, revealing the importance of journalism on the information given to a non specialized population. The premises concerning accidents caused by nuclear or radiological causes are that they have something beyond any other accident of technological causes when they are object of communication by mass media. The image of the bomb destruction can be a constant of terror on the apprehension of the risk in those accidents. French Discourse analysis is the theory support approached to search about the construction of the fear evolving the atomic energy, by analyzing some of the articles of mass communication media. The time selection, as a first cut, were the decades of 1980/1990, which were celebrated by the events of Chernobyl, worldwide, and the Cesium-137, in Goiania, Brazil. The same accidents are given a second cut on the celebration of their anniversaries, in cycles of up to 30 years, in a way of upgrading the production conditions of the discourses around them and their effects on the learning of society. The analysis was articulated between texts and images as discursive materials that have their own significations on the final effect of senses, which is, according to the methodology adopted, strongly affected by ideology. (author)

  4. The Solution of Large Time-Dependent Problems Using Reduced Coordinates.

    Science.gov (United States)

    1987-06-01

    numerical intergration schemes for dynamic problems, the algorithm known as Newmark’s Method. The behavior of the Newmark scheme, as well as the basic...T’he horizontal displacements at the mid-height and the bottom of the buildin- are shown in f igure 4. 13. The solution history illustrated is for a

  5. Asymptotic eigenvalue estimates for a Robin problem with a large parameter

    Czech Academy of Sciences Publication Activity Database

    Exner, Pavel; Minakov, A.; Parnovski, L.

    2014-01-01

    Roč. 71, č. 2 (2014), s. 141-156 ISSN 0032-5155 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : Laplacian * Robin problem * eigenvalue asymptotics Subject RIV: BE - Theoretical Physics Impact factor: 0.250, year: 2014

  6. Indefinitely preconditioned inexact Newton method for large sparse equality constrained non-linear programming problems

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Vlček, Jan

    1998-01-01

    Roč. 5, č. 3 (1998), s. 219-247 ISSN 1070-5325 R&D Projects: GA ČR GA201/96/0918 Keywords : nonlinear programming * sparse problems * equality constraints * truncated Newton method * augmented Lagrangian function * indefinite systems * indefinite preconditioners * conjugate gradient method * residual smoothing Subject RIV: BA - General Mathematics Impact factor: 0.741, year: 1998

  7. Development of innovative problem based learning model with PMRI-scientific approach using ICT to increase mathematics literacy and independence-character of junior high school students

    Science.gov (United States)

    Wardono; Waluya, B.; Kartono; Mulyono; Mariani, S.

    2018-03-01

    This research is very urgent in relation to the national issue of human development and the nation's competitiveness because of the ability of Indonesian Junior High School students' mathematics literacy results of the Programme for International Student Assessment (PISA) by OECD field of Mathematics is still very low compared to other countries. Curriculum 2013 launched one of them reflect the results of PISA which is still far from the expectations of the Indonesian nation and to produce a better quality of education, PISA ratings that reflect the nation's better competitiveness need to be developed innovative, interactive learning models such as innovative interactive learning Problem Based Learning (PBL) based on the approach of Indonesian Realistic Mathematics Education (PMRI) and the Scientific approach using Information and Communication Technology (ICT).The research was designed using Research and Development (R&D), research that followed up the development and dissemination of a product/model. The result of the research shows the innovative interactive learning PBL model based on PMRI-Scientific using ICT that developed valid, practical and effective and can improve the ability of mathematics literacy and independence-character of junior high school students. While the quality of innovative interactive learning PBL model based on PMRI-Scientific using ICT meet the good category.

  8. Scientific meetings

    International Nuclear Information System (INIS)

    1973-01-01

    One of the main aims of the IAEA is to foster the exchange of scientific and technical information and one of the main ways of doing this is to convene international scientific meetings. They range from large international conferences bringing together several hundred scientists, smaller symposia attended by an average of 150 to 250 participants and seminars designed to instruct rather than inform, to smaller panels and study groups of 10 to 30 experts brought together to advise on a particular programme or to develop a set of regulations. The topics of these meetings cover every part of the Agency's activities and form a backbone of many of its programmes. (author)

  9. Problem-Based Learning Model Used to Scientific Approach Based Worksheet for Physics to Develop Senior High School Students Characters

    Science.gov (United States)

    Yulianti, D.

    2017-04-01

    The purpose of this study is to explore the application of Problem Based Learning(PBL) model aided withscientific approach and character integrated physics worksheets (LKS). Another purpose is to investigate the increase in cognitive and psychomotor learning outcomes and to know the character development of students. The method used in this study was the quasi-experiment. The instruments were observation and cognitive test. Worksheets can improve students’ cognitive, psychomotor learning outcomes. Improvements in cognitive learning results of students who have learned using worksheets are higher than students who received learning without worksheets. LKS can also develop the students’ character.

  10. LEGAL PROTECTION OF AVIATION IN THE CONTEXT OF GLOBALIZATION, RISKS AND SOCIAL ENTROPY AS A SCIENTIFIC PROBLEM: APPROACHES AND SOLUTIONS

    Directory of Open Access Journals (Sweden)

    O. O. Chernaya

    2015-01-01

    Full Text Available The article considers the issue concerning the international legal problem of using armed forces to counter the threats posed by the misuse of civil aircraft, in particular, the use of civil aircraft as a weapon to kill people and destroy objects on the territory of States (the events of 11th September 2001 in the USA. It proves the need for universal international legal norms regulating the actions of States to prevent and suppress acts of the misuse of civil aircraft.

  11. The problem of large samples. An activation analysis study of electronic waste material

    International Nuclear Information System (INIS)

    Segebade, C.; Goerner, W.; Bode, P.

    2007-01-01

    Large-volume instrumental photon activation analysis (IPAA) was used for the investigation of shredded electronic waste material. Sample masses from 1 to 150 grams were analyzed to obtain an estimate of the minimum sample size to be taken to achieve a representativeness of the results which is satisfactory for a defined investigation task. Furthermore, the influence of irradiation and measurement parameters upon the quality of the analytical results were studied. Finally, the analytical data obtained from IPAA and instrumental neutron activation analysis (INAA), both carried out in a large-volume mode, were compared. Only parts of the values were found in satisfactory agreement. (author)

  12. A parametric study of contact problem on a large size flange

    International Nuclear Information System (INIS)

    Mukherjee, A.B.; Narayanan, T.; Dhondkar, J.K.; Mehra, V.K.

    1989-01-01

    Continuous change of contact point on gasket face with the application of bolt load makes it a non-linear problem. Thus the geometric non-linearity of the structure is simulated and a stress distribution over the gasket face is presented in this paper. The paper also describes the use of taper on the gasket face to reduce the stress peaking and to optimize the gasket face separation

  13. Solving and Interpreting Large-scale Harvest Scheduling Problems by Duality and Decomposition

    OpenAIRE

    Berck, Peter; Bible, Thomas

    1982-01-01

    This paper presents a solution to the forest planning problem that takes advantage of both the duality of linear programming formulations currently being used for harvest scheduling and the characteristics of decomposition inherent in the forest land class-relationship. The subproblems of decomposition, defined as the dual, can be solved in a simple, recursive fashion. In effect, such a technique reduces the computational burden in terms of time and computer storage as compared to the traditi...

  14. Problems and Challenges in Human Resource Management: Case of a Large Organization in Pakistan

    Directory of Open Access Journals (Sweden)

    Ali Irshad

    2008-09-01

    Full Text Available This paper critically analyzes the work culture for a mainstream financial organization operating within Pakistan, while drawing a specific example to elucidate certain dilemmas that impede the potential growth for the financial sector and its constituent workforce, besides hampering the performance of the organizations. The case study is related to an organization in financial sector which conducts a Management Trainee Program with the purpose to select, train and develop a high-potential pool of talent into future leaders and fore-runners of the organization. This paper critically analyzes several inherent problems that face the successful implementation of the trainee program under the frameworks of various theories of organizational management. To solve these problems, this article presents a detailed diagnosis of the management shortcomings to improve the firm‟s corporate culture, work ethics and employee handling strategy and mechanism. Recommendations are also made to minimize the problems and maximize the success of the Management Trainee Program in the case study organization.

  15. An adaptive large neighborhood search heuristic for the Electric Vehicle Scheduling Problem

    DEFF Research Database (Denmark)

    Wen, M.; Linde, Esben; Røpke, Stefan

    2016-01-01

    to minimizing the total deadheading distance. A mixed integer programming formulation as well as an Adaptive Large Neighborhood Search (ALNS) heuristic for the E-VSP are presented. ALNS is tested on newly generated E-VSP benchmark instances. Result shows that the proposed heuristic can provide good solutions...

  16. New methods to interpolate large volume of data from points or particles (Mesh-Free) methods application for its scientific visualization

    International Nuclear Information System (INIS)

    Reyes Lopez, Y.; Yervilla Herrera, H.; Viamontes Esquivel, A.; Recarey Morfa, C. A.

    2009-01-01

    In the following paper we developed a new method to interpolate large volumes of scattered data, focused mainly on the results of the Mesh free Methods, Points Methods and the Particles Methods application. Through this one, we use local radial basis function as interpolating functions. We also use over-tree as the data structure that allows to accelerate the localization of the data that influences to interpolate the values at a new point, speeding up the application of scientific visualization techniques to generate images from large data volumes from the application of Mesh-free Methods, Points and Particle Methods, in the resolution of diverse models of physics-mathematics. As an example, the results obtained after applying this method using the local interpolation functions of Shepard are shown. (Author) 22 refs

  17. Epistemic beliefs of middle and high school students in a problem-based, scientific inquiry unit: An exploratory, mixed methods study

    Science.gov (United States)

    Gu, Jiangyue

    Epistemic beliefs are individuals' beliefs about the nature of knowledge, how knowledge is constructed, and how knowledge can be justified. This study employed a mixed-methods approach to examine: (a) middle and high school students' self-reported epistemic beliefs (quantitative) and epistemic beliefs revealed from practice (qualitative) during a problem-based, scientific inquiry unit, (b) How do middle and high school students' epistemic beliefs contribute to the construction of students' problem solving processes, and (c) how and why do students' epistemic beliefs change by engaging in PBL. Twenty-one middle and high school students participated in a summer science class to investigate local water quality in a 2-week long problem-based learning (PBL) unit. The students worked in small groups to conduct water quality tests at in their local watershed and visited several stakeholders for their investigation. Pretest and posttest versions of the Epistemological Beliefs Questionnaire were conducted to assess students' self-reported epistemic beliefs before and after the unit. I videotaped and interviewed three groups of students during the unit and conducted discourse analysis to examine their epistemic beliefs revealed from scientific inquiry activities and triangulate with their self-reported data. There are three main findings from this study. First, students in this study self-reported relatively sophisticated epistemic beliefs on the pretest. However, the comparison between their self-reported beliefs and beliefs revealed from practice indicated that some students were able to apply sophisticated beliefs during the unit while others failed to do so. The inconsistency between these two types of epistemic beliefs may due to students' inadequate cognitive ability, low validity of self-report measure, and the influence of contextual factors. Second, qualitative analysis indicated that students' epistemic beliefs of the nature of knowing influenced their problem

  18. Algorithm for solving the linear Cauchy problem for large systems of ordinary differential equations with the use of parallel computations

    Energy Technology Data Exchange (ETDEWEB)

    Moryakov, A. V., E-mail: sailor@orc.ru [National Research Centre Kurchatov Institute (Russian Federation)

    2016-12-15

    An algorithm for solving the linear Cauchy problem for large systems of ordinary differential equations is presented. The algorithm for systems of first-order differential equations is implemented in the EDELWEISS code with the possibility of parallel computations on supercomputers employing the MPI (Message Passing Interface) standard for the data exchange between parallel processes. The solution is represented by a series of orthogonal polynomials on the interval [0, 1]. The algorithm is characterized by simplicity and the possibility to solve nonlinear problems with a correction of the operator in accordance with the solution obtained in the previous iterative process.

  19. Robust Branch-Cut-and-Price for the Capacitated Minimum Spanning Tree Problem over a Large Extended Formulation

    DEFF Research Database (Denmark)

    Uchoa, Eduardo; Fukasawa, Ricardo; Lysgaard, Jens

    This paper presents a robust branch-cut-and-price algorithm for the Capacitated Minimum Spanning Tree Problem (CMST). The variables are associated to q-arbs, a structure that arises from a relaxation of the capacitated prize-collecting arborescence problem in order to make it solvable in pseudo......-polynomial time. Traditional inequalities over the arc formulation, like Capacity Cuts, are also used. Moreover, a novel feature is introduced in such kind of algorithms. Powerful new cuts expressed over a very large set of variables could be added, without increasing the complexity of the pricing subproblem...

  20. Problems related to the definition of the shielding of a large fast power reactor

    International Nuclear Information System (INIS)

    Moreau, J.

    Solutions for the shielding of a 1000 MW(e) power plant in the same technological line as Phenix are given. They have been evaluated with a monodimensional transport code. The choice is based on the comparison of their efficiency towards neutrons and on the consequences of their characteristics on the conception of the reactor tank. A few economical considerations give an idea of the influence of the choice in shielding on the cost of the power plant. At last the problem of the optimization possibilities is approached from the designer's point of view

  1. Solving large-scale PDE-constrained Bayesian inverse problems with Riemann manifold Hamiltonian Monte Carlo

    Science.gov (United States)

    Bui-Thanh, T.; Girolami, M.

    2014-11-01

    We consider the Riemann manifold Hamiltonian Monte Carlo (RMHMC) method for solving statistical inverse problems governed by partial differential equations (PDEs). The Bayesian framework is employed to cast the inverse problem into the task of statistical inference whose solution is the posterior distribution in infinite dimensional parameter space conditional upon observation data and Gaussian prior measure. We discretize both the likelihood and the prior using the H1-conforming finite element method together with a matrix transfer technique. The power of the RMHMC method is that it exploits the geometric structure induced by the PDE constraints of the underlying inverse problem. Consequently, each RMHMC posterior sample is almost uncorrelated/independent from the others providing statistically efficient Markov chain simulation. However this statistical efficiency comes at a computational cost. This motivates us to consider computationally more efficient strategies for RMHMC. At the heart of our construction is the fact that for Gaussian error structures the Fisher information matrix coincides with the Gauss-Newton Hessian. We exploit this fact in considering a computationally simplified RMHMC method combining state-of-the-art adjoint techniques and the superiority of the RMHMC method. Specifically, we first form the Gauss-Newton Hessian at the maximum a posteriori point and then use it as a fixed constant metric tensor throughout RMHMC simulation. This eliminates the need for the computationally costly differential geometric Christoffel symbols, which in turn greatly reduces computational effort at a corresponding loss of sampling efficiency. We further reduce the cost of forming the Fisher information matrix by using a low rank approximation via a randomized singular value decomposition technique. This is efficient since a small number of Hessian-vector products are required. The Hessian-vector product in turn requires only two extra PDE solves using the adjoint

  2. Final Scientific/Technical Report: Electronics for Large Superconducting Tunnel Junction Detector Arrays for Synchrotron Soft X-ray Research

    Energy Technology Data Exchange (ETDEWEB)

    Warburton, William K

    2009-03-06

    Superconducting tunnel junction (STJ) detectors offer a an approach to detecting soft x-rays with energy resolutions 4-5 times better and at rates 10 faster than traditions semiconductor detectors. To make such detectors feasible, however, then need to be deployed in large arrays of order 1000 detectors, which in turn implies that their processing electronics must be compact, fully computer controlled, and low cost per channel while still delivering ultra-low noise performance so as to not degrade the STJ's performance. We report on our progress in designing a compact, low cost preamplifier intended for this application. In particular, we were able to produce a prototype preamplifier of 2 sq-cm area and a parts cost of less than $30 that matched the energy resolution of the best conventional system to date and demonstrated its ability to acquire an STJ I-V curve under computer control, the critical step for determining and setting the detectors' operating points under software control.

  3. Solving sparse linear least squares problems on some supercomputers by using large dense blocks

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Ostromsky, T; Sameh, A

    1997-01-01

    technique is preferable to sparse matrix technique when the matrices are not large, because the high computational speed compensates fully the disadvantages of using more arithmetic operations and more storage. For very large matrices the computations must be organized as a sequence of tasks in each......Efficient subroutines for dense matrix computations have recently been developed and are available on many high-speed computers. On some computers the speed of many dense matrix operations is near to the peak-performance. For sparse matrices storage and operations can be saved by operating only...... and storing only nonzero elements. However, the price is a great degradation of the speed of computations on supercomputers (due to the use of indirect addresses, to the need to insert new nonzeros in the sparse storage scheme, to the lack of data locality, etc.). On many high-speed computers a dense matrix...

  4. Sparse direct solver for large finite element problems based on the minimum degree algorithm

    Czech Academy of Sciences Publication Activity Database

    Pařík, Petr; Plešek, Jiří

    2017-01-01

    Roč. 113, November (2017), s. 2-6 ISSN 0965-9978 R&D Projects: GA ČR(CZ) GA15-20666S; GA MŠk(CZ) EF15_003/0000493 Institutional support: RVO:61388998 Keywords : sparse direct solution * finite element method * large sparse Linear systems Subject RIV: JR - Other Machinery OBOR OECD: Mechanical engineering Impact factor: 3.000, year: 2016 https://www.sciencedirect.com/science/article/pii/S0965997817302582

  5. Standard problem exercise to validate criticality codes for large arrays of packages of fissile materials

    International Nuclear Information System (INIS)

    Whitesides, G.E.; Stephens, M.E.

    1986-01-01

    A study has been conducted by an Office of Economic Cooperation and Development-Committee on the Safety of Nuclear Installations (OECD-CSNI) Working Group that examined computational methods used to compute k/sub eff/ for large greater than or equal to5 3 arrays of fissile material (in which each unit is a substantial fraction of a critical mass). Five fissile materials that might typically be transported were used in the study. The ''packages'' used for this exercise were simplified to allow studies unperturbed by the variety of structural materials which would exist in an actual package. The only material present other than the fissile material was a variation in the moderator (water) surrounding the fissile material. Consistent results were obtained from calculations using several computational methods. That is, when the bias demonstrated by each method for actual critical experiments was used to ''correct'' the results obtained for systems for which there were no experimental data, there was good agreement between the methods. Two major areas of concern were raised by this exercise. First, the lack of experimental data for arrays with size greater than 5 3 limits validation for large systems. Second, there is a distinct possibility that the comingling of two shipments of unlike units could result in a reduction of the safety margins. Additional experiments and calculations will be required to satisfactorily resolve the remaining questions regarding the safe transport of large arrays of fissile materials

  6. Modeling and solving a large-scale generation expansion planning problem under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shan; Ryan, Sarah M. [Iowa State University, Department of Industrial and Manufacturing Systems Engineering, Ames (United States); Watson, Jean-Paul [Sandia National Laboratories, Discrete Math and Complex Systems Department, Albuquerque (United States); Woodruff, David L. [University of California Davis, Graduate School of Management, Davis (United States)

    2011-11-15

    We formulate a generation expansion planning problem to determine the type and quantity of power plants to be constructed over each year of an extended planning horizon, considering uncertainty regarding future demand and fuel prices. Our model is expressed as a two-stage stochastic mixed-integer program, which we use to compute solutions independently minimizing the expected cost and the Conditional Value-at-Risk; i.e., the risk of significantly larger-than-expected operational costs. We introduce stochastic process models to capture demand and fuel price uncertainty, which are in turn used to generate trees that accurately represent the uncertainty space. Using a realistic problem instance based on the Midwest US, we explore two fundamental, unexplored issues that arise when solving any stochastic generation expansion model. First, we introduce and discuss the use of an algorithm for computing confidence intervals on obtained solution costs, to account for the fact that a finite sample of scenarios was used to obtain a particular solution. Second, we analyze the nature of solutions obtained under different parameterizations of this method, to assess whether the recommended solutions themselves are invariant to changes in costs. The issues are critical for decision makers who seek truly robust recommendations for generation expansion planning. (orig.)

  7. Recreating Raven's: software for systematically generating large numbers of Raven-like matrix problems with normed properties.

    Science.gov (United States)

    Matzen, Laura E; Benz, Zachary O; Dixon, Kevin R; Posey, Jamie; Kroger, James K; Speed, Ann E

    2010-05-01

    Raven's Progressive Matrices is a widely used test for assessing intelligence and reasoning ability (Raven, Court, & Raven, 1998). Since the test is nonverbal, it can be applied to many different populations and has been used all over the world (Court & Raven, 1995). However, relatively few matrices are in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. For the present study, we analyzed the types of relations that appear in Raven's original Standard Progressive Matrices (SPMs) and created a software tool that can combine the same types of relations according to parameters chosen by the experimenter, to produce very large numbers of matrix problems with specific properties. We then conducted a norming study in which the matrices we generated were compared with the actual SPMs. This study showed that the generated matrices both covered and expanded on the range of problem difficulties provided by the SPMs.

  8. The solar elemental abundances problem: Large enhancements in photoionization and bound-free opacity

    Science.gov (United States)

    Pradhan, A.; Nahar, S.

    2016-05-01

    Aimed at solving the outstanding problem of solar opacity and radiation transport, we report substantial photoabsorption in the high-energy regime due to atomic core photo-excitations not heretofore considered. In an extensive R-Matrix calculations of unprecedented complexity for an important iron ion Fe XVII, with a wave function expansion of 99 Fe XVIII core states from n current opacity models, and ii) demonstrate convergence with respect to successive core excitations. These findings may explain the ``higher-than-predicted'' monochromatic iron opacity measured recently at the Sandia Z-pinch fusion device at solar interior conditions. The findings will also impact the total atomic photoabsorption and radiation transport in laboratory and astrophysical plasmas, such as UV emission from host stars of extra-solar planets. Support: NSF, DOE, Ohio Supercomputer Center, Columbus, OH.

  9. Implementation of a partitioned algorithm for simulation of large CSI problems

    Science.gov (United States)

    Alvin, Kenneth F.; Park, K. C.

    1991-01-01

    The implementation of a partitioned numerical algorithm for determining the dynamic response of coupled structure/controller/estimator finite-dimensional systems is reviewed. The partitioned approach leads to a set of coupled first and second-order linear differential equations which are numerically integrated with extrapolation and implicit step methods. The present software implementation, ACSIS, utilizes parallel processing techniques at various levels to optimize performance on a shared-memory concurrent/vector processing system. A general procedure for the design of controller and filter gains is also implemented, which utilizes the vibration characteristics of the structure to be solved. Also presented are: example problems; a user's guide to the software; the procedures and algorithm scripts; a stability analysis for the algorithm; and the source code for the parallel implementation.

  10. Use of endochronic plasticity for multi-dimensional small and large strain problems

    International Nuclear Information System (INIS)

    Hsieh, B.J.

    1980-04-01

    The endochronic plasticity theory was proposed in its general form by K.C. Valanis. An intrinsic time measure, which is a property of the material, is used in the theory. the explicit forms of the constitutive equation resemble closely those of the classical theory of linear viscoelasticity. Excellent agreement between the predicted and experimental results is obtained for some metallic and non-metallic materials for one dimensional cases. No reference on the use of endochronic plasticity consistent with the general theory proposed by Valanis is available in the open literature. In this report, the explicit constitutive equations are derived that are consistent with the general theory for one-dimensional (simple tension or compression), two-dimensional plane strain or stress and three-dimensional axisymmetric problems

  11. Non-smooth optimization methods for large-scale problems: applications to mid-term power generation planning

    International Nuclear Information System (INIS)

    Emiel, G.

    2008-01-01

    This manuscript deals with large-scale non-smooth optimization that may typically arise when performing Lagrangian relaxation of difficult problems. This technique is commonly used to tackle mixed-integer linear programming - or large-scale convex problems. For example, a classical approach when dealing with power generation planning problems in a stochastic environment is to perform a Lagrangian relaxation of the coupling constraints of demand. In this approach, a master problem coordinates local subproblems, specific to each generation unit. The master problem deals with a separable non-smooth dual function which can be maximized with, for example, bundle algorithms. In chapter 2, we introduce basic tools of non-smooth analysis and some recent results regarding incremental or inexact instances of non-smooth algorithms. However, in some situations, the dual problem may still be very hard to solve. For instance, when the number of dualized constraints is very large (exponential in the dimension of the primal problem), explicit dualization may no longer be possible or the update of dual variables may fail. In order to reduce the dual dimension, different heuristics were proposed. They involve a separation procedure to dynamically select a restricted set of constraints to be dualized along the iterations. This relax-and-cut type approach has shown its numerical efficiency in many combinatorial problems. In chapter 3, we show Primal-dual convergence of such strategy when using an adapted sub-gradient method for the dual step and under minimal assumptions on the separation procedure. Another limit of Lagrangian relaxation may appear when the dual function is separable in highly numerous or complex sub-functions. In such situation, the computational burden of solving all local subproblems may be preponderant in the whole iterative process. A natural strategy would be here to take full advantage of the dual separable structure, performing a dual iteration after having

  12. Solution to selected occupational health problems in the operation of large machinery in potash mines

    Energy Technology Data Exchange (ETDEWEB)

    Stuhrmann, D.

    1979-01-01

    This paper discusses health hazards, such as noise, vibrations and air pollution caused by the operation of large machines and their effect on the human body. Means of reducing the influence of these factors are: improved ear protection against noise, special gymnastics at the end of each shift, warm water swimming and a weekly sauna treatment for the effects of vibrations. Air pollution caused by diesel engines is monitored. Engine exhaust standards are examined in detail and methods are proposed to further reduce air pollution.

  13. Fútbol y Racismo: un problema científico y social. Soccer and Racism: a scientific and social problem.

    Directory of Open Access Journals (Sweden)

    Duran González, Javier

    2006-04-01

    Full Text Available ResumenAunque el problema del racismo parecía haber sido superado en Europa, recientes sucesos como la emisión de sonidos simiescos contra jugadores negros, cánticos anti-semitas e incluso la utilización de eslóganes y símbolos racistas de la extrema derecha parecen mostrar lo contrario. En este sentido la alarma parece haberse disparado en España. Este artículo muestra la política de actuación que se ha implantado en nuestro país para luchar contra este problema traducida en la creación del Observatorio contra el Racismo y la Violencia en el Deporte. Los contenidos han sido estructurados en una introducción sobre las líneas de intervención en el marco del racismo y el deporte en general incidiendo sobre algunas de las principales dificultades con las que nos enfrentamos a la hora de intervenir en este marco; la presentación de las políticas de actuación europeas y los órganos responsables en la lucha y prevención del racismo dentro del fútbol; las actuaciones específicas que se han adoptado sobre este respecto en España materializadas a través de la creación del Observatorio del Racismo y la Violencia en el Deporte el 22 de diciembre de 2004 dentro de la Comisión Nacional contra la Violencia en los Espectáculos Deportivos y las medidas adoptadas; concluyendo con algunas recomendaciones para asegurar la eficacia en la lucha contra el racismo en el deporte en España.AbstractAlthough the problem of the racism seemed to have been overcome in Europe, recent events like the emission of monkey sounds against black players, anti-Semitic songs and even the use of slogans and racist symbols of the extreme right seem to show the opposite. In this sense, the alarm has raised in Spain. This article shows the politic proceedings that have been implanted in this country to fight against this problem, leading to the formation of the Observatory against Racism and Violence in Sport. The paper have been structured into an introduction

  14. Performance of the improved version of Monte Carlo code A 3MCNP for large-scale shielding problems

    International Nuclear Information System (INIS)

    Omura, M.; Miyake, Y.; Hasegawa, T.; Ueki, K.; Sato, O.; Haghighat, A.; Sjoden, G. E.

    2005-01-01

    A 3MCNP (Automatic Adjoint Accelerated MCNP) is a revised version of the MCNP Monte Carlo code, which automatically prepares variance reduction parameters for the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using a deterministic 'importance' (or adjoint) function, CADIS performs source and transport biasing within the weight-window technique. The current version of A 3MCNP uses the three-dimensional (3-D) Sn transport TORT code to determine a 3-D importance function distribution. Based on simulation of several real-life problems, it is demonstrated that A 3MCNP provides precise calculation results with a remarkably short computation time by using the proper and objective variance reduction parameters. However, since the first version of A 3MCNP provided only a point source configuration option for large-scale shielding problems, such as spent-fuel transport casks, a large amount of memory may be necessary to store enough points to properly represent the source. Hence, we have developed an improved version of A 3MCNP (referred to as A 3MCNPV) which has a volumetric source configuration option. This paper describes the successful use of A 3MCNPV for a concrete cask neutron and gamma-ray shielding problem, and a PWR dosimetry problem. (authors)

  15. Number of deaths due to lung diseases: How large is the problem?

    International Nuclear Information System (INIS)

    Wagener, D.K.

    1990-01-01

    The importance of lung disease as an indicator of environmentally induced adverse health effects has been recognized by inclusion among the Health Objectives for the Nation. The 1990 Health Objectives for the Nation (US Department of Health and Human Services, 1986) includes an objective that there should be virtually no new cases among newly exposed workers for four preventable occupational lung diseases-asbestosis, byssinosis, silicosis, and coal workers' pneumoconiosis. This brief communication describes two types of cause-of-death statistics- underlying and multiple cause-and demonstrates the differences between the two statistics using lung disease deaths among adult men. The choice of statistic has a large impact on estimated lung disease mortality rates. The choice of statistics also may have large effect on the estimated mortality rates due to other chromic diseases thought to be environmentally mediated. Issues of comorbidity and the way causes of death are reported become important in the interpretation of these statistics. The choice of which statistic to use when comparing data from a study population with national statistics may greatly affect the interpretations of the study findings

  16. [Substance basis research on Chinese materia medica is one of key scientific problems of inheriting, development and innovation of Chinese materia medica].

    Science.gov (United States)

    Yang, Xiu-wei

    2015-09-01

    The compound Chinese materia medica is the medication pattern of the traditional Chinese medicine for the disease prevention and treatment. The single Chinese materia medica (mostly in decoction pieces) is the prescription composition of the compound Chinese materia medica. The study of the effective substance basis of Chinese materia medica should be based on the chemical compositions of the compound Chinese materia medica as an entry point considering the different status of "Monarch, Minister, Assistant, and Guide" for a certain single Chinese materia medica in the different compound Chinese materia medica while substance basis research of a certain single Chinese materia medica should be a full component analysis as well as both stable and controllable quality. Substance basis research on Chinese materia medica is one of key scientific problems of inheriting, development and innovation of Chinese materia medica.

  17. PROSCENIUM OF SCIENTIFIC MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vasile Berlingher

    2013-09-01

    Full Text Available During the last three decades of the nineteenth century, organizations developed rapidly, their managers began to realize that they had too frequent managerial problems; this awareness lead to a new phase of development of scientific management. Examining the titles published in that period, it can be concluded that management issues that pose interest related to payroll and payroll systems, problems exacerbated by the industrial revolution and related work efficiency. Noting that large organizations losing power, direct supervision, the managers were looking for incentives to replace this power . One of the first practitioners of this new management system was Henry R. Towne, the president of the well-known enterprise "Yale and Towne Manufacturing Company", which applied the management methods in his company workshops. Publishers of magazines "Industrial Management" and "The Engineering Magazine" stated that HR Towne is, undisputedly, the pioneer of scientific management. He initiated the systematic application of effective management methods and his famous article "The Engineer as Economist" provided to the company. "American Society of Mechanical Engineers" in 1886 was the one that probably inspired Frederick W. Taylor to devote his entire life and work in scientific management.

  18. Complaints from emergency department patients largely result from treatment and communication problems.

    Science.gov (United States)

    Taylor, David McD; Wolfe, Rory; Cameron, Peter A

    2002-03-01

    Emergency department patient complaints are often justified and may lead to apology, remedial action or compensation. The aim of the present study was to analyse emergency department patient complaints in order to identify procedures or practices that require change and to make recommendations for intervention strategies aimed at decreasing complaint rates. We undertook a retrospective analysis of patient complaints from 36 Victorian emergency departments during a 61 month period. Data were obtained from the Health Complaint Information Program (Health Services Commissioner). In all, 2,419 emergency department patients complained about a total of 3,418 separate issues (15.4% of all issues from all hospital departments). Of these, 1,157 complaints (47.80%) were received by telephone and 829 (34.3%) were received by letter; 1,526 (63.1 %) complaints were made by a person other than the patient. Highest complaint rates were received from patients who were female, born in non-English-speaking countries and very young or very old. One thousand one hundred and forty-one issues (33.4%) related to patient treatment, including inadequate treatment (329 issues) and inadequate diagnosis (249 issues); 1079 (31.6%) issues related to communication, including poor staff attitude, discourtesy and rudeness (444 issues); 407 (11.9%) issues related to delay in treatment. Overall, 2516 issues (73.6%) were resolved satisfactorily, usually by explanation or apology. Only 59 issues (1.7%) resulted in a procedure or policy change. Remedial action was taken in 109 issues (3.2%) and compensation was paid to eight patients. Communication remains a significant factor in emergency department patient dissatisfaction. While patient complaints have resulted in major changes to policy and procedure, research and intervention strategies into communication problems are indicated. In the short term, focused staff training is recommended.

  19. Theory and algorithms for solving large-scale numerical problems. Application to the management of electricity production

    International Nuclear Information System (INIS)

    Chiche, A.

    2012-01-01

    This manuscript deals with large-scale optimization problems, and more specifically with solving the electricity unit commitment problem arising at EDF. First, we focused on the augmented Lagrangian algorithm. The behavior of that algorithm on an infeasible convex quadratic optimization problem is analyzed. It is shown that the algorithm finds a point that satisfies the shifted constraints with the smallest possible shift in the sense of the Euclidean norm and that it minimizes the objective on the corresponding shifted constrained set. The convergence to such a point is realized at a global linear rate, which depends explicitly on the augmentation parameter. This suggests us a rule for determining the augmentation parameter to control the speed of convergence of the shifted constraint norm to zero. This rule has the advantage of generating bounded augmentation parameters even when the problem is infeasible. As a by-product, the algorithm computes the smallest translation in the Euclidean norm that makes the constraints feasible. Furthermore, this work provides solution methods for stochastic optimization industrial problems decomposed on a scenario tree, based on the progressive hedging algorithm introduced by [Rockafellar et Wets, 1991]. We also focus on the convergence of that algorithm. On the one hand, we offer a counter-example showing that the algorithm could diverge if its augmentation parameter is iteratively updated. On the other hand, we show how to recover the multipliers associated with the non-dualized constraints defined on the scenario tree from those associated with the corresponding constraints of the scenario subproblems. Their convergence is also analyzed for convex problems. The practical interest of theses solutions techniques is corroborated by numerical experiments performed on the electric production management problem. We apply the progressive hedging algorithm to a realistic industrial problem. More precisely, we solve the French medium

  20. Implementing interactive decision support: A case for combining cyberinfrastructure, data fusion, and social process to mobilize scientific knowledge in sustainability problems

    Science.gov (United States)

    Pierce, S. A.

    2014-12-01

    Geosciences are becoming increasingly data intensive, particularly in relation to sustainability problems, which are multi-dimensional, weakly structured and characterized by high levels of uncertainty. In the case of complex resource management problems, the challenge is to extract meaningful information from data and make sense of it. Simultaneously, scientific knowledge alone is insufficient to change practice. Creating tools, and group decision support processes for end users to interact with data are key challenges to transforming science-based information into actionable knowledge. The ENCOMPASS project began as a multi-year case study in the Atacama Desert of Chile to design and implement a knowledge transfer model for energy-water-mining conflicts in the region. ENCOMPASS combines the use of cyberinfrastructure (CI), automated data collection, interactive interfaces for dynamic decision support, and participatory modelling to support social learning. A pilot version of the ENCOMPASS CI uses open source systems and serves as a structure to integrate and store multiple forms of data and knowledge, such as DEM, meteorological, water quality, geomicrobiological, energy demand, and groundwater models. In the case study, informatics and data fusion needs related to scientific uncertainty around deep groundwater flowpaths and energy-water connections. Users may upload data from field sites with handheld devices or desktops. Once uploaded, data assets are accessible for a variety of uses. To address multi-attributed decision problems in the Atacama region a standalone application with touch-enabled interfaces was created to improve real-time interactions with datasets by groups. The tool was used to merge datasets from the ENCOMPASS CI to support exploration among alternatives and build shared understanding among stakeholders. To date, the project has increased technical capacity among stakeholders, resulted in the creation of both for-profit and non

  1. Problems of new processes in coking industry discussed at the Scientific Council of the GKNT and the Section of Coking Industry of the Scientific and Technological Council of the Ministry of Iron and Steel Industry of the USSR

    Energy Technology Data Exchange (ETDEWEB)

    Ermolova, V.P.

    1983-03-01

    A report is given from the conference on new processes in the Soviet coking industry held in Moscow on 28-29 October 1982. The following papers were delivered: implementing research programs on new coking processes, new trends in coking plant design and construction in the 11th 5 year plan and in the time period until 1990, increasing efficiency of coking industry, research program of the Scientific Council in 1983. The report concentrates on new trends in coking plant design. Considering that the raw material basis of the Eastern regions of the USSR is poor in high quality coking coal the new coking processes and technologies should use weakly coking coal. The following schemes are discussed: heat treatments (especially useful in the case of black coal from the Kuzbass characterized by intensive fluctuations of petrology), selective crushing (schemes developed by the VUKhIN Institute), partial briquetting of coal mixtures (experiments carried out by the UKhIN in the Donbass in 1982). Other problems of coking such as development of new systems of smokeless fuel feeding to coke ovens, dry coke quenching, increasing capacity of coke chambers to 41.6 m/sup 3/, environmental protection in coking industry and production of special coke types from poor quality coal for metallurgy are also described.

  2. Proceedings of 5. international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century'. Pt. 1; Materialy 5-oj mezhdunarodnoj nauchnoj konferentsii 'Sakharovskie chteniya 2005 goda: Ehkologicheskie problemy XXI veka'. Ch. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kundas, S P; Okeanov, A E [International A. Sakharov environmental univ., Minsk (Belarus); Shevchuk, V E [Kamiteht pa prablemam nastupstvaw katastrofy na Charnobyl' skaj AEhS pry Savetse Ministraw Rehspubliki Belarus' , Minsk (Belarus)

    2005-05-15

    The first part of proceedings of the fifth international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century', which was held in the International A. Sakharov Environmental University, contents materials on topics: socio-ecological problems, medical ecology, biological ecology. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  3. An adaptive large neighborhood search heuristic for Two-Echelon Vehicle Routing Problems arising in city logistics

    Science.gov (United States)

    Hemmelmayr, Vera C.; Cordeau, Jean-François; Crainic, Teodor Gabriel

    2012-01-01

    In this paper, we propose an adaptive large neighborhood search heuristic for the Two-Echelon Vehicle Routing Problem (2E-VRP) and the Location Routing Problem (LRP). The 2E-VRP arises in two-level transportation systems such as those encountered in the context of city logistics. In such systems, freight arrives at a major terminal and is shipped through intermediate satellite facilities to the final customers. The LRP can be seen as a special case of the 2E-VRP in which vehicle routing is performed only at the second level. We have developed new neighborhood search operators by exploiting the structure of the two problem classes considered and have also adapted existing operators from the literature. The operators are used in a hierarchical scheme reflecting the multi-level nature of the problem. Computational experiments conducted on several sets of instances from the literature show that our algorithm outperforms existing solution methods for the 2E-VRP and achieves excellent results on the LRP. PMID:23483764

  4. Numerical and analytical approaches to an advection-diffusion problem at small Reynolds number and large Péclet number

    Science.gov (United States)

    Fuller, Nathaniel J.; Licata, Nicholas A.

    2018-05-01

    Obtaining a detailed understanding of the physical interactions between a cell and its environment often requires information about the flow of fluid surrounding the cell. Cells must be able to effectively absorb and discard material in order to survive. Strategies for nutrient acquisition and toxin disposal, which have been evolutionarily selected for their efficacy, should reflect knowledge of the physics underlying this mass transport problem. Motivated by these considerations, in this paper we discuss the results from an undergraduate research project on the advection-diffusion equation at small Reynolds number and large Péclet number. In particular, we consider the problem of mass transport for a Stokesian spherical swimmer. We approach the problem numerically and analytically through a rescaling of the concentration boundary layer. A biophysically motivated first-passage problem for the absorption of material by the swimming cell demonstrates quantitative agreement between the numerical and analytical approaches. We conclude by discussing the connections between our results and the design of smart toxin disposal systems.

  5. An adaptive large neighborhood search heuristic for Two-Echelon Vehicle Routing Problems arising in city logistics.

    Science.gov (United States)

    Hemmelmayr, Vera C; Cordeau, Jean-François; Crainic, Teodor Gabriel

    2012-12-01

    In this paper, we propose an adaptive large neighborhood search heuristic for the Two-Echelon Vehicle Routing Problem (2E-VRP) and the Location Routing Problem (LRP). The 2E-VRP arises in two-level transportation systems such as those encountered in the context of city logistics. In such systems, freight arrives at a major terminal and is shipped through intermediate satellite facilities to the final customers. The LRP can be seen as a special case of the 2E-VRP in which vehicle routing is performed only at the second level. We have developed new neighborhood search operators by exploiting the structure of the two problem classes considered and have also adapted existing operators from the literature. The operators are used in a hierarchical scheme reflecting the multi-level nature of the problem. Computational experiments conducted on several sets of instances from the literature show that our algorithm outperforms existing solution methods for the 2E-VRP and achieves excellent results on the LRP.

  6. Robust branch-cut-and-price for the Capacitated Minimum Spanning Tree problem over a large extended formulation

    DEFF Research Database (Denmark)

    Uchoa, Eduardo; Fukasawa, Ricardo; Lysgaard, Jens

    2008-01-01

    -polynomial time. Traditional inequalities over the arc formulation, like Capacity Cuts, are also used. Moreover, a novel feature is introduced in such kind of algorithms: powerful new cuts expressed over a very large set of variables are added, without increasing the complexity of the pricing subproblem......This paper presents a robust branch-cut-and-price algorithm for the Capacitated Minimum Spanning Tree Problem (CMST). The variables are associated to q-arbs, a structure that arises from a relaxation of the capacitated prize-collecting arborescence problem in order to make it solvable in pseudo...... or the size of the LPs that are actually solved. Computational results on benchmark instances from the OR-Library show very significant improvements over previous algorithms. Several open instances could be solved to optimality....

  7. The main environmental and social problems in China's large coal mine construction and the countermeasures

    Energy Technology Data Exchange (ETDEWEB)

    Geng, Hai-qing [State Environmental Protection Administration, Beijing (China). Appraisal Center for Environment and Engineering

    2008-05-15

    Due to rapid industrialization and urbanization, the number of China's large proposed coal mines has increased very quickly in recent years. 144 environmental impact assessment reports were submitted to SEPA during 2001 2006, so the problem of reconciling coal exploitation with the environmental and social impacts has become urgent in China's sustainable development strategy. Based on analysis of data on coal mine exploitation, it is pointed out that there are four main problems in the management of the coal sector: the SEA (Strategic Environmental Assessment) lags behind the practical needs; the policy is not clear; migration is neglected; and an ecological compensatory mechanism is absent. Four countermeasures are recommended: accelerating the execution of SEA; compartmentalizing typical zones for environmental management; improving the organisation of resettlement in the coal district; and installing an ecological compensatory mechanism in the coal mining industry. 13 refs., 2 figs., 1 tab.

  8. Scientific Notes (Problems of Cybernetics)

    Science.gov (United States)

    1960-10-18

    methodesworked out "by I, .P. P&vlov. ’ j ? The attainments cf neTirocyhernetI.es are opening new j [prospects for the in vest lotion ...created only for. a definite, state of the environment i. I and will expire If the environment changes, .Fiirtn-ars»©*’ i | its potentialities are no...orevious activities and to switch from one | i program of activity to another. I In this connection, the experiments conducted by thej famous Polish

  9. Scientific publications and research groups on alcohol consumption and related problems worldwide: authorship analysis of papers indexed in PubMed and Scopus databases (2005 to 2009).

    Science.gov (United States)

    González-Alcaide, Gregorio; Castelló-Cogollos, Lourdes; Castellano-Gómez, Miguel; Agullo-Calatayud, Víctor; Aleixandre-Benavent, Rafael; Alvarez, Francisco Javier; Valderrama-Zurián, Juan Carlos

    2013-01-01

    The research of alcohol consumption-related problems is a multidisciplinary field. The aim of this study is to analyze the worldwide scientific production in the area of alcohol-drinking and alcohol-related problems from 2005 to 2009. A MEDLINE and Scopus search on alcohol (alcohol-drinking and alcohol-related problems) published from 2005 to 2009 was carried out. Using bibliometric indicators, the distribution of the publications was determined within the journals that publish said articles, specialty of the journal (broad subject terms), article type, language of the publication, and country where the journal is published. Also, authorship characteristics were assessed (collaboration index and number of authors who have published more than 9 documents). The existing research groups were also determined. About 24,100 documents on alcohol, published in 3,862 journals, and authored by 69,640 authors were retrieved from MEDLINE and Scopus between the years 2005 and 2009. The collaboration index of the articles was 4.83 ± 3.7. The number of consolidated research groups in the field was identified as 383, with 1,933 authors. Documents on alcohol were published mainly in journals covering the field of "Substance-Related Disorders," 23.18%, followed by "Medicine," 8.7%, "Psychiatry," 6.17%, and "Gastroenterology," 5.25%. Research on alcohol is a consolidated field, with an average of 4,820 documents published each year between 2005 and 2009 in MEDLINE and Scopus. Alcohol-related publications have a marked multidisciplinary nature. Collaboration was common among alcohol researchers. There is an underrepresentation of alcohol-related publications in languages other than English and from developing countries, in MEDLINE and Scopus databases. Copyright © 2012 by the Research Society on Alcoholism.

  10. Health problems awareness during travel among faculty members of a large university in Latin America: preliminary report

    Directory of Open Access Journals (Sweden)

    Ana Cristina Nakamura Tome

    2013-02-01

    Full Text Available Health safety during trips is based on previous counseling, vaccination and prevention of infections, previous diseases or specific problems related to the destination. Our aim was to assess two aspects, incidence of health problems related to travel and the traveler's awareness of health safety. To this end we phone-interviewed faculty members of a large public University, randomly selected from humanities, engineering and health schools. Out of 520 attempts, we were able to contact 67 (12.9% and 46 (68.6% agreed to participate in the study. There was a large male proportion (37/44, 84.1%, mature adults mostly in their forties and fifties (32/44, 72.7%, all of them with higher education, as you would expect of faculty members. Most described themselves as being sedentary or as taking occasional exercise, with only 15.9% (7/44 taking regular exercise. Preexisting diseases were reported by 15 travelers. Most trips lasted usually one week or less. Duration of the travel was related to the destination, with (12h or longer trips being taken by 68.2% (30/44 of travelers, and the others taking shorter (3h domestic trips. Most travelling was made by air (41/44 and only 31.8% (14/44 of the trips were motivated by leisure. Field research trips were not reported. Specific health counseling previous to travel was reported only by two (4.5%. Twenty seven of them (61.4% reported updated immunization, but 11/30 reported unchecked immunizations. 30% (9/30 reported travel without any health insurance coverage. As a whole group, 6 (13.6% travelers reported at least one health problem attributed to the trip. All of them were males travelling abroad. Five presented respiratory infections, such as influenza and common cold, one neurological, one orthopedic, one social and one hypertension. There were no gender differences regarding age groups, destination, type of transport, previous health counseling, leisure travel motivation or pre-existing diseases

  11. Health problems awareness during travel among faculty members of a large university in Latin America: preliminary report.

    Science.gov (United States)

    Tome, Ana Cristina Nakamura; Canello, Thaís Brandi; Luna, Expedito José de Albuquerque; Andrade Junior, Heitor Franco de

    2013-01-01

    Health safety during trips is based on previous counseling, vaccination and prevention of infections, previous diseases or specific problems related to the destination. Our aim was to assess two aspects, incidence of health problems related to travel and the traveler's awareness of health safety. To this end we phone-interviewed faculty members of a large public University, randomly selected from humanities, engineering and health schools. Out of 520 attempts, we were able to contact 67 (12.9%) and 46 (68.6%) agreed to participate in the study. There was a large male proportion (37/44, 84.1%), mature adults mostly in their forties and fifties (32/44, 72.7%), all of them with higher education, as you would expect of faculty members. Most described themselves as being sedentary or as taking occasional exercise, with only 15.9% (7/44) taking regular exercise. Preexisting diseases were reported by 15 travelers. Most trips lasted usually one week or less. Duration of the travel was related to the destination, with (12h) or longer trips being taken by 68.2% (30/44) of travelers, and the others taking shorter (3h) domestic trips. Most travelling was made by air (41/44) and only 31.8% (14/44) of the trips were motivated by leisure. Field research trips were not reported. Specific health counseling previous to travel was reported only by two (4.5%). Twenty seven of them (61.4%) reported updated immunization, but 11/30 reported unchecked immunizations. 30% (9/30) reported travel without any health insurance coverage. As a whole group, 6 (13.6%) travelers reported at least one health problem attributed to the trip. All of them were males travelling abroad. Five presented respiratory infections, such as influenza and common cold, one neurological, one orthopedic, one social and one hypertension. There were no gender differences regarding age groups, destination, type of transport, previous health counseling, leisure travel motivation or pre-existing diseases. Interestingly

  12. Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction

    Science.gov (United States)

    Cui, Tiangang; Marzouk, Youssef; Willcox, Karen

    2016-06-01

    Two major bottlenecks to the solution of large-scale Bayesian inverse problems are the scaling of posterior sampling algorithms to high-dimensional parameter spaces and the computational cost of forward model evaluations. Yet incomplete or noisy data, the state variation and parameter dependence of the forward model, and correlations in the prior collectively provide useful structure that can be exploited for dimension reduction in this setting-both in the parameter space of the inverse problem and in the state space of the forward model. To this end, we show how to jointly construct low-dimensional subspaces of the parameter space and the state space in order to accelerate the Bayesian solution of the inverse problem. As a byproduct of state dimension reduction, we also show how to identify low-dimensional subspaces of the data in problems with high-dimensional observations. These subspaces enable approximation of the posterior as a product of two factors: (i) a projection of the posterior onto a low-dimensional parameter subspace, wherein the original likelihood is replaced by an approximation involving a reduced model; and (ii) the marginal prior distribution on the high-dimensional complement of the parameter subspace. We present and compare several strategies for constructing these subspaces using only a limited number of forward and adjoint model simulations. The resulting posterior approximations can rapidly be characterized using standard sampling techniques, e.g., Markov chain Monte Carlo. Two numerical examples demonstrate the accuracy and efficiency of our approach: inversion of an integral equation in atmospheric remote sensing, where the data dimension is very high; and the inference of a heterogeneous transmissivity field in a groundwater system, which involves a partial differential equation forward model with high dimensional state and parameters.

  13. Sleep in a large, multi-university sample of college students: sleep problem prevalence, sex differences, and mental health correlates.

    Science.gov (United States)

    Becker, Stephen P; Jarrett, Matthew A; Luebbe, Aaron M; Garner, Annie A; Burns, G Leonard; Kofler, Michael J

    2018-04-01

    To (1) describe sleep problems in a large, multi-university sample of college students; (2) evaluate sex differences; and (3) examine the unique associations of mental health symptoms (i.e., anxiety, depression, attention-deficit/hyperactivity disorder inattention [ADHD-IN], ADHD hyperactivity-impulsivity [ADHD-HI]) in relation to sleep problems. 7,626 students (70% female; 81% White) ages 18-29 years (M=19.14, SD=1.42) from six universities completed measures assessing mental health symptoms and the Pittsburgh Sleep Quality Index (PSQI). A substantial minority of students endorsed sleep problems across specific sleep components. Specifically, 27% described their sleep quality as poor, 36% reported obtaining less than 7 hours of sleep per night, and 43% reported that it takes >30 minutes to fall asleep at least once per week. 62% of participants met cut-off criteria for poor sleep, though rates differed between females (64%) and males (57%). In structural regression models, both anxiety and depression symptoms were uniquely associated with disruptions in most PSQI sleep component domains. However, anxiety (but not depression) symptoms were uniquely associated with more sleep disturbances and sleep medication use, whereas depression (but not anxiety) symptoms were uniquely associated with increased daytime dysfunction. ADHD-IN symptoms were uniquely associated with poorer sleep quality and increased daytime dysfunction, whereas ADHD-HI symptoms were uniquely associated with more sleep disturbances and less daytime dysfunction. Lastly, ADHD-IN, anxiety, and depression symptoms were each independently associated with poor sleep status. This study documents a high prevalence of poor sleep among college students, some sex differences, and distinct patterns of mental health symptoms in relation to sleep problems. Copyright © 2018. Published by Elsevier Inc.

  14. Computing the full spectrum of large sparse palindromic quadratic eigenvalue problems arising from surface Green's function calculations

    Science.gov (United States)

    Huang, Tsung-Ming; Lin, Wen-Wei; Tian, Heng; Chen, Guan-Hua

    2018-03-01

    Full spectrum of a large sparse ⊤-palindromic quadratic eigenvalue problem (⊤-PQEP) is considered arguably for the first time in this article. Such a problem is posed by calculation of surface Green's functions (SGFs) of mesoscopic transistors with a tremendous non-periodic cross-section. For this problem, general purpose eigensolvers are not efficient, nor is advisable to resort to the decimation method etc. to obtain the Wiener-Hopf factorization. After reviewing some rigorous understanding of SGF calculation from the perspective of ⊤-PQEP and nonlinear matrix equation, we present our new approach to this problem. In a nutshell, the unit disk where the spectrum of interest lies is broken down adaptively into pieces small enough that they each can be locally tackled by the generalized ⊤-skew-Hamiltonian implicitly restarted shift-and-invert Arnoldi (G⊤SHIRA) algorithm with suitable shifts and other parameters, and the eigenvalues missed by this divide-and-conquer strategy can be recovered thanks to the accurate estimation provided by our newly developed scheme. Notably the novel non-equivalence deflation is proposed to avoid as much as possible duplication of nearby known eigenvalues when a new shift of G⊤SHIRA is determined. We demonstrate our new approach by calculating the SGF of a realistic nanowire whose unit cell is described by a matrix of size 4000 × 4000 at the density functional tight binding level, corresponding to a 8 × 8nm2 cross-section. We believe that quantum transport simulation of realistic nano-devices in the mesoscopic regime will greatly benefit from this work.

  15. Biomedical ontologies: toward scientific debate.

    Science.gov (United States)

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  16. Scientific Management Still Endures in Education

    Science.gov (United States)

    Ireh, Maduakolam

    2016-01-01

    Some schools in America have changed, while others remain unchanged due largely to the accretion of small adjustments in what remains a very traditional enterprise. The problem is rooted in the propagation and adoption of scientific management by educators who applied and/or continues to apply it to education to restore order and for…

  17. Scientific Council on problems on new processes in the coking industry. [Effect on coke consumption of moisture, sulfur and ash; substitution possibility

    Energy Technology Data Exchange (ETDEWEB)

    Filippov, B.S.

    1981-07-01

    This paper presents a report on the Coking Section of the Scientific Council held on November 20, 1980 in Moscow. The following problems were discussed: indexes characterizing blast furnace coke (for furnaces with a volume of 5580 M/sup 3/); replacing metallurgical coke with other types of fuels; use of brown coal; liners of coke ovens. Papers delivered during the session are summarized. Reducing moisture content in blast furnace coke permits its consumption to be reduced by 2%. Reducing sulfur content in blast furnace coke by 0.1% permits its consumption to be reduced from 10 to 15 kg for 1 t of pig iron. Increase in ash content of coke by 1% causes coke consumption increase ranging from 1.5 to 2.0%. About 10 Mmt of coke class with grains above 25 mm in USSR is used for purposes other than blast furnaces. Possibilities of substituting coke with lean coal are evaluated (particularly from Kuzbass). A method for briquetting a mixture of black and brown coal is proposed. Briquets are a suitable fuel in metallurgy. A new type of liner, which consists of at least 92% silicon dioxide, is described. Physical and mechanical properties of the liners are discussed.

  18. A Nonlinear Multiobjective Bilevel Model for Minimum Cost Network Flow Problem in a Large-Scale Construction Project

    Directory of Open Access Journals (Sweden)

    Jiuping Xu

    2012-01-01

    Full Text Available The aim of this study is to deal with a minimum cost network flow problem (MCNFP in a large-scale construction project using a nonlinear multiobjective bilevel model with birandom variables. The main target of the upper level is to minimize both direct and transportation time costs. The target of the lower level is to minimize transportation costs. After an analysis of the birandom variables, an expectation multiobjective bilevel programming model with chance constraints is formulated to incorporate decision makers’ preferences. To solve the identified special conditions, an equivalent crisp model is proposed with an additional multiobjective bilevel particle swarm optimization (MOBLPSO developed to solve the model. The Shuibuya Hydropower Project is used as a real-world example to verify the proposed approach. Results and analysis are presented to highlight the performances of the MOBLPSO, which is very effective and efficient compared to a genetic algorithm and a simulated annealing algorithm.

  19. Third-order-accurate numerical methods for efficient, large time-step solutions of mixed linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Cobb, J.W.

    1995-02-01

    There is an increasing need for more accurate numerical methods for large-scale nonlinear magneto-fluid turbulence calculations. These methods should not only increase the current state of the art in terms of accuracy, but should also continue to optimize other desired properties such as simplicity, minimized computation, minimized memory requirements, and robust stability. This includes the ability to stably solve stiff problems with long time-steps. This work discusses a general methodology for deriving higher-order numerical methods. It also discusses how the selection of various choices can affect the desired properties. The explicit discussion focuses on third-order Runge-Kutta methods, including general solutions and five examples. The study investigates the linear numerical analysis of these methods, including their accuracy, general stability, and stiff stability. Additional appendices discuss linear multistep methods, discuss directions for further work, and exhibit numerical analysis results for some other commonly used lower-order methods.

  20. Anza palaeoichnological site. Late Cretaceous. Morocco. Part II. Problems of large dinosaur trackways and the first African Macropodosaurus trackway

    Science.gov (United States)

    Masrour, Moussa; Lkebir, Noura; Pérez-Lorente, Félix

    2017-10-01

    The Anza site shows large ichnological surfaces indicating the coexistence in the same area of different vertebrate footprints (dinosaur and pterosaur) and of different types (tridactyl and tetradactyl, semiplantigrade and rounded without digit marks) and the footprint variability of long trackways. This area may become a world reference in ichnology because it contains the second undebatable African site with Cretaceous pterosaur footprints - described in part I - and the first African site with Macropodosaurus footprints. In this work, problems related to long trackways are also analyzed, such as their sinuosity, the order-disorder of the variability (long-short) of the pace length and the difficulty of morphological classification of the theropod footprints due to their morphological variability.

  1. [A new stage of development of gerontology and geriatrics in Russia: Problems o creation of a geriatric care system. Part 2. The structure of the system, scientific approach].

    Science.gov (United States)

    Anisimov, V N; Serpov, V Yu; Finagentov, A V; Khavinson, V Kh

    2017-01-01

    The publication is the second part of the analytical review on the new stage of development of gerontology and geriatrics in Russia. Components of social support system for senior citizens and the structure of social-medical care as its crucial components are presented. The problem of positioning of geriatric care within the system of social support for senior citizens, as well as its peculiarities and the algorithm providing geriatric care are discussed. The analysis of this algorithm allowed us to justify the indissoluble link and continuity of individual components of geriatric care and its cost-effectiveness. The position of the Russian Federation Ministry of Health concerning of introduction of geriatric care as an element in the system of medical care for older citizens was looking through. The pilot project «Territory of Care» proposed by the Russian Federation Ministry of Labor and Ministry of Health for establishment of long-term system of medical and social care to citizens of the older generation on the principles of multidisciplinary and interdepartmental interaction was elucidated as well. Some failures of the project have been highlighted and recommendations for its development were stressed. The role of gerontology as a systemic basis for creation of geriatric service in Russia and for the development of an integrated social and medical care to citizens of the older generation was underlined. The main priorities in the field of aging in the forthcoming decade are formulated. The most promising areas of research in the field of gerontology were discussed, the implementation of which will allow to realize the State social policy goals focused on the quality of life of senior citizens. Finally, the position of Gerontological Society of the Russian Academy of Sciences on the creation of mechanisms of scientific support for the renovation of geriatric services, including collaboration with experts in the field of practical medicine, social workers, and

  2. Statistical process control charts for attribute data involving very large sample sizes: a review of problems and solutions.

    Science.gov (United States)

    Mohammed, Mohammed A; Panesar, Jagdeep S; Laney, David B; Wilson, Richard

    2013-04-01

    The use of statistical process control (SPC) charts in healthcare is increasing. The primary purpose of SPC is to distinguish between common-cause variation which is attributable to the underlying process, and special-cause variation which is extrinsic to the underlying process. This is important because improvement under common-cause variation requires action on the process, whereas special-cause variation merits an investigation to first find the cause. Nonetheless, when dealing with attribute or count data (eg, number of emergency admissions) involving very large sample sizes, traditional SPC charts often produce tight control limits with most of the data points appearing outside the control limits. This can give a false impression of common and special-cause variation, and potentially misguide the user into taking the wrong actions. Given the growing availability of large datasets from routinely collected databases in healthcare, there is a need to present a review of this problem (which arises because traditional attribute charts only consider within-subgroup variation) and its solutions (which consider within and between-subgroup variation), which involve the use of the well-established measurements chart and the more recently developed attribute charts based on Laney's innovative approach. We close by making some suggestions for practice.

  3. Increasing the efficiency of the TOUGH code for running large-scale problems in nuclear waste isolation

    International Nuclear Information System (INIS)

    Nitao, J.J.

    1990-08-01

    The TOUGH code developed at Lawrence Berkeley Laboratory (LBL) is being extensively used to numerically simulate the thermal and hydrologic environment around nuclear waste packages in the unsaturated zone for the Yucca Mountain Project. At the Lawrence Livermore National Laboratory (LLNL) we have rewritten approximately 80 percent of the TOUGH code to increase its speed and incorporate new options. The geometry of many requires large numbers of computational elements in order to realistically model detailed physical phenomena, and, as a result, large amounts of computer time are needed. In order to increase the speed of the code we have incorporated fast linear equation solvers, vectorization of substantial portions of code, improved automatic time stepping, and implementation of table look-up for the steam table properties. These enhancements have increased the speed of the code for typical problems by a factor of 20 on the Cray 2 computer. In addition to the increase in computational efficiency we have added several options: vapor pressure lowering; equivalent continuum treatment of fractures; energy and material volumetric, mass and flux accounting; and Stefan-Boltzmann radiative heat transfer. 5 refs

  4. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  5. C Versus Fortran-77 for Scientific Programming

    Directory of Open Access Journals (Sweden)

    Tom MacDonald

    1992-01-01

    Full Text Available The predominant programming language for numeric and scientific applications is Fortran-77 and supercomputers are primarily used to run large-scale numeric and scientific applications. Standard C* is not widely used for numerical and scientific programming, yet Standard C provides many desirable linguistic features not present in Fortran-77. Furthermore, the existence of a standard library and preprocessor eliminates the worst portability problems. A comparison of Standard C and Fortran-77 shows several key deficiencies in C that reduce its ability to adequately solve some numerical problems. Some of these problems have already been addressed by the C standard but others remain. Standard C with a few extensions and modifications could be suitable for all numerical applications and could become more popular in supercomputing environments.

  6. Soil erosion and sediment yield, a double barrel problem in South Africa's only large river network without a dam

    Science.gov (United States)

    Le Roux, Jay

    2016-04-01

    Soil erosion not only involves the loss of fertile topsoil but is also coupled with sedimentation of dams, a double barrel problem in semi-arid regions where water scarcity is frequent. Due to increasing water requirements in South Africa, the Department of Water and Sanitation is planning water resource development in the Mzimvubu River Catchment, which is the only large river network in the country without a dam. Two dams are planned including a large irrigation dam and a hydropower dam. However, previous soil erosion studies indicate that large parts of the catchment is severely eroded. Previous studies, nonetheless, used mapping and modelling techniques that represent only a selection of erosion processes and provide insufficient information about the sediment yield. This study maps and models the sediment yield comprehensively by means of two approaches over a five-year timeframe between 2007 and 2012. Sediment yield contribution from sheet-rill erosion was modelled with ArcSWAT (a graphical user interface for SWAT in a GIS), whereas gully erosion contributions were estimated using time-series mapping with SPOT 5 imagery followed by gully-derived sediment yield modelling in a GIS. Integration of the sheet-rill and gully results produced a total sediment yield map, with an average of 5 300 t km-2 y-1. Importantly, the annual average sediment yield of the areas where the irrigation dam and hydropower dam will be built is around 20 000 t km-2 y-1. Without catchment rehabilitation, the life expectancy of the irrigation dam and hydropower dam could be 50 and 40 years respectively.

  7. Proceedings of 6. international scientific conference 'Sakharov readings 2006: Ecological problems of XXI century'. Pt. 1; Materialy 6-oj mezhdunarodnoj nauchnoj konferentsii 'Sakharovskie chteniya 2006 goda: Ehkologicheskie problemy XXI veka'. Ch. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kundas, S P; Okeanov, A E; Poznyak, S S [International A. Sakharov environmental univ., Minsk (Belarus)

    2006-05-15

    The first part of proceedings of the sixth international scientific conference 'Sakharov readings 2006: Ecological problems of XXI century', which was held in the International A. Sakharov environmental university, contents materials on topics: socio-ecological problems, medical ecology, biomonitoring and bioindication, biological ecology. The proceedings are intended for specialists in field of ecology and related sciences, teachers, students and post-graduate students. (authors)

  8. Blending Problem Based Learning and History of Science Approaches to Enhance Views about Scientific Inquiry: New Wine in an Old Bottle

    Science.gov (United States)

    Dogan, Nihal

    2017-01-01

    In 2016, the Program for International Student Assessment (PISA) showed that approximately 44.4% of students in Turkey obtained very low grades when their scientific knowledge was evaluated. In addition, the vast majority of students were shown to have no knowledge of basic scientific terms or concepts. Science teachers play a significant role in…

  9. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Problemas metodológicos en las investigaciones sobre VIH/SIDA en Bolivia Methodological problems in the scientific research on HIV /AIDS in Bolivia

    Directory of Open Access Journals (Sweden)

    Susana Ramírez Hita

    2013-05-01

    Full Text Available Este trabajo es una reflexión sobre las dificultades metodológicas que arrastra la producción científica, tanto epidemiológica como de ciencias sociales, relativa a la problemática del VIH/SIDA en Bolivia. Los estudios asociados a esta producción sirvieron de base para la implementación de programas del Fondo Mundial, la Organización Panamericana de la Salud, cooperaciones internacionales, Organizaciones No Gubernamentales y el Ministerio de Salud y Deportes boliviano. El análisis de las contradicciones y falencias metodológicas se realizó a través de una revisión bibliográfica y una investigación de metodología cualitativa, que se centró en la calidad de atención a las personas viviendo con VIH/SIDA en servicios públicos de salud y en cómo son realizados y diseñados los programas destinados a esta población. De esta manera se pudo observar las deficiencias en los diseños metodológicos que presentan los estudios epidemiológicos y de ciencias sociales que sirven de base para la implementación de programas sanitarios.This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the

  11. Scientific developments ISFD3

    Science.gov (United States)

    Schropp, M.H.I.; Soong, T.W.

    2006-01-01

    Highlights, trends, and consensus from the 63 papers submitted to the Scientific Developments theme of the Third International Symposium on Flood Defence (ISFD) are presented. Realizing that absolute protection against flooding can never be guaranteed, trends in flood management have shifted: (1) from flood protection to flood-risk management, (2) from reinforcing structural protection to lowering flood levels, and (3) to sustainable management through integrated problem solving. Improved understanding of watershed responses, climate changes, applications of GIS and remote-sensing technologies, and advanced analytical tools appeared to be the driving forces for renewing flood-risk management strategies. Technical competence in integrating analytical tools to form the basin wide management systems are demonstrated by several large, transnation models. However, analyses from social-economic-environmental points of view are found lag in general. ?? 2006 Taylor & Francis Group.

  12. Social Network Analysis and Mining to Monitor and Identify Problems with Large-Scale Information and Communication Technology Interventions.

    Science.gov (United States)

    da Silva, Aleksandra do Socorro; de Brito, Silvana Rossy; Vijaykumar, Nandamudi Lankalapalli; da Rocha, Cláudio Alex Jorge; Monteiro, Maurílio de Abreu; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    The published literature reveals several arguments concerning the strategic importance of information and communication technology (ICT) interventions for developing countries where the digital divide is a challenge. Large-scale ICT interventions can be an option for countries whose regions, both urban and rural, present a high number of digitally excluded people. Our goal was to monitor and identify problems in interventions aimed at certification for a large number of participants in different geographical regions. Our case study is the training at the Telecentros.BR, a program created in Brazil to install telecenters and certify individuals to use ICT resources. We propose an approach that applies social network analysis and mining techniques to data collected from Telecentros.BR dataset and from the socioeconomics and telecommunications infrastructure indicators of the participants' municipalities. We found that (i) the analysis of interactions in different time periods reflects the objectives of each phase of training, highlighting the increased density in the phase in which participants develop and disseminate their projects; (ii) analysis according to the roles of participants (i.e., tutors or community members) reveals that the interactions were influenced by the center (or region) to which the participant belongs (that is, a community contained mainly members of the same region and always with the presence of tutors, contradicting expectations of the training project, which aimed for intense collaboration of the participants, regardless of the geographic region); (iii) the social network of participants influences the success of the training: that is, given evidence that the degree of the community member is in the highest range, the probability of this individual concluding the training is 0.689; (iv) the North region presented the lowest probability of participant certification, whereas the Northeast, which served municipalities with similar

  13. SALTON SEA SCIENTIFIC DRILLING PROJECT: SCIENTIFIC PROGRAM.

    Science.gov (United States)

    Sass, J.H.; Elders, W.A.

    1986-01-01

    The Salton Sea Scientific Drilling Project, was spudded on 24 October 1985, and reached a total depth of 10,564 ft. (3. 2 km) on 17 March 1986. There followed a period of logging, a flow test, and downhole scientific measurements. The scientific goals were integrated smoothly with the engineering and economic objectives of the program and the ideal of 'science driving the drill' in continental scientific drilling projects was achieved in large measure. The principal scientific goals of the project were to study the physical and chemical processes involved in an active, magmatically driven hydrothermal system. To facilitate these studies, high priority was attached to four areas of sample and data collection, namely: (1) core and cuttings, (2) formation fluids, (3) geophysical logging, and (4) downhole physical measurements, particularly temperatures and pressures.

  14. Liver diseases: A major, neglected global public health problem requiring urgent actions and large-scale screening.

    Science.gov (United States)

    Marcellin, Patrick; Kutala, Blaise K

    2018-02-01

    CLDs represent an important, and certainly underestimated, global public health problem. CLDs are highly prevalent and silent, related to different, sometimes associated causes. The distribution of the causes of these diseases is slowly changing, and within the next decade, the proportion of virus-induced CLDs will certainly decrease significantly while the proportion of NASH will increase. There is an urgent need for effective global actions including education, prevention and early diagnosis to manage and treat CLDs, thus preventing cirrhosis-related morbidity and mortality. Our role is to increase the awareness of the public, healthcare professionals and public health authorities to encourage active policies for early management that will decrease the short- and long-term public health burden of these diseases. Because necroinflammation is the key mechanism in the progression of CLDs, it should be detected early. Thus, large-scale screening for CLDs is needed. ALT levels are an easy and inexpensive marker of liver necroinflammation and could be the first-line tool in this process. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Inverse problem to constrain the controlling parameters of large-scale heat transport processes: The Tiberias Basin example

    Science.gov (United States)

    Goretzki, Nora; Inbar, Nimrod; Siebert, Christian; Möller, Peter; Rosenthal, Eliyahu; Schneider, Michael; Magri, Fabien

    2015-04-01

    Salty and thermal springs exist along the lakeshore of the Sea of Galilee, which covers most of the Tiberias Basin (TB) in the northern Jordan- Dead Sea Transform, Israel/Jordan. As it is the only freshwater reservoir of the entire area, it is important to study the salinisation processes that pollute the lake. Simulations of thermohaline flow along a 35 km NW-SE profile show that meteoric and relic brines are flushed by the regional flow from the surrounding heights and thermally induced groundwater flow within the faults (Magri et al., 2015). Several model runs with trial and error were necessary to calibrate the hydraulic conductivity of both faults and major aquifers in order to fit temperature logs and spring salinity. It turned out that the hydraulic conductivity of the faults ranges between 30 and 140 m/yr whereas the hydraulic conductivity of the Upper Cenomanian aquifer is as high as 200 m/yr. However, large-scale transport processes are also dependent on other physical parameters such as thermal conductivity, porosity and fluid thermal expansion coefficient, which are hardly known. Here, inverse problems (IP) are solved along the NW-SE profile to better constrain the physical parameters (a) hydraulic conductivity, (b) thermal conductivity and (c) thermal expansion coefficient. The PEST code (Doherty, 2010) is applied via the graphical interface FePEST in FEFLOW (Diersch, 2014). The results show that both thermal and hydraulic conductivity are consistent with the values determined with the trial and error calibrations. Besides being an automatic approach that speeds up the calibration process, the IP allows to cover a wide range of parameter values, providing additional solutions not found with the trial and error method. Our study shows that geothermal systems like TB are more comprehensively understood when inverse models are applied to constrain coupled fluid flow processes over large spatial scales. References Diersch, H.-J.G., 2014. FEFLOW Finite

  16. Fundamental problem of high-level radioactive waste disposal policy in Japan. Critical analysis responding to the publication of 'Nationwide Map of Scientific Features for Geological Disposal' by the Japanese government

    International Nuclear Information System (INIS)

    Juraku, Kohta

    2017-01-01

    The government explains that 'Scientific Characteristic Map' (hereinafter 'Map') shows the scientific characteristics of sites that are thought necessary to be taken into account when choosing the place to implement geological disposal and their geographical distribution on the Japanese map for the convenience to 'roughly overlook.' Nuclear Waste Management Organization of Japan (NUMO) as the implementing agency for geological disposal and the government (Agency for Natural Resources and Energy of Ministry of Economy, Trade and Industry) stress that this Map does not indicate so-called 'optimum land,' but it is the 'first step of a long way to realize disposal' for high-level radioactive waste (HLW). However, there clearly lurks a debate about the acceptance of the location of geological disposal in the future. The author has pointed out that the essence of the HLW disposal problem is a problem of 'value selection' that should be decided prior to the location of disposal site. The author believes that it is the competence of society how to identify the path of countermeasures by reconciling in a high degree the justice of the policies supported by scientific and professional knowledge and the justice of social decision making through a democratic duty process. However, the government is trying to forward HLW disposal only from the viewpoint of location problems, while neglecting the problem of 'value selection.' (A.O.)

  17. Decomposing Large Inverse Problems with an Augmented Lagrangian Approach: Application to Joint Inversion of Body-Wave Travel Times and Surface-Wave Dispersion Measurements

    Science.gov (United States)

    Reiter, D. T.; Rodi, W. L.

    2015-12-01

    Constructing 3D Earth models through the joint inversion of large geophysical data sets presents numerous theoretical and practical challenges, especially when diverse types of data and model parameters are involved. Among the challenges are the computational complexity associated with large data and model vectors and the need to unify differing model parameterizations, forward modeling methods and regularization schemes within a common inversion framework. The challenges can be addressed in part by decomposing the inverse problem into smaller, simpler inverse problems that can be solved separately, providing one knows how to merge the separate inversion results into an optimal solution of the full problem. We have formulated an approach to the decomposition of large inverse problems based on the augmented Lagrangian technique from optimization theory. As commonly done, we define a solution to the full inverse problem as the Earth model minimizing an objective function motivated, for example, by a Bayesian inference formulation. Our decomposition approach recasts the minimization problem equivalently as the minimization of component objective functions, corresponding to specified data subsets, subject to the constraints that the minimizing models be equal. A standard optimization algorithm solves the resulting constrained minimization problems by alternating between the separate solution of the component problems and the updating of Lagrange multipliers that serve to steer the individual solution models toward a common model solving the full problem. We are applying our inversion method to the reconstruction of the·crust and upper-mantle seismic velocity structure across Eurasia.· Data for the inversion comprise a large set of P and S body-wave travel times·and fundamental and first-higher mode Rayleigh-wave group velocities.

  18. An adaptive large neighborhood search heuristic for the pickup and delivery problem with time Windows and scheduled lines

    NARCIS (Netherlands)

    Ghilas, V.; Demir, E.; van Woensel, T.

    2016-01-01

    The Pickup and Delivery Problem with Time Windows and Scheduled Lines (PDPTW-SL) concerns scheduling a set of vehicles to serve freight requests such that a part of the journey can be carried out on a scheduled public transportation line. Due to the complexity of the problem, which is NP-hard, we

  19. How the Elderly Can Use Scientific Knowledge to Solve Problems While Designing Toys: A Retrospective Analysis of the Design of a Working UFO

    Science.gov (United States)

    Chen, Mei-Yung; Hong, Jon-Chao; Hwang, Ming-Yueh; Wong, Wan-Tzu

    2013-01-01

    The venerable aphorism "an old dog cannot learn new tricks" implies that the elderly rarely learn anything new--in particular, scientific knowledge. On the basis of "learning by doing," the present study emphasized knowledge application (KA) as elderly subjects collaborated on the design of a toy flying saucer (UFO). Three…

  20. Proceedings of 5. international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century'. Pt. 2; Materialy 5-oj mezhdunarodnoj nauchnoj konferentsii 'Sakharovskie chteniya 2005 goda: Ehkologicheskie problemy XXI veka'. Ch. 2

    Energy Technology Data Exchange (ETDEWEB)

    Kundas, S P; Okeanov, A E [International A. Sakharov environmental univ., Minsk (Belarus); Shevchuk, V E [Kamiteht pa prablemam nastupstvaw katastrofy na Charnobyl' skaj AEhS pry Savetse Ministraw Rehspubliki Belarus' , Minsk (Belarus)

    2005-05-15

    The first part of proceedings of the fifth international scientific conference 'Sakharov readings 2005: Ecological problems of XXI century', which was held in the International A. Sakharov Environmental University, contents materials on topics: radioecology, ecological and radiation monitoring, new information systems and technologies in ecology, priority ecological power engineering, management in ecology, ecological education. The proceedings are intended for specialists in field of ecology and related sciences, dosimetry, engineers, teachers, students and post-graduate students.

  1. Empirical Mining of Large Data Sets Already Helps to Solve Practical Ecological Problems; A Panoply of Working Examples (Invited)

    Science.gov (United States)

    Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.

    2013-12-01

    Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak

  2. A solution approach based on Benders decomposition for the preventive maintenance scheduling problem of a stochastic large-scale energy system

    DEFF Research Database (Denmark)

    Lusby, Richard Martin; Muller, Laurent Flindt; Petersen, Bjørn

    2013-01-01

    This paper describes a Benders decomposition-based framework for solving the large scale energy management problem that was posed for the ROADEF 2010 challenge. The problem was taken from the power industry and entailed scheduling the outage dates for a set of nuclear power plants, which need...... to be regularly taken down for refueling and maintenance, in such away that the expected cost of meeting the power demand in a number of potential scenarios is minimized. We show that the problem structure naturally lends itself to Benders decomposition; however, not all constraints can be included in the mixed...

  3. Shaping a Scientific Self

    DEFF Research Database (Denmark)

    Andrade-Molina, Melissa; Valero, Paola

    us to understand how a truth is reproduced, circulating among diverse fields of human knowledge. Also it will show why we accept and reproduce a particular discourse. Finally, we state Euclidean geometry as a truth that circulates in scientific discourse and performs a scientific self. We unfold...... the importance of having students following the path of what schools perceive a real scientist is, no to become a scientist, but to become a logical thinker, a problem solver, a productive citizen who uses reason....

  4. The Einstein Equations and the Large Scale Behavior of Gravitational Fields: 50 years of the Cauchy Problem in General Relativity

    International Nuclear Information System (INIS)

    Coles, P

    2006-01-01

    Cosmology is a discipline that encompasses many diverse aspects of physics and astronomy. This is part of its attraction, but also a reason why it is difficult for new researchers to acquire sufficient grounding to enable them to make significant contributions early in their careers. For this reason there are many cosmology textbooks aimed at the advanced undergraduate/beginning postgraduate level. Physical Foundations of Cosmology by Viatcheslav Mukhanov is a worthy new addition to this genre. Like most of its competitors it does not attempt to cover every single aspect of the subject but chooses a particular angle and tries to unify its treatment around that direction. Mukhanov has chosen to focus on the fundamental principles underlying modern cosmological research at the expense of some detail at the frontiers. The book places great emphasis on the particle-astrophysics interface and issues connected with the thermal history of the big-bang model. The treatment of big-bang nucleosynthesis is done in much more detail than in most texts at a similar level, for example. It also contains a very extended and insightful discussion of inflationary models. Mukhanov makes great use of approximate analytical arguments to develop physical intuition rather than concentrating on numerical approaches. The book is quite mathematical, but not in a pedantically formalistic way. There is much use of 'order-of-magnitude' dimensional arguments which undergraduate students often find difficult to get the hang of, but which they would do well to assimilate as early as possible in their research careers. The text is peppered with problems for the reader to solve, some straightforward and some exceedingly difficult. Solutions are not provided. The price to be paid for this foundational approach is that there is not much about observational cosmology in this book, and neither is there much about galaxy formation or large-scale structure. It also neglects some of the trendier recent

  5. The paradox of scientific expertise

    DEFF Research Database (Denmark)

    Alrøe, Hugo Fjelsted; Noe, Egon

    2011-01-01

    Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads to a f...... cross-disciplinary research and in the collective use of different kinds of scientific expertise, and thereby make society better able to solve complex, real-world problems.......Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads...... to a fragmentation of scientific expertise. To resolve this paradox, the present paper investigates three hypotheses: 1) All scientific knowledge is perspectival. 2) The perspectival structure of science leads to specific forms of knowledge asymmetries. 3) Such perspectival knowledge asymmetries must be handled...

  6. The Black-Scholes option pricing problem in mathematical finance: generalization and extensions for a large class of stochastic processes

    Science.gov (United States)

    Bouchaud, Jean-Philippe; Sornette, Didier

    1994-06-01

    The ability to price risks and devise optimal investment strategies in thé présence of an uncertain "random" market is thé cornerstone of modern finance theory. We first consider thé simplest such problem of a so-called "European call option" initially solved by Black and Scholes using Ito stochastic calculus for markets modelled by a log-Brownien stochastic process. A simple and powerful formalism is presented which allows us to generalize thé analysis to a large class of stochastic processes, such as ARCH, jump or Lévy processes. We also address thé case of correlated Gaussian processes, which is shown to be a good description of three différent market indices (MATIF, CAC40, FTSE100). Our main result is thé introduction of thé concept of an optimal strategy in the sense of (functional) minimization of the risk with respect to the portfolio. If the risk may be made to vanish for particular continuous uncorrelated 'quasiGaussian' stochastic processes (including Black and Scholes model), this is no longer the case for more general stochastic processes. The value of the residual risk is obtained and suggests the concept of risk-corrected option prices. In the presence of very large deviations such as in Lévy processes, new criteria for rational fixing of the option prices are discussed. We also apply our method to other types of options, `Asian', `American', and discuss new possibilities (`doubledecker'...). The inclusion of transaction costs leads to the appearance of a natural characteristic trading time scale. L'aptitude à quantifier le coût du risque et à définir une stratégie optimale de gestion de portefeuille dans un marché aléatoire constitue la base de la théorie moderne de la finance. Nous considérons d'abord le problème le plus simple de ce type, à savoir celui de l'option d'achat `européenne', qui a été résolu par Black et Scholes à l'aide du calcul stochastique d'Ito appliqué aux marchés modélisés par un processus Log

  7. Scientific perspectives on music therapy.

    Science.gov (United States)

    Hillecke, Thomas; Nickel, Anne; Bolay, Hans Volker

    2005-12-01

    What needs to be done on the long road to evidence-based music therapy? First of all, an adequate research strategy is required. For this purpose the general methodology for therapy research should be adopted. Additionally, music therapy needs a variety of methods of allied fields to contribute scientific findings, including mathematics, natural sciences, behavioral and social sciences, as well as the arts. Pluralism seems necessary as well as inevitable. At least two major research problems can be identified, however, that make the path stony: the problem of specificity and the problem of eclecticism. Neuroscientific research in music is giving rise to new ideas, perspectives, and methods; they seem to be promising prospects for a possible contribution to a theoretical and empirical scientific foundation for music therapy. Despite the huge heterogeneity of theoretical approaches in music therapy, an integrative model of working ingredients in music therapy is useful as a starting point for empirical studies in order to question what specifically works in music therapy. For this purpose, a heuristic model, consisting of five music therapy working factors (attention modulation, emotion modulation, cognition modulation, behavior modulation, and communication modulation) has been developed by the Center for Music Therapy Research (Viktor Dulger Institute) in Heidelberg. Evidence shows the effectiveness of music therapy for treating certain diseases, but the question of what it is in music therapy that works remains largely unanswered. The authors conclude with some questions to neuroscientists, which we hope may help elucidate relevant aspects of a possible link between the two disciplines.

  8. Improved formulations and an Adaptive Large Neighborhood Search heuristic for the integrated berth allocation and quay crane assignment problem

    DEFF Research Database (Denmark)

    Iris, Cagatay; Pacino, Dario; Røpke, Stefan

    2017-01-01

    This paper focuses on the integrated berth allocation and quay crane assignment problem in container terminals. We consider the decrease in the marginal productivity of quay cranes and the increase in handling time due to deviation from the desired position. We consider a continuous berth...

  9. Solving large instances of the quadratic cost of partition problem on dense graphs by data correcting algorithms

    NARCIS (Netherlands)

    Goldengorin, Boris; Vink, Marius de

    1999-01-01

    The Data-Correcting Algorithm (DCA) corrects the data of a hard problem instance in such a way that we obtain an instance of a well solvable special case. For a given prescribed accuracy of the solution, the DCA uses a branch and bound scheme to make sure that the solution of the corrected instance

  10. Large-Scale Studies on the Transferability of General Problem-Solving Skills and the Pedagogic Potential of Physics

    Science.gov (United States)

    Mashood, K. K.; Singh, Vijay A.

    2013-01-01

    Research suggests that problem-solving skills are transferable across domains. This claim, however, needs further empirical substantiation. We suggest correlation studies as a methodology for making preliminary inferences about transfer. The correlation of the physics performance of students with their performance in chemistry and mathematics in…

  11. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  12. Optimization of the solution of the problem of scheduling theory ...

    African Journals Online (AJOL)

    This article describes the genetic algorithm used to solve the problem related to the scheduling theory. A large number of different methods is described in the scientific literature. The main issue that faced the problem in question is that it is necessary to search the optimal solution in a large search space for the set of ...

  13. Scientific Misconduct.

    Science.gov (United States)

    Goodstein, David

    2002-01-01

    Explores scientific fraud, asserting that while few scientists actually falsify results, the field has become so competitive that many are misbehaving in other ways; an example would be unreasonable criticism by anonymous peer reviewers. (EV)

  14. The problem of maintaining large herbivores in small conservation areas: deterioration of the grassveld in the Addo Elephant National Park

    Directory of Open Access Journals (Sweden)

    P. Novellie

    1991-09-01

    Full Text Available Changes in vegetation cover and species composition in a grassland community during a six year period are reported. The grass Themeda triandra and the dwarf shrub Helichrysum rosum decreased in abundance, whereas the grass Eragrostis obtusa increased. Comparison of grazed plots with fenced plots revealed large herbivores were responsible for the increase in abundance ofE. obtusa. The abundance of T. triandra was influenced by large herbivores, but rainfall fluctuations apparently also played a role. The decline in relative abundance of/7. rosum was evidently not caused by large herbivores. Grass cover was closely determined by rainfall. A drought-induced decline in forage abundance evidently caused the buffalo population to crash.

  15. A modified Symbiotic Organisms Search algorithm for large scale economic dispatch problem with valve-point effects

    International Nuclear Information System (INIS)

    Secui, Dinu Calin

    2016-01-01

    This paper proposes a new metaheuristic algorithm, called Modified Symbiotic Organisms Search (MSOS) algorithm, to solve the economic dispatch problem considering the valve-point effects, the prohibited operating zones (POZ), the transmission line losses, multi-fuel sources, as well as other operating constraints of the generating units and power system. The MSOS algorithm introduces, in all of its phases, new relations to update the solutions to improve its capacity of identifying stable and of high-quality solutions in a reasonable time. Furthermore, to increase the capacity of exploring the MSOS algorithm in finding the most promising zones, it is endowed with a chaotic component generated by the Logistic map. The performance of the modified algorithm and of the original algorithm Symbiotic Organisms Search (SOS) is tested on five systems of different characteristics, constraints and dimensions (13-unit, 40-unit, 80-unit, 160-unit and 320-unit). The results obtained by applying the proposed algorithm (MSOS) show that this has a better performance than other techniques of optimization recently used in solving the economic dispatch problem with valve-point effects. - Highlights: • A new modified SOS algorithm (MSOS) is proposed to solve the EcD problem. • Valve-point effects, ramp-rate limits, POZ, multi-fuel sources, transmission losses were considered. • The algorithm is tested on five systems having 13, 40, 80, 160 and 320 thermal units. • MSOS algorithm outperforms many other optimization techniques.

  16. Software support for irregular and loosely synchronous problems

    Science.gov (United States)

    Choudhary, A.; Fox, G.; Hiranandani, S.; Kennedy, K.; Koelbel, C.; Ranka, S.; Saltz, J.

    1992-01-01

    A large class of scientific and engineering applications may be classified as irregular and loosely synchronous from the perspective of parallel processing. We present a partial classification of such problems. This classification has motivated us to enhance FORTRAN D to provide language support for irregular, loosely synchronous problems. We present techniques for parallelization of such problems in the context of FORTRAN D.

  17. Popularization of science and scientific journalism: possibilities of scientific literacy

    Directory of Open Access Journals (Sweden)

    Alessandro Augusto Barros Façanha

    2017-07-01

    Full Text Available This study evidences the intersection between science education and communication in the perspective of the popularization of sciences based on the evidence produced in a specific column of a large circulation newspaper of the city of Teresina / PI. The discussions were based on the analysis of content carried out in the context of science classes in a school of basic education with elementary students, where journalistic texts were used with diverse themes that involved science and daily life in order to understand the interpretation of texts And the relationship with the context of scientific dissemination and citizenship. The analysis of the content was used and the answers were stratified into categories of conceptual nature and application of the themes. The analyses show that the texts of scientific dissemination have a contribution in relation to the popularization of Sciences, fomentation to the debate in the classroom, didactic increment in the classes of sciences, in spite of their insertion still incipient in the context of science education. However, the results of the research denote the difficulty faced by the students in understanding the text of dissemination in their conceptual comprehension and resolution of daily problems, as well as the distance between the context of the sciences in their theoretical scope and their presentation in everyday situations, Despite this, the texts of divulgation corroborated as an important way of real insertion in the process of scientific literacy and promotion of citizenship.

  18. North American vegetation model for land-use planning in a changing climate: A solution to large classification problems

    Science.gov (United States)

    Gerald E. Rehfeldt; Nicholas L. Crookston; Cuauhtemoc Saenz-Romero; Elizabeth M. Campbell

    2012-01-01

    Data points intensively sampling 46 North American biomes were used to predict the geographic distribution of biomes from climate variables using the Random Forests classification tree. Techniques were incorporated to accommodate a large number of classes and to predict the future occurrence of climates beyond the contemporary climatic range of the biomes. Errors of...

  19. Large scale implementation of clinical medication reviews in Dutch community pharmacies: Drug-related problems and interventions

    NARCIS (Netherlands)

    Kempen, Thomas G. H.; Van De Steeg-Van Gompel, Caroline H. P. A.; Hoogland, Petra; Liu, Yuqian; Bouvy, Marcel L.

    2014-01-01

    Background: Research on the benefits of clinical medication reviews (CMRs) performed by pharmacists has been conducted mostly in controlled settings and has been widely published. Less is known of the effects after large scale implementation in community pharmacies. An online CMR tool enabled the

  20. Can a large neutron excess help solve the baryon loading problem in gamma-Ray burst fireballs?

    Science.gov (United States)

    Fuller; Pruet; Abazajian

    2000-09-25

    We point out that the baryon loading problem in gamma-ray burst (GRB) models can be ameliorated if a significant fraction of the baryons which inertially confine the fireball is converted to neutrons. A high neutron fraction can result in a reduced transfer of energy from relativistic light particles in the fireball to baryons. The energy needed to produce the required relativistic flow in the GRB is consequently reduced, in some cases by orders of magnitude. A high neutron-to-proton ratio has been calculated in neutron star-merger fireball environments. Significant neutron excess also could occur near compact objects with high neutrino fluxes.

  1. Perturbation series at large orders in quantum mechanics and field theories: application to the problem of resummation

    International Nuclear Information System (INIS)

    Zinn-Justin, J.; Freie Univ. Berlin

    1981-01-01

    In this review I present a method to estimate the large order behavior of perturbation theory in quantum mechanics and field theory. The basic idea, due to Lipatov, is to relate the large order behavior to (in general complex) instanton contributions to the path integral representation of Green's functions. I explain the method first in the case of a simple integral and of the anharmonic oscillator and recover the results of Bender and Wu. I apply it then to the PHI 4 field theory. I study general potentials and boson field theories. I show, following Parisi, how the method can be generalized to theories with fermions. Finally I outline the implications of these results for the summability of the series. In particular I explain a method to sum divergent series based on a Borel transformation. In a last section I compare the larger order behavior predictions to actual series calculation. I present also some numerical examples of series summation. (orig.)

  2. The effect of random matter density perturbations on the large mixing angle solution to the solar neutrino problem

    Science.gov (United States)

    Guzzo, M. M.; Holanda, P. C.; Reggiani, N.

    2003-08-01

    The neutrino energy spectrum observed in KamLAND is compatible with the predictions based on the Large Mixing Angle realization of the MSW (Mikheyev-Smirnov-Wolfenstein) mechanism, which provides the best solution to the solar neutrino anomaly. From the agreement between solar neutrino data and KamLAND observations, we can obtain the best fit values of the mixing angle and square difference mass. When doing the fitting of the MSW predictions to the solar neutrino data, it is assumed the solar matter do not have any kind of perturbations, that is, it is assumed the the matter density monothonically decays from the center to the surface of the Sun. There are reasons to believe, nevertheless, that the solar matter density fluctuates around the equilibrium profile. In this work, we analysed the effect on the Large Mixing Angle parameters when the density matter randomically fluctuates around the equilibrium profile, solving the evolution equation in this case. We find that, in the presence of these density perturbations, the best fit values of the mixing angle and the square difference mass assume smaller values, compared with the values obtained for the standard Large Mixing Angle Solution without noise. Considering this effect of the random perturbations, the lowest island of allowed region for KamLAND spectral data in the parameter space must be considered and we call it very-low region.

  3. Heat recovery networks synthesis of large-scale industrial sites: Heat load distribution problem with virtual process subsystems

    International Nuclear Information System (INIS)

    Pouransari, Nasibeh; Maréchal, Francois

    2015-01-01

    Highlights: • Synthesizing industrial size heat recovery network with match reduction approach. • Targeting TSI with minimum exchange between process subsystems. • Generating a feasible close-to-optimum network. • Reducing tremendously the HLD computational time and complexity. • Generating realistic network with respect to the plant layout. - Abstract: This paper presents a targeting strategy to design a heat recovery network for an industrial plant by dividing the system into subsystems while considering the heat transfer opportunities between them. The methodology is based on a sequential approach. The heat recovery opportunity between process units and the optimal flow rates of utilities are first identified using a Mixed Integer Linear Programming (MILP) model. The site is then divided into a number of subsystems where the overall interaction is resumed by a pair of virtual hot and cold stream per subsystem which is reconstructed by solving the heat cascade inside each subsystem. The Heat Load Distribution (HLD) problem is then solved between those packed subsystems in a sequential procedure where each time one of the subsystems is unpacked by switching from the virtual stream pair back into the original ones. The main advantages are to minimize the number of connections between process subsystems, to alleviate the computational complexity of the HLD problem and to generate a feasible network which is compatible with the minimum energy consumption objective. The application of the proposed methodology is illustrated through a number of case studies, discussed and compared with the relevant results from the literature

  4. Global existence and large time asymptotic behavior of strong solutions to the Cauchy problem of 2D density-dependent Navier–Stokes equations with vacuum

    Science.gov (United States)

    Lü, Boqiang; Shi, Xiaoding; Zhong, Xin

    2018-06-01

    We are concerned with the Cauchy problem of the two-dimensional (2D) nonhomogeneous incompressible Navier–Stokes equations with vacuum as far-field density. It is proved that if the initial density decays not too slow at infinity, the 2D Cauchy problem of the density-dependent Navier–Stokes equations on the whole space admits a unique global strong solution. Note that the initial data can be arbitrarily large and the initial density can contain vacuum states and even have compact support. Furthermore, we also obtain the large time decay rates of the spatial gradients of the velocity and the pressure, which are the same as those of the homogeneous case.

  5. Fast and accurate solution for the SCUC problem in large-scale power systems using adapted binary programming and enhanced dual neural network

    International Nuclear Information System (INIS)

    Shafie-khah, M.; Moghaddam, M.P.; Sheikh-El-Eslami, M.K.; Catalão, J.P.S.

    2014-01-01

    Highlights: • A novel hybrid method based on decomposition of SCUC into QP and BP problems is proposed. • An adapted binary programming and an enhanced dual neural network model are applied. • The proposed EDNN is exactly convergent to the global optimal solution of QP. • An AC power flow procedure is developed for including contingency/security issues. • It is suited for large-scale systems, providing both accurate and fast solutions. - Abstract: This paper presents a novel hybrid method for solving the security constrained unit commitment (SCUC) problem. The proposed formulation requires much less computation time in comparison with other methods while assuring the accuracy of the results. Furthermore, the framework provided here allows including an accurate description of warmth-dependent startup costs, valve point effects, multiple fuel costs, forbidden zones of operation, and AC load flow bounds. To solve the nonconvex problem, an adapted binary programming method and enhanced dual neural network model are utilized as optimization tools, and a procedure for AC power flow modeling is developed for including contingency/security issues, as new contributions to earlier studies. Unlike classical SCUC methods, the proposed method allows to simultaneously solve the unit commitment problem and comply with the network limits. In addition to conventional test systems, a real-world large-scale power system with 493 units has been used to fully validate the effectiveness of the novel hybrid method proposed

  6. Scientific millenarianism

    International Nuclear Information System (INIS)

    Weinberg, A.M.

    1997-01-01

    Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO 2 warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are the questions addressed in this paper

  7. Botany and topography: the problem of the levelling of plants in the scientific historiography on Francisco José de Caldas

    Directory of Open Access Journals (Sweden)

    María Alejandra Puerta Olaya,

    2017-07-01

    Full Text Available The levelling of plants is usually recognized as one of the main concepts in the works and the thought of Francisco José de Caldas. There are different interpretations about this concept, but in general, the treatment is not very careful and does not really go into the details concerning its theoretical assumptions and consequences. In this article, we identify the diverse interpretations that historians have offered regarding the origin, the function and the definition of this concept. Our interest is to show the difficulties that the scientific historiography on Caldas faces when it deals with this concept, and how these difficulties generate uncertainty concerning the coherence that may exist between those different interpretations. In particular, we defend the thesis that the approach to the term “levelling of plants” has been focused more on the plants part than on the levelling part, that is, more on botany than on topography. This historiographic assumption has led to the construction of historical narratives that, despite the explicit topographic dimension of the term, place it in the history of botany and not in the history of topography.

  8. Challenges and opportunities in coding the commons: problems, procedures, and potential solutions in large-N comparative case studies

    Directory of Open Access Journals (Sweden)

    Elicia Ratajczyk

    2016-09-01

    Full Text Available On-going efforts to understand the dynamics of coupled social-ecological (or more broadly, coupled infrastructure systems and common pool resources have led to the generation of numerous datasets based on a large number of case studies. This data has facilitated the identification of important factors and fundamental principles which increase our understanding of such complex systems. However, the data at our disposal are often not easily comparable, have limited scope and scale, and are based on disparate underlying frameworks inhibiting synthesis, meta-analysis, and the validation of findings. Research efforts are further hampered when case inclusion criteria, variable definitions, coding schema, and inter-coder reliability testing are not made explicit in the presentation of research and shared among the research community. This paper first outlines challenges experienced by researchers engaged in a large-scale coding project; then highlights valuable lessons learned; and finally discusses opportunities for further research on comparative case study analysis focusing on social-ecological systems and common pool resources.

  9. Solving Classification Problems for Large Sets of Protein Sequences with the Example of Hox and ParaHox Proteins

    Directory of Open Access Journals (Sweden)

    Stefanie D. Hueber

    2016-02-01

    Full Text Available Phylogenetic methods are key to providing models for how a given protein family evolved. However, these methods run into difficulties when sequence divergence is either too low or too high. Here, we provide a case study of Hox and ParaHox proteins so that additional insights can be gained using a new computational approach to help solve old classification problems. For two (Gsx and Cdx out of three ParaHox proteins the assignments differ between the currently most established view and four alternative scenarios. We use a non-phylogenetic, pairwise-sequence-similarity-based method to assess which of the previous predictions, if any, are best supported by the sequence-similarity relationships between Hox and ParaHox proteins. The overall sequence-similarities show Gsx to be most similar to Hox2–3, and Cdx to be most similar to Hox4–8. The results indicate that a purely pairwise-sequence-similarity-based approach can provide additional information not only when phylogenetic inference methods have insufficient information to provide reliable classifications (as was shown previously for central Hox proteins, but also when the sequence variation is so high that the resulting phylogenetic reconstructions are likely plagued by long-branch-attraction artifacts.

  10. WHICH HEALTH-RELATED PROBLEMS ARE ASSOCIATED WITH PROBLEMATIC VIDEO-GAMING OR SOCIAL MEDIA USE IN ADOLESCENTS? A LARGE-SCALE CROSS-SECTIONAL STUDY

    Directory of Open Access Journals (Sweden)

    Saskia Y. M. Mérelle

    2017-02-01

    Full Text Available Objective: Problematic video-gaming or social media use may seriously affect adolescents’ health status. However, it is not very well known which health-related problems are most strongly related to these issues. To inform the development of prevention and intervention strategies, this study aims to gain a better understanding of the health-related problems and demographical factors associated with problematic video-gaming or social media use in early adolescence. Method: A cross-sectional analysis was performed on data collected by two Municipal Health Services in the Netherlands in 2013-2014. In this survey among youth, 21,053 students from secondary schools (mean age 14.4 years completed a web-based questionnaire. Multivariate analyses were carried out to assess the strength of the associations between mental health problems, life-events, lifestyle and substance use as independent variables, and problematic video-gaming and problematic social media use as dependent variables. Results: Of the participating students, 5.7% reported problematic video-gaming and 9.1% problematic social media use. Problematic video-gaming was most strongly associated with conduct problems, suicidal thoughts (all medium effect sizes, OR ≥2, p<0.01, sedentary behavior (large effect size, OR ≥3, p<0.01, and male gender (large effect size. Problematic social media use was highly associated with conduct problems, hyperactivity and sedentary behavior (all medium effect sizes. Additionally, female gender and non-Western ethnicity were relevant demographics (large and medium effect size. Conclusions: Most mental health problems were consistently associated with both problematic video-gaming and problematic social media use, though associations were only practically relevant for conduct problems (both groups, suicidal thoughts (problematic video-gaming and hyperactivity (problematic social media use. This study also highlights sedentary behavior as health risk as it

  11. Open scientific communication urged

    Science.gov (United States)

    Richman, Barbara T.

    In a report released last week the National Academy of Sciences' Panel on Scientific Communication and National Security concluded that the ‘limited and uncertain benefits’ of controls on the dissemination of scientific and technological research are ‘outweighed by the importance of scientific progress, which open communication accelerates, to the overall welfare of the nation.’ The 18-member panel, chaired by Dale R. Corson, president emeritus of Cornell University, was created last spring (Eos, April 20, 1982, p. 241) to examine the delicate balance between open dissemination of scientific and technical information and the U.S. government's desire to protect scientific and technological achievements from being translated into military advantages for our political adversaries.The panel dealt almost exclusively with the relationship between the United States and the Soviet Union but noted that there are ‘clear problems in scientific communication and national security involving Third World countries.’ Further study of this matter is necessary.

  12. A large industrial pollution problem on the Kyrgyzstan - Uzbekistan border: Soviet production of mercury and stibium for the Soviet military

    International Nuclear Information System (INIS)

    Hadjamberdiev, I.; Tukhvatshin, R.

    2009-01-01

    Soviet industry of mercury and stibium was located in South-East Fergana in Kyrgyzstan and Uzbekistan boarder. Khaidarken combine produced high pure mercury (99.9997 percent) since 1940, it was the second source in the World (after Almadena, Spain). Maximal production was 790 t in 1990, after Transitional Shock about 300 tons a year. Tail was established in 1967. There is special tube 5500 m transporting pulp to tail. The pulp contains about 0,003 mg/liter mercury, 0,005 mg/liter arsenic, 21 mg/liter stibium, etc. Pulp is cleaned by aluminum sulfuric and mortar. After drying and compressing by itself the concentrations rises: mercury 90-250 mg/kg, arsenic 190-400, stibium 800-1700 mg/kg. Environment pollution problem contains three kinds: ground water infiltration; old tube corroding some places (leaking from chink of tube) - both mentioned lead to vegetables cumulating; combine work spreading mercury by air to settlement Khaidarken. Kadamjay enterprise for stibium (mines, combine, purify plant, tails) began work in 1936. Most part of production used in soviet military. Maximal production was 17.000 t clearing ore in 1990, after USSR collapse 1-6 t/year. Tremendous tails and dams (total 150 mln t) remains non re-cultivated until now. The tails contain electrolysis wastage: sodium-sulfides, sulfites, sulfates; stibium; arsenic; cadmium; stibium; etc. Seven deposits (tail-damp really) established 1976, total square 76.1 thousands sq m, total volume 250 thousand cub m. The deposits over-filled, contents filtrating - little saline or lakes generated (one situated 50m near Uzbekistan boarder). River Shakhimardan flow to Uzbekistan (settlement Vuadil, Ferghana town). There are health damage indices in the areas.(author)

  13. Mechanical problems in turbomachines, steam and gas turbines. Large steam turbine manufacturing requirements to fulfill customer needs for electric power

    International Nuclear Information System (INIS)

    Brazzini, R.

    1975-01-01

    The needs of the customers in large steam turbines for electric power are examined. The choices and decisions made by the utility about the equipments are dealt with after considering the evolution of power demand on the French network. These decisions and choices mainly result from a technical and economic optimization of production equipments: choice of field-proven solutions, trend to lower steam characteristics, trend to higher output of the units (i.e. size effect), spreading out standardization of machines and components (policy of technical as well as technological levels, i.e. mass production effect). Standardization of external characteristics of units of same level of output and even standardization of some main components. The requirements turbine manufacturers have to meet may fall in two categories: on one side: gaining experience and know-how, capability of making high quality experiments, out put capacity, will to hold a high efficiency level; on the other side: meeting the technical requirements related to the contracts. Among these requirements, one can differentiate those dealing with the service expected from the turbine and that resulting in the responsibility limits of the manufacturer and those tending to gain interchangeability, to improve availability of the equipment, to increase safety, and to make operation and maintenance easier [fr

  14. Association of Stressful Life Events with Psychological Problems: A Large-Scale Community-Based Study Using Grouped Outcomes Latent Factor Regression with Latent Predictors

    Directory of Open Access Journals (Sweden)

    Akbar Hassanzadeh

    2017-01-01

    Full Text Available Objective. The current study is aimed at investigating the association between stressful life events and psychological problems in a large sample of Iranian adults. Method. In a cross-sectional large-scale community-based study, 4763 Iranian adults, living in Isfahan, Iran, were investigated. Grouped outcomes latent factor regression on latent predictors was used for modeling the association of psychological problems (depression, anxiety, and psychological distress, measured by Hospital Anxiety and Depression Scale (HADS and General Health Questionnaire (GHQ-12, as the grouped outcomes, and stressful life events, measured by a self-administered stressful life events (SLEs questionnaire, as the latent predictors. Results. The results showed that the personal stressors domain has significant positive association with psychological distress (β=0.19, anxiety (β=0.25, depression (β=0.15, and their collective profile score (β=0.20, with greater associations in females (β=0.28 than in males (β=0.13 (all P<0.001. In addition, in the adjusted models, the regression coefficients for the association of social stressors domain and psychological problems profile score were 0.37, 0.35, and 0.46 in total sample, males, and females, respectively (P<0.001. Conclusion. Results of our study indicated that different stressors, particularly those socioeconomic related, have an effective impact on psychological problems. It is important to consider the social and cultural background of a population for managing the stressors as an effective approach for preventing and reducing the destructive burden of psychological problems.

  15. Association of Stressful Life Events with Psychological Problems: A Large-Scale Community-Based Study Using Grouped Outcomes Latent Factor Regression with Latent Predictors

    Science.gov (United States)

    Hassanzadeh, Akbar; Heidari, Zahra; Hassanzadeh Keshteli, Ammar; Afshar, Hamid

    2017-01-01

    Objective The current study is aimed at investigating the association between stressful life events and psychological problems in a large sample of Iranian adults. Method In a cross-sectional large-scale community-based study, 4763 Iranian adults, living in Isfahan, Iran, were investigated. Grouped outcomes latent factor regression on latent predictors was used for modeling the association of psychological problems (depression, anxiety, and psychological distress), measured by Hospital Anxiety and Depression Scale (HADS) and General Health Questionnaire (GHQ-12), as the grouped outcomes, and stressful life events, measured by a self-administered stressful life events (SLEs) questionnaire, as the latent predictors. Results The results showed that the personal stressors domain has significant positive association with psychological distress (β = 0.19), anxiety (β = 0.25), depression (β = 0.15), and their collective profile score (β = 0.20), with greater associations in females (β = 0.28) than in males (β = 0.13) (all P < 0.001). In addition, in the adjusted models, the regression coefficients for the association of social stressors domain and psychological problems profile score were 0.37, 0.35, and 0.46 in total sample, males, and females, respectively (P < 0.001). Conclusion Results of our study indicated that different stressors, particularly those socioeconomic related, have an effective impact on psychological problems. It is important to consider the social and cultural background of a population for managing the stressors as an effective approach for preventing and reducing the destructive burden of psychological problems. PMID:29312459

  16. Scientific annual report 1973

    International Nuclear Information System (INIS)

    A report is given on the scientific research at DESY in 1973, which included the first storage of electrons in the double storage ring DORIS. Also mentioned are the two large spectrometers PLUTO and DASP, and experiments relating to elementary particles, synchrotron radiation, and the improvement of the equipment are described. (WL/AK) [de

  17. Group Peer Mentoring: An Answer to the Faculty Mentoring Problem? A Successful Program at a Large Academic Department of Medicine.

    Science.gov (United States)

    Pololi, Linda H; Evans, Arthur T

    2015-01-01

    To address a dearth of mentoring and to avoid the pitfalls of dyadic mentoring, the authors implemented and evaluated a novel collaborative group peer mentoring program in a large academic department of medicine. The mentoring program aimed to facilitate faculty in their career planning, and targeted either early-career or midcareer faculty in 5 cohorts over 4 years, from 2010 to 2014. Each cohort of 9-12 faculty participated in a yearlong program with foundations in adult learning, relationship formation, mindfulness, and culture change. Participants convened for an entire day, once a month. Sessions incorporated facilitated stepwise and values-based career planning, skill development, and reflective practice. Early-career faculty participated in an integrated writing program and midcareer faculty in leadership development. Overall attendance of the 51 participants was 96%, and only 3 of 51 faculty who completed the program left the medical school during the 4 years. All faculty completed a written detailed structured academic development plan. Participants experienced an enhanced, inclusive, and appreciative culture; clarified their own career goals, values, strengths and priorities; enhanced their enthusiasm for collaboration; and developed skills. The program results highlight the need for faculty to personally experience the power of forming deep relationships with their peers for fostering successful career development and vitality. The outcomes of faculty humanity, vitality, professionalism, relationships, appreciation of diversity, and creativity are essential to the multiple missions of academic medicine. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.

  18. Problems of simulation of large, long-lived vortices in the atmospheres of the giant planets (jupiter, saturn, neptune)

    Science.gov (United States)

    Nezlin, Michael V.; Sutyrin, Georgi G.

    1994-01-01

    Large, long-lived vortices are abundant in the atmospheres of the giant planets. Some of them survive a few orders of magnitude longer than the dispersive linear Rossby wave packets, e.g. the Great Red Spot (GRS), Little Red Spot (LRS) and White Ovals (WO) of Jupiter, Big Bertha, Brown Spot and Anne's Spot of Saturn, the Great Dark Spot (GDS) of Neptune, etc. Nonlinear effects which prevent their dispersion spreading are the main subject of our consideration. Particular emphasis is placed on determining the dynamical processes which may explain the remarkable properties of observed vortices such as anticyclonic rotation in preference to cyclonic one and the uniqueness of the GRS, the largest coherent vortex, along the perimeter of Jupiter at corresponding latitude. We review recent experimental and theoretical studies of steadily translating solitary Rossby vortices (anticyclones) in a rotating shallow fluid. Two-dimensional monopolar solitary vortices trap fluid which is transported westward. These dualistic structures appear to be vortices, on the one hand, and solitary “waves”, on the other hand. Owing to the presence of the trapped fluid, such solitary structures collide inelastically and have a memory of the initial disturbance which is responsible for the formation of the structure. As a consequence, they have no definite relationship between the amplitude and characteristic size. Their vortical properties are connected with geostrophic advection of local vorticity. Their solitary properties (nonspreading and stationary translation) are due to a balance between Rossby wave dispersion and nonlinear effects which allow the anticyclones, with an elevation of a free surface, to propagate faster than the linear waves, without a resonance with linear waves, i.e. without wave radiation. On the other hand, cyclones, with a depression of a free surface, are dispersive and nonstationary features. This asymmetry in dispersion-nonlinear properties of cyclones and

  19. Contemporary problems of health protection for workers employed at a large industrial enterprise and working under occupational hazards

    Directory of Open Access Journals (Sweden)

    E.Ya. Titova

    2017-12-01

    Full Text Available We examined data provided by a healthcare facility at a large industrial enterprise focusing on occupational morbidity dynamics over 2013-2016 and periodical medical examinations results obtained in 2015 and 2016. We created a specialized program and applied it to conduct sociological research on health of workers who has a periodical medical examination. We detected that most questioned workers (50.48 % whose occupations were associated with dangerous and hazardous occupa-tional factors were poorly aware of occupational diseases prevention and needed relevant knowledge. It is shown that over 2013–2016 occupational morbidity decreased from 9.38 cases per 10,000 workers to 3.55 cases. However, it remained higher that in Russian and Perm region on average. All occupational diseases were detected in workers older than 40 with their working record being longer than 15 years. A share of people able to work with certain limitations grew in 2016 in comparison with 2015 (from 7.35 to 9.31 %; a number of people who needed sanatorium-resort therapy also grew from 19.96 to 32.12 %; a number of people with general somatic diseases increased from 31.23 to 70.17 %; health index reduced from 38.77 to 29.82 %. Musculoskeletal system diseases, con-nective tissue diseases, circulatory system diseases, eye and its accessory apparatus diseases, respiratory organs diseases, and digestive organs diseases prevailed in general somatic morbidity structure. We registered a substantial growth in hearing organs diseases (mostly hearing loss, from 49.47 to 99.06 cases per 100 examined; skin and subcutaneous tissue diseases (from 7.73 to 36.3 cases per 100 examined; urinary system diseases (from 68.42 to 100.62 cases per 100 examined. We detected that most examined workers pursued unhealthy lifestyle. For example, 29.9 % often consumed strong spirits (equally men and women, and 72.8 % smoked. All the respondents tended to have low physical activity. We also revealed some

  20. Queen's discovery lauded by top scientific journal

    CERN Multimedia

    McGrady, S

    2002-01-01

    A scientific breakthrough at Queen's University's Sudbury Neutrino Observatory has received major international recognition. The journal Science ranked the discovery that cracked the "neutrino problem" second, in the journal's top 10 scientific achievements of 2002 (1/2 page).

  1. Biomedical Scientific and Professional Social Networks in the Service of the Development of Modern Scientific Publishing.

    Science.gov (United States)

    Masic, Izet; Begic, Edin

    2016-12-01

    Information technologies have found their application in virtually every branch of health care. In recent years they have demonstrated their potential in the development of online library, where scientists and researchers can share their latest findings. Academia.edu, ResearchGate, Mendeley, Kudos, with the support of platform GoogleScholar, have indeed increased the visibility of scientific work of one author, and enable a much greater availability of the scientific work to the broader audience. Online libraries have allowed free access to the scientific content to the countries that could not follow the economic costs of getting access to certain scientific bases. Especially great benefit occurred in countries in transition and developing countries. Online libraries have great potential in terms of expanding knowledge, but they also present a major problem for many publishers, because their rights can be violated, which are signed by the author when publishing the paper. In the future it will lead to a major conflict of the author, the editorial board and online database, about the right to scientific content This question certainly represents one of the most pressing issues of publishing, whose future in printed form is already in the past, and the future of the online editions will be a problem of large-scale.

  2. Use of large pieces of printed circuit boards for bioleaching to avoid 'precipitate contamination problem' and to simplify overall metal recovery.

    Science.gov (United States)

    Adhapure, N N; Dhakephalkar, P K; Dhakephalkar, A P; Tembhurkar, V R; Rajgure, A V; Deshmukh, A M

    2014-01-01

    Very recently bioleaching has been used for removing metals from electronic waste. Most of the research has been targeted to using pulverized PCBs for bioleaching where precipitate formed during bioleaching contaminates the pulverized PCB sample and making the overall metal recovery process more complicated. In addition to that, such mixing of pulverized sample with precipitate also creates problems for the final separation of non metallic fraction of PCB sample. In the present investigation we attempted the use of large pieces of printed circuit boards instead of pulverized sample for removal of metals. Use of large pieces of PCBs for bioleaching was restricted due to the chemical coating present on PCBs, the problem has been solved by chemical treatment of PCBs prior to bioleaching. In short,•Large pieces of PCB can be used for bioleaching instead of pulverized PCB sample.•Metallic portion on PCBs can be made accessible to bacteria with prior chemical treatment of PCBs.•Complete metal removal obtained on PCB pieces of size 4 cm × 2.5 cm with the exception of solder traces. The final metal free PCBs (non metallic) can be easily recycled and in this way the overall recycling process (metallic and non metallic part) of PCBs becomes simple.

  3. Modified truncated randomized singular value decomposition (MTRSVD) algorithms for large scale discrete ill-posed problems with general-form regularization

    Science.gov (United States)

    Jia, Zhongxiao; Yang, Yanfei

    2018-05-01

    In this paper, we propose new randomization based algorithms for large scale linear discrete ill-posed problems with general-form regularization: subject to , where L is a regularization matrix. Our algorithms are inspired by the modified truncated singular value decomposition (MTSVD) method, which suits only for small to medium scale problems, and randomized SVD (RSVD) algorithms that generate good low rank approximations to A. We use rank-k truncated randomized SVD (TRSVD) approximations to A by truncating the rank- RSVD approximations to A, where q is an oversampling parameter. The resulting algorithms are called modified TRSVD (MTRSVD) methods. At every step, we use the LSQR algorithm to solve the resulting inner least squares problem, which is proved to become better conditioned as k increases so that LSQR converges faster. We present sharp bounds for the approximation accuracy of the RSVDs and TRSVDs for severely, moderately and mildly ill-posed problems, and substantially improve a known basic bound for TRSVD approximations. We prove how to choose the stopping tolerance for LSQR in order to guarantee that the computed and exact best regularized solutions have the same accuracy. Numerical experiments illustrate that the best regularized solutions by MTRSVD are as accurate as the ones by the truncated generalized singular value decomposition (TGSVD) algorithm, and at least as accurate as those by some existing truncated randomized generalized singular value decomposition (TRGSVD) algorithms. This work was supported in part by the National Science Foundation of China (Nos. 11771249 and 11371219).

  4. Scientific Symposium “Small Solution for Big Water-Related Problems: Innovative Microarrays and Small Sensors to Cope with Water Quality and Food Security”

    Directory of Open Access Journals (Sweden)

    Stefania Marcheggiani

    2015-12-01

    Full Text Available This issue presents the conclusive results of two European Commission funded Projects, namely Universal Microarrays for the Evaluation of Fresh-water Quality Based on Detection of Pathogens and their Toxins (MicroAQUA and Rationally Designed Aquatic Receptors (RADAR. These projects focused their activities on the quality of drinking water as an extremely important factor for public health of humans and animals. The MicroAQUA Project aimed at developing a universal microarray chip for the detection of various pathogens (cyanobacteria, bacteria, viruses and parasitic protozoa and their toxins in waters. In addition, the project included the detection of select species of diatoms, which represent reliable bio-indicators to assess overall water quality. Large numbers of compounds are released into the environment; some of these are toxins such as endocrine disrupting compounds (EDCs and can affect the endocrine, immune and nervous systems of a wide range of animals causing alterations such as reproductive disorders and cancer. Detection of these contaminants in water systems is important to protect sensitive environmental sites and reduce the risk of toxins entering the food chain. A modular platform for monitoring toxins in water and food production facilities, using biosensors derived from aquatic organisms, was the main goal of RADAR Project.

  5. Scalability of Parallel Scientific Applications on the Cloud

    Directory of Open Access Journals (Sweden)

    Satish Narayana Srirama

    2011-01-01

    Full Text Available Cloud computing, with its promise of virtually infinite resources, seems to suit well in solving resource greedy scientific computing problems. To study the effects of moving parallel scientific applications onto the cloud, we deployed several benchmark applications like matrix–vector operations and NAS parallel benchmarks, and DOUG (Domain decomposition On Unstructured Grids on the cloud. DOUG is an open source software package for parallel iterative solution of very large sparse systems of linear equations. The detailed analysis of DOUG on the cloud showed that parallel applications benefit a lot and scale reasonable on the cloud. We could also observe the limitations of the cloud and its comparison with cluster in terms of performance. However, for efficiently running the scientific applications on the cloud infrastructure, the applications must be reduced to frameworks that can successfully exploit the cloud resources, like the MapReduce framework. Several iterative and embarrassingly parallel algorithms are reduced to the MapReduce model and their performance is measured and analyzed. The analysis showed that Hadoop MapReduce has significant problems with iterative methods, while it suits well for embarrassingly parallel algorithms. Scientific computing often uses iterative methods to solve large problems. Thus, for scientific computing on the cloud, this paper raises the necessity for better frameworks or optimizations for MapReduce.

  6. Practice and effectiveness of web-based problem-based learning approach in a large class-size system: A comparative study.

    Science.gov (United States)

    Ding, Yongxia; Zhang, Peili

    2018-06-12

    Problem-based learning (PBL) is an effective and highly efficient teaching approach that is extensively applied in education systems across a variety of countries. This study aimed to investigate the effectiveness of web-based PBL teaching pedagogies in large classes. The cluster sampling method was used to separate two college-level nursing student classes (graduating class of 2013) into two groups. The experimental group (n = 162) was taught using a web-based PBL teaching approach, while the control group (n = 166) was taught using conventional teaching methods. We subsequently assessed the satisfaction of the experimental group in relation to the web-based PBL teaching mode. This assessment was performed following comparison of teaching activity outcomes pertaining to exams and self-learning capacity between the two groups. When compared with the control group, the examination scores and self-learning capabilities were significantly higher in the experimental group (P web-based PBL teaching approach. In a large class-size teaching environment, the web-based PBL teaching approach appears to be more optimal than traditional teaching methods. These results demonstrate the effectiveness of web-based teaching technologies in problem-based learning. Copyright © 2018. Published by Elsevier Ltd.

  7. Modelling of natural convection flows with large temperature differences: a benchmark problem for low Mach number solvers. Part. 1 reference solutions

    International Nuclear Information System (INIS)

    Le Quere, P.; Weisman, C.; Paillere, H.; Vierendeels, J.; Dick, E.; Becker, R.; Braack, M.; Locke, J.

    2005-01-01

    Heat transfer by natural convection and conduction in enclosures occurs in numerous practical situations including the cooling of nuclear reactors. For large temperature difference, the flow becomes compressible with a strong coupling between the continuity, the momentum and the energy equations through the equation of state, and its properties (viscosity, heat conductivity) also vary with the temperature, making the Boussinesq flow approximation inappropriate and inaccurate. There are very few reference solutions in the literature on non-Boussinesq natural convection flows. We propose here a test case problem which extends the well-known De Vahl Davis differentially heated square cavity problem to the case of large temperature differences for which the Boussinesq approximation is no longer valid. The paper is split in two parts: in this first part, we propose as yet unpublished reference solutions for cases characterized by a non-dimensional temperature difference of 0.6, Ra 10 6 (constant property and variable property cases) and Ra = 10 7 (variable property case). These reference solutions were produced after a first international workshop organized by Cea and LIMSI in January 2000, in which the above authors volunteered to produce accurate numerical solutions from which the present reference solutions could be established. (authors)

  8. THE APOLOGETIC CONCERN IN THE WORK OF BIBLICAL THEOLOGIANS OF THE KIEV THEOLOGICAL ACADEMY FROM THE END OF THE NINETEENTH TO THE BEGINNING OF THE TWENTIETH CENTURY (THE PROBLEM OF FINDING COMMON GROUND FOR THEOLOGICAL AND SCIENTIFIC THOUGHT

    Directory of Open Access Journals (Sweden)

    Sergey Golovashchenko

    2013-06-01

    Full Text Available The author examines the relationship between scientific and theological components in a selection of the works of well-known Biblical scholars active at the Kiev Theological Academy around the turn of the nineteenth century and the begin ning of the twentieth. Among them figure the names of F. J. Pokrovsky, V. P. Rybinsky, D. I. Bogdashevsky, and Father A. A. Glagolev. The work of these experts has been little studied until today. The spiritual, intellectual, and ideological context of the time has been taken into account by the author. The author of this article pays special attention to the ideological background surrounding the polemic between Russian Orthodox biblical scholars and those proponents of the negative school of biblical exegesis. The focus is on several key elements of understanding the Bible, the research and exposition of biblical history, as well as points of dogmatic and moral import stemming from an interpretation of the scriptures. The author demonstrates that the position of the Kievan biblical scholars was apologetic, contrasting the theological and scientific schools against the background of a more than positivistic understanding of history and the Bible seen as the sacred scripture of the Church. In this way, they contributed to academic research, and the way of teaching the scriptures of the schools, as well as the exposition of the scriptures for the purpose of dogmatic and moral enlightenment. At the same time, they began the process of working towards a synthesis as an approach for further scientific and theological research. Important for the continuing development of Russian Orthodox biblical studies during the twentieth century was finding a balance between Orthodox biblical apologetics and scientific thought . This attempt at re-discovering and reconstructing the apologetic atmosphere of the Kievan biblical scholars was made possible through a combination of several factors — one of the most important being

  9. Scientific Programming in Fortran

    Directory of Open Access Journals (Sweden)

    W. Van Snyder

    2007-01-01

    Full Text Available The Fortran programming language was designed by John Backus and his colleagues at IBM to reduce the cost of programming scientific applications. IBM delivered the first compiler for its model 704 in 1957. IBM's competitors soon offered incompatible versions. ANSI (ASA at the time developed a standard, largely based on IBM's Fortran IV in 1966. Revisions of the standard were produced in 1977, 1990, 1995 and 2003. Development of a revision, scheduled for 2008, is under way. Unlike most other programming languages, Fortran is periodically revised to keep pace with developments in language and processor design, while revisions largely preserve compatibility with previous versions. Throughout, the focus on scientific programming, and especially on efficient generated programs, has been maintained.

  10. Manual for JSSL (JAERI scientific subroutine library)

    International Nuclear Information System (INIS)

    Inoue, Shuji; Fujimura, Toichiro; Tsutsui, Tsuneo; Nishida, Takahiko

    1982-09-01

    A manual on revised version of JAERI scientific subroutine library, which is a collection of scientific subroutines developed or modified in JAERI. They are classified into fifteen fields (Special Functions, Linear Problems, Eigenvalue and Eigen vector Problems, Non linear Problems, Mathematical Programming, Extreme Value Problems, Transformations, Functional Approximation Methods, Numerical Differential and Integral Methods, Numerical Differential and Integral Equations, Statistical Functions, Physical Problems, I/O Routines, Plotter Routines, Computer System Functions and Others). Main expansion of this version is in the fields of mathematical programming and statistical functions. The present library may be said to be a comprehensive compilation of scientific subroutines covering almost all the important fields. (author)

  11. A synergetic combination of small and large neighborhood schemes in developing an effective procedure for solving the job shop scheduling problem.

    Science.gov (United States)

    Amirghasemi, Mehrdad; Zamani, Reza

    2014-01-01

    This paper presents an effective procedure for solving the job shop problem. Synergistically combining small and large neighborhood schemes, the procedure consists of four components, namely (i) a construction method for generating semi-active schedules by a forward-backward mechanism, (ii) a local search for manipulating a small neighborhood structure guided by a tabu list, (iii) a feedback-based mechanism for perturbing the solutions generated, and (iv) a very large-neighborhood local search guided by a forward-backward shifting bottleneck method. The combination of shifting bottleneck mechanism and tabu list is used as a means of the manipulation of neighborhood structures, and the perturbation mechanism employed diversifies the search. A feedback mechanism, called repeat-check, detects consequent repeats and ignites a perturbation when the total number of consecutive repeats for two identical makespan values reaches a given threshold. The results of extensive computational experiments on the benchmark instances indicate that the combination of these four components is synergetic, in the sense that they collectively make the procedure fast and robust.

  12. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    Science.gov (United States)

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  13. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    Science.gov (United States)

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  14. Accelerating scientific discovery : 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis of Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide

  15. Scientific Council of the State Committee for Science and Technology on the problem 'New processes in the coking industry', Coke-Chemistry Section of the Scientific-Technical Council of the Ministry of Ferrous Metallurgy of the USSR and of the Central Administration of the Scientific-Technical Branch of Ferrous Metallurgy

    Energy Technology Data Exchange (ETDEWEB)

    Bukvareva, O.F.

    1986-12-01

    A report is presented on 3 conferences on the development of the Soviet coking industry held in February and April 1986. Papers delivered at the conferences and selected recommendations for development of equipment for coking, coking systems, optimization of coal mixtures for coking, environmental protection and research programs on coking are reviewed. The following problems were discussed: modernization of coke oven design, selecting the optimum size of coke ovens, prospects for dry coke quenching in the USSR, evaluation of operation of systems for dry coke quenching, utilization of waste heat from coke quenching for heat treatment of coal mixtures for coking, research programs on coking in the 12th five-year plan (1986-1990), partial briquetting of coal mixtures for coking and selecting optimum binders for partial briqetting, formed coke processes, economic analysis of formed coke processes, fire-resistant bricks and elements for coke oven construction, new coking technologies, pollution control in coking plants.

  16. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

  17. Energy and scientific communication

    Science.gov (United States)

    De Sanctis, E.

    2013-06-01

    Energy communication is a paradigmatic case of scientific communication. It is particularly important today, when the world is confronted with a number of immediate, urgent problems. Science communication has become a real duty and a big challenge for scientists. It serves to create and foster a climate of reciprocal knowledge and trust between science and society, and to establish a good level of interest and enthusiasm for research. For an effective communication it is important to establish an open dialogue with the audience, and a close collaboration among scientists and science communicators. An international collaboration in energy communication is appropriate to better support international and interdisciplinary research and projects.

  18. REVIEWS OF TOPICAL PROBLEMS Rotational explosion mechanism for collapsing supernovae and the two-stage neutrino signal from supernova 1987A in the Large Magellanic Cloud

    Science.gov (United States)

    Imshennik, Vladimir S.

    2011-02-01

    The two-stage (double) signal produced by the outburst of the close supernova (SN) in the Large Magellanic Cloud, which started on and involved two neutrino signals during the night of 23 February 1987 UT, is theoretically interpreted in terms of a scenario of rotationally exploding collapsing SNs, to whose class the outburst undoubtedly belongs. This scenario consists of a set of hydrodynamic and kinetic models in which key results are obtained by numerically solving non-one-dimensional and nonstationary problems. Of vital importance in this context is the inclusion of rotation effects, their role being particularly significant precisely in terms of the question of the transformation of the original collapse of the presupernova iron core to the explosion of the SN shell, with an energy release on a familiar scale of 1051 erg. The collapse in itself leads to the birth of neutron stars (black holes) emitting neutrino and gravitational radiation signals of gigantic intensity, whose total energy significantly (by a factor of hundreds) exceeds the above-cited SN burst energy. The proposed rotational scenario is described briefly by artificially dividing it into three (or four) characteristic stages. This division is dictated by the physical meaning of the chain of events a rotating iron core of a sufficiently massive (more than 10M) star triggers when it collapses. An attempt is made to quantitatively describe the properties of the associated neutrino and gravitational radiations. The review highlights the interpretation of the two-stage neutrino signal from SN 1987A, a problem which, given the present status of theoretical astrophysics, cannot, in the author's view, be solved without including rotation effects.

  19. Scientific Resource EXplorer

    Science.gov (United States)

    Xing, Z.; Wormuth, A.; Smith, A.; Arca, J.; Lu, Y.; Sayfi, E.

    2014-12-01

    Inquisitive minds in our society are never satisfied with curatedimages released by a typical public affairs office. They always want tolook deeper and play directly on original data. However, most scientificdata products are notoriously hard to use. They are immensely large,highly distributed and diverse in format. In this presentation,we will demonstrate Resource EXplorer (REX), a novel webtop applicationthat allows anyone to conveniently explore and visualize rich scientificdata repositories, using only a standard web browser. This tool leverageson the power of Webification Science (w10n-sci), a powerful enabling technologythat simplifies the use of scientific data on the web platform.W10n-sci is now being deployed at an increasing number of NASA data centers,some of which are the largest digital treasure troves in our nation.With REX, these wonderful scientific resources are open for teachers andstudents to learn and play.

  20. Professional scientific blog

    Directory of Open Access Journals (Sweden)

    Tamás Beke

    2009-03-01

    Full Text Available The professional blog is a weblog that on the whole meets the requirements of scientific publication. In my opinion it bear a resemblance to digital notice board, where the competent specialists of the given branch of science can place their ideas, questions, possible solutions and can raise problems. Its most important function can be collectivization of the knowledge. In this article I am going to examine the characteristics of the scientific blog as a genre. Conventional learning counts as a rather solitary activity. If the students have access to the materials of each other and of the teacher, their sense of solitude diminishes and this model is also closer to the constructivist approach that features the way most people think and learn. Learning does not mean passively collecting tiny pieces of knowledge; it much more esembles ‘spinning a conceptual net’ which is made up by the experiences and observations of the individual. With the spreading of the Internet more universities and colleges worldwide gave a try to on-line educational methods, but the most efficient one has not been found yet. The publication of the curriculum (the material of the lectures and the handling of the electronic mails are not sufficient; much more is needed for collaborative learning. Our scholastic scientific blog can be a sufficient field for the start of a knowledge-building process based on cooperation. In the Rocard-report can be read that for the future of Europe it is crucial to develop the education of the natural sciences, and for this it isnecessary to act on local, regional, national and EU-level. To the educational processes should be involved beyond the traditional actors (child, parent, teacher also others (scientists, professionals, universities, local institutions, the actors of the economic sphere, etc.. The scholastic scientific blog answer the purposes, as a collaborative knowledge-sharing forum.

  1. Scientific report 1999

    International Nuclear Information System (INIS)

    2000-01-01

    This scientific report of the Fuel Cycle Direction of the Cea, presents the Direction activities and research programs in the fuel cycle domain during the year 1999. The first chapter is devoted to the front end of the fuel cycle with the SILVA process as main topic. The second chapter is largely based on the separation chemistry of the back end cycle. The third and fourth chapters present studies of more applied and sometimes more technical developments in the nuclear industry or not. (A.L.B.)

  2. Different stability of social-communication problems and negative demanding behaviour from infancy to toddlerhood in a large Dutch population sample

    Science.gov (United States)

    2014-01-01

    Background Little is known about the stability of behavioural and developmental problems as children develop from infants to toddlers in the general population. Therefore, we investigated behavioural profiles at two time points and determined whether behaviours are stable during early development. Methods Parents of 4,237 children completed questionnaires with 62 items about externalizing, internalizing, and social-communicative behaviour when the children were 14–15 and 36–37 months old. Factor mixture modelling identified five homogeneous profiles at both time points: three with relatively normal behaviour or with mild/moderate problems, one with clear communication and interaction problems, and another with pronounced negative and demanding behaviour. Results More than 85% of infants with normal behaviour or mild problems at 14–15 months were reported to behave relatively typically as toddlers at 36–37 months. A similar percentage of infants with moderate communication problems outgrew their problems by the time they were toddlers. However, infants with severe problems had mild to severe problems as toddlers, and did not show completely normal behaviour. Improvement over time occurred more often in children with negative and demanding behaviour than in children with communication and interaction problems. The former showed less homotypic continuity than the latter. Conclusions Negative and demanding behaviour is more often transient and a less specific predictor of problems in toddlerhood than communication and interaction problems. PMID:25061477

  3. Scientific impact: opportunity and necessity.

    Science.gov (United States)

    Cohen, Marlene Z; Alexander, Gregory L; Wyman, Jean F; Fahrenwald, Nancy L; Porock, Davina; Wurzbach, Mary E; Rawl, Susan M; Conn, Vicki S

    2010-08-01

    Recent National Institutes of Health changes have focused attention on the potential scientific impact of research projects. Research with the excellent potential to change subsequent science or health care practice may have high scientific impact. Only rigorous studies that address highly significant problems can generate change. Studies with high impact may stimulate new research approaches by changing understanding of a phenomenon, informing theory development, or creating new research methods that allow a field of science to move forward. Research with high impact can transition health care to more effective and efficient approaches. Studies with high impact may propel new policy developments. Research with high scientific impact typically has both immediate and sustained influence on the field of study. The article includes ideas to articulate potential scientific impact in grant applications as well as possible dissemination strategies to enlarge the impact of completed projects.

  4. Large deviations

    CERN Document Server

    Varadhan, S R S

    2016-01-01

    The theory of large deviations deals with rates at which probabilities of certain events decay as a natural parameter in the problem varies. This book, which is based on a graduate course on large deviations at the Courant Institute, focuses on three concrete sets of examples: (i) diffusions with small noise and the exit problem, (ii) large time behavior of Markov processes and their connection to the Feynman-Kac formula and the related large deviation behavior of the number of distinct sites visited by a random walk, and (iii) interacting particle systems, their scaling limits, and large deviations from their expected limits. For the most part the examples are worked out in detail, and in the process the subject of large deviations is developed. The book will give the reader a flavor of how large deviation theory can help in problems that are not posed directly in terms of large deviations. The reader is assumed to have some familiarity with probability, Markov processes, and interacting particle systems.

  5. Scientific applications of symbolic computation

    International Nuclear Information System (INIS)

    Hearn, A.C.

    1976-02-01

    The use of symbolic computation systems for problem solving in scientific research is reviewed. The nature of the field is described, and particular examples are considered from celestial mechanics, quantum electrodynamics and general relativity. Symbolic integration and some more recent applications of algebra systems are also discussed [fr

  6. Can scientific medicine incorporate alternative medicine?

    Science.gov (United States)

    Federspil, G; Vettor, R

    2000-06-01

    The authors examine the problem of defining alternative medicine, and after a brief analysis conclude that a satisfactory unifying definition of the different practices is not possible. Scientific knowledge is a function of scientific method. In turn the principle of falsifiability proposed by Karl Popper is used as a demarcation line between science and pseudoscience. They assert that the various alternative modalities do not represent authentic scientific disciplines, as they lack many of the minimum requirements of scientific discourse and, above all, because they violate the principle of falsifiability. Until they overcome these methodological shortcomings, alternative medical practices cannot become authentic scientific disciplines.

  7. Scientific expertise from the inside: AFSSET Working Group on Radio-frequencies (2008-2009)

    International Nuclear Information System (INIS)

    Barthe, Yannick

    2014-01-01

    Although there is now a large amount of social science research on scientific expertise and expert groups, direct evidence by sociologists who themselves participated in scientific expert groups assessing controversial topics remain rare. This paper offers just this type of feedback. The aim is to analyse the production of scientific expert opinions based on personal experience: the author's participation as a sociologist in an expert committee set up by the former French Agency for the Safety of Health, the Environment and Work (AFSSET) on the topic of radio-frequencies. Several problematic aspects of these groups will thus be discussed from this concrete experience: the problem of the composition of the expert group, the issue of conflicts of interest, the organisation of the work within the group, the effects of the presence of an observer from an association, and the differences between performing scientific research and providing scientific expert opinions. (authors)

  8. Manual for JSSL (JAERI Scientific Subroutine Library)

    International Nuclear Information System (INIS)

    Fujimura, Toichiro; Tsutsui, Tsuneo

    1991-09-01

    JSSL (JAERI Scientific Subroutine Library) is a library of scientific subroutines developed or modified in JAERI. They are classified into sixteen fields (Special Functions, Linear Problems, Eigenvalue and Eigenvector Problems, Non Linear Problems, Mathematical Programming, Extreme Value Problems, Transformations, Functional Approximation Methods, Numerical Differential and Integral Methods, Numerical Differential and Integral Equations, Statistical Functions, Physical Problems, I/O Routines, Plotter Routines, Computer System Functions and Others). This report is the user manual for the revised version of JSSL which involves evaluated subroutines selected from the previous compilation of JSSL, applied in almost all the fields. (author)

  9. Scientific Services on the Cloud

    Science.gov (United States)

    Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong

    Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.

  10. Scientific instruments, scientific progress and the cyclotron

    International Nuclear Information System (INIS)

    Baird, David; Faust, Thomas

    1990-01-01

    Philosophers speak of science in terms of theory and experiment, yet when they speak of the progress of scientific knowledge they speak in terms of theory alone. In this article it is claimed that scientific knowledge consists of, among other things, scientific instruments and instrumental techniques and not simply of some kind of justified beliefs. It is argued that one aspect of scientific progress can be characterized relatively straightforwardly - the accumulation of new scientific instruments. The development of the cyclotron is taken to illustrate this point. Eight different activities which promoted the successful completion of the cyclotron are recognised. The importance is in the machine rather than the experiments which could be run on it and the focus is on how the cyclotron came into being, not how it was subsequently used. The completed instrument is seen as a useful unit of scientific progress in its own right. (UK)

  11. Big Data Challenges for Large Radio Arrays

    Science.gov (United States)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  12. Improving the scientific misconduct hearing process.

    Science.gov (United States)

    Parrish, D M

    The overturning and withdrawal of several of the Office of Research Integrity's (ORI's) findings of scientific misconduct have called its role into question. The contested findings of scientific misconduct that have been tried before the hearing body have been based on lengthy and expensive ORI investigations. How could ORI have failed to prove its findings of scientific misconduct after the commitment of substantial resources that far exceed those devoted during institutional investigations? One reason may be that the current hearing process makes it difficult or impossible for ORI, institutions, or individuals to prove scientific misconduct. The hearing process has been criticized by discouraged whistleblowers who believe that their allegations of scientific misconduct should have been upheld, and by the accused for the expensive and protracted nature of the proceedings. The following article examines problems in the scientific misconduct hearing process and suggests that the process could be improved by letting administrative law judges, patent attorneys, and a scientific majority decide these cases.

  13. Ecological potentialities for the future scientific research

    International Nuclear Information System (INIS)

    Fidirko, V.A.

    1996-01-01

    Efficient scientific development may promote the solution of all the environmental problems. The way the question is put is new, for science is finally considered to be the source of all environmental disasters and to be blamed for that. Search for the means to solve scientifically induced crisis situation seems to be very interesting. (author)

  14. Extracting Core Claims from Scientific Articles

    NARCIS (Netherlands)

    Jansen, Tom; Kuhn, Tobias

    2017-01-01

    The number of scientific articles has grown rapidly over the years and there are no signs that this growth will slow down in the near future. Because of this, it becomes increasingly difficult to keep up with the latest developments in a scientific field. To address this problem, we present here an

  15. Evaluation of scheduling problems for the project planning of large-scale projects using the example of nuclear facility dismantling; Evaluation von Schedulingproblemen fuer die Projektplanung von Grossprojekten am Beispiel des kerntechnischen Rueckbaus

    Energy Technology Data Exchange (ETDEWEB)

    Huebner, Felix; Schellenbaum, Uli; Stuerck, Christian; Gerhards, Patrick; Schultmann, Frank

    2017-05-15

    The magnitude of widespread nuclear decommissioning and dismantling, regarding deconstruction costs and project duration, exceeds even most of the prominent large-scale projects. The deconstruction costs of one reactor are estimated at several hundred million Euros and the dismantling period for more than a decade. The nuclear power plants built in the 1970s are coming closer to the end of their planned operating lifespan. Therefore, the decommissioning and dismantling of nuclear facilities, which is posing a multitude of challenges to planning and implementation, is becoming more and more relevant. This study describes planning methods for large-scale projects. The goal of this paper is to formulate a project planning problem that appropriately copes with the specific challenges of nuclear deconstruction projects. For this purpose, the requirements for appropriate scheduling methods are presented. Furthermore, a variety of possible scheduling problems are introduced and compared by their specifications and their behaviour. A set of particular scheduling problems including possible extensions and generalisations is assessed in detail. Based on the introduced problems and extensions, a Multi-mode Resource Investment Problem with Tardiness Penalty is chosen to fit the requirements of nuclear facility dismantling. This scheduling problem is then customised and adjusted according to the specific challenges of nuclear deconstruction projects. It can be called a Multi-mode Resource Investment Problem under the consideration of generalized precedence constraints and post-operational costs.

  16. WHICH HEALTH-RELATED PROBLEMS ARE ASSOCIATED WITH PROBLEMATIC VIDEO-GAMING OR SOCIAL MEDIA USE IN ADOLESCENTS? A LARGE-SCALE CROSS-SECTIONAL STUDY

    OpenAIRE

    Saskia Y. M. Mérelle; Annet M. Kleiboer; Miriam Schotanus; Theresia L. M. Cluitmans; Cornelia M. Waardenburg; Danielle Kramer; Dike van de Mheen; Antonius J. van Rooij

    2017-01-01

    markdownabstract_Objective:_ Problematic video-gaming or social media use may seriously affect adolescents’ health status. However, it is not very well known which health-related problems are most strongly related to these issues. To inform the development of prevention and intervention strategies, this study aims to gain a better understanding of the healthrelated problems and demographical factors associated with problematic video-gaming or social media use in early adolescence. _Method:_ A...

  17. Science and society: The benefits of scientific collaboration

    CERN Multimedia

    2003-01-01

    The guest speaker at the next Science and Society symposium is no stranger to CERN. He is, in fact, Sir Chris Llewellyn Smith, Director General of CERN from 1994 to 1998. His topic is one with which he is particularly familiar, having "lived" it throughout his time at CERN: international scientific collaboration and its advantages. International scientific collaboration is essential in a wide range of areas and for a large number of reasons: scientific problems have no frontiers; certain subjects are so complex that they require the expertise of numerous countries; certain types of research, such as that carried out at CERN, cannot be pursued by one nation on its own. However, scientific collaboration is not only beneficial to science itself. This is the point Chris Llewellyn Smith intends to demonstrate in his address. Scientific collaboration can help to build bridges between societies and act as a spur to the development of certain countries. It can even help to diminish conflicts in certain cases. The his...

  18. Scientific computing vol II - eigenvalues and optimization

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the second of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses more advanced topics than volume one, and is largely not a prerequisite for volume three. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 49 examples, 110 exercises, 66 algorithms, 24 interactive JavaScript programs, 77 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in LAPACK, GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either upper level undergraduate...

  19. Assignment of uncertainties to scientific data

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1994-01-01

    Long-standing problems of uncertainty assignment to scientific data came into a sharp focus in recent years when uncertainty information ('covariance files') had to be added to application-oriented large libraries of evaluated nuclear data such as ENDF and JEF. Question arouse about the best way to express uncertainties, the meaning of statistical and systematic errors, the origin of correlation and construction of covariance matrices, the combination of uncertain data from different sources, the general usefulness of results that are strictly valid only for Gaussian or only for linear statistical models, etc. Conventional statistical theory is often unable to give unambiguous answers, and tends to fail when statistics is bad so that prior information becomes crucial. Modern probability theory, on the other hand, incorporating decision information becomes group-theoretic results, is shown to provide straight and unique answers to such questions, and to deal easily with prior information and small samples. (author). 10 refs

  20. Load Balancing Scientific Applications

    Energy Technology Data Exchange (ETDEWEB)

    Pearce, Olga Tkachyshyn [Texas A & M Univ., College Station, TX (United States)

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one at the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.

  1. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  2. A generative model for scientific concept hierarchies.

    Science.gov (United States)

    Datta, Srayan; Adar, Eytan

    2018-01-01

    In many scientific disciplines, each new 'product' of research (method, finding, artifact, etc.) is often built upon previous findings-leading to extension and branching of scientific concepts over time. We aim to understand the evolution of scientific concepts by placing them in phylogenetic hierarchies where scientific keyphrases from a large, longitudinal academic corpora are used as a proxy of scientific concepts. These hierarchies exhibit various important properties, including power-law degree distribution, power-law component size distribution, existence of a giant component and less probability of extending an older concept. We present a generative model based on preferential attachment to simulate the graphical and temporal properties of these hierarchies which helps us understand the underlying process behind scientific concept evolution and may be useful in simulating and predicting scientific evolution.

  3. A generative model for scientific concept hierarchies

    Science.gov (United States)

    Adar, Eytan

    2018-01-01

    In many scientific disciplines, each new ‘product’ of research (method, finding, artifact, etc.) is often built upon previous findings–leading to extension and branching of scientific concepts over time. We aim to understand the evolution of scientific concepts by placing them in phylogenetic hierarchies where scientific keyphrases from a large, longitudinal academic corpora are used as a proxy of scientific concepts. These hierarchies exhibit various important properties, including power-law degree distribution, power-law component size distribution, existence of a giant component and less probability of extending an older concept. We present a generative model based on preferential attachment to simulate the graphical and temporal properties of these hierarchies which helps us understand the underlying process behind scientific concept evolution and may be useful in simulating and predicting scientific evolution. PMID:29474409

  4. Solved problems in electrochemistry

    International Nuclear Information System (INIS)

    Piron, D.L.

    2004-01-01

    This book presents calculated solutions to problems in fundamental and applied electrochemistry. It uses industrial data to illustrate scientific concepts and scientific knowledge to solve practical problems. It is subdivided into three parts. The first uses modern basic concepts, the second studies the scientific basis for electrode and electrolyte thermodynamics (including E-pH diagrams and the minimum energy involved in transformations) and the kinetics of rate processes (including the energy lost in heat and in parasite reactions). The third part treats larger problems in electrolysis and power generation, as well as in corrosion and its prevention. Each chapter includes three sections: the presentation of useful principles; some twenty problems with their solutions; and, a set of unsolved problems

  5. Crash problem definition and safety benefits methodology for stability control for single-unit medium and heavy trucks and large-platform buses

    Science.gov (United States)

    2009-10-01

    This report presents the findings of a comprehensive engineering analysis of electronic stability control (ESC) and roll stability control (RSC) systems for single-unit medium and heavy trucks and large-platform buses. This report details the applica...

  6. Scientific integrity in Brazil.

    Science.gov (United States)

    Lins, Liliane; Carvalho, Fernando Martins

    2014-09-01

    This article focuses on scientific integrity and the identification of predisposing factors to scientific misconduct in Brazil. Brazilian scientific production has increased in the last ten years, but the quality of the articles has decreased. Pressure on researchers and students for increasing scientific production may contribute to scientific misconduct. Cases of misconduct in science have been recently denounced in the country. Brazil has important institutions for controlling ethical and safety aspects of human research, but there is a lack of specific offices to investigate suspected cases of misconduct and policies to deal with scientific dishonesty.

  7. Introduction to scientific publishing backgrounds, concepts, strategies

    CERN Document Server

    Öchsner, Andreas

    2013-01-01

    This book is a very concise introduction to the basic knowledge of scientific publishing. It  starts with the basics of writing a scientific paper, and recalls the different types of scientific documents. In gives an overview on the major scientific publishing companies and different business models. The book also introduces to abstracting and indexing services and how they can be used for the evaluation of science, scientists, and institutions. Last but not least, this short book faces the problem of plagiarism and publication ethics.

  8. Different stability of social-communication problems and negative demanding behaviour from infancy to toddlerhood in a large Dutch population sample

    NARCIS (Netherlands)

    Moricke, E.; Lappenschaar, G.M.; Swinkels, S.H.N.; Rommelse, N.N.J.; Buitelaar, J.K.

    2014-01-01

    BACKGROUND: Little is known about the stability of behavioural and developmental problems as children develop from infants to toddlers in the general population. Therefore, we investigated behavioural profiles at two time points and determined whether behaviours are stable during early development.

  9. Which health-related problems are associated with problematic video-gaming or social media use in adolescents? : A large-scale cross-sectional study

    NARCIS (Netherlands)

    S.Y.M. Mérelle (Saskia); A. Kleiboer (Annet); M. Schotanus (Miriam); T.L.M. Cluitmans (Theresia L. M.); C.M. Waardenburg (Cornelia M.); D. Kramer (Danielle); H. van de Mheen (Dike); A.J. van Rooij (Antonius)

    2017-01-01

    markdownabstract_Objective:_ Problematic video-gaming or social media use may seriously affect adolescents’ health status. However, it is not very well known which health-related problems are most strongly related to these issues. To inform the development of prevention and intervention strategies,

  10. Concept Formation in Scientific Knowledge Discovery from a Constructivist View

    Science.gov (United States)

    Peng, Wei; Gero, John S.

    The central goal of scientific knowledge discovery is to learn cause-effect relationships among natural phenomena presented as variables and the consequences their interactions. Scientific knowledge is normally expressed as scientific taxonomies and qualitative and quantitative laws [1]. This type of knowledge represents intrinsic regularities of the observed phenomena that can be used to explain and predict behaviors of the phenomena. It is a generalization that is abstracted and externalized from a set of contexts and applicable to a broader scope. Scientific knowledge is a type of third-person knowledge, i.e., knowledge that independent of a specific enquirer. Artificial intelligence approaches, particularly data mining algorithms that are used to identify meaningful patterns from large data sets, are approaches that aim to facilitate the knowledge discovery process [2]. A broad spectrum of algorithms has been developed in addressing classification, associative learning, and clustering problems. However, their linkages to people who use them have not been adequately explored. Issues in relation to supporting the interpretation of the patterns, the application of prior knowledge to the data mining process and addressing user interactions remain challenges for building knowledge discovery tools [3]. As a consequence, scientists rely on their experience to formulate problems, evaluate hypotheses, reason about untraceable factors and derive new problems. This type of knowledge which they have developed during their career is called "first-person" knowledge. The formation of scientific knowledge (third-person knowledge) is highly influenced by the enquirer's first-person knowledge construct, which is a result of his or her interactions with the environment. There have been attempts to craft automatic knowledge discovery tools but these systems are limited in their capabilities to handle the dynamics of personal experience. There are now trends in developing

  11. Scientific Data Management Center for Enabling Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Vouk, Mladen A.

    2013-01-15

    Managing scientific data has been identified by the scientific community as one of the most important emerging needs because of the sheer volume and increasing complexity of data being collected. Effectively generating, managing, and analyzing this information requires a comprehensive, end-to-end approach to data management that encompasses all of the stages from the initial data acquisition to the final analysis of the data. Fortunately, the data management problems encountered by most scientific domains are common enough to be addressed through shared technology solutions. Based on community input, we have identified three significant requirements. First, more efficient access to storage systems is needed. In particular, parallel file system and I/O system improvements are needed to write and read large volumes of data without slowing a simulation, analysis, or visualization engine. These processes are complicated by the fact that scientific data are structured differently for specific application domains, and are stored in specialized file formats. Second, scientists require technologies to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis and searches over extremely large data sets. Specialized feature discovery and statistical analysis techniques are needed before the data can be understood or visualized. Furthermore, interactive analysis requires techniques for efficiently selecting subsets of the data. Finally, generating the data, collecting and storing the results, keeping track of data provenance, data post-processing, and analysis of results is a tedious, fragmented process. Tools for automation of this process in a robust, tractable, and recoverable fashion are required to enhance scientific exploration. The SDM center was established under the SciDAC program to address these issues. The SciDAC-1 Scientific Data Management (SDM) Center succeeded in bringing an initial set of advanced

  12. Testing Scientific Software: A Systematic Literature Review

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  13. Testing Scientific Software: A Systematic Literature Review.

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  14. Mental health problems are associated with low-frequency fluctuations in reaction time in a large general population sample. The TRAILS study.

    Science.gov (United States)

    Bastiaansen, J A; van Roon, A M; Buitelaar, J K; Oldehinkel, A J

    2015-02-01

    Increased intra-subject reaction time variability (RT-ISV) as coarsely measured by the standard deviation (RT-SD) has been associated with many forms of psychopathology. Low-frequency RT fluctuations, which have been associated with intrinsic brain rhythms occurring approximately every 15-40s, have been shown to add unique information for ADHD. In this study, we investigated whether these fluctuations also relate to attentional problems in the general population, and contribute to the two major domains of psychopathology: externalizing and internalizing problems. RT was monitored throughout a self-paced sustained attention task (duration: 9.1 ± 1.2 min) in a Dutch population cohort of young adults (n=1455, mean age: 19.0 ± 0.6 years, 55.1% girls). To characterize temporal fluctuations in RT, we performed direct Fourier Transform on externally validated frequency bands based on frequency ranges of neuronal oscillations: Slow-5 (0.010-0.027 Hz), Slow-4 (0.027-0.073 Hz), and three additional higher frequency bands. Relative magnitude of Slow-4 fluctuations was the primary predictor in regression models for attentional, internalizing and externalizing problems (measured by the Adult Self-Report questionnaire). Additionally, stepwise regression models were created to investigate (a) whether Slow-4 significantly improved the prediction of problem behaviors beyond the RT-SD and (b) whether the other frequency bands provided important additional information. The magnitude of Slow-4 fluctuations significantly predicted attentional and externalizing problems and even improved model fit after modeling RT-SD first (R(2) change=0.6%, Pfrequency bands provided additional information. Low-frequency RT fluctuations have added predictive value for attentional and externalizing, but not internalizing problems beyond global differences in variability. This study extends previous findings in clinical samples of children with ADHD to adolescents from the general population and

  15. Extensional scientific realism vs. intensional scientific realism.

    Science.gov (United States)

    Park, Seungbae

    2016-10-01

    Extensional scientific realism is the view that each believable scientific theory is supported by the unique first-order evidence for it and that if we want to believe that it is true, we should rely on its unique first-order evidence. In contrast, intensional scientific realism is the view that all believable scientific theories have a common feature and that we should rely on it to determine whether a theory is believable or not. Fitzpatrick argues that extensional realism is immune, while intensional realism is not, to the pessimistic induction. I reply that if extensional realism overcomes the pessimistic induction at all, that is because it implicitly relies on the theoretical resource of intensional realism. I also argue that extensional realism, by nature, cannot embed a criterion for distinguishing between believable and unbelievable theories. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. The scientific status of fusion

    International Nuclear Information System (INIS)

    Crandall, D.H.

    1989-01-01

    The development of fusion energy has been a large-scale scientific undertaking of broad interest. The magnetic plasma containment in tokamaks and the laser-drive ignition of microfusion capsules appear to be scientifically feasible sources of energy. These concepts are bounded by questions of required intensity in magnetid field and plasma currents or in drive energy and, for both concepts, by issues of plasma stability and energy transport. The basic concept and the current scientific issues are described for magnetic fusion and for the interesting, but likely infeasible, muon-catalyzed fusion concept. Inertial fusion is mentioned, qualitatively, to complete the context. For magnetic fusion, the required net energy production within the plasma may be accomplished soon, but the more useful goal of self-sustained plasma ignition requires a new device of somewhat uncertain (factor of 2) cost and size. (orig.)

  17. WWW: The Scientific Method

    Science.gov (United States)

    Blystone, Robert V.; Blodgett, Kevin

    2006-01-01

    The scientific method is the principal methodology by which biological knowledge is gained and disseminated. As fundamental as the scientific method may be, its historical development is poorly understood, its definition is variable, and its deployment is uneven. Scientific progress may occur without the strictures imposed by the formal…

  18. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  19. Some problems raised by the operation of large nuclear turbo-generator sets. Automatic control system for steam turbo-generator units

    International Nuclear Information System (INIS)

    Cecconi, F.

    1976-01-01

    The design of an appropriate automatic system was found to be useful to improve the control of large size turbo-generator units so as to provide easy and efficient control and monitoring. The experience of the manufacturer of these turbo-generator units allowed a system well suited for this function to be designed [fr

  20. Scientific data management challenges, technology and deployment

    CERN Document Server

    Rotem, Doron

    2010-01-01

    Dealing with the volume, complexity, and diversity of data currently being generated by scientific experiments and simulations often causes scientists to waste productive time. Scientific Data Management: Challenges, Technology, and Deployment describes cutting-edge technologies and solutions for managing and analyzing vast amounts of data, helping scientists focus on their scientific goals. The book begins with coverage of efficient storage systems, discussing how to write and read large volumes of data without slowing the simulation, analysis, or visualization processes. It then focuses on the efficient data movement and management of storage spaces and explores emerging database systems for scientific data. The book also addresses how to best organize data for analysis purposes, how to effectively conduct searches over large datasets, how to successfully automate multistep scientific process workflows, and how to automatically collect metadata and lineage information. This book provides a comprehensive u...

  1. The Daily Operational Brief: Fostering Daily Readiness, Care Coordination, and Problem-Solving Accountability in a Large Pediatric Health Care System.

    Science.gov (United States)

    Donnelly, Lane F; Basta, Kathryne C; Dykes, Anne M; Zhang, Wei; Shook, Joan E

    2018-01-01

    At a pediatric health system, the Daily Operational Brief (DOB) was updated in 2015 after three years of operation. Quality and safety metrics, the patient volume and staffing assessment, and the readiness assessment are all presented. In addition, in the problem-solving accountability system, problematic issues are categorized as Quick Hits or Complex Issues. Walk-the-Wall, a biweekly meeting attended by hospital senior administrative leadership and quality and safety leaders, is conducted to chart current progress on Complex Issues. The DOB provides a daily standardized approach to evaluate readiness to provide care to current patients and improvement in the care to be provided for future patients. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  2. Managing environmental issues at large-scale industrial estates: Problems and initiatives in central and eastern Europe and the former Soviet Union

    International Nuclear Information System (INIS)

    Coyle, R.

    1996-01-01

    A great many large-scale industrial sites are undergoing major transformation and restructuring in central and eastern Europe and the former Soviet Union. The EBRD's portfolio of investment projects increasingly includes such sites, presenting the Bank with environmental challenges related to their size, complexity and history. Both technological improvements and changes in management structure are needed in order to address environmental, and health and safety, issues. The EBRD requires ''environmental due diligence'' on all of its projects under preparation. Requirements vary, depending on the nature of each project. (author)

  3. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  4. Knapsack problems

    CERN Document Server

    Kellerer, Hans; Pisinger, David

    2004-01-01

    Thirteen years have passed since the seminal book on knapsack problems by Martello and Toth appeared. On this occasion a former colleague exclaimed back in 1990: "How can you write 250 pages on the knapsack problem?" Indeed, the definition of the knapsack problem is easily understood even by a non-expert who will not suspect the presence of challenging research topics in this area at the first glance. However, in the last decade a large number of research publications contributed new results for the knapsack problem in all areas of interest such as exact algorithms, heuristics and approximation schemes. Moreover, the extension of the knapsack problem to higher dimensions both in the number of constraints and in the num­ ber of knapsacks, as well as the modification of the problem structure concerning the available item set and the objective function, leads to a number of interesting variations of practical relevance which were the subject of intensive research during the last few years. Hence, two years ago ...

  5. Neophilia Ranking of Scientific Journals.

    Science.gov (United States)

    Packalen, Mikko; Bhattacharya, Jay

    2017-01-01

    The ranking of scientific journals is important because of the signal it sends to scientists about what is considered most vital for scientific progress. Existing ranking systems focus on measuring the influence of a scientific paper (citations)-these rankings do not reward journals for publishing innovative work that builds on new ideas. We propose an alternative ranking based on the proclivity of journals to publish papers that build on new ideas, and we implement this ranking via a text-based analysis of all published biomedical papers dating back to 1946. In addition, we compare our neophilia ranking to citation-based (impact factor) rankings; this comparison shows that the two ranking approaches are distinct. Prior theoretical work suggests an active role for our neophilia index in science policy. Absent an explicit incentive to pursue novel science, scientists underinvest in innovative work because of a coordination problem: for work on a new idea to flourish, many scientists must decide to adopt it in their work. Rankings that are based purely on influence thus do not provide sufficient incentives for publishing innovative work. By contrast, adoption of the neophilia index as part of journal-ranking procedures by funding agencies and university administrators would provide an explicit incentive for journals to publish innovative work and thus help solve the coordination problem by increasing scientists' incentives to pursue innovative work.

  6. Modelling of large-scale structures arising under developed turbulent convection in a horizontal fluid layer (with application to the problem of tropical cyclone origination

    Directory of Open Access Journals (Sweden)

    G. V. Levina

    2000-01-01

    Full Text Available The work is concerned with the results of theoretical and laboratory modelling the processes of the large-scale structure generation under turbulent convection in the rotating-plane horizontal layer of an incompressible fluid with unstable stratification. The theoretical model describes three alternative ways of creating unstable stratification: a layer heating from below, a volumetric heating of a fluid with internal heat sources and combination of both factors. The analysis of the model equations show that under conditions of high intensity of the small-scale convection and low level of heat loss through the horizontal layer boundaries a long wave instability may arise. The condition for the existence of an instability and criterion identifying the threshold of its initiation have been determined. The principle of action of the discovered instability mechanism has been described. Theoretical predictions have been verified by a series of experiments on a laboratory model. The horizontal dimensions of the experimentally-obtained long-lived vortices are 4÷6 times larger than the thickness of the fluid layer. This work presents a description of the laboratory setup and experimental procedure. From the geophysical viewpoint the examined mechanism of the long wave instability is supposed to be adequate to allow a description of the initial step in the evolution of such large-scale vortices as tropical cyclones - a transition form the small-scale cumulus clouds to the state of the atmosphere involving cloud clusters (the stage of initial tropical perturbation.

  7. Problems with the dating of sediment core using excess 210Pb in a freshwater system impacted by large scale watershed changes

    International Nuclear Information System (INIS)

    Baskaran, Mark; Nix, Joe; Kuyper, Clark; Karunakara, N.

    2014-01-01

    Pb-210 dating of freshwater and coastal sediments have been extensively conducted over the past 40 years for historical pollution reconstruction studies, sediment focusing, sediment accumulation and mixing rate determination. In areas where there is large scale disturbance of sediments and the watershed, the vertical profiles of excess 210 Pb ( 210 Pb xs ) could provide erroneous or less reliable information on sediment accumulation rates. We analyzed one sediment core from Hendrix Lake in southwestern Arkansas for excess 210 Pb and 137 Cs. There is no decrease in excess 210 Pb activity with depth while the 137 Cs profile indicates sharp peak corresponding to 1963 and the 137 Cs penetration depth of 137 Cs corresponds to 1952. The historical data on the accelerated mercury mining during 1931–1944 resulted in large-scale Hg input to this watershed. Using the peak Hg activity as a time marker, the obtained sediment accumulation rates agree well with the 137 Cs-based rates. Four independent evidences (two-marker events based on 137 Cs and two marker events based on Hg mining activity) result in about the same sedimentation rates and thus, we endorse earlier suggestion that 210 Pb profile always needs to be validated with at least one another independent method. We also present a concise discussion on what important factors that can affect the vertical profiles of 210 Pb xs in relatively smaller lakes

  8. The relative importance of relational and scientific characteristics of psychotherapy: Perceptions of community members vs. therapists.

    Science.gov (United States)

    Farrell, Nicholas R; Deacon, Brett J

    2016-03-01

    Although client preferences are an integral component of evidence-based practice in psychology (American Psychological Association, 2006), relatively little research has examined what potential mental health consumers value in the psychotherapy they may receive. The present study was conducted to examine community members' preferences for the scientific and relational aspects of psychotherapy for different types of presenting problems, and how accurately therapists perceive these preferences. Community members (n = 200) were surveyed about the importance of scientific (e.g., demonstrated efficacy in clinical trials) and relational (e.g., therapist empathy) characteristics of psychotherapy both for anxiety disorders (e.g., obsessive-compulsive disorder) and disorder-nonspecific issues (e.g., relationship difficulties). Therapists (n = 199) completed the same survey and responded how they expected the average mental health consumer would. Results showed that although community members valued relational characteristics significantly more than scientific characteristics, the gap between these two was large for disorder-nonspecific issues (d = 1.24) but small for anxiety disorders (d = .27). Community members rated scientific credibility as important across problem types. Therapists significantly underestimated the importance of scientific characteristics to community members, particularly in the treatment of disorder-nonspecific issues (d = .74). Therapists who valued research less in their own practice were more likely to underestimate the importance of scientific credibility to community members. The implications of the present findings for understanding the nature of client preferences in evidence-based psychological practice are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Proportional Reasoning: An Essential Component of Scientific Understanding

    Science.gov (United States)

    Hilton, Annette; Hilton, Geoff

    2016-01-01

    In many scientific contexts, students need to be able to use mathematical knowledge in order to engage in scientific reasoning and problem-solving, and their understanding of scientific concepts relies heavily on their ability to understand and use mathematics in often new or unfamiliar contexts. Not only do science students need high levels of…

  10. Tools for Observation: Art and the Scientific Process

    Science.gov (United States)

    Pettit, E. C.; Coryell-Martin, M.; Maisch, K.

    2015-12-01

    Art can support the scientific process during different phases of a scientific discovery. Art can help explain and extend the scientific concepts for the general public; in this way art is a powerful tool for communication. Art can aid the scientist in processing and interpreting the data towards an understanding of the concepts and processes; in this way art is powerful - if often subconscious - tool to inform the process of discovery. Less often acknowledged, art can help engage students and inspire scientists during the initial development of ideas, observations, and questions; in this way art is a powerful tool to develop scientific questions and hypotheses. When we use art as a tool for communication of scientific discoveries, it helps break down barriers and makes science concepts less intimidating and more accessible and understandable for the learner. Scientists themselves use artistic concepts and processes - directly or indirectly - to help deepen their understanding. Teachers are following suit by using art more to stimulate students' creative thinking and problem solving. We show the value of teaching students to use the artistic "way of seeing" to develop their skills in observation, questioning, and critical thinking. In this way, art can be a powerful tool to engage students (from elementary to graduate) in the beginning phase of a scientific discovery, which is catalyzed by inquiry and curiosity. Through qualitative assessment of the Girls on Ice program, we show that many of the specific techniques taught by art teachers are valuable for science students to develop their observation skills. In particular, the concepts of contour drawing, squinting, gesture drawing, inverted drawing, and others can provide valuable training for student scientists. These art techniques encourage students to let go of preconceptions and "see" the world (the "data") in new ways they help students focus on both large-scale patterns and small-scale details.

  11. Scientific and technical challenges of radioactive waste management

    International Nuclear Information System (INIS)

    Vira, J.

    1996-01-01

    In spite of considerable spending on research and technical development, the management of nuclear wastes continues to be a difficult issue in public decision making. The nuclear industry says that it has safe solutions for the ultimate disposal of nuclear wastes, but the message has not really got through to the public at large. Although communications problems reflect the general stigmatization of nuclear power, there are obvious issues in safety and performance assessment of nuclear waste disposal which evade scientific resolution. Any scientist is concerned for his personal credibility must respect the rules and limits of scientific practice, but the intriguing question is whether he would not do better to address the layman's worries about radioactive substances? The discussion in this paper points out the intricacies of the distinction between scientific proof and judgement, with emphasis on safety assessment for nuclear waste disposal. Who are the final arbitrators? In a democratic society it is probably those who vote.Building confidence in expert judgements is a challenge for waste managers and scientists. The media may create their own 'experts', whose only necessary credential is the trust of their audience, but scientific judgements must stand the test of time.'Confidence building' is currently a key word on the whole nuclear waste management scene, and confidence in science and scientists is certainly needed for any progress towards practical implementation of plans. The means for building confidence in the decision-making process are probably different from those applied for science and scientists. (author)

  12. Paradigm for new scientific technology; Shinkagaku gijutsu paradigm

    Energy Technology Data Exchange (ETDEWEB)

    Shindo, Y [National Chemical Lab. for Industry, Tokyo (Japan)

    1995-01-05

    This paper reviews the current status from the standpoint of chemical engineers facing the coming of the 21st century, and surveys the paradigm for new scientific technologies. The criticism is mixed with unique opinions everywhere, such as `departure of students from scientific and engineering faculties is none other than the result of a market principle`, `national burden of trillions of yens should not be spent only under a justice of advancement of the science`, and `the global civilization itself has no other way but to change from the conventional expansive development type of the western country style to the internal development type of the oriental country style`. Values that define the paradigm for new scientific technologies may include such keywords as saturation in technology, baseless expansion of research projects, criticism on science, market principle, and centering human being. It should be looked at seriously that profit from research and development should exceed the cast invested therein in the future, and scientific technologies that serve truly the society should be aimed at. These efforts will result in one of the large pillars that support the future in which creation of new functions is aimed at as a result of structuring the new systems. Trying to overcome the environmental problems is one of them.

  13. Scientific Computing in Electrical Engineering

    CERN Document Server

    Amrhein, Wolfgang; Zulehner, Walter

    2018-01-01

    This collection of selected papers presented at the 11th International Conference on Scientific Computing in Electrical Engineering (SCEE), held in St. Wolfgang, Austria, in 2016, showcases the state of the art in SCEE. The aim of the SCEE 2016 conference was to bring together scientists from academia and industry, mathematicians, electrical engineers, computer scientists, and physicists, and to promote intensive discussions on industrially relevant mathematical problems, with an emphasis on the modeling and numerical simulation of electronic circuits and devices, electromagnetic fields, and coupled problems. The focus in methodology was on model order reduction and uncertainty quantification. This extensive reference work is divided into six parts: Computational Electromagnetics, Circuit and Device Modeling and Simulation, Coupled Problems and Multi‐Scale Approaches in Space and Time, Mathematical and Computational Methods Including Uncertainty Quantification, Model Order Reduction, and Industrial Applicat...

  14. Some thoughts on alternative methods and their scientific implications

    NARCIS (Netherlands)

    Mansvelt, van J.D.

    1983-01-01

    Reflections upon our agricultural problems cannot be limitated to those problems itself, but should incorporate a reflection upon our social and scientific traditions. An alternative agriculture asks for participating nature research

  15. The Associative Basis of Scientific Creativity: A Model Proposal

    Directory of Open Access Journals (Sweden)

    Esra Kanli

    2014-06-01

    Full Text Available Creativity is accepted as an important part of scientific skills. Scientific creativity proceeds from a need or urge to solve a problem, and in-volves the production of original and useful ideas or products. Existing scientific creativity theories and tests do not feature the very im-portant thinking processes, such as analogical and associative thinking, which can be consid-ered crucial in creative scientific problem solv-ing. Current study’s aim is to provide an alter-native model and explicate the associative basis of scientific creativity. Emerging from the re-viewed theoretical framework, Scientific Asso-ciations Model is proposed. This model claims that, similarity and mediation constitutes the basis of creativity and focuses on three compo-nents namely; associative thinking, analogical thinking (analogical reasoning & analogical problem solving and insight which are consid-ered to be main elements of scientific associa-tive thinking.

  16. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  17. Is writing style predictive of scientific fraud?

    DEFF Research Database (Denmark)

    Braud, Chloé Elodie; Søgaard, Anders

    2017-01-01

    The problem of detecting scientific fraud using machine learning was recently introduced, with initial, positive results from a model taking into account various general indicators. The results seem to suggest that writing style is predictive of scientific fraud. We revisit these initial experime......The problem of detecting scientific fraud using machine learning was recently introduced, with initial, positive results from a model taking into account various general indicators. The results seem to suggest that writing style is predictive of scientific fraud. We revisit these initial...... experiments, and show that the leave-one-out testing procedure they used likely leads to a slight over-estimate of the predictability, but also that simple models can outperform their proposed model by some margin. We go on to explore more abstract linguistic features, such as linguistic complexity...

  18. Problematizing as a scientific endeavor

    Directory of Open Access Journals (Sweden)

    Anna McLean Phillips

    2017-08-01

    Full Text Available The work of physics learners at all levels revolves around problems. Physics education research has inspired attention to the forms of these problems, whether conceptual or algorithmic, closed or open response, well or ill structured. Meanwhile, it has been the work of curriculum developers and instructors to develop these problems. Physics education research has supported these efforts with studies of students problem solving and the effects of different kinds of problems on learning. In this article we argue, first, that developing problems is central to the discipline of physics. It involves noticing a gap of understanding, identifying and articulating its precise nature, and motivating a community of its existence and significance. We refer to this activity as problematizing, and we show its importance by drawing from writings in physics and philosophy of science. Second, we argue that students, from elementary age to adults, can problematize as part of their engaging in scientific inquiry. We present four cases, drawing from episodes vetted by a panel of collaborating faculty in science departments as clear instances of students doing science. Although neither we nor the scientists had problematizing in mind when screening cases, we found it across the episodes. We close with implications for instruction, including the value of helping students recognize and manage the situation of being confused but not yet having a clear question, and implications for research, including the need to build problematizing into our models of learning.

  19. Problematizing as a scientific endeavor

    Science.gov (United States)

    Phillips, Anna McLean; Watkins, Jessica; Hammer, David

    2017-12-01

    The work of physics learners at all levels revolves around problems. Physics education research has inspired attention to the forms of these problems, whether conceptual or algorithmic, closed or open response, well or ill structured. Meanwhile, it has been the work of curriculum developers and instructors to develop these problems. Physics education research has supported these efforts with studies of students problem solving and the effects of different kinds of problems on learning. In this article we argue, first, that developing problems is central to the discipline of physics. It involves noticing a gap of understanding, identifying and articulating its precise nature, and motivating a community of its existence and significance. We refer to this activity as problematizing, and we show its importance by drawing from writings in physics and philosophy of science. Second, we argue that students, from elementary age to adults, can problematize as part of their engaging in scientific inquiry. We present four cases, drawing from episodes vetted by a panel of collaborating faculty in science departments as clear instances of students doing science. Although neither we nor the scientists had problematizing in mind when screening cases, we found it across the episodes. We close with implications for instruction, including the value of helping students recognize and manage the situation of being confused but not yet having a clear question, and implications for research, including the need to build problematizing into our models of learning.

  20. Age and Scientific Performance.

    Science.gov (United States)

    Cole, Stephen

    1979-01-01

    The long-standing belief that age is negatively associated with scientific productivity and creativity is shown to be based upon incorrect analysis of data. Studies reported in this article suggest that the relationship between age and scientific performance is influenced by the operation of the reward system. (Author)