WorldWideScience

Sample records for 100-year tool applied

  1. 100 years of superconductivity

    Rogalla, Horst

    2011-01-01

    Even a hundred years after its discovery, superconductivity continues to bring us new surprises, from superconducting magnets used in MRI to quantum detectors in electronics. 100 Years of Superconductivity presents a comprehensive collection of topics on nearly all the subdisciplines of superconductivity. Tracing the historical developments in superconductivity, the book includes contributions from many pioneers who are responsible for important steps forward in the field.The text first discusses interesting stories of the discovery and gradual progress of theory and experimentation. Emphasizi

  2. Convergence: Human Intelligence The Next 100 Years

    Fluellen, Jerry E., Jr.

    2005-01-01

    How might human intelligence evolve over the next 100 years? This issue paper explores that idea. First, the paper summarizes five emerging perspectives about human intelligence: Howard Gardner's multiple intelligences theory, Robert Sternberg's triarchic theory of intelligence, Ellen Langer's mindfulness theory, David Perkins' learnable…

  3. Applied regression analysis a research tool

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  4. Analysis of 100 Years of Curriculum Designs

    Lynn Kelting-Gibson

    2013-01-01

    Full Text Available Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational planning and teaching because it is a way to gather accurate evidence of student learning and information to inform instructional decisions. The purpose of this review was to analyze 100 years of curriculum designs to uncover elements of assessment that will support teachers in their desire to improve student learning. Throughout this research the author seeks to answer the question: Do historical and contemporary curriculum designs include elements of assessment that help teachers improve student learning? The results of the review reveal that assessment elements were addressed in all of the curricular designs analyzed, but not all elements of assessment were identified using similar terminology. Based on the analysis of this review, it is suggested that teachers not be particular about the terminology used to describe assessment elements, as all curriculum models discussed use one or more elements similar to the context of pre, formative, and summative assessments.

  5. 100 Years of the Physics of Diodes

    Luginsland, John

    2013-10-01

    The Child-Langmuir Law (CL), discovered 100 years ago, gives the maximum current that can be transported across a planar diode in the steady state. As a quintessential example of the impact of space-charge shielding near a charged surface, it is central to the studies of high current diodes, such as high power microwave sources, vacuum microelectronics, electron and ion sources, and high current drivers used in high-energy density physics experiments. CL remains a touchstone of fundamental sheath physics, including contemporary studies of nano-scale quantum diodes and plasmonic devices. Its solid state analog is the Mott-Gurney law, governing the maximum charge injection in solids, such as organic materials and other dielectrics, which is important to energy devices, such as solar cells and light-emitting diodes. This paper reviews the important advances in the physics of diodes since the discovery of CL, including virtual cathode formation and extension of CL to multiple dimensions, to the quantum regime, and to ultrafast processes. We will review the influence of magnetic fields, multiple species in bipolar flow, electromagnetic and time dependent effects in both short pulse and high frequency THz limits, and single electron regimes. Transitions from various emission mechanisms (thermionic, field, and photo-emission) to the space charge limited state (CL) will be addressed, especially highlighting important simulation and experimental developments in selected contemporary areas of study. This talk will stress the fundamental physical links between the physics of beams to limiting currents in other areas, such as low temperature plasmas, laser plasmas, and space propulsion. Also emphasized is the role of non-equilibrium phenomena associated with materials and plasmas in close contact. Work supported by the Air Force Office of Scientific Research.

  6. Beam Line: 100 years of elementary particles

    Pais, A.; Weinberg, S.; Quigg, C.; Riordan, M.; Panofsky, W. K. H.

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  7. Opening the 100-Year Window for Time-Domain Astronomy

    Grindlay, Jonathan; Tang, Sumin; Los, Edward; Servillat, Mathieu

    2012-04-01

    The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890-1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the ``production scanning'' phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys.

  8. Opening the 100-Year Window for Time Domain Astronomy

    Grindlay, Jonathan; Los, Edward; Servillat, Mathieu

    2012-01-01

    The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890--1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the "production scanning" phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys

  9. Advances of Bioinformatics Tools Applied in Virus Epitopes Prediction

    Ping Chen; Simon Rayner; Kang-hong Hu

    2011-01-01

    In recent years, the in silico epitopes prediction tools have facilitated the progress of vaccines development significantly and many have been applied to predict epitopes in viruses successfully. Herein, a general overview of different tools currently available, including T cell and B cell epitopes prediction tools, is presented. And the principles of different prediction algorithms are reviewed briefly. Finally, several examples are present to illustrate the application of the prediction tools.

  10. Understanding General Relativity after 100 years: A matter of perspective

    Dadhich, Naresh

    2016-01-01

    This is the centenary year of general relativity, it is therefore natural to reflect on what perspective we have evolved in 100 years. I wish to share here a novel perspective, and the insights and directions that ensue from it.

  11. 100-Year Flood-It's All About Chance

    Holmes, Jr., Robert R.; Dinicola, Karen

    2010-01-01

    In the 1960's, the United States government decided to use the 1-percent annual exceedance probability (AEP) flood as the basis for the National Flood Insurance Program. The 1-percent AEP flood was thought to be a fair balance between protecting the public and overly stringent regulation. Because the 1-percent AEP flood has a 1 in 100 chance of being equaled or exceeded in any 1 year, and it has an average recurrence interval of 100 years, it often is referred to as the '100-year flood'. The term '100-year flood' is part of the national lexicon, but is often a source of confusion by those not familiar with flood science and statistics. This poster is an attempt to explain the concept, probabilistic nature, and inherent uncertainties of the '100-year flood' to the layman.

  12. Using Applied Theatre as a Tool to Address Netizenship

    Skeiker, Fadi Fayad

    2015-01-01

    This paper charts the ways in which a researcher uses applied theatre practice as a tool to address netizenship issues in the advancement of digital age by documenting a workshop he co-facilitated with graduate students at the University of Porto during the Future Places conference in 2013. The workshop used applied theatre both to catalyze…

  13. The Who and Why of 100 Year Bonds

    Richard Kish

    2014-09-01

    Full Text Available A unique category of debt, 100 year bonds, offers limited benefits to both buyers and sellers. From an issuer’s viewpoint, the entities are able to lock in low financing over an extended period of time, since these bonds are only offered during periods of low interest rates. From the buyer’s viewpoint, 100 year bonds provide investors with an option for lengthening the duration and convexity of their portfolios. Firms, such as life insurance companies, may be interested in lengthening the duration of their portfolios to match liabilities or other investment strategies. Besides supporting answers to the question of why 100 year bonds are issued, we outline the two waves of issuance; the issuers, the industry categories of issuers and investors; and provide examples showing the effects of changes in interest rates when investing in these bonds.

  14. Semantic Differential applied to the evaluation of machine tool design

    Mondragón Donés, Salvador; Company, Pedro; Vergara Monedero, Margarita

    2005-01-01

    In this article, a study is presented showing that Product Semantics (PS) can be used to study the design of machine tools. Nowadays, different approaches to PS (Semantic Differential, Kansei Engineering, etc.) are being applied to consumer products with successful results, but commercial products have generally received less attention and machine tools in particular have not yet been studied. Our second objective is to measure the different sensitivities that the different groups of the popu...

  15. Centennial Calendar- 100 Years of the American Phytopathological Society

    I edited a 40-page publication (calendar) that covered 18 chapters written by members of our society. This covered pioneering researchers, departments, and epidemics of the last 100 years of plant pathology in the U. S. This was given to all members of the American Phytopathological Society who att...

  16. The 7 basic tools of quality applied to radiological safety

    This work seeks to establish a series of correspondences among the search of the quality and the optimization of the doses received by the occupationally exposed personnel. There are treated about the seven basic statistic tools of the quality: the Pareto technique, Cause effect diagrams, Stratification, Verification sheet, Histograms, Dispersion diagrams and Graphics and control frames applied to the Radiological Safety

  17. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  18. Coating-substrate-simulations applied to HFQ® forming tools

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  19. Relativity and Gravitation : 100 Years After Einstein in Prague

    Ledvinka, Tomáš; General Relativity, Cosmology and Astrophysics : Perspectives 100 Years After Einstein's Stay in Prague

    2014-01-01

    In early April 1911 Albert Einstein arrived in Prague to become full professor of theoretical physics at the German part of Charles University. It was there, for the first time, that he concentrated primarily on the problem of gravitation. Before he left Prague in July 1912 he had submitted the paper “Relativität und Gravitation: Erwiderung auf eine Bemerkung von M. Abraham” in which he remarkably anticipated what a future theory of gravity should look like. At the occasion of the Einstein-in-Prague centenary an international meeting was organized under a title inspired by Einstein's last paper from the Prague period: "Relativity and Gravitation, 100 Years after Einstein in Prague". The main topics of the conference included: classical relativity, numerical relativity, relativistic astrophysics and cosmology, quantum gravity, experimental aspects of gravitation, and conceptual and historical issues. The conference attracted over 200 scientists from 31 countries, among them a number of leading experts in ...

  20. NASA Administrator Dan Goldin greets 100-year-old VIP.

    2000-01-01

    Astronaut Andy Thomas (left) greets 100-year-old Captain Ralph Charles, one of the VIPs attending the launch of STS-99. Charles also met NASA Administrator Dan Goldin. An aviator who has the distinction of being the oldest licensed pilot in the United States, Charles is still flying. He has experienced nearly a century of flight history, from the Wright Brothers to the Space Program. He took flying lessons from one of the first fliers trained by Orville Wright, first repaired then built airplanes, went barnstorming, operated a charter service in the Caribbean, and worked as a test pilot for the Curtiss Wright Airplane Co. Charles watches all the Shuttle launches from his home in Ohio and his greatest wish is to be able to watch one in person from KSC.

  1. Overuse syndrome in musicians--100 years ago. An historical review.

    Fry, H J

    Overuse syndrome in musicians was extensively reported 100 years ago. The clinical features and results of treatment, which were recorded in considerable detail, match well the condition that is described today. The medical literature that is reviewed here extends from 1830 to 1911 and includes 21 books and 54 articles from the English language literature, apart from two exceptions; however, the writers of the day themselves reviewed French, German and Italian literature on the subject. The disorder was said to result from the overuse of the affected parts. Two theories of aetiology, not necessarily mutually exclusive, were argued. The central theory regarded the lesion as being in the central nervous system, the peripheral theory implied a primary muscle disorder. No serious case was put forward for a psychogenic origin, though emotional factors were believed to aggravate the condition. Advances in musical instrument manufacture--particularly the development of the concert piano and the clarinet--may have played a part in the prevalence of overuse syndrome in musicians. Total rest from the mechanical use of the hand was the only effective treatment recorded. PMID:3540544

  2. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  3. Creating Long Term Income Streams for the 100 Year Starship Study Initiative

    Sylvester, A. J.

    Development and execution of long term research projects are very dependent on a consistent application of funding to maximize the potential for success. The business structure for the 100 Year Starship Study project should allow for multiple income streams to cover the expenses of the research objectives. The following examples illustrate the range of potential avenues: 1) affiliation with a charitable foundation for creating a donation program to fund a long term endowment for research, 2) application for grants to fund initial research projects and establish the core expertise of the research entity, 3) development of intellectual property which can then be licensed for additional revenue, 4) creation of spinout companies with equity positions retained by the lab for funding the endowment, and 5) funded research which is dual use for the technology goals of the interstellar flight research objectives. With the establishment of a diversified stream of funding options, then the endowment can be funded at a level to permit dedicated research on the interstellar flight topics. This paper will focus on the strategy of creating spinout companies to create income streams which would fund the endowment of the 100 Year Starship Study effort. This technique is widely used by universities seeking to commercially develop and market technologies developed by university researchers. An approach will be outlined for applying this technique to potentially marketable technologies generated as a part of the 100 Year Starship Study effort.

  4. 100 years of seismic research on the Moho

    Prodehl, Claus; Kennett, Brian; Artemieva, Irina;

    2013-01-01

    1980s, passive seismology using distant earthquakes has played an increasingly important role in studies of crustal structure. The receiver function technique exploiting conversions between P and SV waves at discontinuities in seismic wavespeed below a seismic station has been extensively applied to...... the increasing numbers of permanent and portable broad-band seismic stations across the globe. Receiver function studies supplement controlled source work with improved geographic coverage and now make a significant contribution to knowledge of the nature of the crust and the depth to Moho......The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research on the...

  5. Applying MDE Tools at Runtime: Experiments upon Runtime Models

    Song, Hui; Huang, Gang; Chauvel, Franck; Sun, Yanshun

    2010-01-01

    to be published International audience Runtime models facilitate the management of running systems in many different ways. One of the advantages of runtime models is that they enable the use of existing MDE tools at runtime to implement common auxiliary activities in runtime management, such as querying, visualization, and transformation. In this tool demonstration paper, we focus on this specific aspect of runtime models. We discuss the requirements of runtime models to enable the use of...

  6. Process for selecting engineering tools : applied to selecting a SysML tool.

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  7. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  8. 100-Year Floodplains, Floodplains 100 year define in gold color, Published in 2009, 1:2400 (1in=200ft) scale, WABASH COUNTY GOVERNMENT.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale, was produced all or in part from Published Reports/Deeds information as of 2009. It is...

  9. 100-Year Floodplains, 100 year flood plain data, Published in 2006, 1:1200 (1in=100ft) scale, Washoe County.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Field Survey/GPS information as of 2006. It is described...

  10. Applying AI tools to operational space environmental analysis

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  11. Geo-environmental mapping tool applied to pipeline design

    Andrade, Karina de S.; Calle, Jose A.; Gil, Euzebio J. [Geomecanica S/A Tecnologia de Solo Rochas e Materiais, Rio de Janeiro, RJ (Brazil); Sare, Alexandre R. [Geomechanics International Inc., Houston, TX (United States); Soares, Ana Cecilia [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The Geo-Environmental Mapping is an improvement of the Geological-Geotechnical Mapping used for basic pipeline designs. The main purpose is to assembly the environmental, geotechnical and geological concepts in a methodological tool capable to predict constrains and reduce the pipeline impact to the environment. The Geo-Environmental mapping was built to stress the influence of soil/structure interaction, related to the physical effect that comes from the contact between structures and soil or rock. A Geological-Geotechnical-Environmental strip (chart) was presented to emphasize the pipeline operational constrains and its influence to the environment. The mapping was developed to clearly show the occurrence and properties of geological materials divided into geotechnical domain units (zones). The strips present construction natural properties, such as: excavability, stability of the excavation and soil re-use capability. Also, the environmental constrains were added to the geological-geotechnical mapping. The Geo-Environmental Mapping model helps the planning of the geotechnical and environmental inquiries to be carried out during executive design, the discussion on the types of equipment to be employed during construction and the analysis of the geological risks and environmental impacts to be faced during the built of the pipeline. (author)

  12. Monitoring operational data production applying Big Data tooling

    Som de Cerff, Wim; de Jong, Hotze; van den Berg, Roy; Bos, Jeroen; Oosterhoff, Rijk; Klein Ikkink, Henk Jan; Haga, Femke; Elsten, Tom; Verhoef, Hans; Koutek, Michal; van de Vegte, John

    2015-04-01

    Within the KNMI Deltaplan programme for improving the KNMI operational infrastructure an new fully automated system for monitoring the KNMI operational data production systems is being developed: PRISMA (PRocessflow Infrastructure Surveillance and Monitoring Application). Currently the KNMI operational (24/7) production systems consist of over 60 applications, running on different hardware systems and platforms. They are interlinked for the production of numerous data products, which are delivered to internal and external customers. All applications are individually monitored by different applications, complicating root cause and impact analysis. Also, the underlying hardware and network is monitored separately using Zabbix. Goal of the new system is to enable production chain monitoring, which enables root cause analysis (what is the root cause of the disruption) and impact analysis (what other products will be effected). The PRISMA system will make it possible to dispose all the existing monitoring applications, providing one interface for monitoring the data production. For modeling the production chain, the Neo4j Graph database is used to store and query the model. The model can be edited through the PRISMA web interface, but is mainly automatically provided by the applications and systems which are to be monitored. The graph enables us to do root case and impact analysis. The graph can be visualized in the PRISMA web interface on different levels. Each 'monitored object' in the model will have a status (OK, error, warning, unknown). This status is derived by combing all log information available. For collecting and querying the log information Splunk is used. The system is developed using Scrum, by a multi-disciplinary team consisting of analysts, developers, a tester and interaction designer. In the presentation we will focus on the lessons learned working with the 'Big data' tooling Splunk and Neo4J.

  13. Applied climate-change analysis: the climate wizard tool.

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  14. From Smallpox to Big Data: The Next 100 Years of Epidemiologic Methods.

    Gange, Stephen J; Golub, Elizabeth T

    2016-03-01

    For more than a century, epidemiology has seen major shifts in both focus and methodology. Taking into consideration the explosion of "big data," the advent of more sophisticated data collection and analytical tools, and the increased interest in evidence-based solutions, we present a framework that summarizes 3 fundamental domains of epidemiologic methods that are relevant for the understanding of both historical contributions and future directions in public health. First, the manner in which populations and their follow-up are defined is expanding, with greater interest in online populations whose definition does not fit the usual classification by person, place, and time. Second, traditional data collection methods, such as population-based surveillance and individual interviews, have been supplemented with advances in measurement. From biomarkers to mobile health, innovations in the measurement of exposures and diseases enable refined accuracy of data collection. Lastly, the comparison of populations is at the heart of epidemiologic methodology. Risk factor epidemiology, prediction methods, and causal inference strategies are areas in which the field is continuing to make significant contributions to public health. The framework presented herein articulates the multifaceted ways in which epidemiologic methods make such contributions and can continue to do so as we embark upon the next 100 years. PMID:26443419

  15. 100-Year Floodplains, FEMA FIRM Mapping, Published in 2014, Not Applicable scale, GIS.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from Other information as of 2014. It is described as 'FEMA FIRM...

  16. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  17. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a...... applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory in...

  18. LOW FREQUENCY VARIABILITY OF INTERANNUAL CHANGE PATTERNS FOR GLOBAL MEAN TEMPERATURE DURING THE RECENT 100 YEARS

    刘晶淼; 丁裕国; 等

    2002-01-01

    The TEEOF method that expands temporally is used to conduct a diagnostic study of the variation patterns of 1,3,6and 10 years with regard to mean air temperature over the globe and Southern and Northern Hemispheres over the course of 100 years.The results show that the first mode of TEEOF takes up more than 50%in the total variance,with each of the first mode in the interannual osicllations generally standing for annually varying patterns which are related with climate and reflecting long-term tendency of change in air temperature.It is particularly true for the first mode on the 10-year scale.which shows an obvious ascending ascending trend concerning the temperature in winter and consistently the primary component of time goes in a way that is very close to the sequence of actual temperature,Apart from the first mode of all time sections of TEEOF for the globe and the two hemispheres and the second mode of the 1-year TEEOF.interannual variation described by other characteristic vectors are showing various patterns,with corresponding primary components having relation with longterm variability of specific interannual quasi-periodic oscillation structures.A T2 test applied to the annual variation pattern shows that the abrupt changes for the southern Hemisphere and the globe come close to the result of a uni-element t test for mean temperature than those for the Northern Hemisphere do.It indicates that the T2 test,when carried out with patterns of multiple variables.Seems more reasonable than the t test with single elements.

  19. Discharge, gage height, and elevation of 100-year floods in the Hudson River basin, New York

    Archer, Roger J.

    1978-01-01

    The flood discharge that may be expected to be equaled or exceeded on the average of once in 100 years (100-year flood) was computed by the log-Pearson Type-III frequency relation for 72 stations in the Hudson River basin. These discharges and, where available, their corresponding gage height and elevation above mean sea level are presented in tabular form. A short explanation of computation methods is included. The data are to be used as part of a federally funded study of the water resources and related land resources of the Hudson River basin. (Woodard-USGS)

  20. Applying observations of work activity in designing prototype data analysis tools

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  1. Innovation and Involvement: 100 Years of Community Work with Older People.

    Glasby, Jon

    2000-01-01

    Birmingham Settlement has provided services to British older adults for over 100 years, including such innovations as adult day centers, meals on wheels, and transportation services. The participation of the clientele in research helped flesh out the history of the settlement through narratives that demonstrate its impact on the life of the…

  2. Ageing management of instrumentation and control systems for 100 years life of AHWR

    Currently Nuclear Power Plants are being designed for a life of about 40 years. However, Advanced Heavy Water Reactor (AHWR), being designed by BARC, is intended to have a life of 100 years. Instrumentation and Control (I and C) plays a crucial role in the safe operation of any nuclear reactor. Design of I and C especially for a life of 100 years offers a great deal of challenges. Experience has shown that ageing and obsolescence have the potential to cause the maintainability and operability of I and C systems in Nuclear Power Plants to deteriorate well before the end of plant life. Hence, all ageing effects are to be detected in time and eliminated by repair, upgrading and replacement measures. However since no I and C system can survive such a long life of 100 years, special attention is to be paid in the design to effect easy replacement. Every aspect of design of hardware and software should deal with obsolescence. Design strategies like minimising the amount of cabling by resorting to networked data communication will go a long way in achieving the desired life extension. Hence it is essential that an effective Ageing Management Programme to be established at the very initial stages of design, planning and engineering of I and C systems for AHWR. This will ensure reliable continued operation of I and C systems for 100 years of life. (author)

  3. 100 Years of the American Economic Review : The Top 20 Articles

    Kenneth J. Arrow; B. Douglas Bernheim; Martin S. Feldstein; Daniel L. McFadden; James M. Poterba; Solow, Robert M.

    2011-01-01

    This paper presents a list of the top 20 articles published in the American Economic Review during its first 100 years. This list was assembled in honor of the AER 's one-hundredth anniversary by a group of distinguished economists at the request of AER 's editor. A brief description accompanies the citations of each article.

  4. The Observation Of Defects Of School Buildings Over 100 Years Old In Perak

    Alauddin Kartina

    2016-01-01

    Full Text Available Malaysia is blessed with a rich legacy of heritage buildings with unique architectural and historical values. The heritage buildings become a symbol of the national identity of our country. Therefore, heritage buildings, as important monuments should be conserved well to ensure the extension of the building’s life span and to make sure continuity functions of the building for future generations. The aim of this study is to analyze the types of defects attached in school buildings over 100 years located in Perak. The data were collected in four different schools aged over 100 years in Perak. The finding of the study highlighted the types of defects which were categorized based on building elements, including external wall, roof, door, ceiling, staircase, column, internal wall, floor and windows. Finding showed that the type of defects occurred in school buildings over 100 years in Perak is the same as the other heritage buildings. This finding can be used by all parties to take serious actions in preventing defects from occurring in buildings over 100 years. This would ensure that buildings’ functional life span can be extended for future use.

  5. Simulations of the Greenland ice sheet 100 years into the future with the full Stokes model Elmer/Ice

    Seddik, H.; Greve, R.; Zwinger, T.; Gillet-Chaulet, F.; Gagliardini, O.

    2011-12-01

    the surface precipitation and temperature and the set S (three experiments) applies an amplification factor to change the basal sliding velocity. The experiments are compared to a constant climate control run beginning at present (epoch 2004-1-1 0:0:0) and running up to 100 years holding the climate constant to its present state. The experiments with the amplification factor (Set S) show high sensitivities. Relative to the control run, the scenario with an amplification factor of 3x applied to the sliding velocity produces a Greenland contribution to sea level rise of ~25 cm. An amplification factor of 2.5x produces a contribution of ~16 cm and an amplification factor 2x produces a contribution of ~9 cm. The experiments with the changes to the surface precipitation and temperature (set C) show a contribution to sea level rise of ~4 cm when a factor 1x is applied to the temperature and precipitation anomalies. A factor 1.5x produces a sea level rise of ~8 cm and a factor 2x produces a sea level rise of ~12 cm.

  6. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP. PMID:21330730

  7. Lessons learned applying CASE methods/tools to Ada software development projects

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  8. 100-year history of the development of bread winter wheat breeding programs

    Литвиненко, М. А.

    2016-01-01

    Purpose. Review of the main achievements of the Wheat Breeding and Seed ProductionDepartment in the Plant Breeding and Genetic Institute – National Centre of Seed and Cultivar Investigation in the developing theoretical principles of breeding and creation of winter wheat varieties of different types during 100-year (1916–2016) period of breeding programs realization. Results. The main theoretical, methodical developments and breeding achievements of Wheat Breeding and Seed Production Departme...

  9. 100 years of mapping the Holocene Rhine-Meuse delta plain: combining research and teaching

    Cohen, K.M.; Stouthamer, E.; Hoek, W.Z.; Middelkoop, H.

    2012-01-01

    The history of modern soil, geomorphological and shallow geological mapping in the Holocene Rhine-Meuse delta plain goes back about 100 years. The delta plain is of very heterogeneous build up, with clayey and peaty flood basins, dissected by sandy fluvial distributary channel belts with fine textured levees grading into tidal-influenced rivers and estuaries. Several generations of precursor rivers occur as alluvial ridges and buried ribbon sands. They form an intricate network originating fr...

  10. Comparative structural response of two steel bridges constructed 100 years apart

    VARUM Humberto; Sousa, Romain; Delgado, Walter; Fernandes, Catarina; Costa, Anibal; Jara, Jose M.; Jara, Manuel; Álvarez, Jose J.

    2011-01-01

    This paper presents a comparative numerical analysis of the structural behaviour and seismic performance of two existing steel bridges, the Infiernillo II Bridge and the Pinhao Bridge, one located in Mexico and the other in Portugal. The two bridges have similar general geometrical characteristics, but were constructed 100 years apart. Three-dimensional structural models of both bridges are developed and analysed for various load cases and several seismic conditions. The results of the compar...

  11. 100 years of Einstein's theory of Brownian motion: from pollen grains to protein trains

    Chowdhury, Debashish

    2005-01-01

    Experimental verification of the theoretical predictions made by Albert Einstein in his paper, published in 1905, on the molecular mechanisms of Brownian motion established the existence of atoms. In the last 100 years discoveries of many facets of the ubiquitous Brownian motion has revolutionized our fundamental understanding of the role of {\\it thermal fluctuations} in the exotic structures and complex dynamics exhibited by soft matter like, for example, colloids, gels, etc. The domain of B...

  12. The Extent of Applying Strategic Management Accounting Tools in Jordanian Banks

    Musa Abdel Latif Ibrahim Alnawaiseh

    2013-01-01

    The study aims to know the extent of applying strategic management accounting tools in Jordanian banks. Thetools that this study tested are; Activity Based Costing, Benchmarking, Competitor Analysis, ValuingCustomers, Integrated Performance Measurement, Life Cycle Costing, Cost of Quality, Brand Value Monitoring,Managing and Budgeting, Strategic Pricing, Target Costing, Value Chain Costing and Balanced Scorecard.To analyze the responses of the respondents (employees) in these banks, and the e...

  13. Accuracy assessment of the UT1 prediction method based on 100-year series analysis

    Malkin, Z.; Tissen, V.; Tolstikov, A.

    2013-01-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole coordinates. The method is based on construction of a general polyharmonic model of the variations of the Earth rotation parameters using all the data available for the last 80-100 years, and modified autoregression technique. In this presentation, a detailed comparison was made of real-time UT1 predictions computed making use of this method in 2006-2010 with ...

  14. Oceanic environmental changes of subarctic Bering Sea in recent 100 years: Evidence from molecular fossils

    LU Bing; CHEN Ronghua; ZHOU Huaiyang; WANG Zipan; CHEN Jianfang; ZHU Chun

    2005-01-01

    The core sample B2-9 from the seafloor of the subarctic Bering Sea was dated with 210Pb to obtain a consecutive sequence of oceanic sedimentary environments at an interval of a decade during 1890-1999. A variety of molecular fossils were detected, including n-alkanes, isoprenoids, fatty acids, sterols, etc. By the characteristics of these fine molecules (C27, C28, and C29 sterols) and their molecular indices (Pr/Ph, ∑C+22/∑C?21, CPI and C18∶2/C18∶0) and in consideration of the variation of organic carbon content, the 100-year evolution history of subarctic sea paleoenvironment was reestablished. It is indicated that during the past 100 years in the Arctic, there were two events of strong climate warming (1920-1950 and 1980-1999), which resulted in an oxidated sediment environment owing to decreasing terrigenous organic matters and increasing marine-derived organic matters, and two events of transitory climate cooling (1910 and 1970-1980), which resulted in a slightly reduced sediment environment owing to increasing terrigenous organic matters and decreasing marine-derived organic matters. It is revealed that the processes of warming/cooling alternated climate are directly related to the Arctic and global climate variations.

  15. Extending dry storage of spent LWR fuel for up to 100 years

    Because of delays in closing the back end of the fuel cycle in the U.S., there is a need to extend dry inert storage of spent fuel beyond its originally anticipated 20-year duration. Many of the methodologies developed to support initial licensing for 20-year storage should be able to support the longer storage periods envisioned. This paper evaluates the applicability of existing information and methodologies to support dry storage up to 100 years. The thrust of the analysis is the potential behavior of the spent fuel. In the USA, the criteria for dry storage of LWR spent fuel are delineated in 10 CFR 72. The criteria fall into four general categories: maintain subcriticality, prevent the release of radioactive material above acceptable limits, ensure that radiation rates and doses do not exceed acceptable levels, and maintain retrievability of the stored radioactive material. These criteria need to be considered for normal, off-normal, and postulated accident conditions. The initial safety analysis report submitted for licensing evaluated the fuel's ability to meet the requirements for 20 years. It is not the intent to repeat these calculations, but to look at expected behavior over the additional 80 years, during which the temperatures and radiation fields are lower. During the first 20 years, the properties of the components may change because of elevated temperatures, presence of moisture, effects of radiation, etc. During normal storage in an inert atmosphere, there is potential for the cladding mechanical properties to change due to annealing or interaction with cask materials. The emissivity of the cladding could also change due to storage conditions. If there is air leakage into the cask, additional degradation could occur through oxidation in breached rods, which could lead to additional fission gas release and enlargement of cladding breaches. Air in-leakage could also affect cover gas conductivity, cladding oxidation, emissivity changes, and excessive

  16. Extending dry storage of spent LWR fuel for up to 100 years

    Because of delays in closing the back end of the fuel cycle in the U.S., there is a need to extend dry inert storage of spent fuel beyond its originally anticipated 20-year duration. Many of the methodologies developed to support initial licensing for 20-year storage should be able to support the longer storage periods envisioned. This paper evaluates the applicability of existing information and methodologies to support dry storage up to 100 years. The thrust of the analysis is the potential behavior of the spent fuel. In the USA, the criteria for dry storage of LWR spent fuel are delineated in 10 CFR 72 [1]. The criteria fall into four general categories: maintain subcriticality, prevent the release of radioactive material above acceptable limits, ensure that radiation rates and doses do not exceed acceptable levels, and maintain retrievability of the stored radioactive material. These criteria need to be considered for normal, off-normal, and postulated accident conditions. The initial safety analysis report submitted for licensing evaluated the fuel's ability to meet the requirements for 20 years. It is not the intent to repeat these calculations, but to look at expected behavior over the additional 80 years, during which the temperatures and radiation fields are lower. During the first 20 years, the properties of the components may change because of elevated temperatures, presence of moisture, effects of radiation, etc. During normal storage in an inert atmosphere, there is potential for the cladding mechanical properties to change due to annealing or interaction with cask materials. The emissivity of the cladding could also change due to storage conditions. If there is air leakage into the cask, additional degradation could occur through oxidation in breached rods, which could lead to additional fission gas release and enlargement of cladding breaches. Air in-leakage could also affect cover gas conductivity, cladding oxidation, emissivity changes, and

  17. Sustainable Foods and Medicines Support Vitality, Sex and Longevity for a 100-Year Starship Expedition

    Edwards, M. R.

    Extended space flight requires foods and medicines that sustain crew health and vitality. The health and therapeutic needs for the entire crew and their children for a 100-year space flight must be sustainable. The starship cannot depend on resupply or carry a large cargo of pharmaceuticals. Everything in the starship must be completely recyclable and reconstructable, including food, feed, textiles, building materials, pharmaceuticals, vaccines, and medicines. Smart microfarms will produce functional foods with superior nutrition and sensory attributes. These foods provide high-quality protein and nutralence (nutrient density), that avoids obesity, diabetes, and other Western diseases. The combination of functional foods, lifestyle actions, and medicines will support crew immunity, energy, vitality, sustained strong health, and longevity. Smart microfarms enable the production of fresh medicines in hours or days, eliminating the need for a large dispensary, which eliminates concern over drug shelf life. Smart microfarms are adaptable to the extreme growing area, resource, and environmental constraints associated with an extended starship expedition.

  18. 100 Years of British military neurosurgery: on the shoulders of giants.

    Roberts, S A G

    2015-01-01

    Death from head injuries has been a feature of conflicts throughout the world for centuries. The burden of mortality has been variously affected by the evolution in weaponry from war-hammers to explosive ordnance, the influence of armour on survivability and the changing likelihood of infection as a complicating factor. Surgery evolved from haphazard trephination to valiant, yet disjointed, neurosurgery by a variety of great historical surgeons until the Crimean War of 1853-1856. However, it was events initiated by the Great War of 1914-1918 that not only marked the development of modern neurosurgical techniques, but our approach to military surgery as a whole. Here the author describes how 100 years of conflict and the input and intertwining relationships between the 20th century's great neurosurgeons established neurosurgery in the United Kingdom and beyond. PMID:26292388

  19. Were moas really hunted to extinction in less than 100 years?

    Three months ago New Zealand archaeologists were surprised to read in their daily newspapers that moas had been eaten to extinction by Maori moahunters in less than 100 years. The claim had been made in the US journal 'Science' by Richard Holdaway, formerly with Canterbury University, and Chris Jacomb of Canterbury Museum. It seems to me there are a number of weaknesses in the original paper, which should have been thrashed out locally before going for prestigious exposure overseas. The rapid extinction claim is based first of all on a 'Leslie matrix model' of moa population dynamics, and secondly on some recent carbon dates of a single archaeological site, Monck's Cave, near Christchurch. 21 refs

  20. Degradation of building materials over a lifespan of 30-100 years

    Following preliminary visits to four Magnox Nuclear Power Stations, a study was made of existing Central Electricity Generating Board (CEGB) reports on the condition of buildings at eight Power Stations. Sampling of building materials, non-destructive testing and inspections were carried out at Transfynydd, Oldbury and Dungeness ''A'' Magnox Power Stations, and the samples were subsequently laboratory tested. From the results of this work it can be concluded that little major deterioration is likely to occur in the reactor buildings at Transfynydd and Oldbury over the next 50 years and at Dungeness ''A'' for at least 25 years, assuming reasonable maintenance and the continuation of suitable internal temperatures and relative humidities. Because of the limitations on taking samples from, and tests on, the reactor biological shields and prestressed concrete vessel, no sensible forecast can be made of their potential life in the 75-100 year range

  1. The volcanic contribution to climate change of the past 100 years

    Volcanic eruptions which inject large amounts of sulfur-rich gas into the stratosphere produce dust veils which last several years and cool the earth's surface. At the same time these dust veils absorb enough solar radiation to warm the stratosphere. Since these temperature changes at the earth's surface and in the stratosphere are both in the opposite direction to the hypothesized effects from greenhouse gases, they act to delay and mask the detection of greenhouse effects on the climate system. A large portion of the global climate change of the past 100 years may be due to the effects of volcanoes, but a definitive answer is not yet clear. While effects over several years have been demonstrated with both data studies and numerical models, long-term effects, while found in climate model calculations, await confirmation with more realistic models. In this paper chronologies of past volcanic eruptions and the evidence from data analyses and climate model calculations are reviewed

  2. Rapid warming in mid-latitude central Asia for the past 100 years

    Fahu CHEN; Jinsong WANG; Liya JIN; Qiang ZHANG; Jing LI; Jianhui CHEN

    2009-01-01

    Surface air temperature variations during the last 100 years (1901-2003) in mid-latitude central Asia were analyzed using Empirical Orthogonal Functions (EOFs). The results suggest that temperature variations in four major sub-regions, i.e. the eastern monsoonal area, central Asia, the Mongolian Plateau and the Tarim Basin, respectively, are coherent and characterized by a striking warming trend during the last 100 years. The annual mean temperature increasing rates at each sub-region (represen-tative station) are 0.19℃ per decade, 0.16℃ per decade, 0.23℃ per decade and 0.15℃ per decade, respectively.The average annual mean temperature increasing rate of the four sub-regions is 0.18℃ per decade, with a greater increasing rate in winter (0.21℃ per decade). In Asian mid-latitude areas, surface air temperature increased relatively slowly from the 1900s to 1970s, and it has increased rapidly since 1970s. This pattern of temperature variation differs from that in the other areas of China. Notably, there was no obvious warming between the 1920s and 1940s, with temperature fluctuating between warming and cooling trends (e.g. 1920s, 1940s, 1960s, 1980s, 1990s). However, the warming trends are of a greater magnitude and their durations are longer than that of the cooling periods, which leads to an overall warming. The amplitude of temperature variations in the study region is also larger than that in eastern China during different periods.

  3. Prediction of Climatic Change for the Next 100 Years in the Apulia Region, Southern Italy

    Mladen Todorovic

    2007-12-01

    Full Text Available The impact of climate change on water resources and use for agricultural production has become a critical question for sustainability. Our objective was investigate the impact of the expected climate changes for the next 100 years on the water balance variations, climatic classifications, and crop water requirements in the Apulia region (Southern Italy. The results indicated that an increase of temperature, in the range between 1.3 and 2,5 °C, is expected in the next 100 years. The reference evapotranspiration (ETo variations would follow a similar trend; as averaged over the whole region, the ETo increase would be about 15.4%. The precipitation will not change significantly on yearly basis although a slight decrease in summer months and a slight increase during the winter season are foreseen. The climatic water deficit (CWD is largely caused by ETo increase, and it would increase over the whole Apulia region in average for more than 200 mm. According to Thornthwaite and Mather climate classification, the moisture index will decrease in the future, with decreasing of humid areas and increasing of aridity zones. The net irrigation requirements (NIR, calculated for ten major crops in the Apulia region, would increase significantly in the future. By the end of the 21st Century, the foreseen increase of NIR, in respect to actual situation, is the greatest for olive tree (65%, wheat (61%, grapevine (49%, and citrus (48% and it is slightly lower for maize (35%, sorghum (34%, sunflower (33%, tomato (31%, and winter and spring sugar beet (both 27%.

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  6. DIII-D integrated plasma control tools applied to next generation tokamaks

    A complete software suite for integrated tokamak plasma control has been developed within the DIII-D program. The suite consists of software for real-time control of all aspects of the plasma, modeling, simulation and design tools for analysis and development of controllers, a flexible and modular architecture for implementation and testing of algorithms and many fully validated models. Many elements of the system have been applied to and implemented on NSTX and MAST. The DIII-D realtime plasma control system together with the integrated modeling and simulation suite have been selected for operational use by both the KSTAR and EAST tokamaks, and are also being used at General Atomics to investigate control issues for ITER

  7. Interferometric Techniques Apply to Gemona (Friuli-Italy) Area as Tool for Structural Analysis.

    Sternai, P.; Calcagni, L.; Crippa, B.

    2009-04-01

    Interferometric Techniques Apply to Gemona (Friuli) Area as Tool for Structural Analysis. We suggest a possible exploitation of radar interferometry for estimating many features of the brittle deformation occurring at the very surface of the Earth, such as, for example, the length of the dislocation front, the total amount of the dislocation, the dislocation rate over the time interval considered. The Interferometric techniques allows obtaining highly reliable vertical velocity values of the order of 1 mm/yr, with a maximum resolution of 80m2. The values obtained always refer to the temporal interval considered, which depends on the availability of SAR images. We demonstrate that is possible to see the evolution and the behaviour of the main tectonic lineament of the considered area even on short period of time (few years). We describe the results of a procedure to calculate terrain motion velocity on highly correlated pixels of an area nearby Gemona - Friuli, Northern Italy, and then we presented some considerations, based on three successful examples of the analysis, on how to exploit these results in a structural-geological description of the area. The versatility of the technique, the large dimensions of the area that can be analyzed (10.000 km2), and the high precision and reliability of the results obtained, make radar interferometry a powerful tool not only to monitor the dislocation occurring at the surface, but also to obtain important information on the structural evolution of mountain belts, otherwise very difficult to recognize.

  8. Strategy for 100-year life of the ACR-1000 concrete containment structure

    The purpose of this paper is to present the Plant Life Management (PLiM) strategy for the concrete containment structure of the ACR-1000 (Advanced CANDU Reactor) designed by AECL. The ACR-1000 is designed for 100-year plant life including 60-year operating life and additional 40-year decommissioning period of time. The approach adopted for the PLiM strategy of the concrete containment structure is a preventive one, key areas being: 1) design methodology, 2) material performance and 3) life cycle management and ageing management program. In the design phase, in addition to strength and serviceability, durability is a major requirement during the service life and decommissioning phase of the ACR structure. Parameters affecting durability design include: a) concrete performance, b) structural application, and c) environmental conditions. Due to the complex nature of the environmental effects acting on structures during the service life of project, it is considered that true improved performance during the service life can be achieved by improving the material characteristics. Many recent innovations in advanced concrete materials technology have made it possible to produce modern concrete such as high-performance concrete with exceptional performance characteristics. In this paper, the PLiM strategy for the ACR-1000 concrete containment is presented. In addition to addressing the design methodology and material performance areas, a systematic approach for ageing management program for the concrete containment structure is presented. (author)

  9. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed

  10. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    Pittman, William; Han, Zhe; Harding, Brian; Rosas, Camilo; Jiang, Jiaojun; Pineda, Alba; Mannan, M. Sam, E-mail: mannan@tamu.edu

    2014-09-15

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed.

  11. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    Freddy Abel Vargas

    2010-04-01

    Full Text Available The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced for applying it to existing sectors of the economy. The present article attempts to define a methodology for technological awareness (based on transforming the information contained in databases by means of constructing technological maps contributing useful knowledge to production processes. This methodology has been applied to the flexible plastic packaging sector. The main trends in this industry's technological development were identified allowing strategies to be proposed for incorporating these advances and tendencies in national companies and research groups involved in flexible plastic packaging technological development and innovation. Technological mappiong's possibilities as an important instrument for producing technological development in a given sector are the analysed as are their possibilities for being used in other production processes.

  12. Quantitative tools for comparing animal communication systems: information theory applied to bottlenose dolphin whistle repertoires.

    McCOWAN; Hanser; Doyle

    1999-02-01

    Comparative analysis of nonhuman animal communication systems and their complexity, particularly in comparison to human language, has been generally hampered by both a lack of sufficiently extensive data sets and appropriate analytic tools. Information theory measures provide an important quantitative tool for examining and comparing communication systems across species. In this paper we use the original application of information theory, that of statistical examination of a communication system's structure and organization. As an example of the utility of information theory to the analysis of animal communication systems, we applied a series of information theory statistics to a statistically categorized set of bottlenose dolphin Tursiops truncatus, whistle vocalizations. First, we use the first-order entropic relation in a Zipf-type diagram (Zipf 1949 Human Behavior and the Principle of Least Effort) to illustrate the application of temporal statistics as comparative indicators of repertoire complexity, and as possible predictive indicators of acquisition/learning in animal vocal repertoires. Second, we illustrate the need for more extensive temporal data sets when examining the higher entropic orders, indicative of higher levels of internal informational structure, of such vocalizations, which could begin to allow the statistical reconstruction of repertoire organization. Third, we propose using 'communication capacity' as a measure of the degree of temporal structure and complexity of statistical correlation, represented by the values of entropic order, as an objective tool for interspecies comparison of communication complexity. In doing so, we introduce a new comparative measure, the slope of Shannon entropies, and illustrate how it potentially can be used to compare the organizational complexity of vocal repertoires across a diversity of species. Finally, we illustrate the nature and predictive application of these higher-order entropies using a preliminary

  13. The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools

    Holst, Michael; Tiglio, Manuel; Vallisneri, Michele

    2016-01-01

    On September 14, 2015, the newly upgraded Laser Interferometer Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW) signal, emitted a billion light-years away by a coalescing binary of two stellar-mass black holes. The detection was announced in February 2016, in time for the hundredth anniversary of Einstein's prediction of GWs within the theory of general relativity (GR). The signal represents the first direct detection of GWs, the first observation of a black-hole binary, and the first test of GR in its strong-field, high-velocity, nonlinear regime. In the remainder of its first observing run, LIGO observed two more signals from black-hole binaries, one moderately loud, another at the boundary of statistical significance. The detections mark the end of a decades-long quest, and the beginning of GW astronomy: finally, we are able to probe the unseen, electromagnetically dark Universe by listening to it. In this article, we present a short historical overview of GW science: this youn...

  14. 100-Year Floodplains, flood plain, Published in 2009, 1:24000 (1in=2000ft) scale, Washington County.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'flood...

  15. To Humbly Go: Guarding Against Perpetuating Models of Colonization in the 100-Year Starship Study

    Kramer, W. R.

    Past patterns of exploration, colonization and exploitation on Earth continue to provide the predominant paradigms that guide many space programs. Any project of crewed space exploration, especially of the magnitude envisioned by the 100-Year Starship Study, must guard against the hubris that may emerge among planners, crew, and others associated with the project, including those industries and bureaucracies that will emerge from the effort. Maintaining a non-exploitative approach may be difficult in consideration of the century of preparatory research and development and the likely multigenerational nature of the voyage itself. Starting now with mission dreamers and planners, the purpose of the voyage must be cast as one of respectful learning and humble discovery, not of conquest (either actual or metaphorical) or other inappropriate models, including military. At a minimum, the Study must actively build non-violence into the voyaging culture it is beginning to create today. References to exploitive colonization, conquest, destiny and other terms from especially American frontier mythology, while tempting in their propagandizing power, should be avoided as they limit creative thinking about alternative possible futures. Future voyagers must strive to adapt to new environments wherever possible and be assimilated by new worlds both biologically and behaviorally rather than to rely on attempts to recreate the Earth they have left. Adaptation should be strongly considered over terraforming. This paper provides an overview of previous work linking the language of colonization to space programs and challenges the extension of the myth of the American frontier to the Starship Study. It argues that such metaphors would be counter-productive at best and have the potential to doom long-term success and survival by planting seeds of social decay and self-destruction. Cautions and recommendations are suggested.

  16. 100 years of California’s water rights system: patterns, trends and uncertainty

    Grantham, Theodore E.; Viers, Joshua H.

    2014-08-01

    For 100 years, California’s State Water Resources Control Board and its predecessors have been responsible for allocating available water supplies to beneficial uses, but inaccurate and incomplete accounting of water rights has made the state ill-equipped to satisfy growing societal demands for water supply reliability and healthy ecosystems. Here, we present the first comprehensive evaluation of appropriative water rights to identify where, and to what extent, water has been dedicated to human uses relative to natural supplies. The results show that water right allocations total 400 billion cubic meters, approximately five times the state’s mean annual runoff. In the state’s major river basins, water rights account for up to 1000% of natural surface water supplies, with the greatest degree of appropriation observed in tributaries to the Sacramento and San Joaquin Rivers and in coastal streams in southern California. Comparisons with water supplies and estimates of actual use indicate substantial uncertainty in how water rights are exercised. In arid regions such as California, over-allocation of surface water coupled with trends of decreasing supply suggest that new water demands will be met by re-allocation from existing uses. Without improvements to the water rights system, growing human and environmental demands portend an intensification of regional water scarcity and social conflict. California’s legal framework for managing its water resources is largely compatible with needed reforms, but additional public investment is required to enhance the capacity of the state’s water management institutions to effectively track and regulate water rights.

  17. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  18. Underworld-GT Applied to Guangdong, a Tool to Explore the Geothermal Potential of the Crust

    Steve Quenette; Yufei Xi; John Mansour; Louis Moresi; David Abramson

    2015-01-01

    Geothermal energy potential is usually discussed in the context of conventional or engi-neered systems and at the scale of an individual reservoir. Whereas exploration for conventional reser-voirs has been relatively easy, with expressions of resource found close to or even at the surface, explora-tion for non-conventional systems relies on temperature inherently increasing with depth and searching for favourable geological environments that maximise this increase. To utilitise the information we do have, we often assimilate available exploration data with models that capture the physics of the domi-nant underlying processes. Here, we discuss computational modelling approaches to exploration at a re-gional or crust scale, with application to geothermal reservoirs within basins or systems of basins. Tar-get reservoirs have (at least) appropriate temperature, permeability and are at accessible depths. We discuss the software development approach that leads to effective use of the tool Underworld. We ex-plore its role in the process of modelling, understanding computational error, importing and exporting geological knowledge as applied to the geological system underpinning the Guangdong Province, China.

  19. Climatic and Hydrological Changes of Past 100 Years in Asian Arid Zone

    Feng, Zhaodong; Salnikov, Vitaliy; Xu, Changchun

    2014-05-01

    The Asian Arid Zone (AAZ) is here defined to include the following regions: northwestern China, Mongolia, Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan, and Uzbekistan. Generally speaking, the AAZ has experienced a temperature rising during the past 100 years that was significantly faster than the global average (0.14 ºC per decade). Specifically, the rate was 0.39 ºC per decade in northwestern China (1950-2010), 0.26 ºC per decade in Kazakhstan (1936-2005), 0.22 ºC per decade in Mongolia (1940-2010), 0.29 ºC per decade in Uzbekistan (1950-2005), 0.18 ºC per decade in Turkmenistan (1961-1995). It should be noted that the mountainous parts of AAZ seems to have experienced a slower rate of temperature rising. For example, the rate was 0.10 ºC per decade in Tajikistan (1940-2005) and was 0.08 ºC per decade in Kyrgyzstan (1890-2005). Precipitation has a slight increasing trend in northwestern China, but it has fluctuated along a near-constant line in the rest of the AAZ. Hydrological data from high-elevation basin show that the runoff has been increasing primarily as a result of rising temperature that caused increases in ice melting. A natural decreasing trend of surface runoff in low-elevation basins is undeniable and the decreasing trend is attributable to intensified evaporation under warming conditions. It is true that the total amount of runoff in the Tianshan Mountains and the associated basins has been increased primarily as a result of temperature rising-resulted increases in ice melting. But, approaching to the turning point of glacier-melting supplies to runoff will pose a great threat to socio-economic sustainability and to ecological security. The turning point refers to the transition from increasing runoff to decreasing runoff within ice melting supplied watersheds under a warming climate.

  20. A Concept for Testing Decision Support Tools in Participatory Processes Applied to the ToSIA Tool

    David Edwards

    2013-04-01

    Full Text Available ToSIA (Tool for Sustainability Impact Assessment offers a transparent and consistent methodological framework to assess impacts of changes (technological, policy, management, etc. in the forest-based sector. This tool is able to facilitate the decision making process within and between diverse groups of stakeholders (e.g., forest managers and policymakers as it provides a neutral, transparent and data-driven platform for stakeholder interaction and communication. To test these capabilities of ToSIA, a practical approach to test if a decision support system is suitable for participatory processes was developed based on a set of evaluation criteria for participatory processes. ToSIA’s performance was assessed and discussed in different categories against a selection of criteria for successful participatory processes: six criteria were fulfilled by ToSIA, in nine, ToSIA is potentially helpful, in two, criteria ToSIA has no influence, and for three criteria, no experiences exist until now. As a result, ToSIA’s conceptual suitability as a participatory decision support system was confirmed for two interlinked roles: as a decision support system to assess alternative scenarios, and as a communication platform for stakeholder interaction.

  1. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  2. Reply to the Comment of Leclercq et al. on "100-year mass changes in the Swiss Alps linked to the Atlantic Multidecadal Oscillation"

    M. Huss

    2010-12-01

    Full Text Available In their comment, Leclercq et al. argue that Huss et al. (2010 overestimate the effect of the Atlantic Multidecadal Oscillation (AMO on the 100-year mass balance variations in the Swiss Alps because time series of conventional balances instead of reference-surface balances were used. Applying the same model as in Huss et al. we calculate time series of reference-surface mass balance, and show that the difference between conventional and reference-surface mass balance is significantly smaller than stated in the comment. Both series exhibit very similar multidecadal variations. The opposing effects of retreat and surface lowering on mass balance partly cancel each other.

  3. The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review

    Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza

    2014-01-01

    Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strength...

  4. Tool for Experimenting with Concepts of Mobile Robotics as Applied to Children's Education

    Jimenez Jojoa, E. M.; Bravo, E. C.; Bacca Cortes, E. B.

    2010-01-01

    This paper describes the design and implementation of a tool for experimenting with mobile robotics concepts, primarily for use by children and teenagers, or by the general public, without previous experience in robotics. This tool helps children learn about science in an approachable and interactive way, using scientific research principles in…

  5. XVII International Botanical Congress. 100 years after the II IBC in Vienna 1905. Abstracts

    Full text: The program of XVII IBC 2005 includes all aspects of basic and applied botanical research. Progress in the different sub-disciplines is revealed through plenary talks, general lectures, symposia, and poster sessions. This conference emphasizes the newest developments in the botanical sciences worldwide. (botek)

  6. (Journal of Applied Toxicology) BMDExpress Data Viewer: A Visualization Tool to Analyze BMDExpress Datasets

    Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at whic...

  7. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  8. Development of an intelligent system for tool wear monitoring applying neural networks

    A. Antić

    2005-12-01

    Full Text Available Purpose: The objective of the researches presented in the paper is to investigate, in laboratory conditions, the application possibilities of the proposed system for tool wear monitoring in hard turning, using modern tools and artificial intelligence (AI methods.Design/methodology/approach: On the basic theoretical principles and the use of computing methods of simulation and neural network training, as well as the conducted experiments, have been directed to investigate the adequacy of the setting.Findings: The paper presents tool wear monitoring for hard turning for certain types of neural network configurations where there are preconditions for up building with dynamic neural networks.Research limitations/implications: Future researches should include the integration of the proposed system into CNC machine, instead of the current separate system, which would provide synchronisation between the system and the machine, i.e. the appropriate reaction by the machine after determining excessive tool wear.Practical implications: Practical application of the conducted research is possible with certain restrictions and supplement of adequate number of experimental researches which would be directed towards certain combinations of machining materials and tools for which neural networks are trained.Originality/value: The contribution of the conducted research is observed in one possible view of the tool monitoring system model and it’s designing on modular principle, and principle building neural network.

  9. Using the natural abundance of 13C and 15N to examine soil organic matter accumulated during 100 years of cropping

    The 13C natural abundance technique was applied to soils of a long term experimental field in the study of organic matter turnover. The technique allowed evaluation of soil organic matter (SOM) originating from residues of different cropping systems that partially replaced the native prairie SOM mineralized during 100 years of cropping history. A large pool of the prairie SOM was highly resistant to decay, demonstrating a turnover time of 1000 years. Labile prairie SOM, lost when cultivation was initiated, had a half-life of 11 years. Accumulated SOM that originated from residues of a particular crop demonstrated a similar half-life as it decayed and was replaced by new SOM from residues of a different crop. Apparent turnover time for soil organic carbon calculated from annual input of crop residues to the soil for different cropping systems ranged from 2 years for corn to 6.4 years for timothy sod. The natural abundance of 15N showed significant change for soil treated with chemical fertilizer or manure relative to the control soil. Manure applied to timothy for 100 years contributed 24% of existing soil organic nitrogen. (author). 12 refs, 2 figs, 3 tabs

  10. 100 years of elementary particles [Beam Line, vol. 27, issue 1, Spring 1997

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K.H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  11. 100 years of elementary particles [Beam Line, vol. 27, number 1, Spring 1997

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe

  12. 100 years of Elementary Particles [Beam Line, vol. 27, issue 1, Spring 1997

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K. H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  13. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  14. Simulation of water-surface elevations for a hypothetical 100-year peak flow in Birch Creek at the Idaho National Engineering and Environmental Laboratory, Idaho

    Delineation of areas at the Idaho National Engineering and Environmental Laboratory that would be inundated by a 100-year peak flow in Birch Creek is needed by the US Department of Energy to fulfill flood-plain regulatory requirements. Birch Creek flows southward about 40 miles through an alluvium-filled valley onto the northern part of the Idaho National Engineering and Environmental laboratory site on the eastern Snake River Plain. The lower 10-mile reach of Birch Creek that ends in Birch Creek Playa near several Idaho National Engineering and Environmental Laboratory facilities is of particular concern. Twenty-six channel cross sections were surveyed to develop and apply a hydraulic model to simulate water-surface elevations for a hypothetical 100-year peak flow in Birch Creek. Model simulation of the 100-year peak flow (700 cubic feet per second) in reaches upstream from State Highway 22 indicated that flow was confined within channels even when all flow was routed to one channel. Where the highway crosses Birch Creek, about 315 cubic feet per second of water was estimated to move downstream--115 cubic feet per second through a culvert and 200 cubic feet per second over the highway. Simulated water-surface elevation at this crossing was 0.8 foot higher than the elevation of the highway. The remaining 385 cubic feet per second flowed southwestward in a trench along the north side of the highway. Flow also was simulated with the culvert removed. The exact location of flood boundaries on Birch Creek could not be determined because of the highly braided channel and the many anthropogenic features (such as the trench, highway, and diversion channels) in the study area that affect flood hydraulics and flow. Because flood boundaries could not be located exactly, only a generalized flood-prone map was developed

  15. Accumulation of pharmaceuticals, Enterococcus, and resistance genes in soils irrigated with wastewater for zero to 100 years in central Mexico.

    Philipp Dalkmann

    Full Text Available Irrigation with wastewater releases pharmaceuticals, pathogenic bacteria, and resistance genes, but little is known about the accumulation of these contaminants in the environment when wastewater is applied for decades. We sampled a chronosequence of soils that were variously irrigated with wastewater from zero up to 100 years in the Mezquital Valley, Mexico, and investigated the accumulation of ciprofloxacin, enrofloxacin, sulfamethoxazole, trimethoprim, clarithromycin, carbamazepine, bezafibrate, naproxen, diclofenac, as well as the occurrence of Enterococcus spp., and sul and qnr resistance genes. Total concentrations of ciprofloxacin, sulfamethoxazole, and carbamazepine increased with irrigation duration reaching 95% of their upper limit of 1.4 µg/kg (ciprofloxacin, 4.3 µg/kg (sulfamethoxazole, and 5.4 µg/kg (carbamazepine in soils irrigated for 19-28 years. Accumulation was soil-type-specific, with largest accumulation rates in Leptosols and no time-trend in Vertisols. Acidic pharmaceuticals (diclofenac, naproxen, bezafibrate were not retained and thus did not accumulate in soils. We did not detect qnrA genes, but qnrS and qnrB genes were found in two of the irrigated soils. Relative concentrations of sul1 genes in irrigated soils were two orders of magnitude larger (3.15 × 10(-3 ± 0.22 × 10(-3 copies/16S rDNA than in non-irrigated soils (4.35 × 10(-5± 1.00 × 10(-5 copies/16S rDNA, while those of sul2 exceeded the ones in non-irrigated soils still by a factor of 22 (6.61 × 10(-4 ± 0.59 × 10(-4 versus 2.99 × 10(-5 ± 0.26 × 10(-5 copies/16S rDNA. Absolute numbers of sul genes continued to increase with prolonging irrigation together with Enterococcus spp. 23S rDNA and total 16S rDNA contents. Increasing total concentrations of antibiotics in soil are not accompanied by increasing relative abundances of resistance genes. Nevertheless, wastewater irrigation enlarges the absolute concentration of resistance genes in soils due to a

  16. An assessment tool applied to manure management systems using innovative technologies

    Sørensen, Claus G.; Jacobsen, Brian H.; Sommer, Sven G.

    2003-01-01

    operational and cost-effective animal manure handling technologies. An assessment tool covering the whole chain of the manure handling system from the animal houses to the field has been developed. The tool enables a system-oriented evaluation of labour demand, machinery capacity and costs related to the...... tanker transport may reduce labour requirements, increase capacity, and open up new ways for reducing ammonia emission. In its most efficient configuration, the use of umbilical systems may reduce the labour requirement by about 40% and increase capacity by 80%. However, these systems are costly and will...

  17. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  18. Applying New Computer-Aided Tools for Wind Farm Planning and Environmental Impact Analysis

    Lybech Thoegersen, Morten; Nielsen, Per; Soerensen, Mads V. [Energi- og Miljoedata (EMD) Aalborg (Denmark); Toppenberg, Per [County of Northern Jutland, Aalborg (Denmark); Soee Christiansen, Erik [Municipality of Nibe, Nibe (Denmark)

    2005-07-01

    The demand for an environmental impact analysis (environmental assessment study) in any major Danish wind farm project has initiated the development for a set of computer-aided tools for wind turbine planning purposes. This paper gives an introduction to the newly developed computer-aided tools integrated in the wind farm design and planning tool WindPRO. The new module WindPLAN includes three interrelated spatial planning models: a weighted visibility calculation model, a conflict check calculation and a wind resource weighted planning module. The application of the models is exemplified through a case study covering the municipality of Nibe - situated in the Northern Jutland, Denmark. The different analysis are heavily dependent on detailed GIS-data - showing objects such as local housing, leisure areas, preservation areas etc. Finally, a brief presentation of other valuable computer-aided tools integrated in the WindPRO/WindPLAN module is given, such as rendering of terrain profiles, user defined map composing and saved pollution calculation.

  19. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  20. Adaptive Monte Carlo applied to uncertainty estimation in a five axis machine tool link errors identification

    Andolfatto, Loïc; Lavernhe, Sylvain; 10.1016/j.ijmachtools.2011.03.006

    2011-01-01

    Knowledge of a machine tool axis to axis location errors allows compensation and correcting actions to be taken to enhance its volumetric accuracy. Several procedures exist, involving either lengthy individual test for each geometric error or faster single tests to identify all errors at once. This study focuses on the closed kinematic Cartesian chain method which uses a single setup test to identify the eight link errors of a five axis machine tool. The identification is based on volumetric error measurements for different poses with a non-contact measuring instrument called CapBall, developed in house. In order to evaluate the uncertainty on each identified error, a multi-output Monte Carlo approach is implemented. Uncertainty sources in the measurement and identification chain - such as sensors output, machine drift and frame transformation uncertainties - can be included in the model and propagated to the identified errors. The estimated uncertainties are finally compared to experimental results to assess...

  1. Systems thinking tools as applied to community-based participatory research: a case study.

    BeLue, Rhonda; Carmack, Chakema; Myers, Kyle R; Weinreb-Welch, Laurie; Lengerich, Eugene J

    2012-12-01

    Community-based participatory research (CBPR) is being used increasingly to address health disparities and complex health issues. The authors propose that CBPR can benefit from a systems science framework to represent the complex and dynamic characteristics of a community and identify intervention points and potential "tipping points." Systems science refers to a field of study that posits a holistic framework that is focused on component parts of a system in the context of relationships with each other and with other systems. Systems thinking tools can assist in intervention planning by allowing all CBPR stakeholders to visualize how community factors are interrelated and by potentially identifying the most salient intervention points. To demonstrate the potential utility of systems science tools in CBPR, the authors show the use of causal loop diagrams by a community coalition engaged in CBPR activities regarding youth drinking reduction and prevention. PMID:22467637

  2. Changing patterns of infant death over the last 100 years: autopsy experience from a specialist children's hospital

    Pryce, J. W.; Weber, M A; Ashworth, M T; Roberts, S; Malone, M.; Sebire, N. J.

    2012-01-01

    OBJECTIVES: Infant mortality has undergone a dramatic reduction in the UK over the past century because of improvements in public health policy and medical advances. Postmortem examinations have been performed at Great Ormond Street Hospital for over 100 years, and analysis of cases across this period has been performed to assess changing patterns of infant deaths undergoing autopsy. DESIGN: Autopsy reports from 1909 and 2009 were examined. Age, major pathology and cause of death was reviewed...

  3. Automated Computer Systems for Manufacturability Analyses and Tooling Design : Applied to the Rotary Draw Bending Process

    Johansson, Joel

    2011-01-01

    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation processes. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the manufacturability analysis process, a part of the production preparation pr...

  4. Applying Model Driven Engineering Techniques and Tools to the Planets Game Learning Scenario

    Nodenot, Thierry; Caron, Pierre André; Le Pallec, Xavier; Laforcade, Pierre

    2008-01-01

    CPM (Cooperative Problem-Based learning Metamodel) is a visual language for the instructional design of Problem-Based Learning (PBL) situations. This language is a UML profile implemented on top of the Objecteering UML Case tool. In this article, we first present the way we used CPM language to bring about the pedagogical transposition of the planets game learning scenario. Then, we propose some related works conducted to improve CPM usability: on the one hand, we outline a MOF solution and a...

  5. Is Scores Derived from the Most Internationally Applied Patient Safety Culture Assessment Tool Correct?

    Javad Moghri; Ali Akbari Sari; Mehdi Yousefi; Hasan Zahmatkesh; Ranjbar Mohammad Ezzatabadi; Pejman Hamouzadeh; Satar Rezaei; Jamil Sadeghifar

    2013-01-01

    Abstract Background Hospital Survey on Patient Safety Culture, known as HSOPS, is an internationally well known and widely used tool for measuring patient safety culture in hospitals. It includes 12 dimensions with positive and negative wording questions. The distribution of these questions in different dimensions is uneven and provides the risk of acquiescence bias. The aim of this study was to assess the questionnaire against this bias. Methods Three hundred nurses were assigned into study ...

  6. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    Freddy Abel Vargas; Óscar Fernando Castellanos Domínguez

    2010-01-01

    The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature) is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced...

  7. A new phase pattern recognition tool applied to field line resonances

    Plaschke, F.; Glassmeier, K.-H.; Milan, S. E.; Mann, I. R.; Motschmann, U.; Rae, I. J.

    2009-04-01

    The detection and characterization of geomagnetic pulsations (standing Alfven waves on magnetospheric field lines, as produced by the field-line resonance (FLR) process) using ground magnetic field data has been based for decades on the interpretation of the longitudinal and latitudinal distributions of pulsation amplitudes and phases. By adopting this approach only clear and single FLRs can be correctly analyzed. Magnetometer array data, however, contain much more phase information due to the coherency of the ground observed FLR wave structures across the array of stations, which remains undisclosed if phase pattern recognition of beamforming techniques are not used. We present theory and applications of such a new phase pattern recognition tool, the Field-Line Resonance Detector (FLRD), which is an adaptation of the wave telescope technique, previously used in seismology and multi-spacecraft analysis. Unlike the traditional methods the FLRD is able to detect and fully characterize multiple superposed or hidden FLR structures, of which the tool allows for an automated detection. We show results of its application in a statistical analysis of one year (2002) of ground magnetometer data from the Canadian magnetometer array CANOPUS (now known as CARISMA, www.carisma.ca) and a comparison of FLRD results with other ground-based data from optical and radar instruments. The remarkable adaptability of the tool to other datasets and phase structures shall also be discussed.

  8. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  9. Atomic Force Microscopy as a Tool for Applied Virology and Microbiology

    Zaitsev, Boris

    2003-12-01

    Atomic force microscope (AFM) can be successfully used for simple and fast solution of many applied biological problems. In this paper the survey of the results of the application of atomic force microscope SolverP47BIO (NT-MDT, Russia) in State Research Center of Virology and Biotechnology "Vector" is presented. The AFM has been used: - in applied virology for the counting of viral particles and examination of virus-cell interaction; - in microbiology for measurements and indication of bacterial spores and cells; - in biotechnology for control of biotechnological processes and evaluation of the distribution of particle dimension for viral and bacterial diagnostic assays. The main advantages of AFM in applied researches are simplicity of the processing of sample preparation and short time of the examination.

  10. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...... that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by...... enriching and extending the concept of MV while trying to lead the CBR methodology into a new domain by applying it in strategic management....

  11. OPERATIONS MANAGEMENT TOOLS APPLIED TO THE OPERATING ROOM: A REVIEW OF CURRENT CONCEPTS AND A SINGLE CENTRE EXPERIENCE

    Lo, Charles Yuan-Hui

    2009-01-01

    Operations management tools can be applied to the operating room setting in order to improve throughput of the system. This is important because of the limitation of resources and funds available to hospitals in the public healthcare system. Hospitals must deal with variability in demand and uncertainty surrounding scheduling; these considerations can be placed in a queuing theory framework to better design processing capacity to minimize wait times and maximize utilization. Lean techniques c...

  12. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). PMID:24007752

  13. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  14. The ZEW combined microsimulation-CGE model : innovative tool for applied policy analysis

    Clauss, Markus; Schubert, Stefanie

    2009-01-01

    This contribution describes the linkage of microsimulation models and computable general equilibrium (CGE) models using two already established models called "STSM" and "PACE-L" used by the Centre for European Economic Research. This state of the art research method for applied policy analysis combines the advantages of both model types: On the one hand, microsimulation models allow for detailed labor supply and distributional effects due to policy measures, as individual household data is us...

  15. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    : environmentally sustainable, technologically feasible, economically viable, socially desirable, legally permissible, and administratively achievable. Specific LCIA indicators may provide preliminary information to support a precautionary approach to act earlier on D-P and contribute to sustainability. Impacts...... eutrophication. The goal is to promote an educational example of environmental impacts assessment through science-based tools to predict the impacts, communicate knowledge and support decisions. The example builds on the (D) high demand for fixation of reactive nitrogen that supports several socio......-economic secondary drivers. The nitrogen exported to marine coastal ecosystems (P), after point and nonpoint source emissions, promote changes in the environmental conditions (S) such as low dissolved oxygen levels that cause the (I) effects on biota. These, stimulate society into designing actions ® to modify D...

  16. Applied Railway Optimization in Production Planning at DSB-S-tog - Tasks, Tools and Challenges

    Clausen, Jens

    2007-01-01

    customers, and has concurrently been met with demands for higher efficiency in the daily operation. The plans of timetable, rolling stock and crew must hence allow for a high level of customer service, be efficient, and be robust against disturbances of operations. It is a highly non-trivial task to meet...... scheduling. In addition we describe on-going efforts in using mathematical models in activities such as timetable design and work-force planning. We also identify some organizatorial key factors, which have paved the way for extended use of optimization methods in railway production planning....... these conflicting goals. S-tog has therefore on the strategic level decided to use software with optimization capabilities in the planning processes. We describe the current status for each activity using optimization or simulation as a tool: Timetable evaluation, rolling stock planning, and crew...

  17. Teaching Strategies to Apply in the Use of Technological Tools in Technical Education

    Olga Arranz García

    2014-09-01

    Full Text Available The emergence of new technologies in education area is changing the way of organizing the educational processes. Teachers are not unrelated to these changes and must employ new strategies to adapt their teaching methods to the new circumstances. One of these adaptations is framed in the virtual learning, where the learning management systems have been revealed as a very effective means within the learning process. In this paper we try to provide teachers in engineering schools how to use in an appropriate way the different technological tools that are present in a virtual platform. Thus, in the experimental framework we show the results outcomes in the analysis of two data samples obtained before and after the implementation of the European Higher Education Area, that would be extrapolated for its innovative application to the learning techniques.

  18. 100 years of superconductivity

    Globe Info

    2011-01-01

    Public lecture by Philippe Lebrun, who works at CERN on applications of superconductivity and cryogenics for particle accelerators. He was head of CERN’s Accelerator Technology Department during the LHC construction period. Centre culturel Jean Monnet, route de Gex Tuesday 11 October from 8.30 p.m. to 10.00 p.m. » Suitable for all – Admission free - Lecture in French » Number of places limited For further information: +33 (0)4 50 42 29 37

  19. 100 years of radar

    Galati, Gaspare

    2016-01-01

    This book offers fascinating insights into the key technical and scientific developments in the history of radar, from the first patent, taken out by Hülsmeyer in 1904, through to the present day. Landmark events are highlighted and fascinating insights provided into the exceptional people who made possible the progress in the field, including the scientists and technologists who worked independently and under strict secrecy in various countries across the world in the 1930s and the big businessmen who played an important role after World War II. The book encourages multiple levels of reading. The author is a leading radar researcher who is ideally placed to offer a technical/scientific perspective as well as a historical one. He has taken care to structure and write the book in such a way as to appeal to both non-specialists and experts. The book is not sponsored by any company or body, either formally or informally, and is therefore entirely unbiased. The text is enriched by approximately three hundred ima...

  20. Quantitative seismic interpretation: Applying rock physics tools to reduce interpretation risk

    Yong Chen

    2007-01-01

    @@ Seismic data analysis is one of the key technologies for characterizing reservoirs and monitoring subsurface pore fluids. While there have been great advances in 3D seismic data processing, the quantitative interpretation of the seismic data for rock properties still poses many challenges. This book demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes, as well as how the multidisciplinary combination of rock physics models with seismic data, sedimentological information, and stochastic techniques can lead to more powerful results than can be obtained from a single technique.

  1. Applied tools for determining low-activity radionuclides in large environmental samples

    Considerable amounts of biological material contaminated with artificial radionuclides were generated to obtain the efficiency curves for low-activity radionuclide analyses of large environmental samples. Likewise, improving detection geometry is also an important task, mainly for studies involving conservation units with a high level of biodiversity preservation. This study aimed to evaluate the Monte Carlo efficiency curves without generating contaminated material with artificial radionuclides for water and vegetation measurements. An in-house adapted Marinelli geometry was applied to reduce the sampled amount of biological material in the ecosystem, which was combined with the Monte Carlo assisted efficiency curve for a more sustainable radiometric analysis. (author)

  2. Prediction of permafrost distribution on the Qinghai-Tibet Plateau in the next 50 and 100 years

    NAN; Zhuotong; LI; Shuxun; CHENG; Guodong

    2005-01-01

    Intergovernmental Panel on Climate Change (IPCC) in 2001 reported that the Earth air temperature would rise by 1.4-5.8℃ and 2.5℃ on average by the year 2100. China regional climate model results also showed that the air temperature on the Qinghai-Tibet Plateau (QTP) would increase by 2.2-2.6℃ in the next 50 years. A numerical permafrost model was ed to predict the changes of permafrost distribution on the QTP over the next 50 and 100 years under the two climatic warming scenarios, i.e. 0.02℃/a, the lower value of IPCC's estimation, and 0.052℃/a, the higher value predicted by Qin et al. Simulation results show that ( i ) in the case of 0.02℃/a air-temperature rise, permafrost area on the QTP will shrink about 8.8% in the next 50 years, and high temperature permafrost with mean annual ground temperature (MAGT) higher than -0.11℃ may turn into seasonal frozen soils. In the next 100 years, permafrost with MAGT higher than -0.5℃ will disappear and the permafrost area will shrink up to 13.4%. (ii) In the case of 0.052℃/a air-temperature rise, permafrost area on the QTP will reduce about 13.5% after 50 years. More remarkable degradation will take place after 100 years, and permafrost area will reduce about 46%. Permafrost with MAGT higher than -2℃ will turn into seasonal frozen soils and even unfrozen soils.

  3. Applied Circular Dichroism: A Facile Spectroscopic Tool for Configurational Assignment and Determination of Enantiopurity

    Macduff O. Okuom

    2015-01-01

    Full Text Available In order to determine if electronic circular dichroism (ECD is a good tool for the qualitative evaluation of absolute configuration and enantiopurity in the absence of chiral high performance liquid chromatography (HPLC, ECD studies were performed on several prescriptions and over-the-counter drugs. Cotton effects (CE were observed for both S and R isomers between 200 and 300 nm. For the drugs examined in this study, the S isomers showed a negative CE, while the R isomers displayed a positive CE. The ECD spectra of both enantiomers were nearly mirror images, with the amplitude proportional to the enantiopurity. Plotting the differential extinction coefficient (Δε versus enantiopurity at the wavelength of maximum amplitude yielded linear standard curves with coefficients of determination (R2 greater than 97% for both isomers in all cases. As expected, Equate, Advil, and Motrin, each containing a racemic mixture of ibuprofen, yielded no chiroptical signal. ECD spectra of Suphedrine and Sudafed revealed that each of them is rich in 1S,2S-pseudoephedrine, while the analysis of Equate vapor inhaler is rich in R-methamphetamine.

  4. Neutron tomography of particulate filters: a non-destructive investigation tool for applied and industrial research

    This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerized reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). This effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF

  5. Environmental management systems tools applied to the nuclear fuel center of IPEN

    Mattos, Luis A. Terribile de; Meldonian, Nelson Leon; Madi Filho, Tufic, E-mail: mattos@ipen.br, E-mail: meldonia@ipen.br, E-mail: tmfilho@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  6. Applying CBR to machine tool product configuration design oriented to customer requirements

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2016-03-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  7. Environmental management systems tools applied to the nuclear fuel center of IPEN

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  8. 3&4D Geomodeling Applied to Mineral Resources Exploration - A New Tool for Targeting Deposits.

    Royer, Jean-Jacques; Mejia, Pablo; Caumon, Guillaume; Collon-Drouaillet, Pauline

    2013-04-01

    3 & 4D geomodeling, a computer method for reconstituting the past deformation history of geological formations, has been used in oil and gas exploration for more than a decade for reconstituting fluid migration. It begins nowadays to be applied for exploring with new eyes old mature mining fields and new prospects. We describe shortly the 3&4D geomodeling basic notions, concepts, and methodology when applied to mineral resources assessment and modeling ore deposits, pointing out the advantages, recommendations and limitations, together with new challenges they rise. Several 3D GeoModels of mining explorations selected across Europe will be presented as illustrative case studies which have been achieved during the EU FP7 ProMine research project. It includes: (i) the Cu-Au porphyry deposits in the Hellenic Belt (Greece); (ii) the VMS in the Iberian Pyrite Belt including the Neves Corvo deposit (Portugal) and (iii) the sediment-hosted polymetallic Cu-Ag (Au, PGE) Kupferschiefer ore deposit in the Foresudetic Belt (Poland). In each case full 3D models using surfaces and regular grid (Sgrid) were built from all dataset available from exploration and exploitation including geological primary maps, 2D seismic cross-sections, and boreholes. The level of knowledge may differ from one site to another however those 3D resulting models were used to pilot additional field and exploration works. In the case of the Kupferschiefer, a sequential restoration-decompaction (4D geomodeling) from the Upper Permian to Cenozoic was conducted in the Lubin- Sieroszowice district of Poland. The results help in better understanding the various superimposed mineralization events which occurred through time in this copper deposit. A hydro-fracturing index was then calculated from the estimated overpressures during a Late Cretaceous-Early Paleocene up-lifting, and seems to correlate with the copper content distribution in the ore-series. These results are in agreement with an Early Paleocene

  9. A practical guide to applying lean tools and management principles to health care improvement projects.

    Simon, Ross W; Canacari, Elena G

    2012-01-01

    Manufacturing organizations have used Lean management principles for years to help eliminate waste, streamline processes, and cut costs. This pragmatic approach to structured problem solving can be applied to health care process improvement projects. Health care leaders can use a step-by-step approach to document processes and then identify problems and opportunities for improvement using a value stream process map. Leaders can help a team identify problems and root causes and consider additional problems associated with methods, materials, manpower, machinery, and the environment by using a cause-and-effect diagram. The team then can organize the problems identified into logical groups and prioritize the groups by impact and difficulty. Leaders must manage action items carefully to instill a sense of accountability in those tasked to complete the work. Finally, the team leaders must ensure that a plan is in place to hold the gains. PMID:22201573

  10. Microgravity: A New Tool for Basic and Applied Research in Space

    1985-01-01

    This brochure highlights selected aspects of the NASA Microgravity Science and Applications program. So that we can expand our understanding and control of physical processes, this program supports basic and applied research in electronic materials, metals, glasses and ceramics, biological materials, combustion and fluids and chemicals. NASA facilities that provide weightless environments on the ground, in the air, and in space are available to U.S. and foreign investigators representing the academic and industrial communities. After a brief history of microgravity research, the text explains the advantages and methods of performing microgravity research. Illustrations follow of equipment used and experiments preformed aboard the Shuttle and of prospects for future research. The brochure concludes be describing the program goals and the opportunities for participation.

  11. Study description of ExternE Project and the EcoSense Tool applied to Brazil

    In the present work an overview of the Extern Project in Brazil that has been conducted by the Brazilian National Nuclear Energy Commission (CNEN) is presented. To perform part of this evaluation study is used a version of the Eco Sense software developed by the Institute for Energy Economy and Rational Energy Application of University of Stuttgart to be applied to Brazil and other countries of South America and part of Central America. An important feature of this study is to establish a local and regional data bank with environmental and social parameters around Brazil and other countries to estimate the externalities of energy that will be introduced due to the new generation units that are planned to be built during the next years to provide an increase of electricity availability to support the economy growth. (author)

  12. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  13. Spatio-temporal analysis of rainfall trends over a maritime state (Kerala) of India during the last 100 years

    Nair, Archana; Ajith Joseph, K.; Nair, K. S.

    2014-05-01

    Kerala, a maritime state of India is bestowed with abundant rainfall which is about three times the national average. This study is conducted to have a better understanding of rainfall variability and trend at regional level for this state during the last 100 years. It is found that the rainfall variation in northern and southern regions of Kerala is large and the deviation is on different timescales. There is a shifting of rainfall mean and variability during the seasons. The trend analysis on rainfall data over the last 100 years reveals that there is a significant (99%) decreasing trend in most of the regions of Kerala especially in the month of January, July and November. The annual and seasonal trends of rainfall in most regions of Kerala are also found to be decreasing significantly. This decreasing trend may be related to global anomalies as a result of anthropogenic green house gas (GHG) emissions due to increased fossil fuel use, land-use change due to urbanisation and deforestation, proliferation in transportation associated atmospheric pollutants. We have also conducted a study of the seasonality index (SI) and found that only one district in the northern region (Kasaragod) has seasonality index of more than 1 and that the distribution of monthly rainfall in this district is mostly attributed to 1 or 2 months. In rest of the districts, the rainfall is markedly seasonal. The trend in SI reveals that the rainfall distribution in these districts has become asymmetric with changes in rainfall distribution.

  14. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  15. Effects of 100 years wastewater irrigation on resistance genes, class 1 integrons and IncP-1 plasmids in Mexican soil

    Sven eJechalke

    2015-03-01

    Full Text Available Long-term irrigation with untreated wastewater can lead to an accumulation of antibiotic substances and antibiotic resistance genes in soil. However, little is known so far about effects of wastewater, applied for decades, on the abundance of IncP-1 plasmids and class 1 integrons which may contribute to the accumulation and spread of resistance genes in the environment, and their correlation with heavy metal concentrations.Therefore, a chronosequence of soils that were irrigated with wastewater from zero to 100 years was sampled in the Mezquital Valley in Mexico in the dry season. The total community DNA was extracted and the absolute and relative abundance (relative to 16S rRNA genes of antibiotic resistance genes (tet(W, tet(Q, aadA, class 1 integrons (intI1, quaternary ammonium compound resistance genes (qacE+qacEΔ1 and IncP-1 plasmids (korB were quantified by real-time PCR. Except for intI1 and qacE+qacEΔ1 the abundances of selected genes were below the detection limit in non-irrigated soil. Confirming the results of a previous study, the absolute abundance of 16S rRNA genes in the samples increased significantly over time (linear regression model, p < 0.05 suggesting an increase in bacterial biomass due to repeated irrigation with wastewater. Correspondingly, all tested antibiotic resistance genes as well as intI1 and korB significantly increased in abundance over the period of 100 years of irrigation. In parallel, concentrations of the heavy metals Zn, Cu, Pb, Ni, and Cr significantly increased. However, no significant positive correlations were observed between the relative abundance of selected genes and years of irrigation, indicating no enrichment in the soil bacterial community due to repeated wastewater irrigation or due to a potential co-selection by increasing concentrations of heavy metals.

  16. 1,100 years after an earthquake: modification of the earthquake record by submergence, Puget Lowland, Washington State

    Arcos, M. E.

    2011-12-01

    Crustal faults may pose a complicated story for earthquake reconstruction. In some cases, regional tectonic strain overprints the record of coseismic land-level changes. This study looks at the record of earthquakes at two sites in the Puget Lowland, Gorst and the Skokomish delta, and how post-earthquake submergence modified the paleoseismic records. The Puget Lowland is the slowly subsiding forearc basin of the northern Cascadia subduction zone. A series of active thrust faults cross this lowland. Several of these faults generated large (M7+) earthquakes, about 1,100 years ago and both field sites have submerged at least 1.5 m since that time. This submergence masked the geomorphic record of uplift in some areas, resulting in a misreading of the zone of earthquake deformation and potential misinterpretation of the underlying fault structure. Earthquakes ~1,100 years ago uplifted both field localities and altered river dynamics. At Gorst, a tsunami and debris flow accompanied uplift of at least 3 m by the Seattle fault. The increased sediment load resulted in braided stream formation for a period after the earthquake. At the Skokomish delta, differential uplift trapped the river on the eastern side of the delta for the last 1,100 years resulting in an asymmetric intertidal zone, 2-km wider on one side of the delta than the other. The delta slope or submergence may contribute to high rates of flooding on the Skokomish River. Preliminary results show the millennial scale rates of submergence vary with the southern Puget Lowland submerging at a faster rate than the northern Puget Lowland. This submergence complicates the reconstruction of past earthquakes and renders assessment of future hazards difficult for those areas that are based on uplifted marine platforms and other coastal earthquake signatures in several ways. 1) Post-earthquake submergence reduces the apparent uplift of marine terraces. 2) Submergence makes zones of earthquake deformation appear narrower. 3

  17. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    Gustafsson, David; Jansson, Per-Erik [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Land and Water Resources Engineering; Gaerdenaes, Annemieke [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Soil Sciences; Eckersten, Henrik [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Crop Production Ecology

    2006-12-15

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem.

  18. Are Geodetically and Geologically Constrained Vertical Deformation Models Compatible With the 100-Year Coastal Tide Gauge Record in California?

    Smith-Konter, B. R.; Sandwell, D. T.

    2006-12-01

    Sea level change has been continuously recorded along the California coastline at several tide gauge stations for the past 50-100 years. These stations provide a temporal record of sea level change, generally attributed to post-glacial rebound and ocean climate phenomena. However, geological processes, including displacements from large earthquakes, have also been shown to produce sea level variations. Furthermore, the vertical tectonic response to interseismic strain accumulation in regions of major fault bends has been shown to produce uplift and subsidence rates consistent with sea level trends. To investigate the long-term extent and implication of tectonic deformation on sea level change, we compare time series data from California tide gauge stations to model estimates of vertical displacements produced by earthquake cycle deformation. Using a 3-D semi-analytic viscoelastic model, we combine geologic slip rates, geodetic velocities, and historical seismic data to simulate both horizontal and vertical deformation of the San Andreas Fault System. Using this model, we generate a time-series of vertical displacements spanning the 100-year sea level record and compare this to tide gauge data provided by the Permanent Service for Mean Sea Level (PSMSL). Comparison between sea level data and a variety of geologically and geodetically constrained models confirms that the two are highly compatible. Vertical displacements are largely controlled by interseismic strain accumulation, however displacements from major earthquakes are also required to explain varying trends in the sea level data. Models based on elastic plate thicknesses of 30-50km and viscosities of 7x10^1^8-2x10^1^9 Pa-s produce vertical displacements at tide-gauge locations that explain long-term trends in the sea level record to a high degree of accuracy at nearly all stations. However, unmodeled phenomena are also present in the sea level data and require further inspection.

  19. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem

  20. Evaluating the 100 year floodplain as an indicator of flood risk in low-lying coastal watersheds

    Sebastian, A.; Brody, S.; Bedient, P. B.

    2013-12-01

    The Gulf of Mexico is the fastest growing region in the United States. Since 1960, the number of housing units built in the low-lying coastal counties has increased by 246%. The region experiences some of the most intense rainfall events in the country and coastal watersheds are prone to severe flooding characterized by wide floodplains and ponding. This flooding is further exacerbated as urban development encroaches on existing streams and waterways. While the 100 year floodplain should play an important role in our ability to develop disaster resilient communities, recent research has indicated that existing floodplain delineations are a poor indicator of actual flood losses in low-lying coastal regions. Between 2001 and 2005, more than 30% of insurance claims made to FEMA in the Gulf Coast region were outside of the 100 year floodplain and residential losses amounted to more than $19.3 billion. As population density and investments in this region continue to increase, addressing flood risk in coastal communities should become a priority for engineers, urban planners, and decision makers. This study compares the effectiveness of 1-D and a 2-D modeling approaches to spatially capture flood claims from historical events. Initial results indicate that 2-D models perform much better in coastal environments and may serve better for floodplain modeling helping to prevent unintended losses. The results of this study encourage a shift towards better engineering practices using existing 2-D models in order to protect resources and provide guidance for urban development in low-lying coastal regions.

  1. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in fie

  2. 100-Year Floodplains, FloodZone; FEMA; Update Frequency is every five or ten years, Published in 2008, Athens-Clarke County Planning Department.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, was produced all or in part from Field Survey/GPS information as of 2008. It is described as 'FloodZone; FEMA; Update Frequency...

  3. 100-Year Floodplains, FEMA Floodway and Flood Boundary Maps, Published in 2005, 1:24000 (1in=2000ft) scale, Lafayette County Land Records.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2005. It is described as 'FEMA...

  4. 100-Year Floodplains, FEMA flood insurance rate map vector data, Published in 2009, 1:7200 (1in=600ft) scale, Portage County.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:7200 (1in=600ft) scale, was produced all or in part from Orthoimagery information as of 2009. It is described as...

  5. 100-Year Floodplains, Flood plains from FEMA, Published in 2003, 1:600 (1in=50ft) scale, Town of Cary NC.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:600 (1in=50ft) scale, was produced all or in part from LIDAR information as of 2003. It is described as 'Flood...

  6. 100-Year Floodplains, Published in 2008, 1:100000 (1in=8333ft) scale, City of Americus & Sumter County, GA GIS.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:100000 (1in=8333ft) scale, was produced all or in part from Road Centerline Files information as of 2008. Data by...

  7. 100-Year Floodplains, St James FEMA Flood Map, Published in 2010, 1:24000 (1in=2000ft) scale, St James Parish Government.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Hardcopy Maps information as of 2010. It is described...

  8. 100-Year Floodplains, Data provided by FEMA and WI DNR, Published in 2009, 1:2400 (1in=200ft) scale, Dane County Land Information Office.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale as of 2009. It is described as 'Data provided by FEMA and WI DNR'. Data by this publisher...

  9. Acidophilic denitrifiers dominate the N2O production in a 100-year-old tea orchard soil.

    Huang, Ying; Long, Xi-En; Chapman, Stephen J; Yao, Huaiying

    2015-03-01

    Aerobic denitrification is the main process for high N2O production in acid tea field soil. However, the biological mechanisms for the high emission are not fully understood. In this study, we examined N2O emission and denitrifier communities in 100-year-old tea soils with four pH levels (3.71, 5.11, 6.19, and 7.41) and four nitrate concentration (0, 50, 200, and 1000 mg kg(-1) of NO3 (-)-N) addition. Results showed the highest N2O emission (10.1 mg kg(-1) over 21 days) from the soil at pH 3.71 with 1000 mg kg(-1) NO3 (-) addition. The N2O reduction and denitrification enzyme activity in the acid soils (pH pH 7.41. Moreover, TRF 78 of nirS and TRF 187 of nosZ dominated in soils of pH 3.71, suggesting an important role of acidophilic denitrifiers in N2O production and reduction. CCA analysis also showed a negative correlation between the dominant denitrifier ecotypes (nirS TRF 78, nosZ TRF 187) and soil pH. The representative sequences were identical to those of cultivated denitrifiers from acidic soils via phylogenetic tree analysis. Our results showed that the acidophilic denitrifier adaptation to the acid environment results in high N2O emission in this highly acidic tea soil. PMID:25273518

  10. GEodesy Tools for Societal Issues (GETSI): Undergraduate curricular modules that feature geodetic data applied to critical social topics

    Douglas, B. J.; Pratt-Sitaula, B.; Walker, B.; Miller, M. S.; Charlevoix, D.

    2014-12-01

    The GETSI project is a three-year NSF funded project to develop and disseminate teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (http://serc.carleton.edu/getsi). GETSI was born out of requests from geoscience faculty for more resources with which to educate future citizens and future geoscience professionals on the power and breadth of geodetic methods to address societally relevant topics. Development of the first two modules started at a February 2014 workshop and initial classroom testing begins in fall 2014. The Year 1 introductory module "Changing Ice and Sea Level" includes geodetic data such as gravity, satellite altimetry, and GPS time series. The majors-level Year 1 module is "Imaging Active Tectonics" and it has students analyzing InSAR and LiDAR data to assess infrastructure vulnerability to demonstratively active faults. Additional resources such as animations and interactive data tools are also being developed. The full modules will take about two weeks of class time; module design will permit portions of the material to be used as individual projects or assignments of shorter duration. Ultimately a total of four modules will be created and disseminated, two each at the introductory and majors-levels. GETSI is working in tight partnership with the Science Education Resource Center's (SERC) InTeGrate project on the module development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. This will allow for an optimized module development process based on successful practices defined by these earlier efforts.

  11. Floodplain sediment from a 100-year-recurrence flood in 2005 of the Ping River in northern Thailand

    S. H. Wood

    2008-07-01

    Full Text Available The tropical storm, floodwater, and the floodplain-sediment layer of a 100-year recurrence flood are examined to better understand characteristics of large monsoon floods on medium-sized rivers in northern Thailand. Storms producing large floods in northern Thailand occur early or late in the summer rainy season (May–October. These storms are associated with tropical depressions evolving from typhoons in the South China Sea that travel westward across the Indochina Peninsula. In late September, 2005, the tropical depression from Typhoon Damrey swept across northern Thailand delivering 100–200 mm/day at stations in mountainous areas. Peak flow from the 6355-km2 drainage area of the Ping River upstream of the city of Chiang Mai was 867 m3s−1 (river-gage of height 4.93 m and flow greater than 600 m3s−1 lasted for 2.5 days. Parts of the city of Chiang Mai and some parts of the floodplain in the intermontane Chiang Mai basin were flooded up to 1-km distant from the main channel. Suspended-sediment concentrations in the floodwater were measured and estimated to be 1000–1300 mg l−1.

    The mass of dry sediment (32.4 kg m-2, measured over a 0.32-km2 area of the floodplain is relatively high compared to reports from European and North American river floods. Average wet sediment thickness over the area was 3.3 cm. Sediment thicker than 8 cm covered 16 per cent of the area, and sediment thicker than 4 cm covered 44 per cent of the area. High suspended-sediment concentration in the floodwater, flow to the floodplain through a gap in the levee afforded by the mouth of a tributary stream as well as flow over levees, and floodwater depths of 1.2 m explain the relatively large amount of sediment in the measured area.

    Grain-size analyses and examination of the flood layer showed about 15-cm thickness of massive fine-sandy silt on the levee within 15

  12. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  13. Portable hyperspectral device as a valuable tool for the detection of protective agents applied on hystorical buildings

    Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.

    2012-04-01

    In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration [1]. The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in

  14. A re-collection of Diplocentrum recurvum lindl. (Orchidaceae after a lapse of 100 years or more from Andhra Pradesh, India

    Mitta Mahendranath

    2015-08-01

    Full Text Available Diplocentrumrecurvum Lindl. (Orchidaceae has been recollected from the Horsley hills of Chittoor districts from Andhra Pradesh after a lapse of 100 years or more. The present paper provides a detailed description, photographs of old herbarium specimens and distribution of the species. 

  15. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of rece

  16. NAA applied to the study of metallic ion transfer induced by orthopedic surgical tools or by metallic prostheses

    After implantation of a metallic prosthesis in a patient, damage can occur and surrounding tissues may be modified. These effects were related by characterizing the soft tissue content. However, the variations in element concentrations can be small and it is necessary to evaluate the instrumental contamination, measured in muscular and capsular tissues surrounding the hips of selected corpses. From corpses, which never undergo surgical operations, samples have been cut with different surgical tools. From corpses with hip prosthesis, samples have been cut with scalpels to determine the contamination induced by metallic prostheses. Results give the mineral composition of surgical tools and of muscular and capsular tissues. (author)

  17. Experiences and Results of Applying Tools for Assessing the Quality of a mHealth App Named Heartkeeper.

    Martínez-Pérez, Borja; de la Torre-Díez, Isabel; López-Coronado, Miguel

    2015-11-01

    Currently, many incomplete mobile apps can be found in the commercial stores, apps with bugs or low quality that needs to be seriously improved. The aim of this paper is to use two different tools for assessing the quality of a mHealth app for the self-management of heart diseases by the own patients named Heartkeeper. The first tool measures the compliance with the Android guidelines given by Google and the second measures the users' Quality of Experience (QoE). The results obtained indicated that Heartkeeper follows in many cases the Android guidelines, especially in the structure, and offers a satisfactory QoE for its users, with special mention to aspects such as the learning curve, the availability and the appearance. As a result, Heartkeeper has proved to be a satisfactory app from the point of view of Google and the users. The conclusions obtained are that the type of tools that measure the quality of an app can be very useful for developers in order to find aspects that need improvements before releasing their apps. By doing this, the number of low-quality applications released will decrease dramatically, so these techniques are strongly recommended for all the app developers. PMID:26345452

  18. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of recent and fossil, principally freshwater fish; “fish kills” and mass mortality in the sea (especially of fish); taphonomy and preservation of fish; upwelling; anoxic conditions, linked to fish mortali...

  19. Applied acoustics concepts, absorbers, and silencers for acoustical comfort and noise control alternative solutions, innovative tools, practical examples

    Fuchs, Helmut V

    2013-01-01

    The author gives a comprehensive overview of materials and components for noise control and acoustical comfort. Sound absorbers must meet acoustical and architectural requirements, which fibrous or porous material alone can meet. Basics and applications are demonstrated, with representative examples for spatial acoustics, free-field test facilities and canal linings. Acoustic engineers and construction professionals will find some new basic concepts and tools for developments in order to improve acoustical comfort. Interference absorbers, active resonators and micro-perforated absorbers of different materials and designs complete the list of applications.

  20. Applying value engineering and modern assessment tools in managing NEPA: Improving effectiveness of the NEPA scoping and planning process

    ECCLESTON, C.H.

    1998-09-03

    While the National Environmental Policy Act (NEPA) implementing regulations focus on describing ''What'' must be done, they provide surprisingly little direction on ''how'' such requirements are to be implemented. Specific implementation of these requirements has largely been left to the discretion of individual agencies. More than a quarter of a century after NEPA's enactment, few rigorous tools, techniques, or methodologies have been developed or widely adopted for implementing the regulatory requirements. In preparing an Environmental Impact Statement, agencies are required to conduct a public scoping process to determine the range of actions, alternatives, and impacts that will be investigated. Determining the proper scope of analysis is an element essential in the successful planning and implementation of future agency actions. Lack of rigorous tools and methodologies can lead to project delays, cost escalation, and increased risk that the scoping process may not adequately capture the scope of decisions that eventually might need to be considered. Recently, selected Value Engineering (VE) techniques were successfully used in managing a prescoping effort. A new strategy is advanced for conducting a pre-scoping/scoping effort that combines NEPA with VE. Consisting of five distinct phases, this approach has potentially wide-spread implications in the way NEPA, and scoping in particular, is practiced.

  1. Finite element code in Python as a universal and modular tool applied to Kohn-Sham equations

    Cimrman, R.; Vackář, Jiří; Novák, M.; Čertík, O.; Rohan, E.; Tůma, Miroslav

    Vienna: Vienna University of Technology, 2012 - (Eberhardsteiner, J.; Böhm, H.; Rammerstorfer, F.), s. 5212-5221 ISBN 9783950353709. [European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS 2012) /6./. Vienna (AT), 10.09.2012-14.09.2012] R&D Projects: GA ČR GA101/09/1630; GA ČR(CZ) GAP108/11/0853 Institutional support: RVO:68378271 ; RVO:67985807 Keywords : electronic structure * density-functional theory * pseudopotentials * molecules * clusters * finite-element method Subject RIV: BE - Theoretical Physics; IN - Informatics, Computer Science (UIVT-O)

  2. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices of...

  3. FEMA 100 year Flood Data

    California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  4. 100 Years of Reality Learning

    Zimpher, Nancy L.; Wright Ron, D.

    2006-01-01

    One may have heard of reality TV, but what about reality learning? The latter is probably a term one hasn't seen much, although it is in many ways a clearer and more concise name for a concept that in 2006 marks its 100th anniversary: cooperative education, or "co-op." Co-op, a break-through idea pioneered at the University of Cincinnati by Herman…

  5. 100 years of Planck's quantum

    Duck, Ian M

    2000-01-01

    This invaluable book takes the reader from Planck's discovery of the quantum in 1900 to the most recent interpretations and applications of nonrelativistic quantum mechanics.The introduction of the quantum idea leads off the prehistory of quantum mechanics, featuring Planck, Einstein, Bohr, Compton, and de Broglie's immortal contributions. Their original discovery papers are featured with explanatory notes and developments in Part 1.The invention of matrix mechanics and quantum mechanics by Heisenberg, Born, Jordan, Dirac, and Schrödinger is presented next, in Part 2.Following that, in Part 3,

  6. 100 Years of Brownian motion

    Hänggi, Peter; Marchesoni, Fabio

    2005-01-01

    In the year 1905 Albert Einstein published four papers that raised him to a giant in the history of science of all times. These works encompass the photon hypothesis (for which he obtained the Nobel prize in 1921), his first two papers on (special) relativity theory and, of course, his first paper on Brownian motion, entitled "\\"Uber die von der molekularkinetischen Theorie der W\\"arme geforderte Bewegung von in ruhenden Fl\\"ussigkeiten suspendierten Teilchen'' (submitted on May 11, 1905). Th...

  7. Effects of applying an external magnetic field during the deep cryogenic heat treatment on the corrosion resistance and wear behavior of 1.2080 tool steel

    Highlights: ► Deep cryogenic increases the carbide percentage and make a more homogenous distribution. ► Deep cryogenic improve the wear resistance and corrosion behavior of 1.2080 tool steel. ► Applying the magnetic field weaker the carbide distribution and decreases the carbides percentage. ► Magnetized samples showed weaker corrosion and wear behavior. -- Abstract: This work concerns with the effect of applying an external magnetic field on the corrosion behavior, wear resistance and microstructure of 1.2080 (D2) tool steel during the deep cryogenic heat treatment. These analyses were performed via scanning electron microscope (SEM), optical microscope (OM), transmission electron microscope (TEM) and X-ay diffraction (XRD) to study the microstructure, a pin-on-disk wear testing machine to study the wear behavior, and linear sweep voltammetry to study the corrosion behavior of the samples. It was shown that the deep cryogenic heat treatment eliminates retained austenite and makes a more uniform carbide distribution with higher percentage. It was also observed that the deep cryogenic heat treatment improves the wear behavior and corrosion resistance of 1.2080 tool steel. In comparison between the magnetized and non-magnetized samples, the carbide percentage decreases and the carbide distribution weakened in the magnetized samples; subsequently, the wear behavior and corrosion resistance attenuated compared in the magnetized samples.

  8. Applying TRIZ and Fuzzy AHP Based on Lean Production to Develop an Innovative Design of a New Shape for Machine Tools

    Ho-Nien Hsieh

    2015-03-01

    Full Text Available Companies are facing cut throat competition and are forced to continuously perform better than their competitors. In order to enhance their position in the competitive world, organizations are improving at a faster pace. Industrial organizations must be used to the new ideals, such as innovation. Today, innovative design in the development of new products has become a core value in most companies, while innovation is recognized as the main driving force in the market. This work applies the Russian theory of inventive problem-solving, TRIZ and the fuzzy analytical hierarchy process (FAHP to design a new shape for machine tools. TRIZ offers several concepts and tools to facilitate concept creation and problem-solving, while FAHP is employed as a decision support tool that can adequately represent qualitative and subjective assessments under the multiple criteria decision-making environment. In the machine tools industry, this is the first study to develop an innovative design under the concept of lean production. We used TRIZ to propose the relevant principles to the shape’s design with the innovative design consideration and also used FAHP to evaluate and select the best feasible alternative from independent factors based on a multiple criteria decision-making environment. To develop a scientific method based on the lean production concept in order to design a new product and improve the old designing process is the contribution of this research.

  9. Lead-time reduction utilizing lean tools applied to healthcare: the inpatient pharmacy at a local hospital.

    Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed

    2010-01-01

    The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time. PMID:20151593

  10. Covalent perturbation as a tool for validation of identifications and PTM mapping applied to bovine alpha-crystallin

    Bunkenborg, Jakob; Falkenby, Lasse Gaarde; Harder, Lea Mørch;

    2016-01-01

    . Chemical modification of all peptides in a sample leads to shifts in masses depending on the chemical properties of each peptide. The identification of a native peptide sequence and its perturbed version with a different parent mass and fragment ion masses provides valuable information. Labeling all...... peptides using reductive alkylation with formaldehyde is one such perturbation where the ensemble of peptides shifts mass depending on the number of reactive amine groups. Matching covalently perturbed fragmentation patterns from the same underlying peptide sequence increases confidence in the assignments...... and can salvage low scoring post-translationally modified peptides. Applying this strategy to bovine alpha-crystallin we identify 9 lysine acetylation sites, 4 O-GlcNAc sites and 13 phosphorylation sites. This article is protected by copyright. All rights reserved....

  11. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling. (letter)

  12. Apply Web-based Analytic Tool and Eye Tracking to Study The Consumer Preferences of DSLR Cameras

    Jih-Syongh Lin

    2013-11-01

    Full Text Available Consumer’s preferences and purchase motivation of products often lie in the purchasing behaviors generated by the synthetic evaluation of form features, color, function, and price of products. If an enterprise can bring these criteria under control, they can grasp the opportunities in the market place. In this study, the product form, brand, and prices of five DSLR digital cameras of Nikon, Lumix, Pentax, Sony, and Olympus were investigated from the image evaluation and eye tracking. The web-based 2-dimensional analytical tool was used to present information on three layers. Layer A provided information of product form and brand name; Layer B for product form, brand name, and product price for the evaluation of purchase intention (X axis and product form attraction (Y axis. On Layer C, Nikon J1 image samples of five color series were presented for the evaluation of attraction and purchase intention. The study results revealed that, among five Japanese brands of digital cameras, LUMIX GF3 is most preferred and serves as the major competitive product, with a product price of US$630. Through the visual focus of eye-tracking, the lens, curvatured handle bar, the curve part and shuttle button above the lens as well as the flexible flash of LUMIX GF3 are the parts that attract the consumer’s eyes. From the verbal descriptions, it is found that consumers emphasize the functions of 3D support lens, continuous focusing in shooting video, iA intelligent scene mode, and all manual control support. In the color preference of Nikon J1, the red and white colors are most preferred while pink is least favored. These findings can serve as references for designers and marketing personnel in new product design and development.

  13. Abstracts of the International conference 'Geological and geophysical studies of the Republic of Kazakhstan's sites', devoted to 100-year jubilee of K.I. Satpaev

    The International conference 'Geological and geophysical studies of the Republic of Kazakhstan's sites', was devoted to 100-year jubilee of K.I. Satpaev. Satpaev is the well-known Kazakh scientist-geologist, the first President of the Academy of Science of the Kazakh Soviet Socialist Republic. The conference was held in 26-29 April 1999 in the Kurchatov city on the former Semipalatinsk test site territory. The conference was mainly dedicated to problems of geological and geophysical examinations and monitoring of objects exposed to effects from underground nuclear explosions. The Collection of abstracts comprises 21 papers

  14. Diagnosing Integrity of Transformer Windings by Applying Statistical Tools to Frequency Response Analysis Data Obtained at Site

    M. Prameela

    2014-03-01

    Full Text Available This study presents the results of Sweep Frequency Response Analysis (SFRA measurement work carried out on number of power transformers at various sites involving problems like shorting of winding turns, core faults and related issues, On-Load Tap Changer (OLTC open contacts and winding displacement issues. The numerical parameters Viz., Min-Max ratio (MM, Mean Square Error (MSE, Maximum Absolute difference (MABS, Absolute Sum of Logarithmic Error (ASLE, Standard Deviation (S.D. and Correlation Coefficient (CC computed in three different frequency bands are presented to aid the interpretation of SFRA data. Comparison of frequency responses among different phases of the same transformer and with sister units were carried out to interpret the data. The study presents limits for various numerical parameters to diagnose the condition of the transformer and discriminate the faulty winding after accounting for manufacturing, design and asymmetry of the winding. The results presented in the study will help in interpreting the SFRA data by applying numerical techniques and assess the condition of the transformer.

  15. Covalent perturbation as a tool for validation of identifications and PTM mapping applied to bovine alpha-crystallin.

    Bunkenborg, Jakob; Falkenby, Lasse Gaarde; Harder, Lea Mørch; Molina, Henrik

    2016-02-01

    Proteomic identifications hinge on the measurement of both parent and fragment masses and matching these to amino acid sequences via database search engines. The correctness of the identifications is assessed by statistical means. Here we present an experimental approach to test identifications. Chemical modification of all peptides in a sample leads to shifts in masses depending on the chemical properties of each peptide. The identification of a native peptide sequence and its perturbed version with a different parent mass and fragment ion masses provides valuable information. Labeling all peptides using reductive alkylation with formaldehyde is one such perturbation where the ensemble of peptides shifts mass depending on the number of reactive amine groups. Matching covalently perturbed fragmentation patterns from the same underlying peptide sequence increases confidence in the assignments and can salvage low scoring post-translationally modified peptides. Applying this strategy to bovine alpha-crystallin, we identify 9 lysine acetylation sites, 4 O-GlcNAc sites and 13 phosphorylation sites. PMID:26644245

  16. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Végh, J., E-mail: janos.vegh@ec.europa.eu [Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Pós, I., E-mail: pos@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Horváth, Cs., E-mail: csaba.horvath@energia.mta.hu [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, P.O. Box 49 (Hungary); Kálya, Z., E-mail: kalyaz@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Parkó, T., E-mail: parkot@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Ignits, M., E-mail: ignits@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary)

    2015-10-15

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  17. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  18. Undergraduate teaching modules featuring geodesy data applied to critical social topics (GETSI: GEodetic Tools for Societal Issues)

    Pratt-Sitaula, B. A.; Walker, B.; Douglas, B. J.; Charlevoix, D. J.; Miller, M. M.

    2015-12-01

    The GETSI project, funded by NSF TUES, is developing and disseminating teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (serc.carleton.edu/getsi). It is collaborative between UNAVCO (NSF's geodetic facility), Mt San Antonio College, and Indiana University. GETSI was initiated after requests by geoscience faculty for geodetic teaching resources for introductory and majors-level students. Full modules take two weeks but module subsets can also be used. Modules are developed and tested by two co-authors and also tested in a third classroom. GETSI is working in partnership with the Science Education Resource Center's (SERC) InTeGrate project on the development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. Two GETSI modules are being published in October 2015. "Ice mass and sea level changes" includes geodetic data from GRACE, satellite altimetry, and GPS time series. "Imaging Active Tectonics" has students analyzing InSAR and LiDAR data to assess infrastructure earthquake vulnerability. Another three modules are in testing during fall 2015 and will be published in 2016. "Surface process hazards" investigates mass wasting hazard and risk using LiDAR data. "Water resources and geodesy" uses GRACE, vertical GPS, and reflection GPS data to have students investigating droughts in California and the High Great Plains. "GPS, strain, and earthquakes" helps students learn about infinitesimal and coseismic strain through analysis of horizontal GPS data and includes an extension module on the Napa 2014 earthquake. In addition to teaching resources, the GETSI project is compiling recommendations on successful development of geodesy curricula. The chief recommendations so far are the critical importance of including scientific experts in the authorship team and investing significant resources in

  19. 还原染料百年发展史话(待续)%The development of vat dyes in 100 years(to be continued)

    陈荣圻

    2015-01-01

    The first vat dye (vat dye RSN) was synthesized and produced by BASF in 1901, which was more than 100 years. Considering indigo blue was synthesized by BASF in 1987, it has a history of far more than 100 years. Vat dyes are expensive because of its complex chemical structure and long synthesis process with large amount of "three wastes" that are difficult to deal with. However, vat dyes are colourful, high color density and impossible to be replaced by any other dyes. Besides printing and dyeing, vat dyes can obtain high-grade organic pigments after pigmentation. Some of those high-grade organic pigments can extend to optical physics, liquid crystal in the electrochemistry optical material fields and other high-tech fields, which are indispensable function materials and completely changed.%1901年,BASF合成并生产了第1只还原染料(还原染料RSN),距今超过一百年,如果说1987年合成靛蓝早已诞生于BASF,那离百年就更远了.还原染料化学结构复杂,合成过程冗长、三废量大、难于处理,所以价格昂贵.但因其色彩鲜艳、密度高,非其他棉用染料所能取代.还原染料除了印染,经颜料化后,某些染料可制得高档有机颜料,有些品种还能拓展到光物理、电化学领域的液晶、光导材料等高科技领域,是不可缺失的功能材料,旧貌换新颜.

  20. Geographical information systems as a tool in limnological studies An applied case study in a shallow .lake of a plain area, Buenos Aires province, Argentina

    The understanding of the hydrological functioning and the interaction among the different water bodies in an area is essential when a sustainable use of the hydric resources is considered. The aim of the present paper is to assess both hydrological-limnological methods and GIS as an integrated methodology applied to the study of shallow lakes, and the hydrological behavior of shallow wetlands in plain areas. La Salada is an areic permanent shallow lake with an area of 5,78 km2 located near La Dulce town (SE of Buenos Aires Province, Argentina). In this paper we applied methods and tools of the Geographical information Systems in order to assess both, the evolution and state of the wetland. Topographic profiles, showing the relationship among the lake and the other aquatic systems, and also a multi temporal assessment of the morphometric parameters were performed by using a Digital Terrain Model of the area. A sample grid was designed to obtain bathymetric, hydrogeochemical and isotopic data. The chemical water composition is homogeneous in area and depth. changes in the conductivity values along depth, the isotopic contents and the Gibbs diagram showed that the evaporation is the main process controlling the water chemistry. Physical-chemical parameters established water quality and uses of the lake.

  1. Bottom-Up modeling, a tool for decision support for long-term policy on energy and environment - The TIMES model applied to the energy intensive industries

    Among the energy users in France and Europe, some industrial sectors are very important and should have a key role when assessing the final energy demand patterns in the future. The aim of our work is to apply a prospective model for the long range analysis of energy/technology choices in the industrial sector, focussing on the energy-intensive sectors. The modelling tool applied in this study is the TIMES model (family of best known MARKAL model). It is an economic linear programming model generator for local, national or multi regional energy systems, which provides a technology-rich basis for estimating energy dynamics over a long term, multi period time. We illustrate our work with nine energy-intensive industrial sectors: paper, steel, glass, cement, lime, tiles, brick, ceramics and plaster. It includes a detailed description of the processes involved in the production of industrial products, providing typical energy uses in each process step. In our analysis, we identified for each industry, several commercially available state-of-the-art technologies, characterized and chosen by the Model on the basis of cost effectiveness. Furthermore, we calculated potential energy savings, carbon dioxide emissions' reduction and we estimated the energy impact of a technological rupture. This work indicates that there still exists a significant potential for energy savings and carbon dioxide emissions' reduction in all industries. (author)

  2. Success at 100 is easier said than done--comments on Araújo et al: successful aging at 100 years.

    Martin, Peter; Poon, Leonard W

    2016-02-01

    Few would argue that achieving the age of 100 years is extraordinary, but what about the quality of life at this extreme age? Is it worth it to live to 100 and beyond? The study by Araújo, Ribero, Teixeira, and Paúl (2015) in three ways provided an answer to this question substantiating and complementing recent findings about successful aging in extreme old age (Poon and Perls, 2007; Martin et al., 2015). First, the study joined other investigators in asking whether the criteria for successful aging posed by Rowe and Kahn (1997) are applicable for older adults at the end stage of a very long life. Second, the study shed light on whether objective or subjective criteria are more appropriate to gauge levels of successful aging for the oldest old (e.g. Pruchno et al., 2010; Cho et al., 2012). Finally, the study provided additional data on psychological, social, and economic resources that enhance the needed ingredients of successful aging at the century mark. PMID:26781990

  3. Organochlorine pesticides (OCPs) in wetland soils under different land uses along a 100-year chronosequence of reclamation in a Chinese estuary

    Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Wang, Junjing; Gao, Zhaoqin; Zhang, Guangliang

    2015-12-01

    Soil profiles were collected at a depth of 30 cm in ditch wetlands (DWs), riverine wetlands (RiWs) and reclaimed wetlands (ReWs) along a 100-year chronosequence of reclamation in the Pearl River Delta. In total, 16 OCPs were measured to investigate the effects of wetland reclamation and reclamation history on OCP levels. Our results showed that average ∑DDTs, HCB, MXC, and ∑OCPs were higher in surface soils of DWs compared to RiWs and ReWs. Both D30 and D20 soils contained the highest ∑OCP levels, followed by D40 and D100 soils; lower ∑OCP levels occurred in D10 soils. Higher ∑OCP levels were observed in the younger RiWs than in the older ones, and surface soils exhibited higher ∑OCP concentrations in the older ReWs compared with younger ReWs. The predominant percentages of γ-HCH in ∑HCHs (>42%) and aldrin in ∑DRINs (>46%) in most samples reflected the recent use of lindane and aldrin. The presence of dominant DDT isomers (p,p’-DDE and p,p’-DDD) indicated the historical input of DDT and significant aerobic degradation of the compound. Generally, DW soils had a higher ecotoxicological risk of OCPs than RiW and ReW soils, and the top 30 cm soils had higher ecotoxicological risks of HCHs than of DDTs.

  4. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China.

    Xiao, Rong; Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe-Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (pheavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. PMID:25723958

  5. A geochemical record of environmental changes in sediments from Sishili Bay, northern Yellow Sea, China: Anthropogenic influence on organic matter sources and composition over the last 100 years

    Highlights: • Increased TOC and TN in the sediment cores indicated a eutrophic trend since 1975. • Marine organic matter sources dominated in Sishili Bay. • Scallop culture displayed mitigation on eutrophication pressures in Sishili Bay. • Increased fertilizer use well matched eutrophic process in Sishili Bay in 1975. -- Abstract: Total organic carbon (TOC), total nitrogen (TN), δ13C and δ15N were measured in sediment cores at three sites in Sishili Bay, China, to track the impacts of anthropogenic activities on the coastal environment over the last 100 years. The increased TOC and TN in the upper section of sediment cores indicated a eutrophic process since 1975. In comparison, the TOC and TN in the sediment core near to a scallop aquaculture area displayed a much slower increase, indicating the contribution of scallop aquaculture in mitigating eutrophication. Combined information from δ13C, δ15N and TOC:TN indicated an increased terrestrial signal, although organic matter sources in Sishili Bay featured a mixture of terrestrial and marine sources, with phytoplankton being dominant. Increased fertilizer use since 1970s contributed to the eutrophic process in Sishili Bay since 1975, and increased sewage discharge from 1990s has added to this process

  6. Quantification of uncertainties in the 100-year flow at an ungaged site near a gaged station and its application in Georgia

    Cho, Huidae; Bones, Emma

    2016-08-01

    The Federal Emergency Management Agency has introduced the concept of the "1-percent plus" flow to incorporate various uncertainties in estimation of the 100-year or 1-percent flow. However, to the best of the authors' knowledge, no clear directions for calculating the 1-percent plus flow have been defined in the literature. Although information about standard errors of estimation and prediction is provided along with the regression equations that are often used to estimate the 1-percent flow at ungaged sites, uncertainty estimation becomes more complicated when there is a nearby gaged station because regression flows and the peak flow estimate from a gage analysis should be weighted to compute the weighted estimate of the 1-percent flow. In this study, an equation for calculating the 1-percent plus flow at an ungaged site near a gaged station is analytically derived. Also, a detailed process is introduced for calculating the 1-percent plus flow for an ungaged site near a gaged station in Georgia as an example and a case study is performed. This study provides engineers and practitioners with a method that helps them better assess flood risks and develop mitigation plans accordingly.

  7. Application of a stent splint to protect intraoral organs from radiation injury to a 97 year-old patient with multiple oral cancers who survived over 100 year-old

    Radiation therapy had been used with increasing frequency in recent years in the management of oral cancers of advanced ages. In those cases we have to take good care to maintain the oral health of patients undergoing cancerocidal dose of radiation therapy. Using splints, as a tissue displacer, during radiation, we could treat a 99-year-old female patient without serious radiation sequelae, successfully she survived over 100 year-old. As she visited us at 97 year-old, the primary lesions located on the left upper lip, nose, upper and lower gums were diagnosed as multiple verrucous carcinoma histologically. Seventeen months after the first radiotherapy to the lip, nose and upper jaw, we planned again radiotherapy to the recurrent tumor of the lower gum. In order to eliminate and minimize side effects of the second irradiation for the contigenous intraoral organs, we devised a splint to exclude the tongue and upper gum apart from a radiation field. The splint, as tissue displacer, was made of heat-cured acrylic resin and divided into two pieces which were formed like full denture without artificial teeth. They were applied to the upper and lower jaws. The lower one had a large wing to exclude the tongue from irradiation field. After setting of the splint, she had been clenched slightly with an aid of chin cap. Then we could finish successfully the radiotherapy with 10 MV X-ray 40 Gy as scheduled without serious troubles. (author)

  8. 100-Year Floodplains, Digital Floodplain maps create by WI DNR added to our website in 2013, Published in 2013, 1:24000 (1in=2000ft) scale, Oneida County Wisconsin.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Published Reports/Deeds information as of 2013. It is...

  9. Polychlorinated biphenyls (PCBs) in sediments/soils of different wetlands along 100-year coastal reclamation chronosequence in the Pearl River Estuary, China.

    Zhao, Qingqing; Bai, Junhong; Lu, Qiongqiong; Gao, Zhaoqin; Jia, Jia; Cui, Baoshan; Liu, Xinhui

    2016-06-01

    PCBs (polychlorinated biphenyls) were determined in sediment/soil profiles to a depth of 30 cm from three different wetlands (i.e., ditch wetlands, riparian wetlands and reclaimed wetlands) of the Pearl River Estuary to elucidate their levels, distribution and toxic risks along a 100-year chronosequence of reclamation. All detected PCB congeners and the total 15 PCBs (∑15 PCBs) decreased with depth along sediment/soil profiles in these three wetlands. The ∑15 PCBs concentrations ranged from 17.68 to 169.26 ng/g in surface sediments/soils. Generally, old wetlands tended to have higher PCB concentrations than younger ones. The dominant PCB congeners at all sampling sites were light PCB homologues (i.e., tetra-CBs and tri-CBs). According to the sediment quality guideline, the average PCB concentrations exceeded the threshold effects level (TEL, 21.6 ng/g) at most of the sampling sites, exhibiting possible adverse biological effects, which were dominantly caused by light PCB congeners. The total toxic equivalent (TEQ) concentrations of 10 dioxin-like PCBs (DL-PCBs) detected at all sampling sites ranged from 0.04 to 852.7 (10(-3) ng/g), mainly affected by PCB126. Only DL-PCB concentrations in ditch and riparian wetland sediments with 40-year reclamation histories (i.e., D40 and Ri40) exhibited moderate adverse biological effects according to SQGQ values. Principal component analysis indicated that PCBs in three wetland sediments/soils mainly originated from Aroclor 1016, 1242, and 1248. Correlation analysis showed that sediment/soil organic carbon content had a significant correlation with the concentrations of several PCB congeners (P  0.05). PMID:27038573

  10. Changes in stable isotopes, lignin-derived phenols, and fossil pigments in sediments of Lake Biwa, Japan: Implications for anthropogenic effects over the last 100 years

    We measured stable nitrogen (N) and carbon (C) isotope ratios, lignin-derived phenols, and fossil pigments in sediments of known ages to elucidate the historical changes in the ecosystem status of Lake Biwa, Japan, over the last 100 years. Stable N isotope ratios and algal pigments in the sediments increased rapidly from the early 1960s to the 1980s, and then remained relatively constant, indicating that eutrophication occurred in the early 1960s but ceased in the 1980s. Stable C isotope ratios of the sediment increased from the 1960s, but decreased after the 1980s to the present. This decrease in stable C isotope ratios after the 1980s could not be explained by annual changes in either terrestrial input or algal production. However, when the C isotope ratios were corrected for the Suess effect, the shift to more negative isotopic value in atmospheric CO2 by fossil fuel burning, the isotopic value showed a trend, which is consistent with the other biomarkers and the monitoring data. The trend was also mirrored by the relative abundance of lignin-derived phenols, a unique organic tracer of material that originated from terrestrial plants, which decreased in the early 1960s and recovered to some degree in the 1980s. We detected no notable difference in the composition of lignin phenols, suggesting that the terrestrial plant composition did not change markedly. However, we found that lignin accumulation rate increased around the 1980s. These results suggest that although eutrophication has stabilized since the 1980s, allochthonous organic matter input has changed in Lake Biwa over the past 25 years

  11. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    Xiao, Rong [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); School of Nature Conservation, Beijing Forestry University, Beijing 100083 (China); Bai, Junhong, E-mail: junhongbai@163.com [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China)

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  12. Upwelling and anthropogenic forcing on phytoplankton productivity and community structure changes in the Zhejiang coastal area over the last 100 years

    DUAN Shanshan; XING Lei; ZHANG Hailong; FENG Xuwen; YANG Haili; ZHAO Meixun

    2014-01-01

    Phytoplankton productivity and community structure in marginal seas have been altered significantly dur-ing the past three decades, but it is still a challenge to distinguish the forcing mechanisms between climate change and anthropogenic activities. High time-resolution biomarker records of two 210Pb-dated sediment cores (#34:28.5°N, 122.272°E;CJ12-1269:28.861 9°N, 122.515 3°E) from the Min-Zhe coastal mud area were compared to reveal changes of phytoplankton productivity and community structure over the past 100 years. Phytoplankton productivity started to increase gradually from the 1970s and increased rapidly after the late 1990s at Site #34;and it started to increase gradually from the middle 1960s and increased rapidly after the late 1980s at Site CJ12-1269. Productivity of Core CJ12-1269 was higher than that of Core #34. Phy-toplankton community structure variations displayed opposite patterns in the two cores. The decreasing D/B (dinosterol/brassicasterol) ratio of Core #34 since the 1960s revealed increased diatom contribution to total productivity. In contrast, the increasing D/B ratio of Core CJ12-1269 since the 1950s indicated in-creased dinoflagellate contribution to total productivity. Both the productivity increase and the increased dinoflagellate contribution in Core CJ12-1269 since the 1950-1960s were mainly caused by anthropogenic activities, as the location was closer to the Changjiang River Estuary with higher nutrient concentration and decreasing Si/N ratios. However, increased diatom contribution in Core #34 is proposed to be caused by increased coastal upwelling, with higher nutrient concentration and higher Si/N ratios.

  13. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  14. 100 Years of benthic foraminiferal history on the inner Texas shelf inferred from fauna and stable isotopes: Preliminary results from two cores

    Strauss, Josiah; Grossman, Ethan L.; Carlin, Joseph A.; Dellapenna, Timothy M.

    2012-04-01

    Coastal regions, such as the Texas-Louisiana shelf, are subject to seasonal hypoxia that strongly depends on the magnitude of freshwater discharge from local and regional river systems. We have determined benthic foraminiferal fauna and isotopic compositions in two 210Pb dated box cores (BR4 and BR5) to examine the evidence for nearshore hypoxia and freshwater discharge on the Texas shelf during the last 100 years. The 210Pb chronologies of both cores reveal sedimentation rates of 0.2 and 0.1 cm yr-1, translating to ˜60 and ˜90 year records. The fauna of both cores were almost exclusively composed of Ammonia parkinsoniana and Elphidium excavatum, indicating euryhaline ambient waters. The Ammonia-Elphidium (A-E) index, a qualitative measure of low oxygen conditions, shows an increase from values between 20 and 50 to near 100 in both cores, suggesting low oxygen conditions between 1960 and the core top. Between 1950 and 1960 (9-10 cm), low A-E values in BR4 coincide with high δ18O and δ13C values greater than 0‰ and -1‰ respectively. This event corresponds to severe drought (the Texas Drought of Record) over the Brazos River drainage basin and considerably reduced river discharge from 1948 to 1957. High A-E values prior to this event imply low-oxygen conditions were prevalent prior to anthropogenic exacerbation of Louisiana shelf hypoxia and at least since the dredging of a new Brazos River delta in 1929. Elphidium excavatum δ13C values are very low (-4‰) and indicative of significant vital effect. The δ13C values of A. parkinsoniana average -3‰ and exhibit little variability, most likely reflecting pore waters influenced by aerobic and anaerobic respiration. The association of lowered Brazos River discharge with more oxygenated shelf bottom waters suggests Brazos River discharge and shelf hypoxia are linked, but the influence of Mississippi-Atchafalaya discharge can also contribute to shelf stratification.

  15. Indications of progressive desiccation of the Transvaal Lowveld over the past 100 years, and implications for the water stabilization programme in the Kruger National Park

    U. De V. Pienaar

    1985-12-01

    Full Text Available All available rainfall statistics recorded for the Kruger National Park area since 1907, coupled with an analysis of all the historical climatological data on hand, appear to confirm the quasi-twenty-year rainfall oscillation in precipitation pattern for the summer rainfall area. This was first pointed out by Tyson & Dyer (1975. The dendrochronological data obtained by Hall (1976 from a study of growth rings of a very old yellowwood tree (Podocarpus falcatus in Natal, also appear to indicate a superimposed, long-term (80-100 years pattern of alternate below- average and above-average rainfall periods. The historical data relating to climate in the park, during the past century or two, seem to bear out such a pattern. If this can be confirmed, it will be an enormous aid not only in wildlife-management planning, but also to agriculturists, demographic planners and others. It would appear that the long, relatively dry rainfall period of 1860-1970, with its concomitant progressive desiccation of the @ area in question, has passed over into the next aboveverage rainfall era. This does not mean that there will be no further cataclysmic droughts during future rainfall trough periods. It is therefore wise to plan ahead to meet such contingencies. The present water distribution pattern in the park (natural plus artificial water is conspicuously still well below that which pertained, during dry seasons, at the turn of the century, when the Sabi and Shingwedzi game reserves were proclaimed. It is the declared policy of the National Parks Board of Trustees to simulate natural regulating mechanisms as closely as possible. In consequence the artificial water-for-game program is a long way from completion. The large numbers of game animals in the park (including dominant species such as elephant Loxodonta africana and buffalo Syncerus coffer can no longer migrate out of the area to escape natural catastrophes (such as the crippling droughts of 1911-1917, the

  16. Simulation tools

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  17. Centennial annual general meeting of the CIM/CMMI/MIGA. Montreal `98: a vision for the future; 100 years of ground subsidence studies

    Chrzanowski, A.; Szostak-Chrzanowski, A.; Forrester, D.J. [University of New Brunswick, Fredericton, NB (Canada)

    1998-12-31

    Some of the empirical methods developed in central Europe for monitoring and analysis of ground subsidence have been adapted to North American conditions. A century of subsidence observations in Cape Breton is outlined. Empirical methods are being replaced by deterministic modelling of rock behaviour methods, that applies numerical methods to development of subsidence models. These deterministic models can be verified by monitoring under diverse geological and mining conditions. Some of the new monitoring methods developed in Canada are illustrated by case studies describing the use of hydrographic surveys to measure subsidence in offshore coal mines, a telemetric monitoring system for a coal mine in British Columbia, and deterministic monitoring and modelling of ground subsidence in a potash mine. 29 refs., 9 figs., 2 tabs.

  18. Decision support tool for Virtual Power Players: Hybrid Particle Swarm Optimization applied to Day-ahead Vehicle-To-Grid Scheduling

    Soares, João; Valle, Zita; Morais, Hugo

    2013-01-01

    This paper presents a decision support Tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy ressource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-aheadscheduling minimizing total operation cost...

  19. Proposition of a PLM tool to support textile design: A case study applied to the definition of the early stages of design requirements

    SEGONDS, Frédéric; Mantelet, Fabrice; Nelson, Julien; Gaillard, Stéphane

    2015-01-01

    The current climate of economic competition forces businesses to adapt more than ever to the expectations of their customers. Faced with new challenges, practices in textile design have evolved in order to be able to manage projects in new work environments. After presenting a state of the art overview of collaborative tools used in product design and making functional comparison between PLM solutions, our paper proposes a case study for the development and testing of a collaborative platform...

  20. Applying standards to ICT models, tools and data in Europe to improve river basin networks and spread innovation on water sector

    Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier

    2015-04-01

    This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking

  1. An H-formulation-based three-dimensional hysteresis loss modelling tool in a simulation including time varying applied field and transport current: the fundamental problem and its solution

    When analytic solutions are not available, finite-element-based tools can be used to simulate hysteresis losses in superconductors with various shapes. A widely used tool for the corresponding magnetoquasistatic problem is based on the H-formulation, where H is the magnetic field intensity, eddy current model. In this paper, we study this type of tool in a three-dimensional simulation problem. We consider a case where we simultaneously apply both a time-varying external magnetic field and a transport current to a twisted wire. We show how the modelling decisions (air has high finite resistivity and applied field determines the boundary condition) affect the current density distribution along the wire. According to the results, the wire carries the imposed net current only on the boundary of the modelling domain, but not inside it. The current diffuses to the air and back to the boundary. To fix this problem, we present another formulation where air is treated as a region with 0 conductivity. Correspondingly, we express H in the air with a scalar potential and a cohomology basis function which considers the net current condition. As shown in this paper, this formulation does not fail in these so-called AC-AC (time varying transport current and applied magnetic field) simulations. (paper)

  2. 100 years of the main mine rescue service. A contribution to the protection against disasters in the coal mining industry; 100 Jahre Hauptstelle fuer das Grubenrettungswesen. Ein Beitrag zum Katastrophenschutz im Steinkohlenbergbau

    Hermuelheim, Walter [RAG Aktiengesellschaft, Herne (Germany). Zentralbereich Arbeits-, Gesundheits- und Umweltschutz

    2011-06-15

    A review of 100 years of protection against disasters in the coal mining industry impressively shows the way from an era of major accidents to a modern branch of industry, which justifiably and with good prospects of success can pursue the aim of ''No accidents - no damage to health - no damage to the environment''. However, the development of the mine rescue service over more than 100 years - represented in the Ruhr by the Main Mine Rescue Service established in 1910 in Essen - would be incomplete without consideration of the allied technical fields underground fire protection and explosion protection. Cooperation between institutions such as the Tremonia test mine and the BVG has produced a safety level in all three fields, which is regarded as exemplary worldwide, and in addition to the latest mining technology is a good advertisement for the German coal mining industry. (orig.)

  3. Decision support tool for Virtual Power Players: Hybrid Particle Swarm Optimization applied to Day-ahead Vehicle-To-Grid Scheduling

    Soares, João; Valle, Zita; Morais, Hugo

    2013-01-01

    This paper presents a decision support Tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy ressource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application...... of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric...... network constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance...

  4. Modified Linear Theory Aircraft Design Tools and Sonic Boom Minimization Strategy Applied to Signature Freezing via F-function Lobe Balancing

    Jung, Timothy Paul

    Commercial supersonic travel has strong business potential; however, in order for the Federal Aviation Administration to lift its ban on supersonic flight overland, designers must reduce aircraft sonic boom strength to an acceptable level. An efficient methodology and associated tools for designing aircraft for minimized sonic booms are presented. The computer-based preliminary design tool, RapidF, based on modified linear theory, enables quick assessment of an aircraft's sonic boom with run times less than 30 seconds on a desktop computer. A unique feature of RapidF is that it tracks where on the aircraft each segment of the of the sonic boom came from, enabling precise modifications, speeding the design process. Sonic booms from RapidF are compared to flight test data, showing that it is capability of predicting a sonic boom duration, overpressure, and interior shock locations. After the preliminary design is complete, scaled flight tests should be conducted to validate the low boom design. When conducting such tests, it is insufficient to just scale the length; thus, equations to scale the weight and propagation distance are derived. Using RapidF, a conceptual supersonic business jet design is presented that uses F-function lobe balancing to create a frozen sonic boom using lifting surfaces. The leading shock is reduced from 1.4 to 0.83 psf, and the trailing shock from 1.2 to 0.87 psf, 41% and 28% reductions respectfully. By changing the incidence angle of the surfaces, different sonic boom shapes can be created, and allowing the lobes to be re-balanced for new flight conditions. Computational fluid dynamics is conducted to validate the sonic boom predictions. Off-design analysis is presented that varies weight, altitude, Mach number, and propagation angle, demonstrating that lobe-balance is robust. Finally, the Perceived Level of Loudness metric is analyzed, resulting in a modified design that incorporates other boom minimization techniques to further reduce

  5. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  6. Trends in nuclear physics. 100 years later

    In the first years after the discovery of radioactivity it became clear that nuclear physics was, by excellence, the science of small quantum systems. Between the fifties and the eighties nuclear physics and elementary particles physics lived their own lives, without much interaction. During this period the basic concepts were defined. Recently, contrary to the specialization law often observed in science, the overlap between nuclear and elementary particle physics has become somewhat blurred. This Les Houches Summer School was set up with the aim of fighting off the excessive specialization evident in many international meetings, and return to the roots. The twofold challenge of setting up a fruitful exchange between experimentalists and theorists in the first place, and between nuclear and hadronic matter physicists in the second place was successfully met. The volume presents high quality, up-to-date reviews starting with an account of the birth and first developments of nuclear physics. Further chapters discuss the description of the nuclear structure, the physics of nuclei at very high spin, the existence of super-heavy nuclei as a consequence of shell structure, liquid-gas transition, including both a description and a review of the experimental situation. Other topics dealt with include the interactions between moderately relativistic heavy ions, the concept of a nucleon dressed by a cloud of pions, the presence of pions in the nucleus, the subnucleonic phenomena in nuclei and quark-gluons deconfinement transition, both theoretical and experimental aspects. Nuclear physics continues to influence many other fields, such as astrophysics, and is also inspired by these same fields. This cross-fertilisation is illustrated by the treatment of neutron stars in one of the final chapters. The last chapter provides an overview of a recent development in which particle and nuclear physicists have cooperated to revitalize an alternative method for nuclear energy production associating high energy production accelerators and sub-critical neutron multiplying assemblies

  7. Appraising Schumpeter's "Essence" after 100 years

    Andersen, Esben Sloth

    Schumpeter's unique type of evolutionary analysis can hardly be understood unless we recognise that he developed it in relation to a study of the strength and weaknesses of the Walrasian form of Neoclassical Economics. This development was largely performed in his first book 'Wesen und Hauptinhalt...... der theoretischen Nationalökonomie'. This German-language book-which in English might be called 'Essence and Scope of Theoretical Economics'-was published a century ago (in 1908). Different readings of Wesen provide many clues about the emergence and structure of Schumpeter's programme for teaching...... and research. This programme included a modernisation of static economic analysis but he concentrated on the difficult extension of economic analysis to cover economic evolution. Schumpeter thought that this extension required a break with basic neoclassical assumptions, but he tried to avoid...

  8. Lurpak: Ready for another 100 years?

    Grunert, Klaus G.

    2001-01-01

    The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of.......The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of....

  9. Peacock: 100 years of servicing Canadian industry

    In 1997 Peacock Inc., a supplier of pipeline, filtration, pumping, materials handling and mechanical equipment of all kinds to the Canadian oil and natural gas industries, will celebrate its 100th year of servicing Canadian industry, and 50th year in the oil patch. The company has outlets in several Canadian cities from Halifax to Vancouver. It manufactures, distributes, maintains and repairs all types of industrial equipment. It also manages the Naval Engineering Test Establishment at LaSalle, PQ, for the Department of Defence. Peacock service centres provide 24-hour service response to emergency breakdowns anywhere in Canada; its engineers and technicians are ISO 9003 qualified or better, and are experts in turnarounds and planned maintenance outages, major overhauls of critical equipment, supplying mechanical crews for emergency equipment breakdowns, and grouting of heavy machinery. By close coordination of its four divisions, and by maintaining their dedication to service, the company looks to the future with pride and confidence

  10. Scoring functions--the first 100 years.

    Tame, Jeremy R H

    2005-06-01

    The use of simple linear mathematical models to estimate chemical properties is not a new idea. Albert Einstein used very simple 'gravity-like' forces to explain the capillarity of different liquids in 1900-1901. Today such models are used in more complicated situations, and a great many have been developed to analyse interactions between proteins and their ligands. This is not surprising, since proteins are too complicated to model accurately without lengthy numerical analysis, and simple models often do at least as good a job in predicting binding constants as much more computationally expensive methods. One hundred years after Einstein's 'miraculous year' in which he transformed physics, it is instructive to recall some of his even earlier work. As approximations, 'scoring functions' are excellent, but it is dangerous to read too much into them. A few cautionary tales are presented for the beginner to the field of ligand affinity prediction by linear models. PMID:16231202

  11. 100 years of ionizing radiation protection

    The development of radiation protection from the end of 19. century and evolution of opinion about injurious effect of ionizing radiation were presented. Observations of undesirable effects of ionizing radiation exposition, progress of radiobiology and dosimetry directed efforts toward radiation protection. These activities covered, at the beginning, limited number of persons and were subsequently extended to whole population. The current means, goals and regulations of radiological control have been discussed

  12. The Flexner Report--100 years later.

    Duffy, Thomas P

    2011-09-01

    The Flexner Report of 1910 transformed the nature and process of medical education in America with a resulting elimination of proprietary schools and the establishment of the biomedical model as the gold standard of medical training. This transformation occurred in the aftermath of the report, which embraced scientific knowledge and its advancement as the defining ethos of a modern physician. Such an orientation had its origins in the enchantment with German medical education that was spurred by the exposure of American educators and physicians at the turn of the century to the university medical schools of Europe. American medicine profited immeasurably from the scientific advances that this system allowed, but the hyper-rational system of German science created an imbalance in the art and science of medicine. A catching-up is under way to realign the professional commitment of the physician with a revision of medical education to achieve that purpose. PMID:21966046

  13. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  14. Solar geometry tool applied to systems and bio-climatic architecture; Herramienta de geometria solar aplicada a sistemas y arquitectura bio-climatica

    Urbano, Antonio; Matsumoto, Yasuhiro; Aguilar, Jaime; Asomoza Rene [CIMVESTAV-IPN, Mexico, D.F (Mexico)

    2000-07-01

    The present article shows the annual solar path, by means of graphic Cartesian, as well as the use of these, taken as base the astronomical, geographical antecedents and of the place. These graphs indicate the hours of sun along the day, month and year for the latitude of 19 Celsius degrees north, as well as the values of radiation solar schedule for the most important declines happened annually (equinoxes, solstices and the intermediate months). These graphs facilitate the user's good location to evaluate inherent obstacles of the environment and to determine in the place, the shades on the solar equipment or immovable (mountains, tree, buildings, windows, terraces, domes, et cetera), the hours of sun or the radiation for the wanted bio-climatic calculation. The present work is a tool of place engineering for the architects, designers, manufactures, planners, installers, energy auditors among other that require the use of the solar energy for anyone of its multiple applications. [Spanish] El presente articulo, muestra las trayectorias solares anules, mediante graficas cartesianas, asi como la utilizacion de estas, tomando como base los antecedentes astronomicos, geograficos y del lugar. Estas graficas indican las horas del sol a lo largo del dia, mes y ano para la latitud de 19 grados Celsius norte, asi como los valores de radiacion solar horaria para las declinaciones mas importantes ocurridas anualmente (equinoccios, solsticios y los meses intermedios). Estas graficas facilitan la ubicacion optima del usuario para evaluar obstaculos inherentes del entorno y determinar en el sitio, las sombras sobre los equipos solares o inmuebles (montanas, arboles, edificios, ventanas, terrazas, domos, etc.), las horas de sol o bien la radiacion para el calculo bio-climatico deseado. El presente trabajo es una herramienta de Ingenieria de sitio para los Arquitectos, Disenadores, Constructores, Proyectistas, Instaladores, Auditores Energeticos entre otros, que requieran el

  15. Science serving people. IAEA-supported projects are helping countries apply the right tools to fight food, health, and water problems

    A new booklet 'Science Serving People' features stories about how IAEA-supported projects are making a difference in many poorer countries. The stories describe applications of nuclear science and technology that are being used through technical cooperation channels to overcome challenges of water scarcity, food shortage, malnutrition, malaria, environmental degradation and many other problems. They also illustrate how the complementary development, safety, and security initiatives of the IAEA are fostering atoms for peace in the developing world. Extreme poverty and deprivation remain a problem of monumental proportions at the dawn of the 21st century, notes IAEA Director General Mohamed ElBaradei in the booklet's Introduction. Through effective partnerships, collaborative research, and strategic direction, the IAEA is contributing to global efforts to help the poor. IAEA programmes have entered an important phase, he said, in which scientific contributions to Member States are yielding very sizeable human benefits. It's clear that science and technology must be better mobilized to meet the needs of the poor, emphasizes Jeffrey Sachs, Director of the Earth Institute at Columbia University, USA, and Special Advisor to UN Secretary-General Kofi Annan. The UN agencies, such as the IAEA, have a great role to play, he says in the booklet's Foreword. This is especially so, he points out, if they act as a bridge between the activities of advanced- country and developing country scientific centres, and if they help to harness the advances of world science for the poor as well as the rich. The bottom line, he concludes, is that rich countries should expand support for those United Nations organizations that can help in solving the unique problems confronting the world's poorest peoples. The booklet features stories on managing water resources, promoting food security, focusing science on health problems, new tools for environmental management, and strengthening nuclear

  16. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  17. Multivariate curve resolution applied to in situ X-ray absorption spectroscopy data: An efficient tool for data processing and analysis

    Highlights: • Use of MCR algorithms to extract component spectra of different kinetic evolution. • Obtaining components and concentration profiles without use of reference spectra. • Automatic extraction of meaningful component profiles from large XAS datasets. - Abstract: Large datasets containing many spectra commonly associated with in situ or operando experiments call for new data treatment strategies as conventional scan by scan data analysis methods have become a time-consuming bottleneck. Several convenient automated data processing procedures like least square fitting of reference spectra exist but are based on assumptions. Here we present the application of multivariate curve resolution (MCR) as a blind-source separation method to efficiently process a large data set of an in situ X-ray absorption spectroscopy experiment where the sample undergoes a periodic concentration perturbation. MCR was applied to data from a reversible reduction–oxidation reaction of a rhenium promoted cobalt Fischer–Tropsch synthesis catalyst. The MCR algorithm was capable of extracting in a highly automated manner the component spectra with a different kinetic evolution together with their respective concentration profiles without the use of reference spectra. The modulative nature of our experiments allows for averaging of a number of identical periods and hence an increase in the signal to noise ratio (S/N) which is efficiently exploited by MCR. The practical and added value of the approach in extracting information from large and complex datasets, typical for in situ and operando studies, is highlighted

  18. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available. PMID:26985673

  19. Management Tools

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  20. How credible are the study results? Evaluating and applying internal validity tools to literature-based assessments of environmental health hazards.

    Rooney, Andrew A; Cooper, Glinda S; Jahnke, Gloria D; Lam, Juleen; Morgan, Rebecca L; Boyles, Abee L; Ratcliffe, Jennifer M; Kraft, Andrew D; Schünemann, Holger J; Schwingl, Pamela; Walker, Teneille D; Thayer, Kristina A; Lunn, Ruth M

    2016-01-01

    Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or "risk of bias" - the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program's (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or "domains" for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. PMID:26857180

  1. Advanced experimental tools designed for the assessment of the thermal load applied to the mixing tee and nozzle geometries in the PWR plant

    Thermal fatigue studies have been started again, after the incident in the heat removal system of the Civaux NPP (may 1998). Thermal fatigue problem was suspected, the cracks occurred in the mixing tee were probably due to the fluctuation at large gap of temperature. This paper introduces the experimental strategy directed by the CEA for assessing the thermal load in the mixing area. 2 mockups of mixing tee, similar in geometry are tested in the FATHERINO facility under similar thermal hydraulic conditions; the first one in brass for selecting the mixing areas where the temperature fluctuation is high and the second one in stainless steel for setting measurements with local specific sensors to determine the thermal load. The specific sensors, Tf fluid, fluxmeter sensor, and the Coefh sensor, record in local, the fluctuation close to the wall in the fluid and in the wall. The heat flux sensor 'fluxmeter' and the heat transfer sensor 'Coefh' are equipped with 3 micro thermocouples in their respective body, non intrusive and typically designed to catch the fluctuations within a low attenuation in the frequency range from 0 to 20 Hz. By applying an inverse heat conduction method to the output data given by the fluxmeter, the wall temperature (mean and fluctuating values) at the internal surface can be accurately determined. The Coefh sensor is like a fluxmeter sensor using the same technology but equipped with a thermocouple in the fluid to determine the heat transfer coefficient. In addition, the results from both experiments (brass and stainless steel mockups) are implemented as input data for the needs of the CFD calculation. The FATHERINO experiment consists of emptying 2 vessels in the mockup, initially field in water and conditioned at low (5 C) and high temperature (80 C). There is a motor pump for each line (cold and hot legs) and the flow rate is controlled by temperature valve

  2. The Magnetosusceptibility Stratigraphy (MS) Applied as a Correlation and High Precision Relative Dating Tool in Archaeology: Application to Caves in Spain and Portugal

    Ellwood, B. B.; Arbizu, M.; Arsuaga, J.; Harrold, F.; Zilhao, J.; Adán, G. E.; Aramburu, A.; Fombella, M. A.; Bedia, I. M.; Alvarez-Laó, D.; García, M.

    2005-05-01

    The magnetic susceptibility (MS) method, when carefully applied, can be used to correlatie between sediment sequences and to characterize the paleoclimate at the time the sediments were deposited in protected archaeological sites, such as within caves or deep rock shelters. This method works because the MS of sediments outside caves, that are eventually deposited in caves, is controlled by pedogenesis that in turn is driven by climate. Here we summarize the method and discuss ways designed to identify anomalous samples that should not be used in relative dating or for correlations. We will then present our results from Cueva del Conde located in the Province of Asturias, northwestern Spain, and compare those results with results from other caves from Spain and Portugal. Cueva del Conde was first excavated in 1915, with additional excavations and studies performed in 1962, 1965, and 1999. The current excavations began in 2001. This body of work identified a transitional sequence from Middle Paleolithic (Mousterian) to early Upper Paleolithic (Aurignacian) artifacts, including perhaps the earliest art known from the Upper Paleolithic, thus establishing Cueva del Conde as an important Paleolithic cave site. We collected a continuous series of 44 samples, each covering about 0.027 m of section, from an exposed 1.2 m sequence within the cave. This section has been excavated and studied by archaeologists working at the site and three 14C dates from charcoal have been reported. The MS for samples collected for this study were measured using the susceptibility bridge at LSU. The MS shows a systematic cyclicity that when constrained by the 14C ages can be correlated to our MS standard curve for Europe (Ellwood et al., 2001; Harrold et al., 2004), and thus to other sites in the region. This cyclicity we interpret to result from climate fluctuations. By comparison to our MS standard curves, we are able to assign MS relative ages to Cueva del Conde that extends the sequence

  3. 100 years of thermal waste treatment - off-gas purification technology then and now. Performance results of the Stellinger Moor waste incineration plant at Hamburg; 100 Jahre thermische Abfallbehandlung - Abgasreinigungstechnik damals und heute, Betriebserfahrungen der MVA Stellinger Moor, Hamburg

    Franck, J. [Stadtreinigung Hamburg, MVA Stellinger Moor, Hamburg (Germany); Schellenberger, I. [Goepfert, Reimer und Partner, Hamburg (Germany); Karpinski, A. [Lentjes Energietechnik GmbH, Essen (Germany)

    1997-06-01

    The contribution outlines the history of thermal waste treatment, starting from the first such plant constructed at Hamburg-Bullerdeich. It also goes into the social context in which the decision to construct such a plant was made. As an example of a modern, up-to-date system, the Stellinger Moor plant at Hamburg is described for comparison. The flue gas purification process employed here reflects the most recent state of the art. If the technical and social boundary conditions of 100 years ago are compared with those of today, one sees how far plant technology has advanced since the first years, especially in the field of emission reduction. On the other hand, the acceptance problems facing operators of thermal waste incineration systems are still the same as 100 years ago. [Deutsch] Dieser Beitrag vermittelt einen Eindruck ueber die Anfaenge der thermischen Abfallbehandlung. Es wird gezeigt, mit welchen technischen Raffinessen die erste Anlage am Bullerdeich in Hamburg errichtet und betrieben wurde. Daneben wird das gesellschaftliche Umfeld angerissen, in dem die Entscheidung fuer eine derartige Behandlungsanlage reifte. Um einen Vergleich mit der heutigen, modernen Anlagentechnik zu ermoeglichen, wird die MVA Stellinger Moor in Hamburg technisch und betrieblich vorgestellt. Das angewandte technische Verfahren der Abgasreinigung entspricht voll und ganz den heutigen hohen Erwartungen der Gesetzgebung. Vergleicht man die Randbedingungen von vor 100 Jahren mit den heutigen sowohl in technischer als auch in gesellschaftlicher Sicht, so ist festzustellen, dass es eine gewaltige Entwicklung bei der Anlagentechnik, besonders vor dem Hintergrund der Reduzierung von Schadgasemissionen, gegeben hat. Die Akzeptanzprobleme fuer thermische Behandlungsanlagen sind aber heute noch genauso vorhanden wie vor 100 Jahren. (orig.)

  4. The“Musical Ningboese” and the Emergenceof China’s Piano Industry in the Past 100 Years%音乐“宁波帮”与中国百年钢琴制作的崛起

    沈浩杰

    2015-01-01

    Without piano, there would be no the development in the musical field in the past 100 years in China. By taking advantage of their inner priorities and outer environments, the people of Ningbo, as the first Chinese people engaged in the piano-making sector, have dominated the all-round development in the piano-related fields such as the production of the spare parts and whole products, marketing, research and development, talents’ training, piano education and service, and they have always been the leader in the 100 years when China’s piano-making has developed from a small and weak industry to a large and strong one. The large and excellent group of people in the piano-making field and the other local professional musicians form the“Musical Ningboese”, a unique group of people in the modern musical history in China.%没有钢琴,就没有中国一百多年新音乐事业的发展。宁波人作为近代以来首批涉略钢琴制作的中国人,善于凭借自身优势和外部环境,主导钢琴零部件、整件制作、市场拓展、科研开发、人才培养、教育服务等全方位建设,始终引领着中国钢琴制作由小变大,由弱到强的百年发展史。这支跨越百年的庞大而优秀的从业队伍和其他宁波籍专业音乐家合在一起,形成了音乐“宁波帮”这一中国近现代音乐史上独一无二的群体。

  5. 100 years of Zukunft/Inden opencast mine. Lignite mining west of the Inde river between Eschweiler and Juelich; 100 Jahre Zukunft - Tagebau Inden. Braunkohlengewinnung westlich der Inde zwischen Eschweiler und Juelich

    Roeggener, Oliver; Oster, Arthur [RWE Power AG, Eschweiler (Germany). Tagebau Inden

    2010-11-15

    With the extraction of the first coal at the Zukunft opencast mine, industrial lignite mining in the west of the Rhenish mining area commenced 100 years ago in September 1910. The setting up of ''BIAG Zukunft'' and the commissioning of the power station of the same name only a few years later triggered sustainable growth of this industry and of the entire region. In the course of the dynamic events in the years that followed, development of the Zukunft-West follow-up opencast mine was finally started, the Weisweiler briquette factory erected and extended, the Zukunft power plant's capacity built up to 230 MW and RWE's Weisweiler power station constructed. The merger of the four big Rhenish lignite-mining companies led to profound structural adjustments in the west of the mining area as well. Where the lignite had previously been extracted in different opencast mines, mining activities and power generation were now focused on own companies whose evolution was driven forward strategically. The Inden mine, which was in the development phase in parallel with the Zukunft mine, was temporarily discontinued in this course of events. In 1981, it was successively re-commissioned to provide an offset for the incipient exhaustion of the Zukunft-West mine. In the further course of opencast mining, several towns have had to be resettled by today, and the Inde river and the mine's belt junction relocated. The Inden mine, too, will be exhausted around the year 2030 and will be recultivated, as the first of the still-operational Rhenisch opencast mines, with a good sized residual lake. By resolution of the Lignite Commission dated 5 December 2008, the Inden II Lignite Plan was therefore amended with the aim of creating a substantial lake instead of backfilling the final void with masses from the Hambach mine. Creating the roughly 11-km{sup 2} lake will also be accompanied with economic structural change in the region. According to expert

  6. Applied Electromagnetics

    These proceedings contain papers relating to the 3rd Japanese-Bulgarian-Macedonian Joint Seminar on Applied Electromagnetics. Included are the following groups: Numerical Methods I; Electrical and Mechanical System Analysis and Simulations; Inverse Problems and Optimizations; Software Methodology; Numerical Methods II; Applied Electromagnetics

  7. 近百年El Nino/La Nina事件与北京气候相关性分析%Correlation Analysis Between El Nino/La Nina Phenomenon During the Recent 100 Years and Beijing Climate

    刘桂莲; 张明庆

    2001-01-01

    Results of the analysis suggest that during the recent 100 years there e xists a strong correlation between the El Nino/La Nina phenomenon and Beijing′s rainfall in summer(June—August),mean monthly maximum temperature (July) and mean monthly minimum temperature in winter (January).El Nino phenomenon appears a negative-correlation with the summer rainfall and the mean monthly minimum te mperature;whereas a positive correlation with the mean monthly maximum temperatu re in summer.La Nina phenomenon appears a positive correlation with the summer r ainfall and the mean monthly minimum temperature in winter;whereas a negative-c orrelation with the mean monthly maximum temperature in summer.%通过对近百年El Nino/La Nina事件与北京气候相关性研究发现,El Nino/ La Nina事件与北京夏季(6~8月)降水、平均最高气温(7月)和冬季(1月)平均最低气温之间 相互关系显著。El Nino事件与夏季降水、冬季平均最低气温呈负相关,与夏季平均最高气 温呈正相关,造成降水减少,气温年较差增大,大陆性增强的气候特点。La Nina事件与夏 季降水、冬季平均最低气温呈正相关,与夏季平均最高气温呈负相关,使降水增加,气温年 较差减小,大陆性减弱的气候特点。

  8. Applied superconductivity

    Newhouse, Vernon L

    1975-01-01

    Applied Superconductivity, Volume II, is part of a two-volume series on applied superconductivity. The first volume dealt with electronic applications and radiation detection, and contains a chapter on liquid helium refrigeration. The present volume discusses magnets, electromechanical applications, accelerators, and microwave and rf devices. The book opens with a chapter on high-field superconducting magnets, covering applications and magnet design. Subsequent chapters discuss superconductive machinery such as superconductive bearings and motors; rf superconducting devices; and future prospec

  9. Applied Stratigraphy

    Lucas, Spencer G.

    Stratigraphy is a cornerstone of the Earth sciences. The study of layered rocks, especially their age determination and correlation, which are integral parts of stratigraphy, are key to fields as diverse as geoarchaeology and tectonics. In the Anglophile history of geology, in the early 1800s, the untutored English surveyor William Smith was the first practical stratigrapher, constructing a geological map of England based on his own applied stratigraphy. Smith has, thus, been seen as the first “industrial stratigrapher,” and practical applications of stratigraphy have since been essential to most of the extractive industries from mining to petroleum. Indeed, gasoline is in your automobile because of a tremendous use of applied stratigraphy in oil exploration, especially during the latter half of the twentieth century. Applied stratigraphy, thus, is a subject of broad interest to Earth scientists.

  10. Applied mathematics

    Logan, J David

    2013-01-01

    Praise for the Third Edition"Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." -MAA Reviews Applied Mathematics, Fourth Edition is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and nat

  11. Applied mineralogy

    Park, W.C.; Hausen, D.M.; Hagni, R.D. (eds.)

    1985-01-01

    A conference on applied mineralogy was held and figures were presented under the following headings: methodology (including image analysis); ore genesis; exploration; beneficiations (including precious metals); process mineralogy - low and high temperatures; and medical science applications. Two papers have been abstracted separately.

  12. Micromachining with Nanostructured Cutting Tools

    Jackson, Mark J

    2013-01-01

    The purpose of the brief is to explain how nanostructured tools can be used to machine materials at the microscale.  The aims of the brief are to explain to readers how to apply nanostructured tools to micromachining applications. This book describes the application of nanostructured tools to machining engineering materials and includes methods for calculating basic features of micromachining. It explains the nature of contact between tools and work pieces to build a solid understanding of how nanostructured tools are made.

  13. Applied dynamics

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  14. Applied optics

    The 1988 progress report, of the Applied Optics laboratory, of the (Polytechnic School, France), is presented. The optical fiber activities are focused on the development of an optical gyrometer, containing a resonance cavity. The following domains are included, in the research program: the infrared laser physics, the laser sources, the semiconductor physics, the multiple-photon ionization and the nonlinear optics. Investigations on the biomedical, the biological and biophysical domains are carried out. The published papers and the congress communications are listed

  15. Geometric reasoning about assembly tools

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  16. The Two-time Rise of Australian Competitive Sport in the 100 Years of the Olympic Games%百年奥运视角下澳大利亚竞技体育的二次崛起历程分析及启示

    浦义俊; 吴贻刚

    2014-01-01

    运用文献资料法、数理统计法、逻辑分析法等,以百年奥运会为研究视角,对澳大利亚竞技体育发展中二次崛起历程及其成功因素进行分析。首先对澳大利亚百年奥运竞技历程进行了发展分期,其次基于奖牌分布特征将对比分析了澳大利亚二次竞技体育崛起的项目分布差异;再次则是在比较澳大利亚两次竞技体育崛起内外部环境变化的基础上,重点对澳大利亚竞技体育得以再度崛起的成功因素进行深入分析,研究认为澳大利亚政治领域执政党及其执政理念的更迭为竞技体育的重生创造了重要前提,澳大利亚联邦政府对竞技体育发展深入的政策设计与逐渐的资金资助是其重回巅峰的关键,澳大利亚逐渐走向成熟的竞技体育管理系统是其再次崛起的重要保障,澳大利亚国民性与体育的高度融合是其竞技体育再度成功的重要根基。最后则指出了澳大利亚竞技体育二次崛起对我国的启示。%This study examined the process of the two-time rise of Australian competitive sport in the 100 years of the Olympic Games and factors contributing to its success. First,we conducted a stage division of the 100 years of Australian competitive sport. Then we looked at the event distribution of the two-time rise. After we compared the first-time rise and the second-time rise in terms of internal and external environment,we focused on the analysis of elements that contributed to the second-time rise of Australian competitive sport. These elements include the alterna-tion of the ruling party and change in administrating conceptions,the federal government’s policy design and finan-cial support for the development of competitive sport,the maturing management system of competitive sport,and the agreement between Australia’s national character and sport. The study concluded with inspirations that can be ap-plied in China’s sport context.

  17. Applied Literature for Healing,

    Susanna Marie Anderson

    2014-11-01

    Full Text Available In this qualitative research study interviews conducted with elite participants serve to reveal the underlying elements that unite the richly diverse emerging field of Applied Literature. The basic interpretative qualitative method included a thematic analysis of data from the interviews yielding numerous common elements that were then distilled into key themes that elucidated the beneficial effects of engaging consciously with literature. These themes included developing a stronger sense of self in balance with an increasing connection with community; providing a safe container to engage challenging and potentially overwhelming issues from a stance of empowered action; and fostering a healing space for creativity. The findings provide grounds for uniting the work being done in a range of helping professions into a cohesive field of Applied Literature, which offers effective tools for healing, transformation and empowerment.Keywords: Applied Literature, Bibliotherapy, Poetry Therapy, Arts in Corrections, Arts in Medicine

  18. Applied geodesy

    This volume is based on the proceedings of the CERN Accelerator School's course on Applied Geodesy for Particle Accelerators held in April 1986. The purpose was to record and disseminate the knowledge gained in recent years on the geodesy of accelerators and other large systems. The latest methods for positioning equipment to sub-millimetric accuracy in deep underground tunnels several tens of kilometers long are described, as well as such sophisticated techniques as the Navstar Global Positioning System and the Terrameter. Automation of better known instruments such as the gyroscope and Distinvar is also treated along with the highly evolved treatment of components in a modern accelerator. Use of the methods described can be of great benefit in many areas of research and industrial geodesy such as surveying, nautical and aeronautical engineering, astronomical radio-interferometry, metrology of large components, deformation studies, etc

  19. Applied mathematics

    The 1988 progress report of the Applied Mathematics center (Polytechnic School, France), is presented. The research fields of the Center are the scientific calculus, the probabilities and statistics and the video image synthesis. The research topics developed are: the analysis of numerical methods, the mathematical analysis of the physics and mechanics fundamental models, the numerical solution of complex models related to the industrial problems, the stochastic calculus and the brownian movement, the stochastic partial differential equations, the identification of the adaptive filtering parameters, the discrete element systems, statistics, the stochastic control and the development, the image synthesis techniques for education and research programs. The published papers, the congress communications and the thesis are listed

  20. Applying radiation

    The invention discloses a method and apparatus for applying radiation by producing X-rays of a selected spectrum and intensity and directing them to a desired location. Radiant energy is directed from a laser onto a target to produce such X-rays at the target, which is so positioned adjacent to the desired location as to emit the X-rays toward the desired location; or such X-rays are produced in a region away from the desired location, and are channeled to the desired location. The radiant energy directing means may be shaped (as with bends; adjustable, if desired) to circumvent any obstruction between the laser and the target. Similarly, the X-ray channeling means may be shaped (as with fixed or adjustable bends) to circumvent any obstruction between the region where the X-rays are produced and the desired location. For producing a radiograph in a living organism the X-rays are provided in a short pulse to avoid any blurring of the radiograph from movement of or in the organism. For altering tissue in a living organism the selected spectrum and intensity are such as to affect substantially the tissue in a preselected volume without injuring nearby tissue. Typically, the selected spectrum comprises the range of about 0.1 to 100 keV, and the intensity is selected to provide about 100 to 1000 rads at the desired location. The X-rays may be produced by stimulated emission thereof, typically in a single direction

  1. Fastener starter tool

    Chandler, Faith T. (Inventor); Valentino, William D. (Inventor); Garton, Harry L. (Inventor); Arnett, Michael C. (Inventor)

    2003-01-01

    A fastener starter tool includes a number of spring retention fingers for retaining a small part, or combination of parts. The tool has an inner housing, which holds the spring retention fingers, a hand grip, and an outer housing configured to slide over the inner housing and the spring retention fingers toward and away from the hand grip, exposing and opening, or respectively, covering and closing, the spring retention fingers. By sliding the outer housing toward (away from) the hand grip, a part can be released from (retained by) the tool. The tool may include replaceable inserts, for retaining parts, such as screws, and configured to limit the torque applied to the part, to prevent cross threading. The inner housing has means to transfer torque from the hand grip to the insert. The tool may include replaceable bits, the inner housing having means for transferring torque to the replaceable bit.

  2. Downhole tool adapted for telemetry

    Hall, David R.; Fox, Joe

    2010-12-14

    A cycleable downhole tool such as a Jar, a hydraulic hammer, and a shock absorber adapted for telemetry. This invention applies to other tools where the active components of the tool are displaced when the tool is rotationally or translationally cycled. The invention consists of inductive or contact transmission rings that are connected by an extensible conductor. The extensible conductor permits the transmission of the signal before, after, and during the cycling of the tool. The signal may be continuous or intermittent during cycling. The invention also applies to downhole tools that do not cycle, but in operation are under such stress that an extensible conductor is beneficial. The extensible conductor may also consist of an extensible portion and a fixed portion. The extensible conductor also features clamps that maintain the conductor under stresses greater than that seen by the tool, and seals that are capable of protecting against downhole pressure and contamination.

  3. Preparing for the future today: The findings of the NEA RK and M initiative for the short term (This period covers several decades and likely more than 100 years. The actual duration will vary across national programmes)

    The NEA initiative on Records, Knowledge and Memory (RK and M) across Generations is an initiative that expresses, supports and aims to answer to an evolution in long-term radioactive waste management (RWM) thinking over the past decades. In the earlier days, the vision seems to have been that waste management ends with the closure of disposal sites. Oversight after closure was not an issue that was studied, (tacitly) assuming safe oblivion of geological repositories, or that archives, markers and other similar tools would suffice, e.g. to avoid human intrusion and/or to understand the nature of the underground facility. Today, it is recognised that oversight should take place for as long as practicable. The new vision includes the preservation of information to be used by future generations. In this paper we want to highlight that such a vision shift with regard to the future requires an accompanying shift with regard to present thinking and practices. To this aim, we will outline some of the studies undertaken within the RK and M initiative that substantiate this finding that the future starts today, and offer suggestions to support its concretization. societal oversight as long as practicable, we should acknowledge the fact that RK and M loss takes place rapidly if it is not acted upon in a conscious and ongoing manner that involves various actors and does more than dumping records into archives. The success of RK and M preservation cannot be judged today by whether they will last for one or ten thousand years. Instead, it lies in establishing and maintaining awareness of the need and responsibility for RK and M preservation in the minds of regulators, operators, stakeholders and, especially in the longer term, the local and regional authorities and general public. Therefore, we should not only think about future activities, but act upon the idea that the long term starts today, and that RK and M preservation needs to be prepared for in the present, while the

  4. Visualisation tools

    E. Dupont proposed that visualisation tools should be extended to Nuclear Data (ND) Information Systems in order to cover all data (and formats), all users and all needs. In particular, these ND Information Systems could both serve as an interface between data and users, as well as between data and codes (processing codes or nuclear reaction codes). It is expected that these systems will combine the advantages of processing codes and visualisation tools, as well as serving as a Tool Box to support various ND projects

  5. Applied ALARA techniques

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work

  6. TS Tools

    Yvette Linders

    2012-12-01

    Full Text Available In deze aflevering van TS Tools laat promovenda Yvette Linders (Radboud Universiteit Nijmegen zien hoe software voor kwalitatieve data-analyse kan worden toegepast in het onderzoek naar literatuurkritiek.

  7. 核电厂变形相关组件安全贮存工器具研发及应用%Safe Storage Tools Designed and Applied for Deformation Associated Core Components

    石中华; 邓志新; 张旭辉; 王玲彬

    2015-01-01

    由于变形相关组件形状发生变化,无法配插到燃料组件或存放架内进行贮存,一般是将其临时放入乏燃料水池贮存格架中的空贮存小室内,这种状态下的变形相关组件失去支撑,变形相关组件棒体会在自身重量作用下发生弯曲,长时间处于此状态下可能导致棒体破损,致使里面的物质泄漏而污染乏燃料水池.文章中工器具的整个研发是以秦山第二核电厂作为试验场所,以其乏燃料水池内的3组变形相关组件作为研制对象,最终研发出一套适用于变形相关组件安全贮存的工器具,本套工器具保证了变形相关组件的完整性.%Due to the shape of the deformation associated core components is changed, it can't be inserted to fuel assembly or storage rack, and it can only be stored in spent fuel storage cell. Deformation associated core components in this state will lose support. Rods of deformation as-sociated core components become bend under the weight of its own, which may be damaged for a long time in this state, thus the material inside leaked and the spent fuel pool is contami-nated. Therefore, we must design a tool, which can safely store the deformation associated core components. Qinshan Phase Ⅱ is selected as the test site for the whole development process, with three groups of deformation associated core components (a group of primary neutron source assembly, a group of burnable poison assembly, a group of rod cluster control assembly) in the spent fuel pool as development objects. Ultimately a set of tools suitable for the safe storage of deformation associated core components are developed, which ensure the integrity of the defor-mation associated core components.

  8. Management Tools in Engineering Education.

    Fehr, M.

    1999-01-01

    Describes a teaching model that applies management tools such as delegation, total quality management, time management, teamwork, and Deming rules. Promotes the advantages of efficiency, reporting, independent scheduling, and quality. (SK)

  9. Qualification of the nuclear reactor core model DYN3D coupled to the thermohydraulic system code ATHLET, applied as an advanced tool for accident analysis of VVER-type reactors. Final report

    The nuclear reactor core model DYN3D with 3D neutron kinetics has been coupled to the thermohydraulic system code ATHLET. In the report, activities on qualification of the coupled code complex ATHLET-DYN3D as a validated tool for the accident analysis of russian VVER type reactors are described. That includes: - Contributions to the validation of the single codes ATHLET and DYN3D by the analysis of experiments on natural circulation behaviour in thermohydraulic test facilities and solution of benchmark tasks on reactivity initiated transients, - the acquisition and evaluation of measurement data on transients in nuclear power plants, the validation of ATHLET-DYN3D by calculating an accident with delayed scram and a pump trip in VVER plants, - the complementary improvement of the code DYN3D by extension of the neutron physical data base, implementation of an improved coolant mixing model, consideration of decay heat release and xenon transients, - the analysis of steam leak scenarios for VVER-440 type reactors with failure of different safety systems, investigation of different model options. The analyses showed, that with realistic coolant mixing modelling in the downcomer and the lower plenum, recriticality of the scramed reactor due to overcooling can be reached. The application of the code complex ATHLET-DYN3D in Czech Republic, Bulgaria and the Ukraine has been started. Future work comprises the verification of ATHLET-DYN3D with a DYN3D version for the square fuel element geometry of western PWR. (orig.)

  10. Alternative affinity tools: more attractive than antibodies?

    Ruigrok, V.J.B.; Levisson, M.; Eppink, M.H.M.; Smidt, H.; Oost, van der J.

    2011-01-01

    Antibodies are the most successful affinity tools used today, in both fundamental and applied research (diagnostics, purification and therapeutics). Nonetheless, antibodies do have their limitations, including high production costs and low stability. Alternative affinity tools based on nucleic acids