WorldWideScience

Sample records for 100-year tool applied

  1. 100 years of superconductivity

    CERN Document Server

    Rogalla, Horst

    2011-01-01

    Even a hundred years after its discovery, superconductivity continues to bring us new surprises, from superconducting magnets used in MRI to quantum detectors in electronics. 100 Years of Superconductivity presents a comprehensive collection of topics on nearly all the subdisciplines of superconductivity. Tracing the historical developments in superconductivity, the book includes contributions from many pioneers who are responsible for important steps forward in the field.The text first discusses interesting stories of the discovery and gradual progress of theory and experimentation. Emphasizi

  2. Convergence: Human Intelligence The Next 100 Years

    Science.gov (United States)

    Fluellen, Jerry E., Jr.

    2005-01-01

    How might human intelligence evolve over the next 100 years? This issue paper explores that idea. First, the paper summarizes five emerging perspectives about human intelligence: Howard Gardner's multiple intelligences theory, Robert Sternberg's triarchic theory of intelligence, Ellen Langer's mindfulness theory, David Perkins' learnable…

  3. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  4. Remembering Robert Goddard's vision 100 years later

    Science.gov (United States)

    Stern, David P.

    “Life, liberty, and the pursuit of happiness” —such are the goals of most of us.Yet a few always exist who feel called by a higher purpose. Society often owes them a great deal.Robert Hutchins Goddard, whose work made spaceflight possible, found his vision 100 years ago this October as a youth of 17. His family was staying on the farm of a relative, when he was asked to trim the branches of a cherry tree behind the barn.

  5. Healthcare, molecular tools and applied genome research.

    Science.gov (United States)

    Groves, M

    2000-11-01

    Biotechnology 2000 offered a rare opportunity for scientists from academia and industry to present and discuss data in fields as diverse as environmental biotechnology and applied genome research. The healthcare section of the meeting encompassed a number of gene therapy delivery systems that are successfully treating genetic disorders. Beta-thalassemia is being corrected in mice by continous erythropoeitin delivery from engineered muscles cells, and from naked DNA electrotransfer into muscles, as described by Dr JM Heard (Institut Pasteur, Paris, France). Dr Reszka (Max-Delbrueck-Centrum fuer Molekulare Medizin, Berlin, Germany), meanwhile, described a treatment for liver metastasis in the form of a drug carrier emolization system, DCES (Max-Delbrueck-Centrum fuer Molekulare Medizin), composed of surface modified liposomes and a substance for chemo-occlusion, which drastically reduces the blood supply to the tumor and promotes apoptosis, necrosis and antiangiogenesis. In the molecular tools section, Willem Stemmer (Maxygen Inc, Redwood City, CA, USA) gave an insight into the importance that techniques, such as molecular breeding (DNA shuffling), have in the evolution of molecules with improved function, over a range of fields including pharmaceuticals, vaccines, agriculture and chemicals. Technologies, such as ribosome display, which can incorporate the evolution and the specific enrichment of proteins/peptides in cycles of selection, could play an enormous role in the production of novel therapeutics and diagnostics in future years, as explained by Andreas Plückthun (Institute of Biochemistry, University of Zurich, Switzerland). Applied genome research offered technologies, such as the 'in vitro expression cloning', described by Dr Zwick (Promega Corp, Madison, WI, USA), are providing a functional analysis for the overwhelming flow of data emerging from high-throughput sequencing of genomes and from high-density gene expression microarrays (DNA chips). The

  6. Beam Line: 100 years of elementary particles

    Science.gov (United States)

    Pais, A.; Weinberg, S.; Quigg, C.; Riordan, M.; Panofsky, W. K. H.

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  7. Advances of Bioinformatics Tools Applied in Virus Epitopes Prediction

    Institute of Scientific and Technical Information of China (English)

    Ping Chen; Simon Rayner; Kang-hong Hu

    2011-01-01

    In recent years, the in silico epitopes prediction tools have facilitated the progress of vaccines development significantly and many have been applied to predict epitopes in viruses successfully. Herein, a general overview of different tools currently available, including T cell and B cell epitopes prediction tools, is presented. And the principles of different prediction algorithms are reviewed briefly. Finally, several examples are present to illustrate the application of the prediction tools.

  8. 100-Year Flood-It's All About Chance

    Science.gov (United States)

    Holmes, Jr., Robert R.; Dinicola, Karen

    2010-01-01

    In the 1960's, the United States government decided to use the 1-percent annual exceedance probability (AEP) flood as the basis for the National Flood Insurance Program. The 1-percent AEP flood was thought to be a fair balance between protecting the public and overly stringent regulation. Because the 1-percent AEP flood has a 1 in 100 chance of being equaled or exceeded in any 1 year, and it has an average recurrence interval of 100 years, it often is referred to as the '100-year flood'. The term '100-year flood' is part of the national lexicon, but is often a source of confusion by those not familiar with flood science and statistics. This poster is an attempt to explain the concept, probabilistic nature, and inherent uncertainties of the '100-year flood' to the layman.

  9. Understanding General Relativity after 100 years: A matter of perspective

    CERN Document Server

    Dadhich, Naresh

    2016-01-01

    This is the centenary year of general relativity, it is therefore natural to reflect on what perspective we have evolved in 100 years. I wish to share here a novel perspective, and the insights and directions that ensue from it.

  10. Semantic Differential applied to the evaluation of machine tool design

    OpenAIRE

    Mondragón Donés, Salvador; Company, Pedro; Vergara Monedero, Margarita

    2005-01-01

    In this article, a study is presented showing that Product Semantics (PS) can be used to study the design of machine tools. Nowadays, different approaches to PS (Semantic Differential, Kansei Engineering, etc.) are being applied to consumer products with successful results, but commercial products have generally received less attention and machine tools in particular have not yet been studied. Our second objective is to measure the different sensitivities that the different groups of the popu...

  11. The 7 basic tools of quality applied to radiological safety

    International Nuclear Information System (INIS)

    This work seeks to establish a series of correspondences among the search of the quality and the optimization of the doses received by the occupationally exposed personnel. There are treated about the seven basic statistic tools of the quality: the Pareto technique, Cause effect diagrams, Stratification, Verification sheet, Histograms, Dispersion diagrams and Graphics and control frames applied to the Radiological Safety

  12. Bacteriophages, revitalized after 100 years in the shadow of antibiotics

    Institute of Scientific and Technical Information of China (English)

    Hongping; Wei

    2015-01-01

    <正>The year 2015 marks 100 years since Dr.Frederick Twort discovered the"filterable lytic factor",which was later independently discovered and named "bacteriophage" by Dr.Felix d’Herelle.On this memorable centennial,it is exciting to see a special issue published by Virologica Sinica on Phages and Therapy.In this issue,readers will not only fi nd that bacteriophage research is a

  13. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    CERN Document Server

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  14. Coating-substrate-simulations applied to HFQ® forming tools

    Directory of Open Access Journals (Sweden)

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  15. Lorentz and Poincaré invariance 100 years of relativity

    CERN Document Server

    Hsu Jong Ping

    2001-01-01

    This collection of papers provides a broad view of the development of Lorentz and Poincaré invariance and spacetime symmetry throughout the past 100 years. The issues explored in these papers include: (1) formulations of relativity theories in which the speed of light is not a universal constant but which are consistent with the four-dimensional symmetry of the Lorentz and Poincaré groups and with experimental results, (2) analyses and discussions by Reichenbach concerning the concepts of simultaneity and physical time from a philosophical point of view, and (3) results achieved by the union o

  16. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  17. Total Hip Arthroplasty – over 100 years of operative history

    Directory of Open Access Journals (Sweden)

    Stephen Richard Knight

    2011-11-01

    Full Text Available Total hip arthroplasty (THA has completely revolutionised the nature in which the arthritic hip is treated, and is considered to be one of the most successful orthopaedic interventions of its generation (1. With over 100 years of operative history, this review examines the progression of the operation from its origins, together with highlighting the materials and techniques that have contributed to its development. Knowledge of its history contributes to a greater understanding of THA, such as the reasons behind selection of prosthetic materials in certain patient groups, while demonstrating the importance of critically analyzing research to continually determine best operative practice. Finally, we describe current areas of research being undertaken to further advance techniques and improve outcomes.

  18. Relativity and Gravitation : 100 Years After Einstein in Prague

    CERN Document Server

    Ledvinka, Tomáš; General Relativity, Cosmology and Astrophysics : Perspectives 100 Years After Einstein's Stay in Prague

    2014-01-01

    In early April 1911 Albert Einstein arrived in Prague to become full professor of theoretical physics at the German part of Charles University. It was there, for the first time, that he concentrated primarily on the problem of gravitation. Before he left Prague in July 1912 he had submitted the paper “Relativität und Gravitation: Erwiderung auf eine Bemerkung von M. Abraham” in which he remarkably anticipated what a future theory of gravity should look like. At the occasion of the Einstein-in-Prague centenary an international meeting was organized under a title inspired by Einstein's last paper from the Prague period: "Relativity and Gravitation, 100 Years after Einstein in Prague". The main topics of the conference included: classical relativity, numerical relativity, relativistic astrophysics and cosmology, quantum gravity, experimental aspects of gravitation, and conceptual and historical issues. The conference attracted over 200 scientists from 31 countries, among them a number of leading experts in ...

  19. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  20. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  1. Creating Long Term Income Streams for the 100 Year Starship Study Initiative

    Science.gov (United States)

    Sylvester, A. J.

    Development and execution of long term research projects are very dependent on a consistent application of funding to maximize the potential for success. The business structure for the 100 Year Starship Study project should allow for multiple income streams to cover the expenses of the research objectives. The following examples illustrate the range of potential avenues: 1) affiliation with a charitable foundation for creating a donation program to fund a long term endowment for research, 2) application for grants to fund initial research projects and establish the core expertise of the research entity, 3) development of intellectual property which can then be licensed for additional revenue, 4) creation of spinout companies with equity positions retained by the lab for funding the endowment, and 5) funded research which is dual use for the technology goals of the interstellar flight research objectives. With the establishment of a diversified stream of funding options, then the endowment can be funded at a level to permit dedicated research on the interstellar flight topics. This paper will focus on the strategy of creating spinout companies to create income streams which would fund the endowment of the 100 Year Starship Study effort. This technique is widely used by universities seeking to commercially develop and market technologies developed by university researchers. An approach will be outlined for applying this technique to potentially marketable technologies generated as a part of the 100 Year Starship Study effort.

  2. Progress of Cometary Science in the Past 100 Years

    Science.gov (United States)

    Sekanina, Zdenek

    1999-01-01

    Enormous strides made by cometary science during the 20th century defy any meaningful comparison of its state 100 years ago and now. The great majority of the subfields enjoying much attention nowadays did not exist in the year 1900. Dramatic developments, especially in the past 30-50 years, have equally affected observational and theoretical studies of comets. The profound diversification of observing techniques has been documented by the ever widening limits on the electromagnetic spectrum covered. While the time around 1900 marked an early period of slow and painful experimentation with photographic methods in cometary studies, observations of comets from the x-ray region to the radio waves have by now become routine. Many of the new techniques, and all those involved with the wavelengths shorter than about 300 nm, were made possible by another major breakthrough of this century - observing from space. Experiments on dedicated Earth-orbiting satellites as well as several deep-space probes have provided fascinating new information on the nature and makeup of comets. In broader terms, much of the progress has been achieved thanks to fundamental discoveries and major advances in electronics, whose applications resulted in qualitatively new instruments (e.g. radiotelescopes) and sensors or detectors (e.g. CCD arrays). The most universal effect on the entire cometary science, from observing to data handling to quantitative interpretations, has been, as in any other branch of science, due to the introduction of electronic computers, with their processing capabilities not only unheard of, but literally unimaginable, in the age of classical desk calculators. As if all this should not be enough, the today's generations of comet scientists have, in addition, been blessed with nature's highly appreciated cooperation. Indeed, in the span of a dozen years, between 1985 and 1997, we were privileged to witness four remarkable cometary events: (i) a return of Halley

  3. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  4. 100 years of seismic research on the Moho

    DEFF Research Database (Denmark)

    Prodehl, Claus; Kennett, Brian; Artemieva, Irina;

    2013-01-01

    The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research...... on the Moho is primarily based on the comprehensive overview of the worldwide history of seismological studies of the Earth's crust using controlled sources from 1850 to 2005, by Prodehl and Mooney (2012). Though the art of applying explosions, so-called “artificial events”, as energy sources for studies...... of the uppermost crustal layers began in the early 1900s, its effective use for studying the entire crust only began at the end of World War II. From 1945 onwards, controlled-source seismology has been the major approach to study details of the crust and underlying crust–mantle boundary, the Moho. The subsequent...

  5. Applying MDE Tools at Runtime: Experiments upon Runtime Models

    OpenAIRE

    Song, Hui; Huang, Gang; Chauvel, Franck; Sun, Yanshun

    2010-01-01

    to be published International audience Runtime models facilitate the management of running systems in many different ways. One of the advantages of runtime models is that they enable the use of existing MDE tools at runtime to implement common auxiliary activities in runtime management, such as querying, visualization, and transformation. In this tool demonstration paper, we focus on this specific aspect of runtime models. We discuss the requirements of runtime models to enable the use of...

  6. Process for selecting engineering tools : applied to selecting a SysML tool.

    Energy Technology Data Exchange (ETDEWEB)

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  7. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  8. 100-Year Floodplains, Floodplains 100 year define in gold color, Published in 2009, 1:2400 (1in=200ft) scale, WABASH COUNTY GOVERNMENT.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale, was produced all or in part from Published Reports/Deeds information as of 2009. It is...

  9. 100-Year Floodplains, 100 year flood plain data, Published in 2006, 1:1200 (1in=100ft) scale, Washoe County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Field Survey/GPS information as of 2006. It is described...

  10. Geo-environmental mapping tool applied to pipeline design

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Karina de S.; Calle, Jose A.; Gil, Euzebio J. [Geomecanica S/A Tecnologia de Solo Rochas e Materiais, Rio de Janeiro, RJ (Brazil); Sare, Alexandre R. [Geomechanics International Inc., Houston, TX (United States); Soares, Ana Cecilia [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The Geo-Environmental Mapping is an improvement of the Geological-Geotechnical Mapping used for basic pipeline designs. The main purpose is to assembly the environmental, geotechnical and geological concepts in a methodological tool capable to predict constrains and reduce the pipeline impact to the environment. The Geo-Environmental mapping was built to stress the influence of soil/structure interaction, related to the physical effect that comes from the contact between structures and soil or rock. A Geological-Geotechnical-Environmental strip (chart) was presented to emphasize the pipeline operational constrains and its influence to the environment. The mapping was developed to clearly show the occurrence and properties of geological materials divided into geotechnical domain units (zones). The strips present construction natural properties, such as: excavability, stability of the excavation and soil re-use capability. Also, the environmental constrains were added to the geological-geotechnical mapping. The Geo-Environmental Mapping model helps the planning of the geotechnical and environmental inquiries to be carried out during executive design, the discussion on the types of equipment to be employed during construction and the analysis of the geological risks and environmental impacts to be faced during the built of the pipeline. (author)

  11. Monitoring operational data production applying Big Data tooling

    Science.gov (United States)

    Som de Cerff, Wim; de Jong, Hotze; van den Berg, Roy; Bos, Jeroen; Oosterhoff, Rijk; Klein Ikkink, Henk Jan; Haga, Femke; Elsten, Tom; Verhoef, Hans; Koutek, Michal; van de Vegte, John

    2015-04-01

    Within the KNMI Deltaplan programme for improving the KNMI operational infrastructure an new fully automated system for monitoring the KNMI operational data production systems is being developed: PRISMA (PRocessflow Infrastructure Surveillance and Monitoring Application). Currently the KNMI operational (24/7) production systems consist of over 60 applications, running on different hardware systems and platforms. They are interlinked for the production of numerous data products, which are delivered to internal and external customers. All applications are individually monitored by different applications, complicating root cause and impact analysis. Also, the underlying hardware and network is monitored separately using Zabbix. Goal of the new system is to enable production chain monitoring, which enables root cause analysis (what is the root cause of the disruption) and impact analysis (what other products will be effected). The PRISMA system will make it possible to dispose all the existing monitoring applications, providing one interface for monitoring the data production. For modeling the production chain, the Neo4j Graph database is used to store and query the model. The model can be edited through the PRISMA web interface, but is mainly automatically provided by the applications and systems which are to be monitored. The graph enables us to do root case and impact analysis. The graph can be visualized in the PRISMA web interface on different levels. Each 'monitored object' in the model will have a status (OK, error, warning, unknown). This status is derived by combing all log information available. For collecting and querying the log information Splunk is used. The system is developed using Scrum, by a multi-disciplinary team consisting of analysts, developers, a tester and interaction designer. In the presentation we will focus on the lessons learned working with the 'Big data' tooling Splunk and Neo4J.

  12. Applied climate-change analysis: the climate wizard tool.

    Directory of Open Access Journals (Sweden)

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  13. 100-Year Floodplains, Floodplain, Published in 2000, Smaller than 1:100000 scale, Taylor County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Smaller than 1:100000 scale, was produced all or in part from Hardcopy Maps information as of 2000. It is described...

  14. 100-Year Floodplains, FEMA Preliminary Map Mad, Published in unknown, Trempealeau County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, was produced all or in part from Other information as of unknown. It is described as 'FEMA Preliminary Map Mad'. Data by this...

  15. 100-Year Floodplains, FEMA FIRM Mapping, Published in 2014, Not Applicable scale, GIS.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from Other information as of 2014. It is described as 'FEMA FIRM...

  16. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  17. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    be a promising alternative to build tools for GSE. However, significant effort is required to introduce a new paradigm; there is a need of sound theoretical foundation based on activity theory to address challenges faced by tools in GSE. This paper reports our effort aimed at building theoretical foundations...... for applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory......Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...

  18. LOW FREQUENCY VARIABILITY OF INTERANNUAL CHANGE PATTERNS FOR GLOBAL MEAN TEMPERATURE DURING THE RECENT 100 YEARS

    Institute of Scientific and Technical Information of China (English)

    刘晶淼; 丁裕国; 等

    2002-01-01

    The TEEOF method that expands temporally is used to conduct a diagnostic study of the variation patterns of 1,3,6and 10 years with regard to mean air temperature over the globe and Southern and Northern Hemispheres over the course of 100 years.The results show that the first mode of TEEOF takes up more than 50%in the total variance,with each of the first mode in the interannual osicllations generally standing for annually varying patterns which are related with climate and reflecting long-term tendency of change in air temperature.It is particularly true for the first mode on the 10-year scale.which shows an obvious ascending ascending trend concerning the temperature in winter and consistently the primary component of time goes in a way that is very close to the sequence of actual temperature,Apart from the first mode of all time sections of TEEOF for the globe and the two hemispheres and the second mode of the 1-year TEEOF.interannual variation described by other characteristic vectors are showing various patterns,with corresponding primary components having relation with longterm variability of specific interannual quasi-periodic oscillation structures.A T2 test applied to the annual variation pattern shows that the abrupt changes for the southern Hemisphere and the globe come close to the result of a uni-element t test for mean temperature than those for the Northern Hemisphere do.It indicates that the T2 test,when carried out with patterns of multiple variables.Seems more reasonable than the t test with single elements.

  19. Base (100-year) flood elevations for selected sites in Marion County, Missouri

    Science.gov (United States)

    Southard, Rodney E.; Wilson, Gary L.

    1998-01-01

    The primary requirement for community participation in the National Flood Insurance Program is the adoption and enforcement of floodplain management requirements that minimize the potential for flood damages to new construction and avoid aggravating existing flooding conditions. This report provides base flood elevations (BFE) for a 100-year recurrence flood for use in the management and regulation of 14 flood-hazard areas designated by the Federal Emergency Management Agency as approximate Zone A areas in Marion County, Missouri. The one-dimensional surface-water flow model, HEC-RAS, was used to compute the base (100-year) flood elevations for the 14 Zone A sites. The 14 sites were located at U.S., State, or County road crossings and the base flood elevation was determined at the upstream side of each crossing. The base (100-year) flood elevations for BFE 1, 2, and 3 on the South Fork North River near Monroe City, Missouri, are 627.7, 579.2, and 545.9 feet above sea level. The base (100-year) flood elevations for BFE 4, 5, 6, and 7 on the main stem of the North River near or at Philadelphia and Palmyra, Missouri, are 560.5, 539.7, 504.2, and 494.4 feet above sea level. BFE 8 is located on Big Branch near Philadelphia, a tributary to the North River, and the base (100-year) flood elevation at this site is 530.5 feet above sea level. One site (BFE 9) is located on the South River near Monroe City, Missouri. The base (100-year) flood elevation at this site is 619.1 feet above sea level. Site BFE 10 is located on Bear Creek near Hannibal, Missouri, and the base (100-year) elevation is 565.5 feet above sea level. The four remaining sites (BFE 11, 12, 13, and 14) are located on the South Fabius River near Philadelphia and Palmyra, Missouri. The base (100-year) flood elevations for BFE 11, 12, 13, and 14 are 591.2, 578.4, 538.7, and 506.9 feet above sea level.

  20. Applying observations of work activity in designing prototype data analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  1. The Observation Of Defects Of School Buildings Over 100 Years Old In Perak

    Directory of Open Access Journals (Sweden)

    Alauddin Kartina

    2016-01-01

    Full Text Available Malaysia is blessed with a rich legacy of heritage buildings with unique architectural and historical values. The heritage buildings become a symbol of the national identity of our country. Therefore, heritage buildings, as important monuments should be conserved well to ensure the extension of the building’s life span and to make sure continuity functions of the building for future generations. The aim of this study is to analyze the types of defects attached in school buildings over 100 years located in Perak. The data were collected in four different schools aged over 100 years in Perak. The finding of the study highlighted the types of defects which were categorized based on building elements, including external wall, roof, door, ceiling, staircase, column, internal wall, floor and windows. Finding showed that the type of defects occurred in school buildings over 100 years in Perak is the same as the other heritage buildings. This finding can be used by all parties to take serious actions in preventing defects from occurring in buildings over 100 years. This would ensure that buildings’ functional life span can be extended for future use.

  2. 100 Years of the American Economic Review : The Top 20 Articles

    OpenAIRE

    Kenneth J. Arrow; B. Douglas Bernheim; Martin S. Feldstein; Daniel L. McFadden; James M. Poterba; Solow, Robert M.

    2011-01-01

    This paper presents a list of the top 20 articles published in the American Economic Review during its first 100 years. This list was assembled in honor of the AER 's one-hundredth anniversary by a group of distinguished economists at the request of AER 's editor. A brief description accompanies the citations of each article.

  3. Ageing management of instrumentation and control systems for 100 years life of AHWR

    International Nuclear Information System (INIS)

    Currently Nuclear Power Plants are being designed for a life of about 40 years. However, Advanced Heavy Water Reactor (AHWR), being designed by BARC, is intended to have a life of 100 years. Instrumentation and Control (I and C) plays a crucial role in the safe operation of any nuclear reactor. Design of I and C especially for a life of 100 years offers a great deal of challenges. Experience has shown that ageing and obsolescence have the potential to cause the maintainability and operability of I and C systems in Nuclear Power Plants to deteriorate well before the end of plant life. Hence, all ageing effects are to be detected in time and eliminated by repair, upgrading and replacement measures. However since no I and C system can survive such a long life of 100 years, special attention is to be paid in the design to effect easy replacement. Every aspect of design of hardware and software should deal with obsolescence. Design strategies like minimising the amount of cabling by resorting to networked data communication will go a long way in achieving the desired life extension. Hence it is essential that an effective Ageing Management Programme to be established at the very initial stages of design, planning and engineering of I and C systems for AHWR. This will ensure reliable continued operation of I and C systems for 100 years of life. (author)

  4. Simulations of the Greenland ice sheet 100 years into the future with the full Stokes model Elmer/Ice

    Science.gov (United States)

    Seddik, H.; Greve, R.; Zwinger, T.; Gillet-Chaulet, F.; Gagliardini, O.

    2011-12-01

    the surface precipitation and temperature and the set S (three experiments) applies an amplification factor to change the basal sliding velocity. The experiments are compared to a constant climate control run beginning at present (epoch 2004-1-1 0:0:0) and running up to 100 years holding the climate constant to its present state. The experiments with the amplification factor (Set S) show high sensitivities. Relative to the control run, the scenario with an amplification factor of 3x applied to the sliding velocity produces a Greenland contribution to sea level rise of ~25 cm. An amplification factor of 2.5x produces a contribution of ~16 cm and an amplification factor 2x produces a contribution of ~9 cm. The experiments with the changes to the surface precipitation and temperature (set C) show a contribution to sea level rise of ~4 cm when a factor 1x is applied to the temperature and precipitation anomalies. A factor 1.5x produces a sea level rise of ~8 cm and a factor 2x produces a sea level rise of ~12 cm.

  5. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  6. Enhancing Social Presence in Online Learning: Mediation Strategies Applied to Social Networking Tools

    Science.gov (United States)

    Joyce, Kristopher M.; Brown, Abbie

    2009-01-01

    An exploration of the mediation strategies applied to social networking tools for purposes of enhancing social presence for students participating in online course work. The article includes a review of the literature, specific examples from the authors' professional practice and recommendations for creating a positive social experience for online…

  7. Liverpool's Discovery: A University Library Applies a New Search Tool to Improve the User Experience

    Science.gov (United States)

    Kenney, Brian

    2011-01-01

    This article features the University of Liverpool's arts and humanities library, which applies a new search tool to improve the user experience. In nearly every way imaginable, the Sydney Jones Library and the Harold Cohen Library--the university's two libraries that serve science, engineering, and medical students--support the lives of their…

  8. A MODEL TO EVALUATE 100-YEAR ENERGY-MIX SCENARIOS TO FACILITATE DEEP DECARBONIZATION IN THE SOUTHEASTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Adkisson, Mary A [ORNL; Qualls, A L [ORNL

    2016-08-01

    The Southeast United States consumes approximately one billion megawatt-hours of electricity annually; roughly two-thirds from carbon dioxide (CO2) emitting sources. The balance is produced by non-CO2 emitting sources: nuclear power, hydroelectric power, and other renewables. Approximately 40% of the total CO2 emissions come from the electric grid. The CO2 emitting sources, coal, natural gas, and petroleum, produce approximately 372 million metric tons of CO2 annually. The rest is divided between the transportation sector (36%), the industrial sector (20%), the residential sector (3%), and the commercial sector (2%). An Energy Mix Modeling Analysis (EMMA) tool was developed to evaluate 100-year energy mix strategies to reduce CO2 emissions in the southeast. Current energy sector data was gathered and used to establish a 2016 reference baseline. The spreadsheet-based calculation runs 100-year scenarios based on current nuclear plant expiration dates, assumed electrical demand changes from the grid, assumed renewable power increases and efficiency gains, and assumed rates of reducing coal generation and deployment of new nuclear reactors. Within the model, natural gas electrical generation is calculated to meet any demand not met by other sources. Thus, natural gas is viewed as a transitional energy source that produces less CO2 than coal until non-CO2 emitting sources can be brought online. The annual production of CO2 and spent nuclear fuel and the natural gas consumed are calculated and summed. A progression of eight preliminary scenarios show that nuclear power can substantially reduce or eliminate demand for natural gas within 100 years if it is added at a rate of only 1000 MWe per year. Any increases in renewable energy or efficiency gains can offset the need for nuclear power. However, using nuclear power to reduce CO2 will result in significantly more spent fuel. More efficient advanced reactors can only marginally reduce the amount of spent fuel generated in

  9. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  10. Struggles for Perspective: A Commentary on ""One Story of Many to Be Told": Following Empirical Studies of College and Adult Writing through 100 Years of NCTE Journals"

    Science.gov (United States)

    Brandt, Deborah

    2011-01-01

    In this article, the author comments on Kevin Roozen and Karen Lunsford's insightful examination of empirical studies of college and adult writing published in NCTE journals over the last 100 years. One sees in their account the struggles for perspective that marked writing studies in this period, as researchers applied ever wider lenses to the…

  11. FMEA TOOL APPLYING: CASE STUDY IN A COMPANY OF PASSENGER TRANSPORTATION BUSINESS

    Directory of Open Access Journals (Sweden)

    Cristiano Roos

    2007-12-01

    Full Text Available This assignment presents a case study in which a tool Failure Modes and Effects Analysis (FMEA was applied in a company on the land/air transportation of passengers and carrier. The objective of this study was to determine, with the tool FMEA, actions that minimize or eliminate potential failure modes in one of the service outcomes attended by the company. The specific point where the tool was applied was the manager of passenger land transportation vehicles, because failures related to this component increase the maintenance expenses of the company and tend to generate client insatisfaction. Also used, incorporated to the FMEA tool, other quality tools like Brainstorming, Ishikawa Diagram and Pareto Graphic. The results of the assignment were reached after determining actions that bring with itself the main objective of this study, this is, the liability and quality increase of the service given. This way, the achievement of the present case study caused a better understanding about the proposed theme, besides showing the importance of quality manager nowadays and facing the rising demands of the clients.

  12. Oceanic environmental changes of subarctic Bering Sea in recent 100 years: Evidence from molecular fossils

    Institute of Scientific and Technical Information of China (English)

    LU; Bing; CHEN; Ronghua; ZHOU; Huaiyang; WANG; Zipan; CHEN

    2005-01-01

    The core sample B2-9 from the seafloor of the subarctic Bering Sea was dated with 210Pb to obtain a consecutive sequence of oceanic sedimentary environments at an interval of a decade during 1890-1999. A variety of molecular fossils were detected, including n-alkanes, isoprenoids, fatty acids, sterols, etc. By the characteristics of these fine molecules (C27, C28, and C29 sterols) and their molecular indices (Pr/Ph, ∑C+22/∑C?21, CPI and C18∶2/C18∶0) and in consideration of the variation of organic carbon content, the 100-year evolution history of subarctic sea paleoenvironment was reestablished. It is indicated that during the past 100 years in the Arctic, there were two events of strong climate warming (1920-1950 and 1980-1999), which resulted in an oxidated sediment environment owing to decreasing terrigenous organic matters and increasing marine-derived organic matters, and two events of transitory climate cooling (1910 and 1970-1980), which resulted in a slightly reduced sediment environment owing to increasing terrigenous organic matters and decreasing marine-derived organic matters. It is revealed that the processes of warming/cooling alternated climate are directly related to the Arctic and global climate variations.

  13. A risk assessment tool applied to the study of shale gas resources.

    Science.gov (United States)

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach.

  14. A risk assessment tool applied to the study of shale gas resources.

    Science.gov (United States)

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. PMID:27453140

  15. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    assessment and response design ultimately benefit from spatial differentiation in the results. DPSIR based on LCIA seems a useful tool to improve communication and learning, as it bridges science and management while promoting the basic elements of sustainable development in a practical educational...... understanding that is well suited for sustainability teaching and communication purposes. Life Cycle Impact Assessment (LCIA) indicators aim at modelling the P-S-I parts and provide a good background for understanding D and R. As an example, the DPSIR approach was applied to the LCIA indicator marine...... eutrophication. The goal is to promote an educational example of environmental impacts assessment through science-based tools to predict the impacts, communicate knowledge and support decisions. The example builds on the (D) high demand for fixation of reactive nitrogen that supports several socio...

  16. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    Directory of Open Access Journals (Sweden)

    Mari-Carmen Mochón

    2016-03-01

    Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.

  17. Degradation of building materials over a lifespan of 30-100 years

    International Nuclear Information System (INIS)

    Following preliminary visits to four Magnox Nuclear Power Stations, a study was made of existing Central Electricity Generating Board (CEGB) reports on the condition of buildings at eight Power Stations. Sampling of building materials, non-destructive testing and inspections were carried out at Transfynydd, Oldbury and Dungeness ''A'' Magnox Power Stations, and the samples were subsequently laboratory tested. From the results of this work it can be concluded that little major deterioration is likely to occur in the reactor buildings at Transfynydd and Oldbury over the next 50 years and at Dungeness ''A'' for at least 25 years, assuming reasonable maintenance and the continuation of suitable internal temperatures and relative humidities. Because of the limitations on taking samples from, and tests on, the reactor biological shields and prestressed concrete vessel, no sensible forecast can be made of their potential life in the 75-100 year range

  18. 100 Years of British military neurosurgery: on the shoulders of giants.

    Science.gov (United States)

    Roberts, S A G

    2015-01-01

    Death from head injuries has been a feature of conflicts throughout the world for centuries. The burden of mortality has been variously affected by the evolution in weaponry from war-hammers to explosive ordnance, the influence of armour on survivability and the changing likelihood of infection as a complicating factor. Surgery evolved from haphazard trephination to valiant, yet disjointed, neurosurgery by a variety of great historical surgeons until the Crimean War of 1853-1856. However, it was events initiated by the Great War of 1914-1918 that not only marked the development of modern neurosurgical techniques, but our approach to military surgery as a whole. Here the author describes how 100 years of conflict and the input and intertwining relationships between the 20th century's great neurosurgeons established neurosurgery in the United Kingdom and beyond. PMID:26292388

  19. Sustainable Foods and Medicines Support Vitality, Sex and Longevity for a 100-Year Starship Expedition

    Science.gov (United States)

    Edwards, M. R.

    Extended space flight requires foods and medicines that sustain crew health and vitality. The health and therapeutic needs for the entire crew and their children for a 100-year space flight must be sustainable. The starship cannot depend on resupply or carry a large cargo of pharmaceuticals. Everything in the starship must be completely recyclable and reconstructable, including food, feed, textiles, building materials, pharmaceuticals, vaccines, and medicines. Smart microfarms will produce functional foods with superior nutrition and sensory attributes. These foods provide high-quality protein and nutralence (nutrient density), that avoids obesity, diabetes, and other Western diseases. The combination of functional foods, lifestyle actions, and medicines will support crew immunity, energy, vitality, sustained strong health, and longevity. Smart microfarms enable the production of fresh medicines in hours or days, eliminating the need for a large dispensary, which eliminates concern over drug shelf life. Smart microfarms are adaptable to the extreme growing area, resource, and environmental constraints associated with an extended starship expedition.

  20. The volcanic contribution to climate change of the past 100 years

    International Nuclear Information System (INIS)

    Volcanic eruptions which inject large amounts of sulfur-rich gas into the stratosphere produce dust veils which last several years and cool the earth's surface. At the same time these dust veils absorb enough solar radiation to warm the stratosphere. Since these temperature changes at the earth's surface and in the stratosphere are both in the opposite direction to the hypothesized effects from greenhouse gases, they act to delay and mask the detection of greenhouse effects on the climate system. A large portion of the global climate change of the past 100 years may be due to the effects of volcanoes, but a definitive answer is not yet clear. While effects over several years have been demonstrated with both data studies and numerical models, long-term effects, while found in climate model calculations, await confirmation with more realistic models. In this paper chronologies of past volcanic eruptions and the evidence from data analyses and climate model calculations are reviewed

  1. Rapid warming in mid-latitude central Asia for the past 100 years

    Institute of Scientific and Technical Information of China (English)

    Fahu CHEN; Jinsong WANG; Liya JIN; Qiang ZHANG; Jing LI; Jianhui CHEN

    2009-01-01

    Surface air temperature variations during the last 100 years (1901-2003) in mid-latitude central Asia were analyzed using Empirical Orthogonal Functions (EOFs). The results suggest that temperature variations in four major sub-regions, i.e. the eastern monsoonal area, central Asia, the Mongolian Plateau and the Tarim Basin, respectively, are coherent and characterized by a striking warming trend during the last 100 years. The annual mean temperature increasing rates at each sub-region (represen-tative station) are 0.19℃ per decade, 0.16℃ per decade, 0.23℃ per decade and 0.15℃ per decade, respectively.The average annual mean temperature increasing rate of the four sub-regions is 0.18℃ per decade, with a greater increasing rate in winter (0.21℃ per decade). In Asian mid-latitude areas, surface air temperature increased relatively slowly from the 1900s to 1970s, and it has increased rapidly since 1970s. This pattern of temperature variation differs from that in the other areas of China. Notably, there was no obvious warming between the 1920s and 1940s, with temperature fluctuating between warming and cooling trends (e.g. 1920s, 1940s, 1960s, 1980s, 1990s). However, the warming trends are of a greater magnitude and their durations are longer than that of the cooling periods, which leads to an overall warming. The amplitude of temperature variations in the study region is also larger than that in eastern China during different periods.

  2. 77 FR 66823 - Freedom of Information Act Request for Papers Submitted to DARPA for the 2011 100 Year Starship...

    Science.gov (United States)

    2012-11-07

    ... of the Secretary Freedom of Information Act Request for Papers Submitted to DARPA for the 2011 100 Year Starship Symposium AGENCY: Defense Advanced Research Projects Agency (DARPA), DoD. ACTION: Notice... panels at the 2011 100 Year Starship Symposium must provide DARPA a written response explaining...

  3. Applying the Case Management CourTools: Finding from an Urban Trial Court

    Directory of Open Access Journals (Sweden)

    Collins E. Ijoma

    2012-06-01

    Full Text Available The National Center for State Courts (NCSC recently promulgated 10 trial court performance measures, referred to as CourTools. Measures 2, 3, 4, and 5 provide a methodology by which court managers can examine their management and processing of cases. The measures include clearance rate (measure 2, time to disposition (measure 3, age of active pending caseload (measure 4, and trial date certainty (measure 5. The objective of this research was threefold. The first aim was to assess the viability of using the case management measures to examine case processing trends in a New Jersey (NJ urban trial court. Each measure was reviewed to determine the tool’s applicability to the criminal division of the court. The second objective (pursued as a parallel to the first was to present the findings in the same context as the CourTools’ framework to determine its practicality. The final goal was to serve as a platform for other courts on the national and international level that do not yet use performance measures. These courts, diverse as they are, may use the methodologies and findings of this case study as a reference and guide to develop their own program to measure the court’s productivity and efficiency. To that end, this case study sought to answer the following questions in determining the applicability of the CourTools to the selected court and by extension, its potential for more universal application to other court systems. First, what is the relevance of measurements to the courts and why is it important, if at all? Second, what are the CourTools? Third, can the measurement model be applied to an actual court and if so, how is it executed and illustrated in practice? Finally, what are the implications of the findings for the court in question, as well as, other courts that seek to incorporate the CourTools to measure performance?

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  6. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    International Nuclear Information System (INIS)

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed

  7. Revisiting extreme storms of the past 100 years for future safety of large water management infrastructures

    Science.gov (United States)

    Chen, Xiaodong; Hossain, Faisal

    2016-07-01

    Historical extreme storm events are widely used to make Probable Maximum Precipitation (PMP) estimates, which form the cornerstone of large water management infrastructure safety. Past studies suggest that extreme precipitation processes can be sensitive to land surface feedback and the planetary warming trend, which makes the future safety of large infrastructures questionable given the projected changes in land cover and temperature in the coming decades. In this study, a numerical modeling framework was employed to reconstruct 10 extreme storms over CONUS that occurred during the past 100 years, which are used by the engineering profession for PMP estimation for large infrastructures such as dams. Results show that the correlation in daily rainfall for such reconstruction can range between 0.4 and 0.7, while the correlation for maximum 3-day accumulation (a standard period used in infrastructure design) is always above 0.5 for post-1948 storms. This suggests that current numerical modeling and reanalysis data allow us to reconstruct big storms after 1948 with acceptable accuracy. For storms prior to 1948, however, reconstruction of storms shows inconsistency with observations. Our study indicates that numerical modeling and data may not have advanced to a sufficient level to understand how such old storms (pre-1948) may behave in future warming and land cover conditions. However, the infrastructure community can certainly rely on the use of model reconstructed extreme storms of the 1948-present period to reassess safety of our large water infrastructures under assumed changes in temperature and land cover.

  8. The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools

    CERN Document Server

    Holst, Michael; Tiglio, Manuel; Vallisneri, Michele

    2016-01-01

    On September 14, 2015, the newly upgraded Laser Interferometer Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW) signal, emitted a billion light-years away by a coalescing binary of two stellar-mass black holes. The detection was announced in February 2016, in time for the hundredth anniversary of Einstein's prediction of GWs within the theory of general relativity (GR). The signal represents the first direct detection of GWs, the first observation of a black-hole binary, and the first test of GR in its strong-field, high-velocity, nonlinear regime. In the remainder of its first observing run, LIGO observed two more signals from black-hole binaries, one moderately loud, another at the boundary of statistical significance. The detections mark the end of a decades-long quest, and the beginning of GW astronomy: finally, we are able to probe the unseen, electromagnetically dark Universe by listening to it. In this article, we present a short historical overview of GW science: this youn...

  9. Quantitative tools for comparing animal communication systems: information theory applied to bottlenose dolphin whistle repertoires.

    Science.gov (United States)

    McCOWAN; Hanser; Doyle

    1999-02-01

    Comparative analysis of nonhuman animal communication systems and their complexity, particularly in comparison to human language, has been generally hampered by both a lack of sufficiently extensive data sets and appropriate analytic tools. Information theory measures provide an important quantitative tool for examining and comparing communication systems across species. In this paper we use the original application of information theory, that of statistical examination of a communication system's structure and organization. As an example of the utility of information theory to the analysis of animal communication systems, we applied a series of information theory statistics to a statistically categorized set of bottlenose dolphin Tursiops truncatus, whistle vocalizations. First, we use the first-order entropic relation in a Zipf-type diagram (Zipf 1949 Human Behavior and the Principle of Least Effort) to illustrate the application of temporal statistics as comparative indicators of repertoire complexity, and as possible predictive indicators of acquisition/learning in animal vocal repertoires. Second, we illustrate the need for more extensive temporal data sets when examining the higher entropic orders, indicative of higher levels of internal informational structure, of such vocalizations, which could begin to allow the statistical reconstruction of repertoire organization. Third, we propose using 'communication capacity' as a measure of the degree of temporal structure and complexity of statistical correlation, represented by the values of entropic order, as an objective tool for interspecies comparison of communication complexity. In doing so, we introduce a new comparative measure, the slope of Shannon entropies, and illustrate how it potentially can be used to compare the organizational complexity of vocal repertoires across a diversity of species. Finally, we illustrate the nature and predictive application of these higher-order entropies using a preliminary

  10. 100-Year Floodplains, Published in 2008, 1:1200 (1in=100ft) scale, City of Milton.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Road Centerline Files information as of 2008. Data by...

  11. 100-Year Floodplains, flood plain, Published in 2009, 1:24000 (1in=2000ft) scale, Washington County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'flood...

  12. The Archives of the Department of Terrestrial Magnetism: Documenting 100 Years of Carnegie Science

    Science.gov (United States)

    Hardy, S. J.

    2005-12-01

    The archives of the Department of Terrestrial Magnetism (DTM) of the Carnegie Institution of Washington document more than a century of geophysical and astronomical investigations. Primary source materials available for historical research include field and laboratory notebooks, equipment designs, plans for observatories and research vessels, scientists' correspondence, and thousands of expedition and instrument photographs. Yet despite its history, DTM long lacked a systematic approach to managing its documentary heritage. A preliminary records survey conducted in 2001 identified more than 1,000 linear feet of historically-valuable records languishing in dusty, poorly-accessible storerooms. Intellectual control at that time was minimal. With support from the National Historical Publications and Records Commission, the "Carnegie Legacy Project" was initiated in 2003 to preserve, organize, and facilitate access to DTM's archival records, as well as those of the Carnegie Institution's administrative headquarters and Geophysical Laboratory. Professional archivists were hired to process the 100-year backlog of records. Policies and procedures were established to ensure that all work conformed to national archival standards. Records were appraised, organized, and rehoused in acid-free containers, and finding aids were created for the project web site. Standardized descriptions of each collection were contributed to the WorldCat bibliographic database and the AIP International Catalog of Sources for History of Physics. Historic photographs and documents were digitized for online exhibitions to raise awareness of the archives among researchers and the general public. The success of the Legacy Project depended on collaboration between archivists, librarians, historians, data specialists, and scientists. This presentation will discuss key aspects (funding, staffing, preservation, access, outreach) of the Legacy Project and is aimed at personnel in observatories, research

  13. To Humbly Go: Guarding Against Perpetuating Models of Colonization in the 100-Year Starship Study

    Science.gov (United States)

    Kramer, W. R.

    Past patterns of exploration, colonization and exploitation on Earth continue to provide the predominant paradigms that guide many space programs. Any project of crewed space exploration, especially of the magnitude envisioned by the 100-Year Starship Study, must guard against the hubris that may emerge among planners, crew, and others associated with the project, including those industries and bureaucracies that will emerge from the effort. Maintaining a non-exploitative approach may be difficult in consideration of the century of preparatory research and development and the likely multigenerational nature of the voyage itself. Starting now with mission dreamers and planners, the purpose of the voyage must be cast as one of respectful learning and humble discovery, not of conquest (either actual or metaphorical) or other inappropriate models, including military. At a minimum, the Study must actively build non-violence into the voyaging culture it is beginning to create today. References to exploitive colonization, conquest, destiny and other terms from especially American frontier mythology, while tempting in their propagandizing power, should be avoided as they limit creative thinking about alternative possible futures. Future voyagers must strive to adapt to new environments wherever possible and be assimilated by new worlds both biologically and behaviorally rather than to rely on attempts to recreate the Earth they have left. Adaptation should be strongly considered over terraforming. This paper provides an overview of previous work linking the language of colonization to space programs and challenges the extension of the myth of the American frontier to the Starship Study. It argues that such metaphors would be counter-productive at best and have the potential to doom long-term success and survival by planting seeds of social decay and self-destruction. Cautions and recommendations are suggested.

  14. 100 years of California’s water rights system: patterns, trends and uncertainty

    Science.gov (United States)

    Grantham, Theodore E.; Viers, Joshua H.

    2014-08-01

    For 100 years, California’s State Water Resources Control Board and its predecessors have been responsible for allocating available water supplies to beneficial uses, but inaccurate and incomplete accounting of water rights has made the state ill-equipped to satisfy growing societal demands for water supply reliability and healthy ecosystems. Here, we present the first comprehensive evaluation of appropriative water rights to identify where, and to what extent, water has been dedicated to human uses relative to natural supplies. The results show that water right allocations total 400 billion cubic meters, approximately five times the state’s mean annual runoff. In the state’s major river basins, water rights account for up to 1000% of natural surface water supplies, with the greatest degree of appropriation observed in tributaries to the Sacramento and San Joaquin Rivers and in coastal streams in southern California. Comparisons with water supplies and estimates of actual use indicate substantial uncertainty in how water rights are exercised. In arid regions such as California, over-allocation of surface water coupled with trends of decreasing supply suggest that new water demands will be met by re-allocation from existing uses. Without improvements to the water rights system, growing human and environmental demands portend an intensification of regional water scarcity and social conflict. California’s legal framework for managing its water resources is largely compatible with needed reforms, but additional public investment is required to enhance the capacity of the state’s water management institutions to effectively track and regulate water rights.

  15. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Science.gov (United States)

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  16. The Hunterian Neurosurgical Laboratory: the first 100 years of neurosurgical research.

    Science.gov (United States)

    Sampath, P; Long, D M; Brem, H

    2000-01-01

    Modern neurosurgery has long had a strong laboratory foundation, and much of this tradition can be traced to the Hunterian Neurosurgical Laboratory of the Johns Hopkins Hospital. Founded with the basic goals of investigating the causes and symptoms of disease and establishing the crucial role that surgeons may play in the treatment of disease, the Hunterian laboratory has adhered to these tenets, despite the dramatic changes in neurosurgery that have occurred in the last 100 years. Named for the famous English surgeon John Hunter (1728-1793), the Hunterian laboratory was conceived by William Welch and William Halsted as a special laboratory for experimental work in surgery and pathology. In 1904, Harvey Cushing was appointed by Halsted to direct the laboratory. With the three primary goals of student education, veterinary surgery that stressed surgical techniques, and meticulous surgical and laboratory record-keeping, the laboratory was quite productive, introducing the use of physiological saline solutions, describing the anatomic features and function of the pituitary gland, and establishing the field of endocrinology. In addition, the original development of hanging drop tissue culture, fundamental investigations into cerebrospinal fluid, and countless contributions to otolaryngology by Samuel Crowe all occurred during this "crucible" period. In 1912, Cushing was succeeded by Walter Dandy, whose work on experimental hydrocephalus and cerebrospinal fluid circulation led to the development of pneumoencephalography. The early days of neurosurgery evolved with close ties to general surgery, and so did the Hunterian laboratory. After Dandy began devoting his time to clinical work, general surgeons (first Jay McLean and then, in 1922, Ferdinand Lee) became the directors of the laboratory. Between 1928 and 1942, more than 150 original articles were issued from the Hunterian laboratory; these articles described significant advances in surgery, including pioneering

  17. Underworld-GT Applied to Guangdong, a Tool to Explore the Geothermal Potential of the Crust

    Institute of Scientific and Technical Information of China (English)

    Steve Quenette; Yufei Xi; John Mansour; Louis Moresi; David Abramson

    2015-01-01

    Geothermal energy potential is usually discussed in the context of conventional or engi-neered systems and at the scale of an individual reservoir. Whereas exploration for conventional reser-voirs has been relatively easy, with expressions of resource found close to or even at the surface, explora-tion for non-conventional systems relies on temperature inherently increasing with depth and searching for favourable geological environments that maximise this increase. To utilitise the information we do have, we often assimilate available exploration data with models that capture the physics of the domi-nant underlying processes. Here, we discuss computational modelling approaches to exploration at a re-gional or crust scale, with application to geothermal reservoirs within basins or systems of basins. Tar-get reservoirs have (at least) appropriate temperature, permeability and are at accessible depths. We discuss the software development approach that leads to effective use of the tool Underworld. We ex-plore its role in the process of modelling, understanding computational error, importing and exporting geological knowledge as applied to the geological system underpinning the Guangdong Province, China.

  18. Using Space Weather Forecast Tools for Understanding Planetary Magnetospheres: MESSENGER Experience Applied to MAVEN Studies

    Science.gov (United States)

    Baker, Daniel N.; Dewey, R. M.; Brain, D. A.; Jakosky, Bruce; Halekas, Jasper; Connerney, Jack; Odstrcil, Dusan; Mays, M. Leila; Luhmann, Janet

    2015-04-01

    The Wang-Sheeley-Arge (WSA)-ENLIL solar wind modeling tool has been used to calculate the values of interplanetary magnetic field (IMF) strength (B), solar wind speed (V), density (n), ram pressure (~nV2), cross-magnetosphere electric field (VxB), Alfvén Mach number (MA), and other derived quantities of relevance for space weather purposes at Earth. Such parameters as solar wind dynamic pressure can be key for estimating the magnetopause standoff distance, as just one example. The interplanetary electric field drives many magnetospheric dynamical processes and can be compared with general magnetic activity indices and with the occurrence of energetic particle bursts within the Earth’s magnetosphere. Such parameters also serve as input to the global magnetohydrodynamic and kinetic magnetosphere models that are used to forecast magnetospheric and ionospheric processes. Such modeling done for Earth space weather forecasting has helped assess near-real-time magnetospheric behavior for MESSENGER at Mercury (as well as other mission analysis and Mercury ground-based observational campaigns). This solar-wind forcing knowledge has provided a crucial continuing step toward bringing heliospheric science expertise to bear on solar-planetary interaction studies. The experience gained from MESSENGER at Mercury is now being applied to the new observations from the MAVEN (Mars Atmosphere and Volatile Evolution) mission at Mars. We compare the continuous WSA-ENLIL results derived from modeling to the MAVEN SWIA and MAG data from mid-December 2014 to the present time. This provides a broader contextual view of solar wind forcing at Mars and also allows a broader validation of the ENLIL model results throughout the inner heliosphere.

  19. Reply to the Comment of Leclercq et al. on "100-year mass changes in the Swiss Alps linked to the Atlantic Multidecadal Oscillation"

    Directory of Open Access Journals (Sweden)

    M. Huss

    2010-12-01

    Full Text Available In their comment, Leclercq et al. argue that Huss et al. (2010 overestimate the effect of the Atlantic Multidecadal Oscillation (AMO on the 100-year mass balance variations in the Swiss Alps because time series of conventional balances instead of reference-surface balances were used. Applying the same model as in Huss et al. we calculate time series of reference-surface mass balance, and show that the difference between conventional and reference-surface mass balance is significantly smaller than stated in the comment. Both series exhibit very similar multidecadal variations. The opposing effects of retreat and surface lowering on mass balance partly cancel each other.

  20. The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review

    OpenAIRE

    Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza

    2014-01-01

    Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strength...

  1. Tool for Experimenting with Concepts of Mobile Robotics as Applied to Children's Education

    Science.gov (United States)

    Jimenez Jojoa, E. M.; Bravo, E. C.; Bacca Cortes, E. B.

    2010-01-01

    This paper describes the design and implementation of a tool for experimenting with mobile robotics concepts, primarily for use by children and teenagers, or by the general public, without previous experience in robotics. This tool helps children learn about science in an approachable and interactive way, using scientific research principles in…

  2. XVII International Botanical Congress. 100 years after the II IBC in Vienna 1905. Abstracts

    International Nuclear Information System (INIS)

    Full text: The program of XVII IBC 2005 includes all aspects of basic and applied botanical research. Progress in the different sub-disciplines is revealed through plenary talks, general lectures, symposia, and poster sessions. This conference emphasizes the newest developments in the botanical sciences worldwide. (botek)

  3. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  4. 100 years of elementary particles [Beam Line, vol. 27, number 1, Spring 1997

    International Nuclear Information System (INIS)

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe

  5. 100 years of elementary particles [Beam Line, vol. 27, issue 1, Spring 1997

    Energy Technology Data Exchange (ETDEWEB)

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K.H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  6. 100 years of Elementary Particles [Beam Line, vol. 27, issue 1, Spring 1997

    Science.gov (United States)

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K. H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  7. Development of an intelligent system for tool wear monitoring applying neural networks

    Directory of Open Access Journals (Sweden)

    A. Antić

    2005-12-01

    Full Text Available Purpose: The objective of the researches presented in the paper is to investigate, in laboratory conditions, the application possibilities of the proposed system for tool wear monitoring in hard turning, using modern tools and artificial intelligence (AI methods.Design/methodology/approach: On the basic theoretical principles and the use of computing methods of simulation and neural network training, as well as the conducted experiments, have been directed to investigate the adequacy of the setting.Findings: The paper presents tool wear monitoring for hard turning for certain types of neural network configurations where there are preconditions for up building with dynamic neural networks.Research limitations/implications: Future researches should include the integration of the proposed system into CNC machine, instead of the current separate system, which would provide synchronisation between the system and the machine, i.e. the appropriate reaction by the machine after determining excessive tool wear.Practical implications: Practical application of the conducted research is possible with certain restrictions and supplement of adequate number of experimental researches which would be directed towards certain combinations of machining materials and tools for which neural networks are trained.Originality/value: The contribution of the conducted research is observed in one possible view of the tool monitoring system model and it’s designing on modular principle, and principle building neural network.

  8. Hamiltonian Systems and Optimal Control in Computational Anatomy: 100 Years Since D'Arcy Thompson.

    Science.gov (United States)

    Miller, Michael I; Trouvé, Alain; Younes, Laurent

    2015-01-01

    The Computational Anatomy project is the morphome-scale study of shape and form, which we model as an orbit under diffeomorphic group action. Metric comparison calculates the geodesic length of the diffeomorphic flow connecting one form to another. Geodesic connection provides a positioning system for coordinatizing the forms and positioning their associated functional information. This article reviews progress since the Euler-Lagrange characterization of the geodesics a decade ago. Geodesic positioning is posed as a series of problems in Hamiltonian control, which emphasize the key reduction from the Eulerian momentum with dimension of the flow of the group, to the parametric coordinates appropriate to the dimension of the submanifolds being positioned. The Hamiltonian viewpoint provides important extensions of the core setting to new, object-informed positioning systems. Several submanifold mapping problems are discussed as they apply to metamorphosis, multiple shape spaces, and longitudinal time series studies of growth and atrophy via shape splines.

  9. Accumulation of pharmaceuticals, Enterococcus, and resistance genes in soils irrigated with wastewater for zero to 100 years in central Mexico.

    Directory of Open Access Journals (Sweden)

    Philipp Dalkmann

    Full Text Available Irrigation with wastewater releases pharmaceuticals, pathogenic bacteria, and resistance genes, but little is known about the accumulation of these contaminants in the environment when wastewater is applied for decades. We sampled a chronosequence of soils that were variously irrigated with wastewater from zero up to 100 years in the Mezquital Valley, Mexico, and investigated the accumulation of ciprofloxacin, enrofloxacin, sulfamethoxazole, trimethoprim, clarithromycin, carbamazepine, bezafibrate, naproxen, diclofenac, as well as the occurrence of Enterococcus spp., and sul and qnr resistance genes. Total concentrations of ciprofloxacin, sulfamethoxazole, and carbamazepine increased with irrigation duration reaching 95% of their upper limit of 1.4 µg/kg (ciprofloxacin, 4.3 µg/kg (sulfamethoxazole, and 5.4 µg/kg (carbamazepine in soils irrigated for 19-28 years. Accumulation was soil-type-specific, with largest accumulation rates in Leptosols and no time-trend in Vertisols. Acidic pharmaceuticals (diclofenac, naproxen, bezafibrate were not retained and thus did not accumulate in soils. We did not detect qnrA genes, but qnrS and qnrB genes were found in two of the irrigated soils. Relative concentrations of sul1 genes in irrigated soils were two orders of magnitude larger (3.15 × 10(-3 ± 0.22 × 10(-3 copies/16S rDNA than in non-irrigated soils (4.35 × 10(-5± 1.00 × 10(-5 copies/16S rDNA, while those of sul2 exceeded the ones in non-irrigated soils still by a factor of 22 (6.61 × 10(-4 ± 0.59 × 10(-4 versus 2.99 × 10(-5 ± 0.26 × 10(-5 copies/16S rDNA. Absolute numbers of sul genes continued to increase with prolonging irrigation together with Enterococcus spp. 23S rDNA and total 16S rDNA contents. Increasing total concentrations of antibiotics in soil are not accompanied by increasing relative abundances of resistance genes. Nevertheless, wastewater irrigation enlarges the absolute concentration of resistance genes in soils due to a

  10. An assessment tool applied to manure management systems using innovative technologies

    DEFF Research Database (Denmark)

    Sørensen, Claus G.; Jacobsen, Brian H.; Sommer, Sven G.

    2003-01-01

    operational and cost-effective animal manure handling technologies. An assessment tool covering the whole chain of the manure handling system from the animal houses to the field has been developed. The tool enables a system-oriented evaluation of labour demand, machinery capacity and costs related to the...... tanker transport may reduce labour requirements, increase capacity, and open up new ways for reducing ammonia emission. In its most efficient configuration, the use of umbilical systems may reduce the labour requirement by about 40% and increase capacity by 80%. However, these systems are costly and will...

  11. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  12. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Science.gov (United States)

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  13. Changing patterns of infant death over the last 100 years: autopsy experience from a specialist children's hospital

    OpenAIRE

    Pryce, J. W.; Weber, M A; Ashworth, M T; Roberts, S; Malone, M.; Sebire, N. J.

    2012-01-01

    OBJECTIVES: Infant mortality has undergone a dramatic reduction in the UK over the past century because of improvements in public health policy and medical advances. Postmortem examinations have been performed at Great Ormond Street Hospital for over 100 years, and analysis of cases across this period has been performed to assess changing patterns of infant deaths undergoing autopsy. DESIGN: Autopsy reports from 1909 and 2009 were examined. Age, major pathology and cause of death was reviewed...

  14. Adaptive Monte Carlo applied to uncertainty estimation in a five axis machine tool link errors identification

    CERN Document Server

    Andolfatto, Loïc; Lavernhe, Sylvain; 10.1016/j.ijmachtools.2011.03.006

    2011-01-01

    Knowledge of a machine tool axis to axis location errors allows compensation and correcting actions to be taken to enhance its volumetric accuracy. Several procedures exist, involving either lengthy individual test for each geometric error or faster single tests to identify all errors at once. This study focuses on the closed kinematic Cartesian chain method which uses a single setup test to identify the eight link errors of a five axis machine tool. The identification is based on volumetric error measurements for different poses with a non-contact measuring instrument called CapBall, developed in house. In order to evaluate the uncertainty on each identified error, a multi-output Monte Carlo approach is implemented. Uncertainty sources in the measurement and identification chain - such as sensors output, machine drift and frame transformation uncertainties - can be included in the model and propagated to the identified errors. The estimated uncertainties are finally compared to experimental results to assess...

  15. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    OpenAIRE

    Mari-Carmen Mochón

    2016-01-01

    After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order t...

  16. Applying quality management tools to medical photography services: a pilot project.

    Science.gov (United States)

    Murray, Peter

    2003-03-01

    The Medical Photography Department at Peterborough Hospitals NHS Trust set up a pilot project to reduce the turnaround time of fundus fluorescein angiograms to the Ophthalmology Department. Quality management tools were used to analyse current photographic practices and develop more efficient methods of service delivery. The improved service to the Ophthalmology Department demonstrates the value of quality management in developing medical photography services at Peterborough Hospitals.

  17. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    OpenAIRE

    Freddy Abel Vargas; Óscar Fernando Castellanos Domínguez

    2010-01-01

    The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature) is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced...

  18. Applying Model Driven Engineering Techniques and Tools to the Planets Game Learning Scenario

    OpenAIRE

    Nodenot, Thierry; Caron, Pierre André; Le Pallec, Xavier; Laforcade, Pierre

    2008-01-01

    CPM (Cooperative Problem-Based learning Metamodel) is a visual language for the instructional design of Problem-Based Learning (PBL) situations. This language is a UML profile implemented on top of the Objecteering UML Case tool. In this article, we first present the way we used CPM language to bring about the pedagogical transposition of the planets game learning scenario. Then, we propose some related works conducted to improve CPM usability: on the one hand, we outline a MOF solution and a...

  19. δ18O record and temperature change over the past 100 years in ice cores on the Tibetan Plateau

    Institute of Scientific and Technical Information of China (English)

    YAO; Tandong; GUO; Xuejun; Lonnie; Thompson; DUAN; Keqin; WANG; Ninglian; PU; Jianchen; XU; Baiqing; YANG; Xiaoxin; SUN; Weizhen

    2006-01-01

    The 213 m ice core from the Puruogangri Ice Field on the Tibetan Plateau facilitates the study of the regional temperature changes with its δ18O record of the past 100 years. Here we combine information from this core with that from the Dasuopu ice core (from the southern Tibetan Plateau), the Guliya ice core (from the northwestern Plateau) and the Dunde ice core (from the northeastern Plateau) to learn about the regional differences in temperature change across the Tibetan Plateau. The δ18O changes vary with region on the Plateau, the variations being especially large between South and North and between East and West. Moreover, these four ice cores present increasing δ18O trends, indicating warming on the Tibetan Plateau over the past 100 years. A comparative study of Northern Hemisphere (NH) temperature changes, the δ18O-reflected temperature changes on the Plateau, and available meteorological records show consistent trends in overall warming during the past 100 years.

  20. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  1. Atomic Force Microscopy as a Tool for Applied Virology and Microbiology

    Science.gov (United States)

    Zaitsev, Boris

    2003-12-01

    Atomic force microscope (AFM) can be successfully used for simple and fast solution of many applied biological problems. In this paper the survey of the results of the application of atomic force microscope SolverP47BIO (NT-MDT, Russia) in State Research Center of Virology and Biotechnology "Vector" is presented. The AFM has been used: - in applied virology for the counting of viral particles and examination of virus-cell interaction; - in microbiology for measurements and indication of bacterial spores and cells; - in biotechnology for control of biotechnological processes and evaluation of the distribution of particle dimension for viral and bacterial diagnostic assays. The main advantages of AFM in applied researches are simplicity of the processing of sample preparation and short time of the examination.

  2. Applying CRISPR-Cas9 tools to identify and characterize transcriptional enhancers.

    Science.gov (United States)

    Lopes, Rui; Korkmaz, Gozde; Agami, Reuven

    2016-09-01

    The development of the CRISPR-Cas9 system triggered a revolution in the field of genome engineering. Initially, the use of this system was focused on the study of protein-coding genes but, recently, a number of CRISPR-Cas9-based tools have been developed to study non-coding transcriptional regulatory elements. These technological advances offer unprecedented opportunities for elucidating the functions of enhancers in their endogenous context. Here, we discuss the application, current limitations and future development of CRISPR-Cas9 systems to identify and characterize enhancer elements in a high-throughput manner.

  3. QUALITY MANAGEMENT TOOLS APPLYING IN THE STRATEGY OF LOGISTICS SERVICES QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Agnieszka Czajkowska

    2015-11-01

    Full Text Available Combination of factors such as: properly organized logistics process, lack of nonconformities, transport damages avoiding and transport in accordance Just In Time idea significantly reduces costs and streamlines the entire production process. This paper proposes the quality management tool for the logistics services assessment based on the results obtained in the selected company operating in Eastern Europe. Customers’ expectations and perceptions were compared using the SERVQUAL method that concerns the service quality assessment in five areas such as: materiality, reliability, promptness, competency and empathy. The research method SERVQUAL allows assessing the service quality level and identifying company areas that requires corrective actions within the improvement process.

  4. Pin Load Control Applied to Retractable Pin Tool Technology and its Characterization

    Science.gov (United States)

    Oelgoetz, Peter

    2000-01-01

    Until the development of retractable pin tool (RPT) technology, friction stir welding (FSW) was limited to constant thickness joining of aluminum materials and the choices of keyhole elimination focused on traditional fusion and plug weld repair techniques. An invention, US Patent Number 5,893.507, "Auto-Adjustable Pin Tool for Friction Stir Welding" assigned to NASA, demonstrated an approach to resolve these serious drawbacks. This approach brings forth a technique that allows the crater, or keyhole, to be closed out automatically at the end of the weld joint without adding any additional equipment or material. Also the probe length can be varied automatically in the weld joint to compensate for material thickness changes, such as, in a tapered joint. This paper reports the effects of pin extension and retraction rates in the weld joint and its correlation to weld quality. The investigation utilized a pin load-detecting device that was integrated in the Phase 2A RPT designed by Boeing for NASA/MSFC. The RPT modification provided pin load data that was accessed and used to eliminate root side indications and determine pin manipulation rates necessary to produce consistence homogeneous joints.

  5. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    Science.gov (United States)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  6. Orymold: ontology based gene expression data integration and analysis tool applied to rice

    Directory of Open Access Journals (Sweden)

    Segura Jordi

    2009-05-01

    Full Text Available Abstract Background Integration and exploration of data obtained from genome wide monitoring technologies has become a major challenge for many bioinformaticists and biologists due to its heterogeneity and high dimensionality. A widely accepted approach to solve these issues has been the creation and use of controlled vocabularies (ontologies. Ontologies allow for the formalization of domain knowledge, which in turn enables generalization in the creation of querying interfaces as well as in the integration of heterogeneous data, providing both human and machine readable interfaces. Results We designed and implemented a software tool that allows investigators to create their own semantic model of an organism and to use it to dynamically integrate expression data obtained from DNA microarrays and other probe based technologies. The software provides tools to use the semantic model to postulate and validate of hypotheses on the spatial and temporal expression and function of genes. In order to illustrate the software's use and features, we used it to build a semantic model of rice (Oryza sativa and integrated experimental data into it. Conclusion In this paper we describe the development and features of a flexible software application for dynamic gene expression data annotation, integration, and exploration called Orymold. Orymold is freely available for non-commercial users from http://www.oryzon.com/media/orymold.html

  7. Enabling to Apply XP Process in Distributed Development Environments with Tool Support

    Directory of Open Access Journals (Sweden)

    Ali Akbar Ansari

    2012-07-01

    Full Text Available The evaluation both in academic and industrial areas of the XP methodology has shown very good results if applied to small/medium co-localized working groups. In this paper, we described an approach that overcomes the XP constraint of collocation by introducing a process-support environment (called M.P.D.X.P that helps software development teams and solves the problems which arise when XP is carried out by distributed teams.

  8. 100 years of radar

    CERN Document Server

    Galati, Gaspare

    2016-01-01

    This book offers fascinating insights into the key technical and scientific developments in the history of radar, from the first patent, taken out by Hülsmeyer in 1904, through to the present day. Landmark events are highlighted and fascinating insights provided into the exceptional people who made possible the progress in the field, including the scientists and technologists who worked independently and under strict secrecy in various countries across the world in the 1930s and the big businessmen who played an important role after World War II. The book encourages multiple levels of reading. The author is a leading radar researcher who is ideally placed to offer a technical/scientific perspective as well as a historical one. He has taken care to structure and write the book in such a way as to appeal to both non-specialists and experts. The book is not sponsored by any company or body, either formally or informally, and is therefore entirely unbiased. The text is enriched by approximately three hundred ima...

  9. 100 years of superconductivity

    CERN Multimedia

    Globe Info

    2011-01-01

    Public lecture by Philippe Lebrun, who works at CERN on applications of superconductivity and cryogenics for particle accelerators. He was head of CERN’s Accelerator Technology Department during the LHC construction period. Centre culturel Jean Monnet, route de Gex Tuesday 11 October from 8.30 p.m. to 10.00 p.m. » Suitable for all – Admission free - Lecture in French » Number of places limited For further information: +33 (0)4 50 42 29 37

  10. Applied Railway Optimization in Production Planning at DSB-S-tog - Tasks, Tools and Challenges

    DEFF Research Database (Denmark)

    Clausen, Jens

    2007-01-01

    these conflicting goals. S-tog has therefore on the strategic level decided to use software with optimization capabilities in the planning processes. We describe the current status for each activity using optimization or simulation as a tool: Timetable evaluation, rolling stock planning, and crew scheduling...... to the customers, and has concurrently been met with demands for higher efficiency in the daily operation. The plans of timetable, rolling stock and crew must hence allow for a high level of customer service, be efficient, and be robust against disturbances of operations. It is a highly non-trivial task to meet....... In addition we describe on-going efforts in using mathematical models in activities such as timetable design and work-force planning. We also identify some organizatorial key factors, which have paved the way for extended use of optimization methods in railway production planning....

  11. Teaching Strategies to Apply in the Use of Technological Tools in Technical Education

    Directory of Open Access Journals (Sweden)

    Olga Arranz García

    2014-09-01

    Full Text Available The emergence of new technologies in education area is changing the way of organizing the educational processes. Teachers are not unrelated to these changes and must employ new strategies to adapt their teaching methods to the new circumstances. One of these adaptations is framed in the virtual learning, where the learning management systems have been revealed as a very effective means within the learning process. In this paper we try to provide teachers in engineering schools how to use in an appropriate way the different technological tools that are present in a virtual platform. Thus, in the experimental framework we show the results outcomes in the analysis of two data samples obtained before and after the implementation of the European Higher Education Area, that would be extrapolated for its innovative application to the learning techniques.

  12. Quantitative seismic interpretation: Applying rock physics tools to reduce interpretation risk

    Institute of Scientific and Technical Information of China (English)

    Yong Chen

    2007-01-01

    @@ Seismic data analysis is one of the key technologies for characterizing reservoirs and monitoring subsurface pore fluids. While there have been great advances in 3D seismic data processing, the quantitative interpretation of the seismic data for rock properties still poses many challenges. This book demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes, as well as how the multidisciplinary combination of rock physics models with seismic data, sedimentological information, and stochastic techniques can lead to more powerful results than can be obtained from a single technique.

  13. Prediction of permafrost distribution on the Qinghai-Tibet Plateau in the next 50 and 100 years

    Institute of Scientific and Technical Information of China (English)

    NAN; Zhuotong; LI; Shuxun; CHENG; Guodong

    2005-01-01

    Intergovernmental Panel on Climate Change (IPCC) in 2001 reported that the Earth air temperature would rise by 1.4-5.8℃ and 2.5℃ on average by the year 2100. China regional climate model results also showed that the air temperature on the Qinghai-Tibet Plateau (QTP) would increase by 2.2-2.6℃ in the next 50 years. A numerical permafrost model was ed to predict the changes of permafrost distribution on the QTP over the next 50 and 100 years under the two climatic warming scenarios, i.e. 0.02℃/a, the lower value of IPCC's estimation, and 0.052℃/a, the higher value predicted by Qin et al. Simulation results show that ( i ) in the case of 0.02℃/a air-temperature rise, permafrost area on the QTP will shrink about 8.8% in the next 50 years, and high temperature permafrost with mean annual ground temperature (MAGT) higher than -0.11℃ may turn into seasonal frozen soils. In the next 100 years, permafrost with MAGT higher than -0.5℃ will disappear and the permafrost area will shrink up to 13.4%. (ii) In the case of 0.052℃/a air-temperature rise, permafrost area on the QTP will reduce about 13.5% after 50 years. More remarkable degradation will take place after 100 years, and permafrost area will reduce about 46%. Permafrost with MAGT higher than -2℃ will turn into seasonal frozen soils and even unfrozen soils.

  14. The potential of social entrepreneurship: conceptual tools for applying citizenship theory to policy and practice.

    Science.gov (United States)

    Caldwell, Kate; Harris, Sarah Parker; Renko, Maija

    2012-12-01

    Contemporary policy encourages self-employment and entrepreneurship as a vehicle for empowerment and self-sufficiency among people with disabilities. However, such encouragement raises important citizenship questions concerning the participation of people with intellectual and developmental disabilities (IDD). As an innovative strategy for addressing pressing social and economic problems, "social entrepreneurship" has become a phrase that is gaining momentum in the IDD community--one that carries with it a very distinct history. Although social entrepreneurship holds the potential to be an empowering source of job creation and social innovation, it also has the potential to be used to further disenfranchise this marginalized population. It is crucial that in moving forward society takes care not to perpetuate existing models of oppression, particularly in regard to the social and economic participation of people with IDD. The conceptual tools addressed in this article can inform the way that researchers, policymakers, and practitioners approach complex issues, such as social entrepreneurship, to improve communication among disciplines while retaining an integral focus on rights and social justice by framing this issue within citizenship theory.

  15. Neutron tomography of particulate filters: a non-destructive investigation tool for applied and industrial research

    International Nuclear Information System (INIS)

    This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerized reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). This effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF

  16. Environmental management systems tools applied to the nuclear fuel center of IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Mattos, Luis A. Terribile de; Meldonian, Nelson Leon; Madi Filho, Tufic, E-mail: mattos@ipen.br, E-mail: meldonia@ipen.br, E-mail: tmfilho@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  17. Applied Circular Dichroism: A Facile Spectroscopic Tool for Configurational Assignment and Determination of Enantiopurity

    Directory of Open Access Journals (Sweden)

    Macduff O. Okuom

    2015-01-01

    Full Text Available In order to determine if electronic circular dichroism (ECD is a good tool for the qualitative evaluation of absolute configuration and enantiopurity in the absence of chiral high performance liquid chromatography (HPLC, ECD studies were performed on several prescriptions and over-the-counter drugs. Cotton effects (CE were observed for both S and R isomers between 200 and 300 nm. For the drugs examined in this study, the S isomers showed a negative CE, while the R isomers displayed a positive CE. The ECD spectra of both enantiomers were nearly mirror images, with the amplitude proportional to the enantiopurity. Plotting the differential extinction coefficient (Δε versus enantiopurity at the wavelength of maximum amplitude yielded linear standard curves with coefficients of determination (R2 greater than 97% for both isomers in all cases. As expected, Equate, Advil, and Motrin, each containing a racemic mixture of ibuprofen, yielded no chiroptical signal. ECD spectra of Suphedrine and Sudafed revealed that each of them is rich in 1S,2S-pseudoephedrine, while the analysis of Equate vapor inhaler is rich in R-methamphetamine.

  18. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2016-03-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  19. 3&4D Geomodeling Applied to Mineral Resources Exploration - A New Tool for Targeting Deposits.

    Science.gov (United States)

    Royer, Jean-Jacques; Mejia, Pablo; Caumon, Guillaume; Collon-Drouaillet, Pauline

    2013-04-01

    3 & 4D geomodeling, a computer method for reconstituting the past deformation history of geological formations, has been used in oil and gas exploration for more than a decade for reconstituting fluid migration. It begins nowadays to be applied for exploring with new eyes old mature mining fields and new prospects. We describe shortly the 3&4D geomodeling basic notions, concepts, and methodology when applied to mineral resources assessment and modeling ore deposits, pointing out the advantages, recommendations and limitations, together with new challenges they rise. Several 3D GeoModels of mining explorations selected across Europe will be presented as illustrative case studies which have been achieved during the EU FP7 ProMine research project. It includes: (i) the Cu-Au porphyry deposits in the Hellenic Belt (Greece); (ii) the VMS in the Iberian Pyrite Belt including the Neves Corvo deposit (Portugal) and (iii) the sediment-hosted polymetallic Cu-Ag (Au, PGE) Kupferschiefer ore deposit in the Foresudetic Belt (Poland). In each case full 3D models using surfaces and regular grid (Sgrid) were built from all dataset available from exploration and exploitation including geological primary maps, 2D seismic cross-sections, and boreholes. The level of knowledge may differ from one site to another however those 3D resulting models were used to pilot additional field and exploration works. In the case of the Kupferschiefer, a sequential restoration-decompaction (4D geomodeling) from the Upper Permian to Cenozoic was conducted in the Lubin- Sieroszowice district of Poland. The results help in better understanding the various superimposed mineralization events which occurred through time in this copper deposit. A hydro-fracturing index was then calculated from the estimated overpressures during a Late Cretaceous-Early Paleocene up-lifting, and seems to correlate with the copper content distribution in the ore-series. These results are in agreement with an Early Paleocene

  20. A practical guide to applying lean tools and management principles to health care improvement projects.

    Science.gov (United States)

    Simon, Ross W; Canacari, Elena G

    2012-01-01

    Manufacturing organizations have used Lean management principles for years to help eliminate waste, streamline processes, and cut costs. This pragmatic approach to structured problem solving can be applied to health care process improvement projects. Health care leaders can use a step-by-step approach to document processes and then identify problems and opportunities for improvement using a value stream process map. Leaders can help a team identify problems and root causes and consider additional problems associated with methods, materials, manpower, machinery, and the environment by using a cause-and-effect diagram. The team then can organize the problems identified into logical groups and prioritize the groups by impact and difficulty. Leaders must manage action items carefully to instill a sense of accountability in those tasked to complete the work. Finally, the team leaders must ensure that a plan is in place to hold the gains.

  1. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching......This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case......-based reasoning. This paper briefly describes the model's theoretical fundamentals and its conceptual structure; conducts a detailed introduction of the critical elements within the model; exhibits a real world application of the model; and summarizes the review of the model through academia and practice. Finds...

  2. Spatio-temporal analysis of rainfall trends over a maritime state (Kerala) of India during the last 100 years

    Science.gov (United States)

    Nair, Archana; Ajith Joseph, K.; Nair, K. S.

    2014-05-01

    Kerala, a maritime state of India is bestowed with abundant rainfall which is about three times the national average. This study is conducted to have a better understanding of rainfall variability and trend at regional level for this state during the last 100 years. It is found that the rainfall variation in northern and southern regions of Kerala is large and the deviation is on different timescales. There is a shifting of rainfall mean and variability during the seasons. The trend analysis on rainfall data over the last 100 years reveals that there is a significant (99%) decreasing trend in most of the regions of Kerala especially in the month of January, July and November. The annual and seasonal trends of rainfall in most regions of Kerala are also found to be decreasing significantly. This decreasing trend may be related to global anomalies as a result of anthropogenic green house gas (GHG) emissions due to increased fossil fuel use, land-use change due to urbanisation and deforestation, proliferation in transportation associated atmospheric pollutants. We have also conducted a study of the seasonality index (SI) and found that only one district in the northern region (Kasaragod) has seasonality index of more than 1 and that the distribution of monthly rainfall in this district is mostly attributed to 1 or 2 months. In rest of the districts, the rainfall is markedly seasonal. The trend in SI reveals that the rainfall distribution in these districts has become asymmetric with changes in rainfall distribution.

  3. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  4. Snooker: a structure-based pharmacophore generation tool applied to class A GPCRs.

    Science.gov (United States)

    Sanders, Marijn P A; Verhoeven, Stefan; de Graaf, Chris; Roumen, Luc; Vroling, Bas; Nabuurs, Sander B; de Vlieg, Jacob; Klomp, Jan P G

    2011-09-26

    G-protein coupled receptors (GPCRs) are important drug targets for various diseases and of major interest to pharmaceutical companies. The function of individual members of this protein family can be modulated by the binding of small molecules at the extracellular side of the structurally conserved transmembrane (TM) domain. Here, we present Snooker, a structure-based approach to generate pharmacophore hypotheses for compounds binding to this extracellular side of the TM domain. Snooker does not require knowledge of ligands, is therefore suitable for apo-proteins, and can be applied to all receptors of the GPCR protein family. The method comprises the construction of a homology model of the TM domains and prioritization of residues on the probability of being ligand binding. Subsequently, protein properties are converted to ligand space, and pharmacophore features are generated at positions where protein ligand interactions are likely. Using this semiautomated knowledge-driven bioinformatics approach we have created pharmacophore hypotheses for 15 different GPCRs from several different subfamilies. For the beta-2-adrenergic receptor we show that ligand poses predicted by Snooker pharmacophore hypotheses reproduce literature supported binding modes for ∼75% of compounds fulfilling pharmacophore constraints. All 15 pharmacophore hypotheses represent interactions with essential residues for ligand binding as observed in mutagenesis experiments and compound selections based on these hypotheses are shown to be target specific. For 8 out of 15 targets enrichment factors above 10-fold are observed in the top 0.5% ranked compounds in a virtual screen. Additionally, prospectively predicted ligand binding poses in the human dopamine D3 receptor based on Snooker pharmacophores were ranked among the best models in the community wide GPCR dock 2010.

  5. Effects of 100 years wastewater irrigation on resistance genes, class 1 integrons and IncP-1 plasmids in Mexican soil

    Directory of Open Access Journals (Sweden)

    Sven eJechalke

    2015-03-01

    Full Text Available Long-term irrigation with untreated wastewater can lead to an accumulation of antibiotic substances and antibiotic resistance genes in soil. However, little is known so far about effects of wastewater, applied for decades, on the abundance of IncP-1 plasmids and class 1 integrons which may contribute to the accumulation and spread of resistance genes in the environment, and their correlation with heavy metal concentrations.Therefore, a chronosequence of soils that were irrigated with wastewater from zero to 100 years was sampled in the Mezquital Valley in Mexico in the dry season. The total community DNA was extracted and the absolute and relative abundance (relative to 16S rRNA genes of antibiotic resistance genes (tet(W, tet(Q, aadA, class 1 integrons (intI1, quaternary ammonium compound resistance genes (qacE+qacEΔ1 and IncP-1 plasmids (korB were quantified by real-time PCR. Except for intI1 and qacE+qacEΔ1 the abundances of selected genes were below the detection limit in non-irrigated soil. Confirming the results of a previous study, the absolute abundance of 16S rRNA genes in the samples increased significantly over time (linear regression model, p < 0.05 suggesting an increase in bacterial biomass due to repeated irrigation with wastewater. Correspondingly, all tested antibiotic resistance genes as well as intI1 and korB significantly increased in abundance over the period of 100 years of irrigation. In parallel, concentrations of the heavy metals Zn, Cu, Pb, Ni, and Cr significantly increased. However, no significant positive correlations were observed between the relative abundance of selected genes and years of irrigation, indicating no enrichment in the soil bacterial community due to repeated wastewater irrigation or due to a potential co-selection by increasing concentrations of heavy metals.

  6. 1,100 years after an earthquake: modification of the earthquake record by submergence, Puget Lowland, Washington State

    Science.gov (United States)

    Arcos, M. E.

    2011-12-01

    Crustal faults may pose a complicated story for earthquake reconstruction. In some cases, regional tectonic strain overprints the record of coseismic land-level changes. This study looks at the record of earthquakes at two sites in the Puget Lowland, Gorst and the Skokomish delta, and how post-earthquake submergence modified the paleoseismic records. The Puget Lowland is the slowly subsiding forearc basin of the northern Cascadia subduction zone. A series of active thrust faults cross this lowland. Several of these faults generated large (M7+) earthquakes, about 1,100 years ago and both field sites have submerged at least 1.5 m since that time. This submergence masked the geomorphic record of uplift in some areas, resulting in a misreading of the zone of earthquake deformation and potential misinterpretation of the underlying fault structure. Earthquakes ~1,100 years ago uplifted both field localities and altered river dynamics. At Gorst, a tsunami and debris flow accompanied uplift of at least 3 m by the Seattle fault. The increased sediment load resulted in braided stream formation for a period after the earthquake. At the Skokomish delta, differential uplift trapped the river on the eastern side of the delta for the last 1,100 years resulting in an asymmetric intertidal zone, 2-km wider on one side of the delta than the other. The delta slope or submergence may contribute to high rates of flooding on the Skokomish River. Preliminary results show the millennial scale rates of submergence vary with the southern Puget Lowland submerging at a faster rate than the northern Puget Lowland. This submergence complicates the reconstruction of past earthquakes and renders assessment of future hazards difficult for those areas that are based on uplifted marine platforms and other coastal earthquake signatures in several ways. 1) Post-earthquake submergence reduces the apparent uplift of marine terraces. 2) Submergence makes zones of earthquake deformation appear narrower. 3

  7. Are Geodetically and Geologically Constrained Vertical Deformation Models Compatible With the 100-Year Coastal Tide Gauge Record in California?

    Science.gov (United States)

    Smith-Konter, B. R.; Sandwell, D. T.

    2006-12-01

    Sea level change has been continuously recorded along the California coastline at several tide gauge stations for the past 50-100 years. These stations provide a temporal record of sea level change, generally attributed to post-glacial rebound and ocean climate phenomena. However, geological processes, including displacements from large earthquakes, have also been shown to produce sea level variations. Furthermore, the vertical tectonic response to interseismic strain accumulation in regions of major fault bends has been shown to produce uplift and subsidence rates consistent with sea level trends. To investigate the long-term extent and implication of tectonic deformation on sea level change, we compare time series data from California tide gauge stations to model estimates of vertical displacements produced by earthquake cycle deformation. Using a 3-D semi-analytic viscoelastic model, we combine geologic slip rates, geodetic velocities, and historical seismic data to simulate both horizontal and vertical deformation of the San Andreas Fault System. Using this model, we generate a time-series of vertical displacements spanning the 100-year sea level record and compare this to tide gauge data provided by the Permanent Service for Mean Sea Level (PSMSL). Comparison between sea level data and a variety of geologically and geodetically constrained models confirms that the two are highly compatible. Vertical displacements are largely controlled by interseismic strain accumulation, however displacements from major earthquakes are also required to explain varying trends in the sea level data. Models based on elastic plate thicknesses of 30-50km and viscosities of 7x10^1^8-2x10^1^9 Pa-s produce vertical displacements at tide-gauge locations that explain long-term trends in the sea level record to a high degree of accuracy at nearly all stations. However, unmodeled phenomena are also present in the sea level data and require further inspection.

  8. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    International Nuclear Information System (INIS)

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem

  9. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    Energy Technology Data Exchange (ETDEWEB)

    Gustafsson, David; Jansson, Per-Erik [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Land and Water Resources Engineering; Gaerdenaes, Annemieke [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Soil Sciences; Eckersten, Henrik [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Crop Production Ecology

    2006-12-15

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem.

  10. A centennial to celebrate : energy and minerals science and technology 100 years of excellence : improving the quality of life of Canadians through natural resources

    Energy Technology Data Exchange (ETDEWEB)

    Udd, J.; Reeve, D.

    2007-07-01

    The year 2007 marked the 100th anniversary of Natural Resources Canada's (NRCan) contribution to science and technology excellence in energy and minerals. This publication discussed the 100 years of excellence of the energy and minerals science and technology sector. It discussed the history of Natural Resources Canada, with reference to the early years; first fuel testing efforts; first World War; the 1920s and 1930s; second World War; post-war years; the 1970s and 1980s; and the 1990s to the present. The publication discussed the creation of the Canada Centre for Mineral and Energy Technology (CANMET) as well as some current NRCan science and technology activities, such as alternative energy programs; energy efficiency for buildings, industries and communities; clean coal; oil sands tailings and water management; community energy systems; renewable energy efficient technology projects (RET) such as RETscreen; hybrid scoop; the anti-vibration rock drill handle; mine waste management; and green mines-green energy. Other NRCan science and technology programs that were presented in the publication included materials technology laboratory relocation; corrosion management tools for the oil and gas pipeline industry; lightweight magnesium engine cradle; mine environment neutral drainage program; metallurgical processing; counter-terrorism; and clean energy. figs.

  11. Aboveground and belowground legacies of native Sami land use on boreal forest in northern Sweden 100 years after abandonment.

    Science.gov (United States)

    Freschet, Grégoire T; Ostlund, Lars; Kichenin, Emilie; Wardle, David A

    2014-04-01

    Human activities that involve land-use change often cause major transformations to community and ecosystem properties both aboveground and belowground, and when land use is abandoned, these modifications can persist for extended periods. However, the mechanisms responsible for rapid recovery vs. long-term maintenance of ecosystem changes following abandonment remain poorly understood. Here, we examined the long-term ecological effects of two remote former settlements, regularly visited for -300 years by reindeer-herding Sami and abandoned -100 years ago, within an old-growth boreal forest that is considered one of the most pristine regions in northern Scandinavia. These human legacies were assessed through measurements of abiotic and biotic soil properties and vegetation characteristics at the settlement sites and at varying distances from them. Low-intensity land use by Sami is characterized by the transfer of organic matter towards the settlements by humans and reindeer herds, compaction of soil through trampling, disappearance of understory vegetation, and selective cutting of pine trees for fuel and construction. As a consequence, we found a shift towards early successional plant species and a threefold increase in soil microbial activity and nutrient availability close to the settlements relative to away from them. These changes in soil fertility and vegetation contributed to 83% greater total vegetation productivity, 35% greater plant biomass, and 23% and 16% greater concentrations of foliar N and P nearer the settlements, leading to a greater quantity and quality of litter inputs. Because decomposer activity was also 40% greater towards the settlements, soil organic matter cycling and nutrient availability were further increased, leading to likely positive feedbacks between the aboveground and belowground components resulting from historic land use. Although not all of the activities typical of Sami have left visible residual traces on the ecosystem after

  12. Regime Shifts in Shallow Lakes: Responses of Cyanobacterial Blooms to Watershed Agricultural Phosphorus Loading Over the Last ~100 Years.

    Science.gov (United States)

    Vermaire, J. C.; Taranu, Z. E.; MacDonald, G. K.; Velghe, K.; Bennett, E.; Gregory-Eaves, I.

    2015-12-01

    Rapid changes in ecosystem states have occurred naturally throughout Earth's history. However, environmental changes that have taken place since the start of the Anthropocene may be destabilizing ecosystems and increasing the frequency of regime shifts in response to abrupt changes in external drivers or local intrinsic dynamics. To evaluate the relative influence of these forcers and improve our understanding of the impact of future change, we examined the effects of historical catchment phosphorus loading associated with agricultural land use on lake ecosystems, and whether this caused a shift from a stable, clear-water, regime to a turbid, cyanobacteria-dominated, state. The sedimentary pigments, diatom, and zooplankton (Cladocera) records from a currently clear-water shallow lake (Roxton Pond) and a turbid-water shallow lake (Petit lac Saint-François; PSF) were examined to determine if a cyanobacteria associated pigment (i.e. echinenone) showed an abrupt non-linear response to continued historical phosphorus load index (determined by phosphorus budget) over the last ~100 years. While PSF lake is presently in the turbid-water state, pigment and diatom analyses indicated that both lakes were once in the clear-water state, and that non-linear increases in catchment phosphorus balance resulted in an abrupt transition to cyanobacteria dominated states in each record. These results show that phosphorus loading has resulted in state shifts in shallow lake ecosystems that has been recorded across multiple paleolimnological indicators preserved in the sedimentary record.

  13. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    NARCIS (Netherlands)

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in fie

  14. 100-Year Floodplains, NC Floodplain Mapping Program data, Published in 2007, 1:12000 (1in=1000ft) scale, Iredell County GIS.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:12000 (1in=1000ft) scale, was produced all or in part from LIDAR information as of 2007. It is described as 'NC...

  15. 100-Year Floodplains, FEMA DFIRM preliminary map out now, to be published in 2009, Published in 2009, 1:12000 (1in=1000ft) scale, Brown County, WI.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:12000 (1in=1000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'FEMA...

  16. 100-Year Floodplains, St James FEMA Flood Map, Published in 2010, 1:24000 (1in=2000ft) scale, St James Parish Government.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Hardcopy Maps information as of 2010. It is described...

  17. 100-Year Floodplains, FloodZone; FEMA; Update Frequency is every five or ten years, Published in 2008, Athens-Clarke County Planning Department.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, was produced all or in part from Field Survey/GPS information as of 2008. It is described as 'FloodZone; FEMA; Update Frequency...

  18. 100-Year Floodplains, FEMA Floodway and Flood Boundary Maps, Published in 2005, 1:24000 (1in=2000ft) scale, Lafayette County Land Records.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2005. It is described as 'FEMA...

  19. 100-Year Floodplains, Flood plains from FEMA, Published in 2003, 1:600 (1in=50ft) scale, Town of Cary NC.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:600 (1in=50ft) scale, was produced all or in part from LIDAR information as of 2003. It is described as 'Flood...

  20. 100-Year Floodplains, Data provided by FEMA and WI DNR, Published in 2009, 1:2400 (1in=200ft) scale, Dane County Land Information Office.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale as of 2009. It is described as 'Data provided by FEMA and WI DNR'. Data by this publisher...

  1. The protection of Canfranc Internation Railway Sattion against natural risks. Analysis and evaluation of its effectiveness 100 years later.

    Science.gov (United States)

    Fabregas, S.; Hurtado, R.; Mintegui, J.

    2012-04-01

    In the late XIXth century and early XXth century, the international railway station in Canfranc "Los Arañones" is built in the Central Pyrenees of Huesca in Spain, along the border between France and Spain. Just after starting the construction of the huge station (250 m long), it was found that natural hazards such as flash floods, landslides, falling blocks and avalanches affected it and compromised the safety of users and infrastructures.Quickly, hydrological restoration works were carried out along "Los Arañones" gorgers basins to reduce joint residual risks. Longitudinal and transversal dams for floods, a large reforestation work to prevent against falling blocks, erosion, flooding and regarding avalanches stone walls were built, as well as benches of grit, snow rakes, and "empty dams", which were created as experimental structures to dissipate the energy of the avalanche in the track zone and wich do not exist anywhere else in the world. All the works were carried out mainly by hand, with materials such as stone, cement and iron. Over 2,500,000 holes were made for planting more than 15 different species of trees, and more than 400,000 tons of stone were moved to build more than 12 different kinds of control measures.It is essential to emphasize the empirical nature of these works and Canfranc's function as a "laboratory or field tests", with most of its structures still effective 100 years after its construction. The works involved about 30% of the total cost of the station in the early XX century. Nowadays to have an "equivalent protection" with the current technology, around 100 million euro should be invested. It is also necessary to validate the current effectiveness of such works, its maintenance task and the protective role of the forest.

  2. Acidophilic denitrifiers dominate the N2O production in a 100-year-old tea orchard soil.

    Science.gov (United States)

    Huang, Ying; Long, Xi-En; Chapman, Stephen J; Yao, Huaiying

    2015-03-01

    Aerobic denitrification is the main process for high N2O production in acid tea field soil. However, the biological mechanisms for the high emission are not fully understood. In this study, we examined N2O emission and denitrifier communities in 100-year-old tea soils with four pH levels (3.71, 5.11, 6.19, and 7.41) and four nitrate concentration (0, 50, 200, and 1000 mg kg(-1) of NO3 (-)-N) addition. Results showed the highest N2O emission (10.1 mg kg(-1) over 21 days) from the soil at pH 3.71 with 1000 mg kg(-1) NO3 (-) addition. The N2O reduction and denitrification enzyme activity in the acid soils (pH pH 7.41. Moreover, TRF 78 of nirS and TRF 187 of nosZ dominated in soils of pH 3.71, suggesting an important role of acidophilic denitrifiers in N2O production and reduction. CCA analysis also showed a negative correlation between the dominant denitrifier ecotypes (nirS TRF 78, nosZ TRF 187) and soil pH. The representative sequences were identical to those of cultivated denitrifiers from acidic soils via phylogenetic tree analysis. Our results showed that the acidophilic denitrifier adaptation to the acid environment results in high N2O emission in this highly acidic tea soil. PMID:25273518

  3. The story of the Hawaiian Volcano Observatory -- A remarkable first 100 years of tracking eruptions and earthquakes

    Science.gov (United States)

    Babb, Janet L.; Kauahikaua, James P.; Tilling, Robert I.

    2011-01-01

    The year 2012 marks the centennial of the Hawaiian Volcano Observatory (HVO). With the support and cooperation of visionaries, financiers, scientists, and other individuals and organizations, HVO has successfully achieved 100 years of continuous monitoring of Hawaiian volcanoes. As we celebrate this milestone anniversary, we express our sincere mahalo—thanks—to the people who have contributed to and participated in HVO’s mission during this past century. First and foremost, we owe a debt of gratitude to the late Thomas A. Jaggar, Jr., the geologist whose vision and efforts led to the founding of HVO. We also acknowledge the pioneering contributions of the late Frank A. Perret, who began the continuous monitoring of Kīlauea in 1911, setting the stage for Jaggar, who took over the work in 1912. Initial support for HVO was provided by the Massachusetts Institute of Technology (MIT) and the Carnegie Geophysical Laboratory, which financed the initial cache of volcano monitoring instruments and Perret’s work in 1911. The Hawaiian Volcano Research Association, a group of Honolulu businessmen organized by Lorrin A. Thurston, also provided essential funding for HVO’s daily operations starting in mid-1912 and continuing for several decades. Since HVO’s beginning, the University of Hawaiʻi (UH), called the College of Hawaii until 1920, has been an advocate of HVO’s scientific studies. We have benefited from collaborations with UH scientists at both the Hilo and Mänoa campuses and look forward to future cooperative efforts to better understand how Hawaiian volcanoes work. The U.S. Geological Survey (USGS) has operated HVO continuously since 1947. Before then, HVO was under the administration of various Federal agencies—the U.S. Weather Bureau, at the time part of the Department of Agriculture, from 1919 to 1924; the USGS, which first managed HVO from 1924 to 1935; and the National Park Service from 1935 to 1947. For 76 of its first 100 years, HVO has been

  4. Applying decision trial and evaluation laboratory as a decision tool for effective safety management system in aviation transport

    Directory of Open Access Journals (Sweden)

    Ifeanyichukwu Ebubechukwu Onyegiri

    2016-10-01

    Full Text Available In recent years, in the aviation industry, the weak engineering controls and lapses associated with safety management systems (SMSs are responsible for the seemingly unprecedented disasters. A previous study has confirmed the difficulties experienced by safety managers with SMSs and the need to direct research to this area of investigation for more insights and progress in the evaluation and maintenance of SMSs in the aviation industry. The purpose of this work is to examine the application of Decision Trial and Evaluation Laboratory (DEMATEL to the aviation industry in developing countries with illustration using the Nigerian aviation survey data for the validation of the method. The advantage of the procedure over other decision making methods is in its ability to apply feedback in its decision making. It also affords us the opportunity of breaking down the complex aviation SMS components and elements which are multi-variate in nature through the analysis of the contributions of the diverse system criteria from the perspective of cause and effects, which in turn yields easier and yet more effective aviation transportation accident pre-corrective actions. In this work, six revised components of an SMS were identified and DEMATEL was applied to obtain their direct and indirect impacts and influences on the overall SMS performance. Data collection was by the survey questionnaire, which served as the initial direct-relation matrix, coded in Matlab software for establishing the impact relation map (IRM. The IRM was then plotted in MS Excel spread-sheet software. From our results, safety structure and regulation has the highest impact level on an SMS with a corresponding positive relation level value. In conclusion, the results agree with those of previous researchers that used grey relational analysis. Thus, DEMATEL serves as a great tool and resource for the safety manager.

  5. Applying the “WSUD potential”-tool in the framework of the Copenhagen Climate Adaptation and Cloudburst Management Plans

    DEFF Research Database (Denmark)

    Lerer, Sara Maria; Madsen, Herle Mo; Smit Andersen, Jonas;

    2016-01-01

    Water Sensitive Urban Design (WSUD) is still in the “Opportunity”-phase of its stabilization process in Copenhagen, Denmark, indicating that there are controversies surrounding its proper use and the regulatory framework is not completely adapted to the new technology. In 2015 private land owners...... Adaptation Plan and general service goal on the other side, which may result in over-sizing of the collective stormwater management system....... in Denmark could get up to 100% of the construction costs of climate adaptation measures funded by the utility companies, which resulted in a race to apply for this co-funding plan. In this study we briefly review the climate adaptation framework in Copenhagen, and then discuss how well different scenarios...... of WSUD in a case study area interact with this framework. The impacts of the different scenarios are assessed using the “WSUD-potential” tool, which builds upon the Three Points Approach. The results indicate that there is a schism between the city’s Cloudburst Management Plan on one side and its Climate...

  6. The story of the Hawaiian Volcano Observatory -- A remarkable first 100 years of tracking eruptions and earthquakes

    Science.gov (United States)

    Babb, Janet L.; Kauahikaua, James P.; Tilling, Robert I.

    2011-01-01

    The year 2012 marks the centennial of the Hawaiian Volcano Observatory (HVO). With the support and cooperation of visionaries, financiers, scientists, and other individuals and organizations, HVO has successfully achieved 100 years of continuous monitoring of Hawaiian volcanoes. As we celebrate this milestone anniversary, we express our sincere mahalo—thanks—to the people who have contributed to and participated in HVO’s mission during this past century. First and foremost, we owe a debt of gratitude to the late Thomas A. Jaggar, Jr., the geologist whose vision and efforts led to the founding of HVO. We also acknowledge the pioneering contributions of the late Frank A. Perret, who began the continuous monitoring of Kīlauea in 1911, setting the stage for Jaggar, who took over the work in 1912. Initial support for HVO was provided by the Massachusetts Institute of Technology (MIT) and the Carnegie Geophysical Laboratory, which financed the initial cache of volcano monitoring instruments and Perret’s work in 1911. The Hawaiian Volcano Research Association, a group of Honolulu businessmen organized by Lorrin A. Thurston, also provided essential funding for HVO’s daily operations starting in mid-1912 and continuing for several decades. Since HVO’s beginning, the University of Hawaiʻi (UH), called the College of Hawaii until 1920, has been an advocate of HVO’s scientific studies. We have benefited from collaborations with UH scientists at both the Hilo and Mänoa campuses and look forward to future cooperative efforts to better understand how Hawaiian volcanoes work. The U.S. Geological Survey (USGS) has operated HVO continuously since 1947. Before then, HVO was under the administration of various Federal agencies—the U.S. Weather Bureau, at the time part of the Department of Agriculture, from 1919 to 1924; the USGS, which first managed HVO from 1924 to 1935; and the National Park Service from 1935 to 1947. For 76 of its first 100 years, HVO has been

  7. Floodplain sediment from a 100-year-recurrence flood in 2005 of the Ping River in northern Thailand

    Directory of Open Access Journals (Sweden)

    S. H. Wood

    2008-07-01

    Full Text Available The tropical storm, floodwater, and the floodplain-sediment layer of a 100-year recurrence flood are examined to better understand characteristics of large monsoon floods on medium-sized rivers in northern Thailand. Storms producing large floods in northern Thailand occur early or late in the summer rainy season (May–October. These storms are associated with tropical depressions evolving from typhoons in the South China Sea that travel westward across the Indochina Peninsula. In late September, 2005, the tropical depression from Typhoon Damrey swept across northern Thailand delivering 100–200 mm/day at stations in mountainous areas. Peak flow from the 6355-km2 drainage area of the Ping River upstream of the city of Chiang Mai was 867 m3s−1 (river-gage of height 4.93 m and flow greater than 600 m3s−1 lasted for 2.5 days. Parts of the city of Chiang Mai and some parts of the floodplain in the intermontane Chiang Mai basin were flooded up to 1-km distant from the main channel. Suspended-sediment concentrations in the floodwater were measured and estimated to be 1000–1300 mg l−1.

    The mass of dry sediment (32.4 kg m-2, measured over a 0.32-km2 area of the floodplain is relatively high compared to reports from European and North American river floods. Average wet sediment thickness over the area was 3.3 cm. Sediment thicker than 8 cm covered 16 per cent of the area, and sediment thicker than 4 cm covered 44 per cent of the area. High suspended-sediment concentration in the floodwater, flow to the floodplain through a gap in the levee afforded by the mouth of a tributary stream as well as flow over levees, and floodwater depths of 1.2 m explain the relatively large amount of sediment in the measured area.

    Grain-size analyses and examination of the flood layer showed about 15-cm thickness of massive fine-sandy silt on the levee within 15

  8. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  9. Portable hyperspectral device as a valuable tool for the detection of protective agents applied on hystorical buildings

    Science.gov (United States)

    Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.

    2012-04-01

    In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration [1]. The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in

  10. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    NARCIS (Netherlands)

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of rece

  11. Applying Total Quality Management Tools Using QFD at Higher Education Institutions in Gulf Area (Case Study: ALHOSN University

    Directory of Open Access Journals (Sweden)

    Adnan Al-Bashir

    2016-07-01

    Full Text Available Human power’s quality plays the key role in the growth and development of societies where the quality of human powers can be enriched with the high quality education provided by the higher education institutions. The higher education institutions are hereby an important sector of any society since it defines the overall quality of human lives. This research will investigate the application of Total Quality Management (TQM tools at the higher education institutions; specifically at ALHOSN University. In this study five tools were implemented at ALHOSN University’s engineering college including: Quality Function Deployment, Affinity Diagrams, Tree Diagrams, Pareto Charts, and Fishbone Diagrams. The research will reveal that the implementation of TQM tools has a great benefit for higher education institutions where they have uncovered many area of potential improvement as well as the main causes of some of the problems the Faculty of Engineering is facing. Also, it will show that the implementation of TQM tools on higher education institution systems will enhance the performance of such institutions.

  12. Applying a statewide geospatial leaching tool for assessing soil vulnerability ratings for agrochemicals across the contiguous United States.

    Science.gov (United States)

    Ki, Seo Jin; Ray, Chittaranjan; Hantush, Mohamed M

    2015-06-15

    A large-scale leaching assessment tool not only illustrates soil (or groundwater) vulnerability in unmonitored areas, but also can identify areas of potential concern for agrochemical contamination. This study describes the methodology of how the statewide leaching tool in Hawaii modified recently for use with pesticides and volatile organic compounds can be extended to the national assessment of soil vulnerability ratings. For this study, the tool was updated by extending the soil and recharge maps to cover the lower 48 states in the United States (US). In addition, digital maps of annual pesticide use (at a national scale) as well as detailed soil properties and monthly recharge rates (at high spatial and temporal resolutions) were used to examine variations in the leaching (loads) of pesticides for the upper soil horizons. Results showed that the extended tool successfully delineated areas of high to low vulnerability to selected pesticides. The leaching potential was high for picloram, medium for simazine, and low to negligible for 2,4-D and glyphosate. The mass loadings of picloram moving below 0.5 m depth increased greatly in northwestern and central US that recorded its extensive use in agricultural crops. However, in addition to the amount of pesticide used, annual leaching load of atrazine was also affected by other factors that determined the intrinsic aquifer vulnerability such as soil and recharge properties. Spatial and temporal resolutions of digital maps had a great effect on the leaching potential of pesticides, requiring a trade-off between data availability and accuracy. Potential applications of this tool include the rapid, large-scale vulnerability assessments for emerging contaminants which are hard to quantify directly through vadose zone models due to lack of full environmental data.

  13. Experiences and Results of Applying Tools for Assessing the Quality of a mHealth App Named Heartkeeper.

    Science.gov (United States)

    Martínez-Pérez, Borja; de la Torre-Díez, Isabel; López-Coronado, Miguel

    2015-11-01

    Currently, many incomplete mobile apps can be found in the commercial stores, apps with bugs or low quality that needs to be seriously improved. The aim of this paper is to use two different tools for assessing the quality of a mHealth app for the self-management of heart diseases by the own patients named Heartkeeper. The first tool measures the compliance with the Android guidelines given by Google and the second measures the users' Quality of Experience (QoE). The results obtained indicated that Heartkeeper follows in many cases the Android guidelines, especially in the structure, and offers a satisfactory QoE for its users, with special mention to aspects such as the learning curve, the availability and the appearance. As a result, Heartkeeper has proved to be a satisfactory app from the point of view of Google and the users. The conclusions obtained are that the type of tools that measure the quality of an app can be very useful for developers in order to find aspects that need improvements before releasing their apps. By doing this, the number of low-quality applications released will decrease dramatically, so these techniques are strongly recommended for all the app developers. PMID:26345452

  14. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    OpenAIRE

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of recent and fossil, principally freshwater fish; “fish kills” and mass mortality in the sea (especially of fish); taphonomy and preservation of fish; upwelling; anoxic conditions, linked to fish mortali...

  15. [Changes in the therapy of pulpal diseases and periapical lesions according to the articles published in the journal Fogorvosi Szemle during the past 100 years (1908-2008)].

    Science.gov (United States)

    Nemes, Júlia; Duhaj, Szilvia; Nyárasdy, Ida

    2008-08-01

    Present review makes an attempt to summarize the Hungarian endodontic literature of pulpal and periodontal diseases, published during the past 100 years. The experimental examinations and clinical studies make it possible to follow the changes in the methods and in the medicines, used in the field of pulpal treatment. The overview gives us information about the problem of disinfection, shaping, measuring, and obturation of root canal. PMID:19055128

  16. A Simulation Tool for Steady State Thermal Performance Applied to the SPL Double-Walled Tube RF Power Coupler

    CERN Document Server

    Bonomi, R

    2014-01-01

    This note reports on the study carried out to design a tool for steady-state thermal performance of the RF power coupler inside the SPL cryostat. To reduce the amount of heat penetrating into the helium bath where the cavity is placed, the main coupler is actively cooled by means of an adequate flow rate of helium gas. The knowledge of the temperature profiles and the overall thermal performance of the power coupler are fundamental for the estimation of the total heat load budget of the cryostat.

  17. Applied acoustics concepts, absorbers, and silencers for acoustical comfort and noise control alternative solutions, innovative tools, practical examples

    CERN Document Server

    Fuchs, Helmut V

    2013-01-01

    The author gives a comprehensive overview of materials and components for noise control and acoustical comfort. Sound absorbers must meet acoustical and architectural requirements, which fibrous or porous material alone can meet. Basics and applications are demonstrated, with representative examples for spatial acoustics, free-field test facilities and canal linings. Acoustic engineers and construction professionals will find some new basic concepts and tools for developments in order to improve acoustical comfort. Interference absorbers, active resonators and micro-perforated absorbers of different materials and designs complete the list of applications.

  18. Applying value engineering and modern assessment tools in managing NEPA: Improving effectiveness of the NEPA scoping and planning process

    Energy Technology Data Exchange (ETDEWEB)

    ECCLESTON, C.H.

    1998-09-03

    While the National Environmental Policy Act (NEPA) implementing regulations focus on describing ''What'' must be done, they provide surprisingly little direction on ''how'' such requirements are to be implemented. Specific implementation of these requirements has largely been left to the discretion of individual agencies. More than a quarter of a century after NEPA's enactment, few rigorous tools, techniques, or methodologies have been developed or widely adopted for implementing the regulatory requirements. In preparing an Environmental Impact Statement, agencies are required to conduct a public scoping process to determine the range of actions, alternatives, and impacts that will be investigated. Determining the proper scope of analysis is an element essential in the successful planning and implementation of future agency actions. Lack of rigorous tools and methodologies can lead to project delays, cost escalation, and increased risk that the scoping process may not adequately capture the scope of decisions that eventually might need to be considered. Recently, selected Value Engineering (VE) techniques were successfully used in managing a prescoping effort. A new strategy is advanced for conducting a pre-scoping/scoping effort that combines NEPA with VE. Consisting of five distinct phases, this approach has potentially wide-spread implications in the way NEPA, and scoping in particular, is practiced.

  19. Poor reliability between Cochrane reviewers and blinded external reviewers when applying the Cochrane risk of bias tool in physical therapy trials.

    Directory of Open Access Journals (Sweden)

    Susan Armijo-Olivo

    Full Text Available OBJECTIVES: To test the inter-rater reliability of the RoB tool applied to Physical Therapy (PT trials by comparing ratings from Cochrane review authors with those of blinded external reviewers. METHODS: Randomized controlled trials (RCTs in PT were identified by searching the Cochrane Database of Systematic Reviews for meta-analysis of PT interventions. RoB assessments were conducted independently by 2 reviewers blinded to the RoB ratings reported in the Cochrane reviews. Data on RoB assessments from Cochrane reviews and other characteristics of reviews and trials were extracted. Consensus assessments between the two reviewers were then compared with the RoB ratings from the Cochrane reviews. Agreement between Cochrane and blinded external reviewers was assessed using weighted kappa (κ. RESULTS: In total, 109 trials included in 17 Cochrane reviews were assessed. Inter-rater reliability on the overall RoB assessment between Cochrane review authors and blinded external reviewers was poor (κ  =  0.02, 95%CI: -0.06, 0.06]. Inter-rater reliability on individual domains of the RoB tool was poor (median κ  = 0.19, ranging from κ  =  -0.04 ("Other bias" to κ  =  0.62 ("Sequence generation". There was also no agreement (κ  =  -0.29, 95%CI: -0.81, 0.35] in the overall RoB assessment at the meta-analysis level. CONCLUSIONS: Risk of bias assessments of RCTs using the RoB tool are not consistent across different research groups. Poor agreement was not only demonstrated at the trial level but also at the meta-analysis level. Results have implications for decision making since different recommendations can be reached depending on the group analyzing the evidence. Improved guidelines to consistently apply the RoB tool and revisions to the tool for different health areas are needed.

  20. 100 years of Planck's quantum

    CERN Document Server

    Duck, Ian M

    2000-01-01

    This invaluable book takes the reader from Planck's discovery of the quantum in 1900 to the most recent interpretations and applications of nonrelativistic quantum mechanics.The introduction of the quantum idea leads off the prehistory of quantum mechanics, featuring Planck, Einstein, Bohr, Compton, and de Broglie's immortal contributions. Their original discovery papers are featured with explanatory notes and developments in Part 1.The invention of matrix mechanics and quantum mechanics by Heisenberg, Born, Jordan, Dirac, and Schrödinger is presented next, in Part 2.Following that, in Part 3,

  1. 100 Years of Reality Learning

    Science.gov (United States)

    Zimpher, Nancy L.; Wright Ron, D.

    2006-01-01

    One may have heard of reality TV, but what about reality learning? The latter is probably a term one hasn't seen much, although it is in many ways a clearer and more concise name for a concept that in 2006 marks its 100th anniversary: cooperative education, or "co-op." Co-op, a break-through idea pioneered at the University of Cincinnati by Herman…

  2. FEMA 100 year Flood Data

    Data.gov (United States)

    California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  3. Covalent perturbation as a tool for validation of identifications and PTM mapping applied to bovine alpha-crystallin

    DEFF Research Database (Denmark)

    Bunkenborg, Jakob; Falkenby, Lasse Gaarde; Harder, Lea Mørch;

    2016-01-01

    Proteomic identifications hinge on the measurement of both parent and fragment masses and matching these to amino acid sequences via database search engines. The correctness of the identifications is assessed by statistical means. Here we present an experimental approach to test identifications...... and can salvage low scoring post-translationally modified peptides. Applying this strategy to bovine alpha-crystallin we identify 9 lysine acetylation sites, 4 O-GlcNAc sites and 13 phosphorylation sites. This article is protected by copyright. All rights reserved....

  4. Lead-time reduction utilizing lean tools applied to healthcare: the inpatient pharmacy at a local hospital.

    Science.gov (United States)

    Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed

    2010-01-01

    The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time.

  5. Lead-time reduction utilizing lean tools applied to healthcare: the inpatient pharmacy at a local hospital.

    Science.gov (United States)

    Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed

    2010-01-01

    The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time. PMID:20151593

  6. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    : it is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future......New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices...

  7. Applying TRIZ and Fuzzy AHP Based on Lean Production to Develop an Innovative Design of a New Shape for Machine Tools

    Directory of Open Access Journals (Sweden)

    Ho-Nien Hsieh

    2015-03-01

    Full Text Available Companies are facing cut throat competition and are forced to continuously perform better than their competitors. In order to enhance their position in the competitive world, organizations are improving at a faster pace. Industrial organizations must be used to the new ideals, such as innovation. Today, innovative design in the development of new products has become a core value in most companies, while innovation is recognized as the main driving force in the market. This work applies the Russian theory of inventive problem-solving, TRIZ and the fuzzy analytical hierarchy process (FAHP to design a new shape for machine tools. TRIZ offers several concepts and tools to facilitate concept creation and problem-solving, while FAHP is employed as a decision support tool that can adequately represent qualitative and subjective assessments under the multiple criteria decision-making environment. In the machine tools industry, this is the first study to develop an innovative design under the concept of lean production. We used TRIZ to propose the relevant principles to the shape’s design with the innovative design consideration and also used FAHP to evaluate and select the best feasible alternative from independent factors based on a multiple criteria decision-making environment. To develop a scientific method based on the lean production concept in order to design a new product and improve the old designing process is the contribution of this research.

  8. Abstracts of the International conference 'Geological and geophysical studies of the Republic of Kazakhstan's sites', devoted to 100-year jubilee of K.I. Satpaev

    International Nuclear Information System (INIS)

    The International conference 'Geological and geophysical studies of the Republic of Kazakhstan's sites', was devoted to 100-year jubilee of K.I. Satpaev. Satpaev is the well-known Kazakh scientist-geologist, the first President of the Academy of Science of the Kazakh Soviet Socialist Republic. The conference was held in 26-29 April 1999 in the Kurchatov city on the former Semipalatinsk test site territory. The conference was mainly dedicated to problems of geological and geophysical examinations and monitoring of objects exposed to effects from underground nuclear explosions. The Collection of abstracts comprises 21 papers

  9. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    International Nuclear Information System (INIS)

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling. (letter)

  10. Apply Web-based Analytic Tool and Eye Tracking to Study The Consumer Preferences of DSLR Cameras

    Directory of Open Access Journals (Sweden)

    Jih-Syongh Lin

    2013-11-01

    Full Text Available Consumer’s preferences and purchase motivation of products often lie in the purchasing behaviors generated by the synthetic evaluation of form features, color, function, and price of products. If an enterprise can bring these criteria under control, they can grasp the opportunities in the market place. In this study, the product form, brand, and prices of five DSLR digital cameras of Nikon, Lumix, Pentax, Sony, and Olympus were investigated from the image evaluation and eye tracking. The web-based 2-dimensional analytical tool was used to present information on three layers. Layer A provided information of product form and brand name; Layer B for product form, brand name, and product price for the evaluation of purchase intention (X axis and product form attraction (Y axis. On Layer C, Nikon J1 image samples of five color series were presented for the evaluation of attraction and purchase intention. The study results revealed that, among five Japanese brands of digital cameras, LUMIX GF3 is most preferred and serves as the major competitive product, with a product price of US$630. Through the visual focus of eye-tracking, the lens, curvatured handle bar, the curve part and shuttle button above the lens as well as the flexible flash of LUMIX GF3 are the parts that attract the consumer’s eyes. From the verbal descriptions, it is found that consumers emphasize the functions of 3D support lens, continuous focusing in shooting video, iA intelligent scene mode, and all manual control support. In the color preference of Nikon J1, the red and white colors are most preferred while pink is least favored. These findings can serve as references for designers and marketing personnel in new product design and development.

  11. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    Science.gov (United States)

    Seaquist, J. W.; Li Johansson, Emma; Nicholas, Kimberly A.

    2014-11-01

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling.

  12. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Energy Technology Data Exchange (ETDEWEB)

    Végh, J., E-mail: janos.vegh@ec.europa.eu [Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Pós, I., E-mail: pos@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Horváth, Cs., E-mail: csaba.horvath@energia.mta.hu [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, P.O. Box 49 (Hungary); Kálya, Z., E-mail: kalyaz@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Parkó, T., E-mail: parkot@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Ignits, M., E-mail: ignits@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary)

    2015-10-15

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  13. Undergraduate teaching modules featuring geodesy data applied to critical social topics (GETSI: GEodetic Tools for Societal Issues)

    Science.gov (United States)

    Pratt-Sitaula, B. A.; Walker, B.; Douglas, B. J.; Charlevoix, D. J.; Miller, M. M.

    2015-12-01

    The GETSI project, funded by NSF TUES, is developing and disseminating teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (serc.carleton.edu/getsi). It is collaborative between UNAVCO (NSF's geodetic facility), Mt San Antonio College, and Indiana University. GETSI was initiated after requests by geoscience faculty for geodetic teaching resources for introductory and majors-level students. Full modules take two weeks but module subsets can also be used. Modules are developed and tested by two co-authors and also tested in a third classroom. GETSI is working in partnership with the Science Education Resource Center's (SERC) InTeGrate project on the development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. Two GETSI modules are being published in October 2015. "Ice mass and sea level changes" includes geodetic data from GRACE, satellite altimetry, and GPS time series. "Imaging Active Tectonics" has students analyzing InSAR and LiDAR data to assess infrastructure earthquake vulnerability. Another three modules are in testing during fall 2015 and will be published in 2016. "Surface process hazards" investigates mass wasting hazard and risk using LiDAR data. "Water resources and geodesy" uses GRACE, vertical GPS, and reflection GPS data to have students investigating droughts in California and the High Great Plains. "GPS, strain, and earthquakes" helps students learn about infinitesimal and coseismic strain through analysis of horizontal GPS data and includes an extension module on the Napa 2014 earthquake. In addition to teaching resources, the GETSI project is compiling recommendations on successful development of geodesy curricula. The chief recommendations so far are the critical importance of including scientific experts in the authorship team and investing significant resources in

  14. 还原染料百年发展史话(待续)%The development of vat dyes in 100 years(to be continued)

    Institute of Scientific and Technical Information of China (English)

    陈荣圻

    2015-01-01

    The first vat dye (vat dye RSN) was synthesized and produced by BASF in 1901, which was more than 100 years. Considering indigo blue was synthesized by BASF in 1987, it has a history of far more than 100 years. Vat dyes are expensive because of its complex chemical structure and long synthesis process with large amount of "three wastes" that are difficult to deal with. However, vat dyes are colourful, high color density and impossible to be replaced by any other dyes. Besides printing and dyeing, vat dyes can obtain high-grade organic pigments after pigmentation. Some of those high-grade organic pigments can extend to optical physics, liquid crystal in the electrochemistry optical material fields and other high-tech fields, which are indispensable function materials and completely changed.%1901年,BASF合成并生产了第1只还原染料(还原染料RSN),距今超过一百年,如果说1987年合成靛蓝早已诞生于BASF,那离百年就更远了.还原染料化学结构复杂,合成过程冗长、三废量大、难于处理,所以价格昂贵.但因其色彩鲜艳、密度高,非其他棉用染料所能取代.还原染料除了印染,经颜料化后,某些染料可制得高档有机颜料,有些品种还能拓展到光物理、电化学领域的液晶、光导材料等高科技领域,是不可缺失的功能材料,旧貌换新颜.

  15. A 100-Year Retrospective Landscape-Level Carbon Budget for the Sooke Lake Watershed, British Columbia: Constraining Estimates of Terrestrial to Aquatic DOC Transfers.

    Science.gov (United States)

    Trofymow, J. A.; Smiley, B. P. K.

    2014-12-01

    To address how natural disturbance, forest harvest, and deforestation from reservoir creation affect landscape-level carbon (C) budgets, a retrospective C budget for the 8500 ha Sooke watershed from 1911 - 2012 was developed using historic spatial inventory and disturbance data. Data was input to a spatially-explicit version of the Carbon Budget Model-Canadian Forest Sector (CBM-CFS3), an inventory-based C budget model used to simulate forest C dynamics at multiple scales. In 1911 the watershed was dominated by mature/old Douglas-fir forests with aboveground biomass C (ABC) of 262 Mg C/ha and net ecosystem production (NEP) of 0.63 Mg C/ha/yr. Land was cleared around Sooke Lake, a dam built and lake expanded from 370 to 450 ha in 1915, 610 ha in 1970, 670 ha in 1980 and 810 ha in 2002. Along with deforestation, fires and localized harvest occurred from 1920 - 1940, reducing ABC to 189 Mg C/ha, with NEP varying from -1.63 to 0.13 Mg C/ha/yr. Distributed harvest occurred 1954 - 1998, with a minimum ABC of 148 Mg C/ha in 1991. By 2012 ABC (177 Mg C/ha) and NEP (2.29 Mg C/ha/yr) had increased. Over 100 years, 2430 ha forest was cut and replanted and 640 ha deforested. CBM-CFS3 includes transfers of dissolved organic C (DOC) to aquatic systems, however data has not been available to parameterize DOC flux. DOC fluxes are modelled as a fraction of decay loss from humified soil C with a default of 100% of losses to CO2 and 0% to DOC. Stream flow and [DOC] data from 1996 - 2012 for 3 watershed catchments, Rithet, Judge and Council were used to estimate annual DOC fluxes. Rithet, Judge and Council differed both in area % disturbed (logging or fire) over 100 years (39%, 93%, 91%) and in area % mature/old forest (>80yrs in 2012) (67%, 56%, 21%). DOC flux for Rithet and Judge ranged from 0.037 - 0.057 Mg C/ha/yr, Council averaged 0.017 Mg C/ha/yr. Low DOC fluxes were likely due to influences of a small lake in the catchment. Constraining CBM-CFS3 to observed DOC fluxes, required

  16. Bottom-Up modeling, a tool for decision support for long-term policy on energy and environment - The TIMES model applied to the energy intensive industries

    International Nuclear Information System (INIS)

    Among the energy users in France and Europe, some industrial sectors are very important and should have a key role when assessing the final energy demand patterns in the future. The aim of our work is to apply a prospective model for the long range analysis of energy/technology choices in the industrial sector, focussing on the energy-intensive sectors. The modelling tool applied in this study is the TIMES model (family of best known MARKAL model). It is an economic linear programming model generator for local, national or multi regional energy systems, which provides a technology-rich basis for estimating energy dynamics over a long term, multi period time. We illustrate our work with nine energy-intensive industrial sectors: paper, steel, glass, cement, lime, tiles, brick, ceramics and plaster. It includes a detailed description of the processes involved in the production of industrial products, providing typical energy uses in each process step. In our analysis, we identified for each industry, several commercially available state-of-the-art technologies, characterized and chosen by the Model on the basis of cost effectiveness. Furthermore, we calculated potential energy savings, carbon dioxide emissions' reduction and we estimated the energy impact of a technological rupture. This work indicates that there still exists a significant potential for energy savings and carbon dioxide emissions' reduction in all industries. (author)

  17. Quantification of uncertainties in the 100-year flow at an ungaged site near a gaged station and its application in Georgia

    Science.gov (United States)

    Cho, Huidae; Bones, Emma

    2016-08-01

    The Federal Emergency Management Agency has introduced the concept of the "1-percent plus" flow to incorporate various uncertainties in estimation of the 100-year or 1-percent flow. However, to the best of the authors' knowledge, no clear directions for calculating the 1-percent plus flow have been defined in the literature. Although information about standard errors of estimation and prediction is provided along with the regression equations that are often used to estimate the 1-percent flow at ungaged sites, uncertainty estimation becomes more complicated when there is a nearby gaged station because regression flows and the peak flow estimate from a gage analysis should be weighted to compute the weighted estimate of the 1-percent flow. In this study, an equation for calculating the 1-percent plus flow at an ungaged site near a gaged station is analytically derived. Also, a detailed process is introduced for calculating the 1-percent plus flow for an ungaged site near a gaged station in Georgia as an example and a case study is performed. This study provides engineers and practitioners with a method that helps them better assess flood risks and develop mitigation plans accordingly.

  18. Organochlorine pesticides (OCPs) in wetland soils under different land uses along a 100-year chronosequence of reclamation in a Chinese estuary

    Science.gov (United States)

    Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Wang, Junjing; Gao, Zhaoqin; Zhang, Guangliang

    2015-12-01

    Soil profiles were collected at a depth of 30 cm in ditch wetlands (DWs), riverine wetlands (RiWs) and reclaimed wetlands (ReWs) along a 100-year chronosequence of reclamation in the Pearl River Delta. In total, 16 OCPs were measured to investigate the effects of wetland reclamation and reclamation history on OCP levels. Our results showed that average ∑DDTs, HCB, MXC, and ∑OCPs were higher in surface soils of DWs compared to RiWs and ReWs. Both D30 and D20 soils contained the highest ∑OCP levels, followed by D40 and D100 soils; lower ∑OCP levels occurred in D10 soils. Higher ∑OCP levels were observed in the younger RiWs than in the older ones, and surface soils exhibited higher ∑OCP concentrations in the older ReWs compared with younger ReWs. The predominant percentages of γ-HCH in ∑HCHs (>42%) and aldrin in ∑DRINs (>46%) in most samples reflected the recent use of lindane and aldrin. The presence of dominant DDT isomers (p,p’-DDE and p,p’-DDD) indicated the historical input of DDT and significant aerobic degradation of the compound. Generally, DW soils had a higher ecotoxicological risk of OCPs than RiW and ReW soils, and the top 30 cm soils had higher ecotoxicological risks of HCHs than of DDTs.

  19. Microbe-mediated transformations of marine dissolved organic matter during 2,100 years of natural incubation in the cold, oxic crust of the Mid-Atlantic Ridge.

    Science.gov (United States)

    Shah Walter, S. R.; Jaekel, U.; Huber, J. A.; Dittmar, T.; Girguis, P. R.

    2015-12-01

    On the western flank of the Mid-Atlantic Ridge, oxic seawater from the deep ocean is downwelled into the basaltic crust, supplying the crustal aquifer with an initial inoculum of organic matter and electron acceptors. Studies have shown that fluids circulating within the crust are minimally altered from original seawater, making this subsurface environment a unique natural experiment in which the fate of marine organic matter and the limitations of microbial adaptability in the context of reduced carbon supply can be examined. To make the subsurface crustal aquifer accessible, two CORK (Circulation Obviation Retrofit Kit) observatories have been installed at North Pond, a sediment-filled depression beneath the oligotrophic Sargasso Sea. Radiocarbon analysis of dissolved inorganic (DIC) and organic carbon (DOC) in samples recovered from these observatories show uncoupled aging between DOC and DIC with Δ14C values of DOC as low as -933‰ despite isolation from the open ocean for, at most, 2,100 years. This extreme value is part of a general trend of decreasing DOC δ13C and Δ14C values with increasing incubation time within the aquifer. Combined with reduced concentrations of DOC, our results argue for selective microbial oxidation of the youngest, most 13C-enriched components of downwelled DOC, possibly identifying these as characteristics of the more bioavailable fractions of deep-ocean dissolved organic matter. They also suggest that microbial oxidation during low-temperature hydrothermal circulation could be an important sink for aged marine dissolved organic matter.

  20. Intraspecific variation in fine root respiration and morphology in response to in situ soil nitrogen fertility in a 100-year-old Chamaecyparis obtusa forest.

    Science.gov (United States)

    Makita, Naoki; Hirano, Yasuhiro; Sugimoto, Takanobu; Tanikawa, Toko; Ishii, Hiroaki

    2015-12-01

    Soil N fertility has an effect on belowground C allocation, but the physiological and morphological responses of individual fine root segments to variations in N availability under field conditions are still unclear. In this study, the direction and magnitude of the physiological and morphological function of fine roots in response to variable in situ soil N fertility in a forest site were determined. We measured the specific root respiration (Rr) rate, N concentration and morphology of fine root segments with 1-3 branching orders in a 100-year-old coniferous forest of Chamaecyparis obtusa. Higher soil N fertility induced higher Rr rates, root N concentration, and specific root length (SRL), and lower root tissue density (RTD). In all fertility levels, the Rr rates were significantly correlated positively with root N and SRL and negatively with RTD. The regression slopes of respiration with root N and RTD were significantly higher along the soil N fertility gradient. Although no differences in the slopes of Rr and SRL relationship were found across the levels, there were significant shifts in the intercept along the common slope. These results suggest that a contrasting pattern in intraspecific relationships between specific Rr and N, RTD, and SRL exists among soils with different N fertility. Consequently, substantial increases in soil N fertility would exert positive effects on organ-scale root performance by covarying the Rr, root N, and morphology for their potential nutrient and water uptake.

  1. 100-Year Floodplains, Digital Floodplain maps create by WI DNR added to our website in 2013, Published in 2013, 1:24000 (1in=2000ft) scale, Oneida County Wisconsin.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Published Reports/Deeds information as of 2013. It is...

  2. Changes in stable isotopes, lignin-derived phenols, and fossil pigments in sediments of Lake Biwa, Japan: Implications for anthropogenic effects over the last 100 years

    International Nuclear Information System (INIS)

    We measured stable nitrogen (N) and carbon (C) isotope ratios, lignin-derived phenols, and fossil pigments in sediments of known ages to elucidate the historical changes in the ecosystem status of Lake Biwa, Japan, over the last 100 years. Stable N isotope ratios and algal pigments in the sediments increased rapidly from the early 1960s to the 1980s, and then remained relatively constant, indicating that eutrophication occurred in the early 1960s but ceased in the 1980s. Stable C isotope ratios of the sediment increased from the 1960s, but decreased after the 1980s to the present. This decrease in stable C isotope ratios after the 1980s could not be explained by annual changes in either terrestrial input or algal production. However, when the C isotope ratios were corrected for the Suess effect, the shift to more negative isotopic value in atmospheric CO2 by fossil fuel burning, the isotopic value showed a trend, which is consistent with the other biomarkers and the monitoring data. The trend was also mirrored by the relative abundance of lignin-derived phenols, a unique organic tracer of material that originated from terrestrial plants, which decreased in the early 1960s and recovered to some degree in the 1980s. We detected no notable difference in the composition of lignin phenols, suggesting that the terrestrial plant composition did not change markedly. However, we found that lignin accumulation rate increased around the 1980s. These results suggest that although eutrophication has stabilized since the 1980s, allochthonous organic matter input has changed in Lake Biwa over the past 25 years

  3. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Rong [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); School of Nature Conservation, Beijing Forestry University, Beijing 100083 (China); Bai, Junhong, E-mail: junhongbai@163.com [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China)

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  4. Upwelling and anthropogenic forcing on phytoplankton productivity and community structure changes in the Zhejiang coastal area over the last 100 years

    Institute of Scientific and Technical Information of China (English)

    DUAN Shanshan; XING Lei; ZHANG Hailong; FENG Xuwen; YANG Haili; ZHAO Meixun

    2014-01-01

    Phytoplankton productivity and community structure in marginal seas have been altered significantly dur-ing the past three decades, but it is still a challenge to distinguish the forcing mechanisms between climate change and anthropogenic activities. High time-resolution biomarker records of two 210Pb-dated sediment cores (#34:28.5°N, 122.272°E;CJ12-1269:28.861 9°N, 122.515 3°E) from the Min-Zhe coastal mud area were compared to reveal changes of phytoplankton productivity and community structure over the past 100 years. Phytoplankton productivity started to increase gradually from the 1970s and increased rapidly after the late 1990s at Site #34;and it started to increase gradually from the middle 1960s and increased rapidly after the late 1980s at Site CJ12-1269. Productivity of Core CJ12-1269 was higher than that of Core #34. Phy-toplankton community structure variations displayed opposite patterns in the two cores. The decreasing D/B (dinosterol/brassicasterol) ratio of Core #34 since the 1960s revealed increased diatom contribution to total productivity. In contrast, the increasing D/B ratio of Core CJ12-1269 since the 1950s indicated in-creased dinoflagellate contribution to total productivity. Both the productivity increase and the increased dinoflagellate contribution in Core CJ12-1269 since the 1950-1960s were mainly caused by anthropogenic activities, as the location was closer to the Changjiang River Estuary with higher nutrient concentration and decreasing Si/N ratios. However, increased diatom contribution in Core #34 is proposed to be caused by increased coastal upwelling, with higher nutrient concentration and higher Si/N ratios.

  5. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    International Nuclear Information System (INIS)

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  6. Assessment and remediation of a historical pipeline release : tools, techniques and technologies applied to in-situ/ex-situ soil and groundwater remediation

    Energy Technology Data Exchange (ETDEWEB)

    Reid, N. [EBA Engineering Consultants Ltd., Calgary, AB (Canada); Kohlsmith, B. [Kinder Morgan Canada Inc., Calgary, AB (Canada)

    2008-07-01

    Tools, techniques, and technologies applied to in-situ/ex-situ soil and groundwater remediation were presented as part of an assessment and remediation of a historical pipeline release. The presentation discussed the initial assessment, as well as a discussion of remediation of hydrophobic soils, re-assessment, site specific criteria, a remediation trial involving bioventing and chemical oxidation, and a full scale remediation. The pipeline release occurred in the summer of 1977. The event was followed by a complete surface remediation with a significant amount of topsoil being removed and replaced. In 2004, a landowner complained of poor crop growth in four patches near the area of the historical spill. An initial assessment was undertaken and several photographs were presented. It was concluded that a comprehensive assessment set the base for a careful staged approach to the remediation of the site including the establishment of site specific criteria. The process was made possible with a high level of communication between all stakeholders. In addition, the most appropriate solution for the site was realized. figs.

  7. 100 Years of benthic foraminiferal history on the inner Texas shelf inferred from fauna and stable isotopes: Preliminary results from two cores

    Science.gov (United States)

    Strauss, Josiah; Grossman, Ethan L.; Carlin, Joseph A.; Dellapenna, Timothy M.

    2012-04-01

    Coastal regions, such as the Texas-Louisiana shelf, are subject to seasonal hypoxia that strongly depends on the magnitude of freshwater discharge from local and regional river systems. We have determined benthic foraminiferal fauna and isotopic compositions in two 210Pb dated box cores (BR4 and BR5) to examine the evidence for nearshore hypoxia and freshwater discharge on the Texas shelf during the last 100 years. The 210Pb chronologies of both cores reveal sedimentation rates of 0.2 and 0.1 cm yr-1, translating to ˜60 and ˜90 year records. The fauna of both cores were almost exclusively composed of Ammonia parkinsoniana and Elphidium excavatum, indicating euryhaline ambient waters. The Ammonia-Elphidium (A-E) index, a qualitative measure of low oxygen conditions, shows an increase from values between 20 and 50 to near 100 in both cores, suggesting low oxygen conditions between 1960 and the core top. Between 1950 and 1960 (9-10 cm), low A-E values in BR4 coincide with high δ18O and δ13C values greater than 0‰ and -1‰ respectively. This event corresponds to severe drought (the Texas Drought of Record) over the Brazos River drainage basin and considerably reduced river discharge from 1948 to 1957. High A-E values prior to this event imply low-oxygen conditions were prevalent prior to anthropogenic exacerbation of Louisiana shelf hypoxia and at least since the dredging of a new Brazos River delta in 1929. Elphidium excavatum δ13C values are very low (-4‰) and indicative of significant vital effect. The δ13C values of A. parkinsoniana average -3‰ and exhibit little variability, most likely reflecting pore waters influenced by aerobic and anaerobic respiration. The association of lowered Brazos River discharge with more oxygenated shelf bottom waters suggests Brazos River discharge and shelf hypoxia are linked, but the influence of Mississippi-Atchafalaya discharge can also contribute to shelf stratification.

  8. Indications of progressive desiccation of the Transvaal Lowveld over the past 100 years, and implications for the water stabilization programme in the Kruger National Park

    Directory of Open Access Journals (Sweden)

    U. De V. Pienaar

    1985-12-01

    Full Text Available All available rainfall statistics recorded for the Kruger National Park area since 1907, coupled with an analysis of all the historical climatological data on hand, appear to confirm the quasi-twenty-year rainfall oscillation in precipitation pattern for the summer rainfall area. This was first pointed out by Tyson & Dyer (1975. The dendrochronological data obtained by Hall (1976 from a study of growth rings of a very old yellowwood tree (Podocarpus falcatus in Natal, also appear to indicate a superimposed, long-term (80-100 years pattern of alternate below- average and above-average rainfall periods. The historical data relating to climate in the park, during the past century or two, seem to bear out such a pattern. If this can be confirmed, it will be an enormous aid not only in wildlife-management planning, but also to agriculturists, demographic planners and others. It would appear that the long, relatively dry rainfall period of 1860-1970, with its concomitant progressive desiccation of the @ area in question, has passed over into the next aboveverage rainfall era. This does not mean that there will be no further cataclysmic droughts during future rainfall trough periods. It is therefore wise to plan ahead to meet such contingencies. The present water distribution pattern in the park (natural plus artificial water is conspicuously still well below that which pertained, during dry seasons, at the turn of the century, when the Sabi and Shingwedzi game reserves were proclaimed. It is the declared policy of the National Parks Board of Trustees to simulate natural regulating mechanisms as closely as possible. In consequence the artificial water-for-game program is a long way from completion. The large numbers of game animals in the park (including dominant species such as elephant Loxodonta africana and buffalo Syncerus coffer can no longer migrate out of the area to escape natural catastrophes (such as the crippling droughts of 1911-1917, the

  9. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  10. Performance-driven design with the support of digital tools: Applying discrete event simulation and space syntax on the design of the emergency department

    Directory of Open Access Journals (Sweden)

    David Morgareidge

    2014-09-01

    This case study demonstrates that DES and SSA are effective tools for facilitating decision-making related to design, reducing capital and operational costs, and improving organizational performance. DES focuses on operational processes and care flow. SSA complements DES with its strength in linking space to human behavior. Combining both tools can lead to high-performance ED design and can extend to broad applications in health care.

  11. 百年柿树经济系数模型研建及应用%Establishment and Application of Economic Coefficient Model of Over 100-Year-Old Persimmons

    Institute of Scientific and Technical Information of China (English)

    巩文; 巩垠熙; 沈晓燕

    2014-01-01

    In this study,over 100-year-old persimmons were used as the object of study. A wood analysis of stems was applied to estimate the biomass,and with the harvest records the relationships between different ages,D. B. H. ( Diameter at Breast Height) factors and biomass,fruit production and economic coefficient ( Ec) were investigated. Furthermore, the statistics software was applied to establish the regression models of persimmon biomass of individual tree,fruit production and Ec. The results were as follows:1) with year as the unit,the relationship between EC1(the ratio of annual production of fruit and the cumulative biomass) and tree age and D. B. H. was all featured by hyperbola. 2) with a life cycle as the unit,EC3 ( the ratio of the cumulative amount of fruit production and cumulative biomass) tended to be stable, with the value of 0. 846 and the standard deviation of 0. 036. 3) thus,Ec plays the significant role in biomass estimation and study of economic forests and should give full attention.%以百年柿树为对象,利用树干解析的生物量及采果记录,研究不同树龄、胸径因子与生物量、产果量及经济系数之间的关系,建立柿树单木生物量、产果量与经济系数的回归模型。结果表明:以年为单位的 EC1(年产果量与累计生物量之比)与树龄及胸径间分别呈双曲线关系;以一个生命周期为单位时,EC3(累计产果量与累计生物量之比)趋于稳定,其值为0.846,标准偏差为0.036。经济系数在经济林生物量估算及研究中有重要作用,应给于充分的重视。

  12. Centennial annual general meeting of the CIM/CMMI/MIGA. Montreal `98: a vision for the future; 100 years of ground subsidence studies

    Energy Technology Data Exchange (ETDEWEB)

    Chrzanowski, A.; Szostak-Chrzanowski, A.; Forrester, D.J. [University of New Brunswick, Fredericton, NB (Canada)

    1998-12-31

    Some of the empirical methods developed in central Europe for monitoring and analysis of ground subsidence have been adapted to North American conditions. A century of subsidence observations in Cape Breton is outlined. Empirical methods are being replaced by deterministic modelling of rock behaviour methods, that applies numerical methods to development of subsidence models. These deterministic models can be verified by monitoring under diverse geological and mining conditions. Some of the new monitoring methods developed in Canada are illustrated by case studies describing the use of hydrographic surveys to measure subsidence in offshore coal mines, a telemetric monitoring system for a coal mine in British Columbia, and deterministic monitoring and modelling of ground subsidence in a potash mine. 29 refs., 9 figs., 2 tabs.

  13. 手持式电动工具用轴承性能浅析%Brief Performance Analysis on the Bearings Applied to Hand-held Motor-operated Electric Tools

    Institute of Scientific and Technical Information of China (English)

    张永恩; 方承志; 宋贵州; 李兴林

    2014-01-01

    Deep groove ball bearings, represented in the tool assembly without considering other factors, combined with the relevant sections of the power tool safety standards, the content mainly discusses several performance requirements of the bearings applied to hand-held motor-operated electric tools and their relationship and effects with safety standard. This content can also be referenced while make technical design and quality appraisal.%以深沟球轴承为代表,在不考虑工具装配等因素的前提下,结合电动工具安全标准的相关章节,讨论并分析手持式电动工具用滚动轴承的主要性能和要求,可供设计选用和质量评价时参考。

  14. Applying standards to ICT models, tools and data in Europe to improve river basin networks and spread innovation on water sector

    Science.gov (United States)

    Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier

    2015-04-01

    This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking

  15. 100 years of the main mine rescue service. A contribution to the protection against disasters in the coal mining industry; 100 Jahre Hauptstelle fuer das Grubenrettungswesen. Ein Beitrag zum Katastrophenschutz im Steinkohlenbergbau

    Energy Technology Data Exchange (ETDEWEB)

    Hermuelheim, Walter [RAG Aktiengesellschaft, Herne (Germany). Zentralbereich Arbeits-, Gesundheits- und Umweltschutz

    2011-06-15

    A review of 100 years of protection against disasters in the coal mining industry impressively shows the way from an era of major accidents to a modern branch of industry, which justifiably and with good prospects of success can pursue the aim of ''No accidents - no damage to health - no damage to the environment''. However, the development of the mine rescue service over more than 100 years - represented in the Ruhr by the Main Mine Rescue Service established in 1910 in Essen - would be incomplete without consideration of the allied technical fields underground fire protection and explosion protection. Cooperation between institutions such as the Tremonia test mine and the BVG has produced a safety level in all three fields, which is regarded as exemplary worldwide, and in addition to the latest mining technology is a good advertisement for the German coal mining industry. (orig.)

  16. An H-formulation-based three-dimensional hysteresis loss modelling tool in a simulation including time varying applied field and transport current: the fundamental problem and its solution

    International Nuclear Information System (INIS)

    When analytic solutions are not available, finite-element-based tools can be used to simulate hysteresis losses in superconductors with various shapes. A widely used tool for the corresponding magnetoquasistatic problem is based on the H-formulation, where H is the magnetic field intensity, eddy current model. In this paper, we study this type of tool in a three-dimensional simulation problem. We consider a case where we simultaneously apply both a time-varying external magnetic field and a transport current to a twisted wire. We show how the modelling decisions (air has high finite resistivity and applied field determines the boundary condition) affect the current density distribution along the wire. According to the results, the wire carries the imposed net current only on the boundary of the modelling domain, but not inside it. The current diffuses to the air and back to the boundary. To fix this problem, we present another formulation where air is treated as a region with 0 conductivity. Correspondingly, we express H in the air with a scalar potential and a cohomology basis function which considers the net current condition. As shown in this paper, this formulation does not fail in these so-called AC-AC (time varying transport current and applied magnetic field) simulations. (paper)

  17. Modified Linear Theory Aircraft Design Tools and Sonic Boom Minimization Strategy Applied to Signature Freezing via F-function Lobe Balancing

    Science.gov (United States)

    Jung, Timothy Paul

    Commercial supersonic travel has strong business potential; however, in order for the Federal Aviation Administration to lift its ban on supersonic flight overland, designers must reduce aircraft sonic boom strength to an acceptable level. An efficient methodology and associated tools for designing aircraft for minimized sonic booms are presented. The computer-based preliminary design tool, RapidF, based on modified linear theory, enables quick assessment of an aircraft's sonic boom with run times less than 30 seconds on a desktop computer. A unique feature of RapidF is that it tracks where on the aircraft each segment of the of the sonic boom came from, enabling precise modifications, speeding the design process. Sonic booms from RapidF are compared to flight test data, showing that it is capability of predicting a sonic boom duration, overpressure, and interior shock locations. After the preliminary design is complete, scaled flight tests should be conducted to validate the low boom design. When conducting such tests, it is insufficient to just scale the length; thus, equations to scale the weight and propagation distance are derived. Using RapidF, a conceptual supersonic business jet design is presented that uses F-function lobe balancing to create a frozen sonic boom using lifting surfaces. The leading shock is reduced from 1.4 to 0.83 psf, and the trailing shock from 1.2 to 0.87 psf, 41% and 28% reductions respectfully. By changing the incidence angle of the surfaces, different sonic boom shapes can be created, and allowing the lobes to be re-balanced for new flight conditions. Computational fluid dynamics is conducted to validate the sonic boom predictions. Off-design analysis is presented that varies weight, altitude, Mach number, and propagation angle, demonstrating that lobe-balance is robust. Finally, the Perceived Level of Loudness metric is analyzed, resulting in a modified design that incorporates other boom minimization techniques to further reduce

  18. Spinoff 2003: 100 Years of Powered Flight

    Science.gov (United States)

    2003-01-01

    Today, NASA continues to reach milestones in space exploration with the Hubble Telescope, Earth-observing systems, the Space Shuttle, the Stardust spacecraft, the Chandra X-Ray Observatory, the International Space Station, the Mars rovers, and experimental research aircraft these are only a few of the many initiatives that have grown out of NASA engineering know-how to drive the Agency s missions. The technical expertise gained from these programs has transferred into partnerships with academia, industry, and other Federal agencies, ensuring America stays capable and competitive. With Spinoff 2003, we once again highlight the many partnerships with U.S. companies that are fulfilling the 1958 Space Act stipulation that NASA s vast body of scientific and technical knowledge also benefit mankind. This year's issue showcases innovations such as the cochlear implant in health and medicine, a cockpit weather system in transportation, and a smoke mask benefiting public safety; many other products are featured in these disciplines, as well as in the additional fields of consumer/home/recreation, environment and resources management, computer technology, and industrial productivity/ manufactacturing technology. Also in this issue, we devote an entire section to NASA s history in the field of flight and showcase NASA s newest enterprise dedicated to education. The Education Enterprise will provide unique teaching and learning experiences for students and teachers at all levels in science, technology, engineering, and mathematics. The Agency also is committed, as never before, to engaging parents and families through NASA s educational resources, content, and opportunities. NASA s catalyst to intensify its focus on teaching and learning springs from our mission statement: to inspire the next generation of explorers as only NASA can.

  19. Lurpak: Ready for another 100 years?

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    2001-01-01

    The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of.......The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of....

  20. 100-Year UPS,"Swifter" Olympics

    Institute of Scientific and Technical Information of China (English)

    Guo Yan; Yang Wei

    2007-01-01

    @@ As the official logistics and express delivery sponsor of 2008Beijing Olympics,UPS will manage all logistical operations at the Olympics Test Events(formally known as "Good Luck Beijing Events")and the actual Games through which majority of equipmerit used at the events will flow.

  1. Leadership: reflections over the past 100 years.

    Science.gov (United States)

    Gregoire, Mary B; Arendt, Susan W

    2014-05-01

    Leadership, viewed by the American Dietetic Association as the ability to inspire and guide others toward building and achieving a shared vision, is a much written-about topic. Research on leadership has addressed the topic using many different approaches, from a very simplistic definition of traits to a more complex process involving interactions, emotions, and learning. Thousands of books and papers have been published on the topic of leadership. This review paper will provide examples of the varying foci of the writings on this topic and includes references for instruments used to measure leadership traits and behaviors. Research is needed to determine effective strategies for preparing dietitians to be effective leaders and assume leadership positions. Identifying ways to help dietitians better reflect on their leadership experiences to enhance their learning and leadership might be one strategy to explore.

  2. Appraising Schumpeter's "Essence" after 100 years

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    der theoretischen Nationalökonomie'. This German-language book-which in English might be called 'Essence and Scope of Theoretical Economics'-was published a century ago (in 1908). Different readings of Wesen provide many clues about the emergence and structure of Schumpeter's programme for teaching....... This reinterpretation helped him to sketch out his theory of economic business cycles as reflecting the waveform process of economic evolution under capitalism....

  3. 100 Years of General Theory of Relativity

    CERN Document Server

    2015-01-01

    The Symposium will celebrate the 100th anniversary of Einstein's four papers on General Relativity, which he submitted to the Preussische Akademie der Wissenschaften during November 1915. A review of the history of the creation of Einstein's masterpiece, from its roots in Bern, the important steps forward in Zurich and up to its completion in Berlin will be followed by an extensive overview covering the later developments up to present-day research. This will include discussions on the impact of the theory on our view of the universe as well as on progress in technology for everyday life.

  4. Analysis of 100 Years of Curriculum Designs

    Science.gov (United States)

    Kelting-Gibson, Lynn

    2013-01-01

    Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational…

  5. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    Science.gov (United States)

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  6. Science serving people. IAEA-supported projects are helping countries apply the right tools to fight food, health, and water problems

    International Nuclear Information System (INIS)

    A new booklet 'Science Serving People' features stories about how IAEA-supported projects are making a difference in many poorer countries. The stories describe applications of nuclear science and technology that are being used through technical cooperation channels to overcome challenges of water scarcity, food shortage, malnutrition, malaria, environmental degradation and many other problems. They also illustrate how the complementary development, safety, and security initiatives of the IAEA are fostering atoms for peace in the developing world. Extreme poverty and deprivation remain a problem of monumental proportions at the dawn of the 21st century, notes IAEA Director General Mohamed ElBaradei in the booklet's Introduction. Through effective partnerships, collaborative research, and strategic direction, the IAEA is contributing to global efforts to help the poor. IAEA programmes have entered an important phase, he said, in which scientific contributions to Member States are yielding very sizeable human benefits. It's clear that science and technology must be better mobilized to meet the needs of the poor, emphasizes Jeffrey Sachs, Director of the Earth Institute at Columbia University, USA, and Special Advisor to UN Secretary-General Kofi Annan. The UN agencies, such as the IAEA, have a great role to play, he says in the booklet's Foreword. This is especially so, he points out, if they act as a bridge between the activities of advanced- country and developing country scientific centres, and if they help to harness the advances of world science for the poor as well as the rich. The bottom line, he concludes, is that rich countries should expand support for those United Nations organizations that can help in solving the unique problems confronting the world's poorest peoples. The booklet features stories on managing water resources, promoting food security, focusing science on health problems, new tools for environmental management, and strengthening nuclear

  7. Solar geometry tool applied to systems and bio-climatic architecture; Herramienta de geometria solar aplicada a sistemas y arquitectura bio-climatica

    Energy Technology Data Exchange (ETDEWEB)

    Urbano, Antonio; Matsumoto, Yasuhiro; Aguilar, Jaime; Asomoza Rene [CIMVESTAV-IPN, Mexico, D.F (Mexico)

    2000-07-01

    The present article shows the annual solar path, by means of graphic Cartesian, as well as the use of these, taken as base the astronomical, geographical antecedents and of the place. These graphs indicate the hours of sun along the day, month and year for the latitude of 19 Celsius degrees north, as well as the values of radiation solar schedule for the most important declines happened annually (equinoxes, solstices and the intermediate months). These graphs facilitate the user's good location to evaluate inherent obstacles of the environment and to determine in the place, the shades on the solar equipment or immovable (mountains, tree, buildings, windows, terraces, domes, et cetera), the hours of sun or the radiation for the wanted bio-climatic calculation. The present work is a tool of place engineering for the architects, designers, manufactures, planners, installers, energy auditors among other that require the use of the solar energy for anyone of its multiple applications. [Spanish] El presente articulo, muestra las trayectorias solares anules, mediante graficas cartesianas, asi como la utilizacion de estas, tomando como base los antecedentes astronomicos, geograficos y del lugar. Estas graficas indican las horas del sol a lo largo del dia, mes y ano para la latitud de 19 grados Celsius norte, asi como los valores de radiacion solar horaria para las declinaciones mas importantes ocurridas anualmente (equinoccios, solsticios y los meses intermedios). Estas graficas facilitan la ubicacion optima del usuario para evaluar obstaculos inherentes del entorno y determinar en el sitio, las sombras sobre los equipos solares o inmuebles (montanas, arboles, edificios, ventanas, terrazas, domos, etc.), las horas de sol o bien la radiacion para el calculo bio-climatico deseado. El presente trabajo es una herramienta de Ingenieria de sitio para los Arquitectos, Disenadores, Constructores, Proyectistas, Instaladores, Auditores Energeticos entre otros, que requieran el

  8. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  9. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Energy Technology Data Exchange (ETDEWEB)

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  10. Multivariate curve resolution applied to in situ X-ray absorption spectroscopy data: An efficient tool for data processing and analysis

    International Nuclear Information System (INIS)

    Highlights: • Use of MCR algorithms to extract component spectra of different kinetic evolution. • Obtaining components and concentration profiles without use of reference spectra. • Automatic extraction of meaningful component profiles from large XAS datasets. - Abstract: Large datasets containing many spectra commonly associated with in situ or operando experiments call for new data treatment strategies as conventional scan by scan data analysis methods have become a time-consuming bottleneck. Several convenient automated data processing procedures like least square fitting of reference spectra exist but are based on assumptions. Here we present the application of multivariate curve resolution (MCR) as a blind-source separation method to efficiently process a large data set of an in situ X-ray absorption spectroscopy experiment where the sample undergoes a periodic concentration perturbation. MCR was applied to data from a reversible reduction–oxidation reaction of a rhenium promoted cobalt Fischer–Tropsch synthesis catalyst. The MCR algorithm was capable of extracting in a highly automated manner the component spectra with a different kinetic evolution together with their respective concentration profiles without the use of reference spectra. The modulative nature of our experiments allows for averaging of a number of identical periods and hence an increase in the signal to noise ratio (S/N) which is efficiently exploited by MCR. The practical and added value of the approach in extracting information from large and complex datasets, typical for in situ and operando studies, is highlighted

  11. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    Science.gov (United States)

    Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available. PMID:26985673

  12. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  13. How credible are the study results? Evaluating and applying internal validity tools to literature-based assessments of environmental health hazards.

    Science.gov (United States)

    Rooney, Andrew A; Cooper, Glinda S; Jahnke, Gloria D; Lam, Juleen; Morgan, Rebecca L; Boyles, Abee L; Ratcliffe, Jennifer M; Kraft, Andrew D; Schünemann, Holger J; Schwingl, Pamela; Walker, Teneille D; Thayer, Kristina A; Lunn, Ruth M

    2016-01-01

    Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or "risk of bias" - the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program's (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or "domains" for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. PMID:26857180

  14. The“Musical Ningboese” and the Emergenceof China’s Piano Industry in the Past 100 Years%音乐“宁波帮”与中国百年钢琴制作的崛起

    Institute of Scientific and Technical Information of China (English)

    沈浩杰

    2015-01-01

    Without piano, there would be no the development in the musical field in the past 100 years in China. By taking advantage of their inner priorities and outer environments, the people of Ningbo, as the first Chinese people engaged in the piano-making sector, have dominated the all-round development in the piano-related fields such as the production of the spare parts and whole products, marketing, research and development, talents’ training, piano education and service, and they have always been the leader in the 100 years when China’s piano-making has developed from a small and weak industry to a large and strong one. The large and excellent group of people in the piano-making field and the other local professional musicians form the“Musical Ningboese”, a unique group of people in the modern musical history in China.%没有钢琴,就没有中国一百多年新音乐事业的发展。宁波人作为近代以来首批涉略钢琴制作的中国人,善于凭借自身优势和外部环境,主导钢琴零部件、整件制作、市场拓展、科研开发、人才培养、教育服务等全方位建设,始终引领着中国钢琴制作由小变大,由弱到强的百年发展史。这支跨越百年的庞大而优秀的从业队伍和其他宁波籍专业音乐家合在一起,形成了音乐“宁波帮”这一中国近现代音乐史上独一无二的群体。

  15. 1,3:2,4-Dibenzylidene-D-sorbitol (DBS) and its derivatives--efficient, versatile and industrially-relevant low-molecular-weight gelators with over 100 years of history and a bright future.

    Science.gov (United States)

    Okesola, Babatunde O; Vieira, Vânia M P; Cornwell, Daniel J; Whitelaw, Nicole K; Smith, David K

    2015-06-28

    Dibenzylidene-D-sorbitol (DBS) has been a well-known low-molecular-weight gelator of organic solvents for over 100 years. As such, it constitutes a very early example of a supramolecular gel--a research field which has recently developed into one of intense interest. The ability of DBS to self-assemble into sample-spanning networks in numerous solvents is predicated upon its 'butterfly-like' structure, whereby the benzylidene groups constitute the 'wings' and the sorbitol backbone the 'body'--the two parts representing the molecular recognition motifs underpinning its gelation mechanism, with the nature of solvent playing a key role in controlling the precise assembly mode. This gelator has found widespread applications in areas as diverse as personal care products and polymer nucleation/clarification, and has considerable potential in applications such as dental composites, energy technology and liquid crystalline materials. Some derivatives of DBS have also been reported which offer the potential to expand the scope and range of applications of this family of gelators and endow the nansocale network with additional functionality. This review aims to explain current trends in DBS research, and provide insight into how by combining a long history of application, with modern methods of derivatisation and analysis, the future for this family of gelators is bright, with an increasing number of high-tech applications, from environmental remediation to tissue engineering, being within reach.

  16. Review and Preview on The Development of the World women's 100 m Running in Nearly 100 Years%世界女子百米短跑百年发展的回顾与前瞻

    Institute of Scientific and Technical Information of China (English)

    王刚; 辛飞庆; 辛飞兵

    2014-01-01

    为了解和掌握近百年来世界女子百米短跑变化状况,运用文献资料法、数理统计法和比较法,分析世界女子百米短跑发展趋势,旨在探索当今及未来世界女子百米短跑发展趋势。结论:奥运会女子纪录不断被刷新,美国女子百米仍处世界领先水平,夺取奥运女子百米桂冠黑人选手居多,女性的生理结构决定女子百米成绩已达到女子极限。%This paper aims to understand and grasp the changes of the world women's 100 m running in nearly 100 years, explores the current and future development trend of world women's 100m running by methods of literature material , mathematical statistics and comparative law.Results show that the Olym-pic Games women's record has been refreshed on and on , the United States sprinters are still leading in the world, black sprinters is in the majority seizing the best title of Olympic games , the performance has reached the limit determined by the physiological structure of the female.

  17. 近百年El Nino/La Nina事件与北京气候相关性分析%Correlation Analysis Between El Nino/La Nina Phenomenon During the Recent 100 Years and Beijing Climate

    Institute of Scientific and Technical Information of China (English)

    刘桂莲; 张明庆

    2001-01-01

    Results of the analysis suggest that during the recent 100 years there e xists a strong correlation between the El Nino/La Nina phenomenon and Beijing′s rainfall in summer(June—August),mean monthly maximum temperature (July) and mean monthly minimum temperature in winter (January).El Nino phenomenon appears a negative-correlation with the summer rainfall and the mean monthly minimum te mperature;whereas a positive correlation with the mean monthly maximum temperatu re in summer.La Nina phenomenon appears a positive correlation with the summer r ainfall and the mean monthly minimum temperature in winter;whereas a negative-c orrelation with the mean monthly maximum temperature in summer.%通过对近百年El Nino/La Nina事件与北京气候相关性研究发现,El Nino/ La Nina事件与北京夏季(6~8月)降水、平均最高气温(7月)和冬季(1月)平均最低气温之间 相互关系显著。El Nino事件与夏季降水、冬季平均最低气温呈负相关,与夏季平均最高气 温呈正相关,造成降水减少,气温年较差增大,大陆性增强的气候特点。La Nina事件与夏 季降水、冬季平均最低气温呈正相关,与夏季平均最高气温呈负相关,使降水增加,气温年 较差减小,大陆性减弱的气候特点。

  18. Applied Electromagnetics

    International Nuclear Information System (INIS)

    These proceedings contain papers relating to the 3rd Japanese-Bulgarian-Macedonian Joint Seminar on Applied Electromagnetics. Included are the following groups: Numerical Methods I; Electrical and Mechanical System Analysis and Simulations; Inverse Problems and Optimizations; Software Methodology; Numerical Methods II; Applied Electromagnetics

  19. LEGO Mindstorms在农业信息与自动化技术课程教学中的应用研究%Applying LEGO Mindstorms robots as a teaching tool in Agricultural Information and Automation education

    Institute of Scientific and Technical Information of China (English)

    冯雷; 郭亚芳; 王若青; 沈明卫; 何勇

    2012-01-01

    为了培养农业与生物系统工程类学生的创新能力,引导学生进行自主、探究式学习,在精细农业等农业信息和自动化技术类课程教学中应用LEGO Mindstorms组件作为教具,研制智能化可编程农机作业模拟系统.多组学生在Robolab编程环境下,使用此系统构建小型智能农业机械模型.结果表明,学生对这个实验项目的效果很满意.因为它不仅为学生提供了解决问题的技巧,提高了他们编程和机械设计等相关技能,同时有益于学生间形成团队协作的良好氛围.介绍了LECGO Mindstorms在相关农业信息课程教学中的探索应用以及实验结果.%The objective is to present details of an LEGO Mindstorms robot design challenge arranged for agriculture and biosystems engineering students in Zhejiang University. As a new teaching tool in the course of Agricultural Information and Automation, different groups of students built Robotic Agricultural Machines using LEGO Mindstorms kits, with Robolab as the programming environment. A survey among 30 students showed that all students were challenged by the projects and were highly satisfied with the outcomes. They all strongly agreed that the projects were effective in helping them to work in teams, apply problem-solving techniques and to boost their programming and mechanical design skills.

  20. Applied superconductivity

    CERN Document Server

    Newhouse, Vernon L

    1975-01-01

    Applied Superconductivity, Volume II, is part of a two-volume series on applied superconductivity. The first volume dealt with electronic applications and radiation detection, and contains a chapter on liquid helium refrigeration. The present volume discusses magnets, electromechanical applications, accelerators, and microwave and rf devices. The book opens with a chapter on high-field superconducting magnets, covering applications and magnet design. Subsequent chapters discuss superconductive machinery such as superconductive bearings and motors; rf superconducting devices; and future prospec

  1. Applied mathematics

    CERN Document Server

    Logan, J David

    2013-01-01

    Praise for the Third Edition"Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." -MAA Reviews Applied Mathematics, Fourth Edition is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and nat

  2. Philosophical Foundation of Chinese Modernity and the Development of Chinese Philosophy in Recent 100 Years%中华现代性的哲学奠基与百年来的中国哲学

    Institute of Scientific and Technical Information of China (English)

    沈清松

    2013-01-01

    提本文关心的问题在于“中华现代性”的哲学奠基,并扣紧此一思路将百年来的中国哲学发展分为四个阶段。第一阶段,哲学作为西方现代性基础思想之引进者,最领风骚。第二阶段,哲学协助反思民族精神,虽渐退居第二线,然已开始思索中华现代性的方向。第三阶段,哲学工作者致力提出数个融合中西的哲学体系,试图为中华现代性作哲学奠基,在人文、社会科学界中实属难能可贵。现在,处于第四期后期,中国哲学面对全球化与后现代挑战,应能在跨文化与全世界脉络中继续探辟概念系统、生活世界与中华精神资源,进行批判性、合理性、顾全整体的哲学思索。%Focusing on the problematics of Chinese modernity and its philosophical foundation, this paper divides the development of philosophy in China in recent 100 years into four peroiods.In its first period, from 1911 to 1927, philosophers in China were the most enthusiastic in introdu-cing Western modernity and its philosophical foundations from various forms and doctrines of modern Western philosophy.This period was the most pro-gressive and, indeed, impacted all Chinese intellectuals at the time.In its second period from 1928 to 1949, during the time of national construction and the Japanese invasion, philosophers stepped behind, serving as helpers in clarifying and articulating the Chinese spirit and Chinese subjectivity at the time of its awakening.This prepared for the philosophical foundation of a Chinese model of modernity.In the third period from 1949 to 1980, some major philosophical figures had built their philosophical systems synthesizing Western and Chinese philosophies.This was, indeed, most precious, and also unique, in comparison with other disciplines in the humanities and the social sciences.These philosophical systems could be seen as divers at-tempts to lay the philosophical foundation of Chinese

  3. Geometric reasoning about assembly tools

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  4. Applied dynamics

    CERN Document Server

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  5. Applied optics

    International Nuclear Information System (INIS)

    The 1988 progress report, of the Applied Optics laboratory, of the (Polytechnic School, France), is presented. The optical fiber activities are focused on the development of an optical gyrometer, containing a resonance cavity. The following domains are included, in the research program: the infrared laser physics, the laser sources, the semiconductor physics, the multiple-photon ionization and the nonlinear optics. Investigations on the biomedical, the biological and biophysical domains are carried out. The published papers and the congress communications are listed

  6. The Two-time Rise of Australian Competitive Sport in the 100 Years of the Olympic Games%百年奥运视角下澳大利亚竞技体育的二次崛起历程分析及启示

    Institute of Scientific and Technical Information of China (English)

    浦义俊; 吴贻刚

    2014-01-01

    运用文献资料法、数理统计法、逻辑分析法等,以百年奥运会为研究视角,对澳大利亚竞技体育发展中二次崛起历程及其成功因素进行分析。首先对澳大利亚百年奥运竞技历程进行了发展分期,其次基于奖牌分布特征将对比分析了澳大利亚二次竞技体育崛起的项目分布差异;再次则是在比较澳大利亚两次竞技体育崛起内外部环境变化的基础上,重点对澳大利亚竞技体育得以再度崛起的成功因素进行深入分析,研究认为澳大利亚政治领域执政党及其执政理念的更迭为竞技体育的重生创造了重要前提,澳大利亚联邦政府对竞技体育发展深入的政策设计与逐渐的资金资助是其重回巅峰的关键,澳大利亚逐渐走向成熟的竞技体育管理系统是其再次崛起的重要保障,澳大利亚国民性与体育的高度融合是其竞技体育再度成功的重要根基。最后则指出了澳大利亚竞技体育二次崛起对我国的启示。%This study examined the process of the two-time rise of Australian competitive sport in the 100 years of the Olympic Games and factors contributing to its success. First,we conducted a stage division of the 100 years of Australian competitive sport. Then we looked at the event distribution of the two-time rise. After we compared the first-time rise and the second-time rise in terms of internal and external environment,we focused on the analysis of elements that contributed to the second-time rise of Australian competitive sport. These elements include the alterna-tion of the ruling party and change in administrating conceptions,the federal government’s policy design and finan-cial support for the development of competitive sport,the maturing management system of competitive sport,and the agreement between Australia’s national character and sport. The study concluded with inspirations that can be ap-plied in China’s sport context.

  7. Disclosure as a regulatory tool

    DEFF Research Database (Denmark)

    Sørensen, Karsten Engsig

    2006-01-01

    The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law.......The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law....

  8. Applied mathematics

    International Nuclear Information System (INIS)

    The 1988 progress report of the Applied Mathematics center (Polytechnic School, France), is presented. The research fields of the Center are the scientific calculus, the probabilities and statistics and the video image synthesis. The research topics developed are: the analysis of numerical methods, the mathematical analysis of the physics and mechanics fundamental models, the numerical solution of complex models related to the industrial problems, the stochastic calculus and the brownian movement, the stochastic partial differential equations, the identification of the adaptive filtering parameters, the discrete element systems, statistics, the stochastic control and the development, the image synthesis techniques for education and research programs. The published papers, the congress communications and the thesis are listed

  9. Applied geodesy

    International Nuclear Information System (INIS)

    This volume is based on the proceedings of the CERN Accelerator School's course on Applied Geodesy for Particle Accelerators held in April 1986. The purpose was to record and disseminate the knowledge gained in recent years on the geodesy of accelerators and other large systems. The latest methods for positioning equipment to sub-millimetric accuracy in deep underground tunnels several tens of kilometers long are described, as well as such sophisticated techniques as the Navstar Global Positioning System and the Terrameter. Automation of better known instruments such as the gyroscope and Distinvar is also treated along with the highly evolved treatment of components in a modern accelerator. Use of the methods described can be of great benefit in many areas of research and industrial geodesy such as surveying, nautical and aeronautical engineering, astronomical radio-interferometry, metrology of large components, deformation studies, etc

  10. Downhole tool adapted for telemetry

    Science.gov (United States)

    Hall, David R.; Fox, Joe

    2010-12-14

    A cycleable downhole tool such as a Jar, a hydraulic hammer, and a shock absorber adapted for telemetry. This invention applies to other tools where the active components of the tool are displaced when the tool is rotationally or translationally cycled. The invention consists of inductive or contact transmission rings that are connected by an extensible conductor. The extensible conductor permits the transmission of the signal before, after, and during the cycling of the tool. The signal may be continuous or intermittent during cycling. The invention also applies to downhole tools that do not cycle, but in operation are under such stress that an extensible conductor is beneficial. The extensible conductor may also consist of an extensible portion and a fixed portion. The extensible conductor also features clamps that maintain the conductor under stresses greater than that seen by the tool, and seals that are capable of protecting against downhole pressure and contamination.

  11. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  12. TS Tools

    Directory of Open Access Journals (Sweden)

    Yvette Linders

    2012-12-01

    Full Text Available In deze aflevering van TS Tools laat promovenda Yvette Linders (Radboud Universiteit Nijmegen zien hoe software voor kwalitatieve data-analyse kan worden toegepast in het onderzoek naar literatuurkritiek.

  13. 100 years' evolution of fisheries higher education and its strategic transformation in China%我国水产高等教育的百年沿革与战略转型

    Institute of Scientific and Technical Information of China (English)

    宁波

    2011-01-01

    In the early 20th century, in order to safeguard the country' s maritime rights and interests and to develop national fisheries industry, the government of Qing Dynasty and Republic of China learning from Japan, United States and other countries' experiences, began to develop China' s fisheries education. After the founding of new China, Shanghai Fisheries College and other fisheries colleges had been founded since 1952 in succession. In 1950's and 1960's, learning experiences from Soviet Union, the system of fisheries higher education was established in China. After decades of development, the fisheries higher education had made great achievements, and made outstanding contributions to the development of fisheries industry in China. In the late 20th century, to meet the needs of the marine industry development, and the self-development needs of fisheries higher education, and the needs of building a modern marine society, the fisheries colleges and universities have changed their names to marine universities, and have transformed from single discipline universities into multi-disciplinary marine universities. The transformation has promoted the development of marine higher education in China. For the effective development of marine higher education,it is suggested to take higher starting point and the road of international education, and to develop basic sciences and applied sciences coordinated, and to optimize the structure of marine higher education further,and to build a good three-dimensional linkage mechanism between the government, marine universities and the society.%20世纪初,为维护海权,清政府、国民政府先后学习日、美等国经验,开始发展水产教育.新中国成立后,上海水产学院等若干所本科水产学府从1952年起陆续建立.20世纪五六十年代,中国学习苏联水产高校经验,奠定中国水产高等教育的主要格局.经过几十年发展,中国水产高等教育取得很大成就,为中国水

  14. Applied ALARA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  15. Management Tools in Engineering Education.

    Science.gov (United States)

    Fehr, M.

    1999-01-01

    Describes a teaching model that applies management tools such as delegation, total quality management, time management, teamwork, and Deming rules. Promotes the advantages of efficiency, reporting, independent scheduling, and quality. (SK)

  16. 核电厂变形相关组件安全贮存工器具研发及应用%Safe Storage Tools Designed and Applied for Deformation Associated Core Components

    Institute of Scientific and Technical Information of China (English)

    石中华; 邓志新; 张旭辉; 王玲彬

    2015-01-01

    由于变形相关组件形状发生变化,无法配插到燃料组件或存放架内进行贮存,一般是将其临时放入乏燃料水池贮存格架中的空贮存小室内,这种状态下的变形相关组件失去支撑,变形相关组件棒体会在自身重量作用下发生弯曲,长时间处于此状态下可能导致棒体破损,致使里面的物质泄漏而污染乏燃料水池.文章中工器具的整个研发是以秦山第二核电厂作为试验场所,以其乏燃料水池内的3组变形相关组件作为研制对象,最终研发出一套适用于变形相关组件安全贮存的工器具,本套工器具保证了变形相关组件的完整性.%Due to the shape of the deformation associated core components is changed, it can't be inserted to fuel assembly or storage rack, and it can only be stored in spent fuel storage cell. Deformation associated core components in this state will lose support. Rods of deformation as-sociated core components become bend under the weight of its own, which may be damaged for a long time in this state, thus the material inside leaked and the spent fuel pool is contami-nated. Therefore, we must design a tool, which can safely store the deformation associated core components. Qinshan Phase Ⅱ is selected as the test site for the whole development process, with three groups of deformation associated core components (a group of primary neutron source assembly, a group of burnable poison assembly, a group of rod cluster control assembly) in the spent fuel pool as development objects. Ultimately a set of tools suitable for the safe storage of deformation associated core components are developed, which ensure the integrity of the defor-mation associated core components.

  17. Alternative affinity tools: more attractive than antibodies?

    NARCIS (Netherlands)

    Ruigrok, V.J.B.; Levisson, M.; Eppink, M.H.M.; Smidt, H.; Oost, van der J.

    2011-01-01

    Antibodies are the most successful affinity tools used today, in both fundamental and applied research (diagnostics, purification and therapeutics). Nonetheless, antibodies do have their limitations, including high production costs and low stability. Alternative affinity tools based on nucleic acids

  18. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  19. Applied investigation on multimedia interactive teaching system of numerical control machine tools%多媒体视频互动系统在数控机床实训中的应用研究

    Institute of Scientific and Technical Information of China (English)

    施立钦

    2014-01-01

    多媒体数控机床实训互动教学系统由高清示教系统、视频监控系统、视频显示系统、语音对讲系统、集中控制系统及课程录播等多个子系统构成,并通过后台软件融合成一个有机的整体。运用该系统可以解决当前高职院校数控技术专业面临的教学方法滞后、实践效果差、安全隐患不断、教师资源不足等诸多问题,探索出一种新的多媒体视频互动系统在数控机床实训中的教学模式。%Multimedia interactive teaching system of numerical control machine tools consisted of high deifnition demonstration teaching system, video monitoring system, video display system, voice intercom system, centralized control system, course recorded and multiple subsystems and formed an organic whole through the background software. The multimedia interactive teaching system solved the current numerical control technology specialty teaching facing many problems such as teaching lag, low practice teaching method effect, safety hidden trouble, insufifciency of teacher resources and many other issues in higher vocational colleges. A new teaching mode was explored.

  20. Nigeria Anopheles vector database: an overview of 100 years' research.

    Directory of Open Access Journals (Sweden)

    Patricia Nkem Okorie

    Full Text Available Anopheles mosquitoes are important vectors of malaria and lymphatic filariasis (LF, which are major public health diseases in Nigeria. Malaria is caused by infection with a protozoan parasite of the genus Plasmodium and LF by the parasitic worm Wuchereria bancrofti. Updating our knowledge of the Anopheles species is vital in planning and implementing evidence based vector control programs. To present a comprehensive report on the spatial distribution and composition of these vectors, all published data available were collated into a database. Details recorded for each source were the locality, latitude/longitude, time/period of study, species, abundance, sampling/collection methods, morphological and molecular species identification methods, insecticide resistance status, including evidence of the kdr allele, and P. falciparum sporozoite rate and W. bancrofti microfilaria prevalence. This collation resulted in a total of 110 publications, encompassing 484,747 Anopheles mosquitoes in 632 spatially unique descriptions at 142 georeferenced locations being identified across Nigeria from 1900 to 2010. Overall, the highest number of vector species reported included An. gambiae complex (65.2%, An. funestus complex (17.3%, An. gambiae s.s. (6.5%. An. arabiensis (5.0% and An. funestus s.s. (2.5%, with the molecular forms An. gambiae M and S identified at 120 locations. A variety of sampling/collection and species identification methods were used with an increase in molecular techniques in recent decades. Insecticide resistance to pyrethroids and organochlorines was found in the main Anopheles species across 45 locations. Presence of P. falciparum and W. bancrofti varied between species with the highest sporozoite rates found in An. gambiae s.s, An. funestus s.s. and An. moucheti, and the highest microfilaria prevalence in An. gambiae s.l., An. arabiensis, and An. gambiae s.s. This comprehensive geo-referenced database provides an essential baseline on Anopheles vectors and will be an important resource for malaria and LF vector control programmes in Nigeria.

  1. Developing Resilient Children: After 100 Years of Montessori Education

    Science.gov (United States)

    Drake, Meg

    2008-01-01

    In this millennium, educators are faced with a number of issues that Dr. Maria Montessori could not have predicted. Today, students are different from the children Dr. Montessori observed in her "Casa dei Bambini." They are influenced by technology in all its forms. Some suffer from medical problems such as complex food allergies, which wreak…

  2. I. P. PAVLOV: 100 YEARS OF RESERACH ON ASSOCIATIVE LEARNING

    Directory of Open Access Journals (Sweden)

    GERMÁN GUTIÉRREZ

    2005-07-01

    Full Text Available A biographical summary of Ivan Pavlov is presented, emphasizing his academic formation and achievements, and hiscontributions to general science and psychology. His main findings on associative learning are described and three areasof current development in this area are discussed: the study of behavioral mechanisms, the study of neurobiologicalmechanisms and the functional role of learning.

  3. 100 years of refrigeration engineering in the chemical industry

    Energy Technology Data Exchange (ETDEWEB)

    Dietrich, K.

    1987-11-01

    The report reviews the many uses of cold in the chemical industry. As an example, the development of the refrigeration system of a large chemical plant is described which distributes cold nearly without losses from a central refrigeration unit and can be controlled for optimum adaptation to the various chemical production processes. The contribution of refrigeration to higher product quality, higher product yield, materials recycling to save feedstocks, energy conservation by means of heat pumps, and environmental protection is pointed out.

  4. The Journal de Radiologie is 100 years old

    International Nuclear Information System (INIS)

    In January 1914, the first edition of Le Journal de Radiologie et d'Electrologie, a monthly medical review, was published by Masson. It was organized by a committee of ten members, whose general secretary was J. Belot. The members of the committee were the pioneers of radiology in France at the time and remained leaders in the field for three decades. The relationships between the Journal and the Societe de Radiologie are obvious: J. Belot was president of the Society and remained so until 1920, G. Haret, a committee member, was the general secretary of the Society from 1909 and remained so until 1928. The Journal at the time did not claim to be national and all of its committee members were Parisian. There was, nonetheless, an impressive list of contributors from throughout France and from several foreign countries. The table of contents of this first edition clearly shows the many skills of these first generation radiologists who saw themselves not only as radio-diagnosticians, radiotherapists, and electrologists, but also as physicians. From this first edition, the ambitions of the Journal's chiefs were clear: unquestionable competence and the need for research, and the importance of innovation. Certainly the founders of this new journal saw themselves as the masters of French radiology, but they were nevertheless wide open to the world. The Journal reported several European meetings, book reviews and articles, particularly from Germany and Austria, but also from England, Belgium, Italy, Cuba, Egypt, the Philippines and the United States. America, however, was very far from being preeminent in 1914. French radiology books also began to be published. Radiology was everywhere. Above all, however was the advertising, promoting the products of innumerable companies producing radiology instrumentation: in the following decades, these merged and concentrated until the French radiology industry had almost completely disappeared after a merger with an American company. The dangers of X-rays were already well known in 1914 and there was a full understanding of the need for protection. Until August 1914, the Journal continued in this same spirit of quality and accuracy to examine all of the fields of radiology and electrology. The Journal supplements generally attracted considerable interest. Whereas the content of the Journal itself was purely scientific, the supplements were useful for radiologists at conferences and meetings and contained large numbers of classified advertisement and advertising. They therefore allowed the reader of the day to connect the radiology community to the neighboring world. From August 1914 to March 1915, the results of radiology's contribution to the Army Health Service was remarkable. The main subject remained battlefield radiology and very soon the decision was made to equip radiology vehicles. A new development was the automobile ambulances. In the navy, hospital boats equipped with radiology instruments were used to examine and treat the injured evacuated from Flandres or the Dardanelles. Collaboration between radiologists and surgeons, eye and hand, had therefore become unquestioned. Their respective roles were clearly described in the Journal de Radiologie articles. However, subjects other than battlefield radiology were not neglected. Radiology acquired legitimacy and its merits were no longer questioned. It advanced enormously: new methods and new instruments were born and a new organization was developed, which radiologists took advantage of to defend their points of view

  5. Media Storytelling, Curriculum, and the Next 100 Years

    Science.gov (United States)

    Lipschultz, Jeremy Harris

    2012-01-01

    Journalism as an academic field in the United States has frequently changed and grown through new professions and new industries coming under its umbrella (sometimes but not always driven by technological and/or economic changes) and academic developments such as cultural studies and media studies. But journalism is still rooted in good…

  6. 100 Years of the Quantum - the Glory and the Shame

    Science.gov (United States)

    Wheeler, John Archibald

    2001-03-01

    What is the greatest mystery that stands out on the books of physics today? Every one of us will have a different answer. Some will say it is the structure of the elementary particles. Others ask in what form is the mass or the attraction that has held the universe together thus far against perpetual expansion. Others will think of the November 11, 1999 dedication of America's first 6 kilometer by 6 kilometer gravity wave detector and will ask what information it will bring us about events going on in the deep and secret places of the universe. My own hope is that we'll see in time to come the answer to a question now almost a century old, "How come the quantum?" Gregory Breit would have sympathized with the investigations at each of these fronts, and could surely have filled us in on what the pioneers thought and said about the question, "How come the quantum?" ...

  7. 100 years of educational reforms in Europe: a contextual database

    OpenAIRE

    Garrouste, Christelle

    2010-01-01

    This report presents the macro data on educational reforms collected for the Survey on Health, Ageing and Retirement in Europe (SHARE). It is divided in two sections. The first and chore part provides an analytical overview of the educational reforms that may have affected the skill level of Europe¿s elderly population. More specifically, it targets the national institutional plans or movements that have brought (or attempted to bring) systemic change in educational practices during the last ...

  8. [100 years of surgery in the Kosevo Hospital in Sarajevo].

    Science.gov (United States)

    Durić, O

    1994-01-01

    The surgery Department of the Regional Hospital was opened on 1st July, 1894 in Sarajevo, what meant the beginnings of European surgery school influence here. The School was in the second half of its activity, better known as "century of surgery". The building, fittings, equipment and staff continued their work here coping the Viennese school achievements. It was headed by the prominent European surgeon, primarius Dr Josef Preindisberger, first assistant to the great personality Dr. Billroth. In the way this institution became a referral centre for two other hospitals in Sarajevo: the Vakuf's and the Military Hospital, but for some 17 more in BH, which were built in the course of ten years. Because of the therapeutic success in the domain of the general surgery and diseases of the eye and according the annual reports, the first 50 beds became insufficient for all those who wanted the treatment. So, the Department was enlarged, in 1905 a new regional Hospital was planned, to act as clinics. The World War 1 stopped the plans. During the period of Kingdom of Yugoslavia, destroyed by war, the Surgery Department continued its work with the doctors educated to continue the work on the pre war level. As a broad pathology basis, but the need of space that time chief surgeon. Primarius Milivoje Kostić worked out in details the former plan of the new hospital building up with a base for clinics. It was accepted as a ten years project, which, to the regrets, did not come to existence to the World War 2.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. 100 YEARS OF AUDI%百年奥迪

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Since entering China in 1988, Audi has been ranked "China Auto Customer Service Satisfaction" champion and "China Auto Sales Satisfaction" champion ten times by J.D. Power, meanwhile becoming the first luxury car brand in the Chinese market to exceed one million units in sales. This success is not only due to the formidable technical background based on the idea of "Innovation through Technology,"%源于德国的奥迪汽车1988年挺进中国,二十几年内,十次摘得J.D.Power公布的"中国汽车用户服务满意度"冠军和"中国汽车销售满意度"冠军,并成为第一个中国地区累计销量超过100万辆的高档汽车品牌。奥迪在中国的成功不仅源自"突破科技、启迪未来"的技术背景,更要归功于其品牌策划与本地用户的良性互动。3月21日,"心动上海"奥迪艺术展在沪申画廊开幕。策展人顾振清生于上海,

  10. Global change and water resources in the next 100 years

    Science.gov (United States)

    Larsen, M. C.; Hirsch, R. M.

    2010-03-01

    We are in the midst of a continental-scale, multi-year experiment in the United States, in which we have not defined our testable hypotheses or set the duration and scope of the experiment, which poses major water-resources challenges for the 21st century. What are we doing? We are expanding population at three times the national growth rate in our most water-scarce region, the southwestern United States, where water stress is already great and modeling predicts decreased streamflow by the middle of this century. We are expanding irrigated agriculture from the west into the east, particularly to the southeastern states, where increased competition for ground and surface water has urban, agricultural, and environmental interests at odds, and increasingly, in court. We are expanding our consumption of pharmaceutical and personal care products to historic high levels and disposing them in surface and groundwater, through sewage treatment plants and individual septic systems. These substances are now detectable at very low concentrations and we have documented significant effects on aquatic species, particularly on fish reproduction function. We don’t yet know what effects on human health may emerge, nor do we know if we need to make large investments in water treatment systems, which were not designed to remove these substances. These are a few examples of our national-scale experiment. In addition to these water resources challenges, over which we have some control, climate change models indicate that precipitation and streamflow patterns will change in coming decades, with western mid-latitude North America generally drier. We have already documented trends in more rain and less snow in western mountains. This has large implications for water supply and storage, and groundwater recharge. We have documented earlier snowmelt peak spring runoff in northeastern and northwestern States, and western montane regions. Peak runoff is now about two weeks earlier than it was in the first half of the 20th century. Decreased summer runoff affects water supply for agriculture, domestic water supply, cooling needs for thermoelectric power generation, and ecosystem needs. In addition to the reduced volume of streamflow during warm summer months, less water results in elevated stream temperature, which also has significant effects on cooling of power generating facilities and on aquatic ecosystem needs. We are now required to include fish and other aquatic species in negotiation over how much water to leave in the river, rather than, as in the past, how much water we could remove from a river. Additionally, we must pay attention to the quality of that water, including its temperature. This is driven in the US by the Endangered Species Act and the Clean Water Act. Furthermore, we must now better understand and manage the whole hydrograph and the influence of hydrologic variability on aquatic ecosystems. Man has trimmed the tails off the probability distribution of flows. We need to understand how to put the tails back on but can’t do that without improved understanding of aquatic ecosystems. Sea level rise presents challenges for fresh water extraction from coastal aquifers as they are compromised by increased saline intrusion. A related problem faces users of ‘run-of-the-river’ water-supply intakes that are threatened by a salt front that migrates further upstream because of higher sea level. We face significant challenges with water infrastructure. The U.S. has among the highest quality drinking water in the world piped to our homes. However, our water and sewage treatment plants and water and sewer pipelines have not had adequate maintenance or investment for decades. The US Environmental Protection Agency estimates that there are up to 3.5M illnesses per year from recreational contact with sewage from sanitary sewage overflows. Infrastructure investment needs have been put at 5 trillion nationally. Global change and water resources challenges that we face this century include a combination of local and national management probl

  11. The FCS Body of Knowledge: Shaping the Next 100 Years

    Science.gov (United States)

    Journal of Family and Consumer Sciences, 2010

    2010-01-01

    This article shares the Body of Knowledge (BOK) as articulated in the new "Accreditation Documents for Undergraduate Programs in Family and Consumer Sciences" (2010). The purpose of sharing the BOK is to enhance awareness of the current knowledge base of family and consumer sciences (FCS), whether for new or lifelong AAFCS members, those exploring…

  12. Technique for estimating depth of 100-year floods in Tennessee

    Science.gov (United States)

    Gamble, Charles R.; Lewis, James G.

    1977-01-01

    Preface: A method is presented for estimating the depth of the loo-year flood in four hydrologic areas in Tennessee. Depths at 151 gaging stations on streams that were not significantly affected by man made changes were related to basin characteristics by multiple regression techniques. Equations derived from the analysis can be used to estimate the depth of the loo-year flood if the size of the drainage basin is known.

  13. 100 Years of Alzheimer's disease (1906-2006).

    Science.gov (United States)

    Lage, José Manuel Martínez

    2006-01-01

    As we commemorate the first centennial since Alzheimer's disease (AD) was first diagnosed, this article casts back into the past while also looking to the future. It reflects on the life of Alois Alzheimer (1864-1915) and the scientific work he undertook in describing the disorder suffered by Auguste D. from age 51 to 56 and the neuropathological findings revealed by her brain, reminding us of the origin of the eponym. It highlights how, throughout the 1960's, the true importance of AD as the major cause of late life dementia ultimately came to light and narrates the evolution of the concepts related to AD throughout the years and its recognition as a major public health problem. Finally, the article pays homage to the work done by the Alzheimer's Association and the research undertaken at the Alzheimer's Disease Centres within the framework of the National Institute on Aging (NIA) Program, briefly discussing the long road travelled in the fight against AD in the past 25 years and the scientific odyssey that we trust will result in finding a cure. PMID:17004362

  14. 100 years after the Marsica earthquake: contribute of outreach activities

    Science.gov (United States)

    D'Addezio, Giuliana; Giordani, Azzurra; Valle, Veronica; Riposati, Daniela

    2015-04-01

    Many outreach events have been proposed by the scientific community to celebrate the Centenary of the January 13, 1915 earthquake, that devastated the Marsica territory, located in Central Apennines. The Laboratorio Divulgazione Scientifica e Attività Museali of the Istituto Nazionale di Geofisica e Vulcanologia (INGV's Laboratory for Outreach and Museum Activities) in Rome, has realised an interactive exhibition in the Castello Piccolomini, Celano (AQ), to retrace the many aspects of the earthquake disaster, in a region such as Abruzzo affected by several destructive earthquakes during its history. The initiatives represent an ideal opportunity for the development of new programs of communication and training on seismic risk and to spread the culture of prevention. The INGV is accredited with the Servizio Civile Nazionale (National Civic Service) and volunteers are involved in the project "Science and Outreach: a comprehensive approach to the divulgation of knowledge of Earth Sciences" starting in 2014. In this contest, volunteers had the opportunity to fully contribute to the exhibition, in particular, promoting and realising two panels concerning the social and environmental consequences of the Marsica earthquake. Describing the serious consequences of the earthquake, we may raise awareness about natural hazards and about the only effective action for earthquake defense: building with anti seismic criteria. After studies and researches conducted in libraries and via web, two themes have been developped: the serious problem of orphans and the difficult reconstruction. Heavy snowfalls and the presence of wolves coming from the high and wild surrounding mountains complicated the scenario and decelerated the rescue of the affected populations. It is important to underline that the earthquake was not the only devastating event in the country in 1915; another drammatic event was, in fact, the First World War. Whole families died and the still alive infants and children were sent to Rome in hospitals and in other suitable structures. Many stories of poor orphans are known but we decided to outlines stories that besides the dramma had an happy ending. To understand the hugeness of the tragedy, we may consider that the number of towns and villages completely destroyed by the earthquake was more than fifty. The reconstruction was very difficult and slow also because of the war, and involved the relocation of settlements in different places. The first shelters to be reconstructed were those for survivors: very small shacks built with anti seismic criteria. They are still on the territory, to be a symbol of the reconstruction and a remined evidence of the earthquake.

  15. Engineering and malaria control: learning from the past 100 years

    DEFF Research Database (Denmark)

    Konradsen, Flemming; van der Hoek, Wim; Amerasinghe, Felix P;

    2004-01-01

    Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental-management-based approa......Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental......-management-based approaches for the control of malaria vectors follows the rapid development of resistance by mosquitoes to the widely used insecticides, the increasing cost of developing new chemicals, logistical constraints involved in the implementation of residual-spraying programs and the environmental concerns linked...... cases are discussed in the wider context of environment-based approaches for the control of malaria vectors, including current relevance. Clearly, some of the interventions piloted and implemented early in the last century still have relevance today but generally in a very site-specific manner...

  16. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  17. Applied mechanics of solids

    CERN Document Server

    Bower, Allan F

    2009-01-01

    Modern computer simulations make stress analysis easy. As they continue to replace classical mathematical methods of analysis, these software programs require users to have a solid understanding of the fundamental principles on which they are based. Develop Intuitive Ability to Identify and Avoid Physically Meaningless Predictions Applied Mechanics of Solids is a powerful tool for understanding how to take advantage of these revolutionary computer advances in the field of solid mechanics. Beginning with a description of the physical and mathematical laws that govern deformation in solids, the text presents modern constitutive equations, as well as analytical and computational methods of stress analysis and fracture mechanics. It also addresses the nonlinear theory of deformable rods, membranes, plates, and shells, and solutions to important boundary and initial value problems in solid mechanics. The author uses the step-by-step manner of a blackboard lecture to explain problem solving methods, often providing...

  18. Engineering tools

    OpenAIRE

    2010-01-01

    The aim of this report is to give an overview of the results of Work Package 5 “Engineering Tools”. In this workpackage numerical tools have been developed for all relevant CHCP systems in the PolySMART demonstration projects (WP3). First, existing simulation platforms have been described and specific characteristics have been identified. Several different simulation platforms are in principle appropriate for the needs in the PolySMART project. The result is an evaluation of available simulat...

  19. Tool Gear: Infrastructure for Parallel Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  20. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  1. Indispensable tool

    International Nuclear Information System (INIS)

    Synchrotron radiation has become an indispensable research tool for a growing number of scientists in a seemingly ever expanding number of disciplines. We can thank the European Synchrotron Research Facility (ESRF) in Grenoble for taking an innovative step toward achieving the educational goal of explaining the nature and benefits of synchrotron radiation to audiences ranging from the general public (including students) to government officials to scientists who may be unfamiliar with x-ray techniques and synchrotron radiation. ESRF is the driving force behind a new CD-ROM playable on both PCs and Macs titled Synchrotron light to explore matter. Published by Springer-Verlag, the CD contains both English and French versions of a comprehensive overview of the subject

  2. A Chinese type 2 diabetic patient sports hindrance survey tool developed by applying the Rasch model%应用Rasch模型开发我国2型糖尿病患者运动阻碍的调查工具

    Institute of Scientific and Technical Information of China (English)

    李庆雯; 朱为模; 李梅

    2014-01-01

    In order to develop and standardize a diabetic patient sports hindrance survey tool, and to clarify major sports hindrances suffered by type 2 diabetic patients in China, the authors applied the world latest educational/psychological measurement technology to simplify the survey tool, so that it can be applied to mass groups of people better. The authors carried out a questionnaire survey on 197 type 2 diabetic patients (104 males, 93 females, average age:53.6). The survey tool originated from a mature sports hindrance survey questionnaire established in the United States, including 43 sports hindrances, used to determine the degree of hindrance in their sports participation. The authors analyzed the data by apply-ing the Rasch model, determined the magnitudes of sports hindrances based on their Logits values, and then analyzed the results according to the Rasch model, reduced questionnaire contents on the basis of maintaining the original content structure, screened out the best reduced questionnaire version by means of correlation analysis, and found that the lack of sports knowledge and the lack of professional guidance were major sports hindrances for type 2 diabetic patients in China. Based on their analysis by applying the new item response theory (IRT) model, the authors found that all the 4 reduced versions of questionnaires were correlative to the original 43-hindrance version, and that the 16-hindrance questionnaire tool had the highest efficiency, whose usage was recommended.%为了开发并规范糖尿病患者运动阻碍的调查工具,弄清中国2型糖尿病患者的主要运动阻碍,运用国际上最新的教育/心理测量技术简化调查工具,更好地应用大规模人群。对197名2型糖尿病患者(男性104名,女性93名,平均年龄53.6岁)进行问卷调查。调查工具源自美国成熟的43项运动阻碍调查问卷,作为判定妨碍他们参加运动的程度。数据应用Rasch模型进

  3. Database Constraints Applied to Metabolic Pathway Reconstruction Tools

    Directory of Open Access Journals (Sweden)

    Jordi Vilaplana

    2014-01-01

    Full Text Available Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (reannotation of proteomes, to properly identify both the individual proteins involved in the process(es of interest and their function. It also enables the sets of proteins involved in the process(es in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  4. Applying mathematical finance tools to the competitive Nordic electricity market

    International Nuclear Information System (INIS)

    This thesis models competitive electricity markets using the methods of mathematical finance. Fundamental problems of finance are market price modelling, derivative pricing, and optimal portfolio selection. The same questions arise in competitive electricity markets. The thesis presents an electricity spot price model based on the fundamental stochastic factors that affect electricity prices. The resulting price model has sound economic foundations, is able to explain spot market price movements, and offers a computationally efficient way of simulating spot prices. The thesis shows that the connection between spot prices and electricity forward prices is nontrivial because electricity is a commodity that must be consumed immediately. Consequently, forward prices of different times are based on the supply-demand conditions at those times. This thesis introduces a statistical model that captures the main characteristics of observed forward price movements. The thesis presents the pricing problems relating to the common Nordic electricity derivatives, as well as the pricing relations between electricity derivatives. The special characteristics of electricity make spot electricity market incomplete. The thesis assumes the existence of a risk-neutral martingale measure so that formal pricing results can be obtained. Some concepts introduced in financial markets are directly usable in the electricity markets. The risk management application in this thesis uses a static optimal portfolio selection framework where Monte Carlo simulation provides quantitative results. The application of mathematical finance requires careful consideration of the special characteristics of the electricity markets. Economic theory and reasoning have to be taken into account when constructing financial models in competitive electricity markets. (orig.)

  5. Scanning Probe Microscopy as a Tool Applied to Agriculture

    Science.gov (United States)

    Leite, Fabio Lima; Manzoli, Alexandra; de Herrmann, Paulo Sérgio Paula; Oliveira, Osvaldo Novais; Mattoso, Luiz Henrique Capparelli

    The control of materials properties and processes at the molecular level inherent in nanotechnology has been exploited in many areas of science and technology, including agriculture where nanotech methods are used in release of herbicides and monitoring of food quality and environmental impact. Atomic force microscopy (AFM) and related techniques are among the most employed nanotech methods, particularly with the possibility of direct measurements of intermolecular interactions. This chapter presents a brief review of the applications of AFM in agriculture that may be categorized into four main topics, namely thin films, research on nanomaterials and nanostructures, biological systems and natural fibers, and soils science. Examples of recent applications will be provided to give the reader a sense of the power of the technique and potential contributions to agriculture.

  6. Spectroscopic Tools Applied to Element Z = 115 Decay Chains

    Directory of Open Access Journals (Sweden)

    Forsberg U.

    2014-03-01

    Full Text Available Nuclides that are considered to be isotopes of element Z = 115 were produced in the reaction 48Ca + 243Am at the GSI Helmholtzzentrum für Schwerionenforschung Darmstadt. The detector setup TASISpec was used. It was mounted behind the gas-filled separator TASCA. Thirty correlated α-decay chains were found, and the energies of the particles were determined with high precision. Two important spectroscopic aspects of the offline data analysis are discussed in detail: the handling of digitized preamplified signals from the silicon strip detectors, and the energy reconstruction of particles escaping to upstream detectors relying on pixel-by-pixel dead-layer thicknesses.

  7. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  8. Tools for Authentication

    International Nuclear Information System (INIS)

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work

  9. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  10. Cyber Security Evaluation Tool

    Energy Technology Data Exchange (ETDEWEB)

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  11. What Metadata Principles Apply to Scientific Data?

    Science.gov (United States)

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  12. Applying SF-Based Genre Approaches to English Writing Class

    Science.gov (United States)

    Wu, Yan; Dong, Hailin

    2009-01-01

    By exploring genre approaches in systemic functional linguistics and examining the analytic tools that can be applied to the process of English learning and teaching, this paper seeks to find a way of applying genre approaches to English writing class.

  13. Downhole tool with replaceable tool sleeve sections

    Energy Technology Data Exchange (ETDEWEB)

    Case, W. A.

    1985-10-29

    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  14. Implementation of cutting tool management system

    Directory of Open Access Journals (Sweden)

    G. Svinjarević

    2007-07-01

    Full Text Available Purpose: of this paper is to show the benefits of implementation of management of cutting tools in the company which specializes in metal cutting process, after which the production conditions alows new possibilities for improvement of the tool management.Design/methodology/approach: applied in this paper was identification current state and exploatation conditions of cutting tools on lathes and milling machines and organization of the departments and other services, which are directly involved in the cutting tools management system.Findings: of the controlled testings and analyses in every phase of tool management in departments and other services which are directly involved in the tool management system will help to reduce stock and costs. It is possible to identify which operator makes errors and is responsible for inappropriate use of cutting tool. Some disadvantages have been identified and a few suggestions for the improvement in the tool management system have been given. A result of research is easy to apply in company with developed informatic infrastructure and is mostly interesting for CNC workshops. Small companies and specialized low volume productions have to made additional effort to integrate in clusters.Practical implications: are reduction of cutting tool on stock, reduction of employee, quick access to the necessary cutting tools and data, simplicity in tool order and supply. The most important is possibility to monitor and to identify which cutting tools and employees are weakest parts of chain in tool management system. Management activity should be foreseeable in all its segments, which includes both the appropriate choice and use of cutting tools, and monitoring of unwanted phenomena during the cutting process and usage of these data for further purchase of tools.Originality/value: in the paper is turnover methodology applied for determination of management efficacy and formation of employees from different departments in

  15. Sheet Bending using Soft Tools

    Science.gov (United States)

    Sinke, J.

    2011-05-01

    Sheet bending is usually performed by air bending and V-die bending processes. Both processes apply rigid tools. These solid tools facilitate the generation of software for the numerical control of those processes. When the lower rigid die is replaced with a soft or rubber tool, the numerical control becomes much more difficult, since the soft tool deforms too. Compared to other bending processes the rubber backed bending process has some distinct advantages, like large radius-to-thickness ratios, applicability to materials with topcoats, well defined radii, and the feasibility of forming details (ridges, beads). These advantages may give the process exclusive benefits over conventional bending processes, not only for industries related to mechanical engineering and sheet metal forming, but also for other disciplines like Architecture and Industrial Design The largest disadvantage is that also the soft (rubber) tool deforms. Although the tool deformation is elastic and recovers after each process cycle, the applied force during bending is related to the deformation of the metal sheet and the deformation of the rubber. The deformation of the rubber interacts with the process but also with sheet parameters. This makes the numerical control of the process much more complicated. This paper presents a model for the bending of sheet materials using a rubber lower die. This model can be implemented in software in order to control the bending process numerically. The model itself is based on numerical and experimental research. In this research a number of variables related to the tooling and the material have been evaluated. The numerical part of the research was used to investigate the influence of the features of the soft lower tool, like the hardness and dimensions, and the influence of the sheet thickness, which also interacts with the soft tool deformation. The experimental research was focused on the relation between the machine control parameters and the most

  16. ECONOMETRIC TOOLS OF CONTROLLING

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-03-01

    Full Text Available Econometrics is one of the most effective mathematical tools of controlling. The article deals with general problems of application of econometric methods in solving problems of controlling. Econometric methods - is primarily a statistical analysis of concrete economic data, of course, with the help of computers. In our country, they are still relatively little known, even though we have the most powerful scientific school in the foundations of econometrics - the probability theory. The article shows that to decide the problems of controlling is necessary to apply econometric methods. Classification of econometric tools can be carried out on various grounds: on methods, by type of data, in tasks, etc. Mass introduction of software products, including modern econometric analysis tools of concrete economic data can be regarded as one of the most effective ways to accelerate scientific and technological progress. The whole arsenal currently used econometric and statistical techniques (methods can be divided into three streams: high econometric (statistical technology; classical econometric (statistical technology, low (inadequate, obsolete econometric (statistical technology. The main problem of modern econometrics is to ensure that the concrete econometric and statistical studies used only the first two types of technology. To get a broader representation of the use of econometric methods in the management of production organization we analyze basic textbook "Organization and planning of engineering production (production management," prepared by the Department of "Economics and organization of production" of the Bauman Moscow State Technical University. It has more than 20 times using econometric methods and models that testify to the effectiveness of such a tool of manager as econometrics

  17. Sea Surface Temperature Variations during the Last 100 Years Recorded in a Porites Coral from the Mischief Reef of Sansha City%三沙市美济礁滨珊瑚记录的近百年海面温度变化

    Institute of Scientific and Technical Information of China (English)

    林紫云; 余克服; 施祺; 陈天然; 陶士臣

    2016-01-01

    Sea surface temperature (SST) variation over the last 100 years was reconstructed with high-resolution Sr/Ca of a Porites coral from the Mischief Reef, Sansha City. The results showed that the annual average SST from 1906 to 2007 AD was 28.4°C with the maximum of 29.2°C in 1998 AD and the minimum of 27.6°C in 1917 AD. The SST of the Nansha Islands increased by 0.025°C/10 a during this period, similar to that of the Xisha Islands. The SST series displayed a significant decadal oscillation with a period of 33 years, suggesting a possible linkage between the SST and the Pacific Decadal Oscillation. In addition, in the years when ENSO occurred, the SST was high, indicating the influence of ENSO in this area. In this paper, a Sr/Ca-SST relation was first established for the Mischief Reef and then the high-resolution SST record was extended to more than 100 years in this area. The SST variations were characterized and related mechanisms were explored.%利用南沙群岛美济礁滨珊瑚Sr/Ca建立与海面温度(SST)的关系式,后报近百年南沙群岛海区高分辨率的SST序列。结果显示:美济礁近百年SST的年平均值为28.4℃,最高值为29.2℃(1998 AD),最低值为27.6℃(1917 AD),相差1.6℃。近百年SST呈增暖趋势,升温速率为0.025℃/10 a,但存在年代际的温度波动,其中1906―1927 AD呈降温态势,1928―1938 AD呈增温态势,1939―1957 AD呈降温态势,1958年后SST呈波动上升趋势。近百年SST呈现3.5 a、33 a周期,显示受ENSO和太平洋年代际涛动(PDO)影响。

  18. Ludic Educational Game Creation Tool

    DEFF Research Database (Denmark)

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos;

    2015-01-01

    This paper presents initial findings and ongoing work of the game creation tool, a core component of the IOLAOS(IOLAOS in ancient Greece was a divine hero famed for helping with some of Heracles’s labors.) platform, a general open authorable framework for educational and training games. The game...... creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play....... Helping children staying safe and preventing serious injury on the roads is crucial. In this context, this work presents an augmented version of the IOLAOS architecture including an enhanced game creation tool and a new multimodality module. In addition presents a case study for creating educational games...

  19. Software Engineering applied to Manufacturing Problems

    Directory of Open Access Journals (Sweden)

    Jorge A. Ruiz-Vanoye

    2010-05-01

    Full Text Available Optimization approaches have traditionally been viewed as tools for solving manufacturing problems, the optimization approach is not suitable for many problems arising in modern manufacturing systems due to their complexity and involvement of qualitative factors. In this paper we use a tool of software engineering applied to manufacturing problems. We use the Heuristics Lab software to determine and analyze the solution obtained for Manufacturing Problems.

  20. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  1. Essays in applied economics

    Science.gov (United States)

    Arano, Kathleen

    Three independent studies in applied economics are presented. The first essay looks at the US natural gas industrial sector and estimates welfare effects associated with the changes in natural gas regulatory policy over the past three decades. Using a disequilibrium model suited to the natural gas industry, welfare transfers and deadweight losses are calculated. Results indicate that deregulation policies, beginning with the NGPA of 1978, have caused the industry to become more responsive to market conditions. Over time, regulated prices converge toward the estimated equilibrium prices. As a result of this convergence, deadweight losses associated with regulation are also diminished. The second essay examines the discounted utility model (DU), the standard model used for intertemporal decision-making. Prior empirical studies challenge the descriptive validity of the model. This essay addresses the four main inconsistencies that have been raised: domain dependence, magnitude effects, time effects, and gain/loss asymmetries. These inconsistencies, however, may be the result of the implicit assumption of linear utility and not a failure of the DU model itself. In order to test this hypothesis, data was collected from in-class surveys of economics classes at Mississippi State University. A random effects model for panel data estimation which accounts for individual specific effects was then used to impute discount rates measured in terms of dollars and utility. All four inconsistencies were found to be present when the dollar measures were used. Using utility measures of the discount rate resolved the inconsistencies in some cases. The third essay brings together two perspectives in the study of religion and economics: modeling religious behavior using economic tools and variables, and modeling economic behavior using religious variables. A system of ordered probit equations is developed to simultaneously model religious activities and economic outcomes. Using data

  2. 29 CFR 1915.132 - Portable electric tools.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Portable electric tools. 1915.132 Section 1915.132 Labor... § 1915.132 Portable electric tools. The provisions of this section shall apply to ship repairing... frames of portable electric tools and appliances, except double insulated tools approved by...

  3. Nanotechnology tools in pharmaceutical R&D

    OpenAIRE

    Kumar, Challa S. S. R.

    2010-01-01

    Nanotechnology is a new approach to problem solving and can be considered as a collection of tools and ideas which can be applied in pharmaceutical industry. Application of nanotechnology tools in pharmaceutical R&D is likely to result in moving the industry from ‘blockbuster drug’ model to ‘personalized medicine’. There are compelling applications in pharmaceutical industry where inexpensive nanotechnology tools can be utilized. The review explores the possibility of categorizing various nan...

  4. A Hybrid Pattern Recognition Architecture for Cutting Tool Condition Monitoring

    OpenAIRE

    Fu, Pan; Hope, A. D.

    2008-01-01

    An intelligent tool condition monitoring system has been established. Tool wear classification is realized by applying a unique fuzzy neural hybrid pattern recognition system. On the basis of this investigation, the following conclusions can be made.

  5. Software Release Procedure and Tools

    OpenAIRE

    Giammatteo, Gabriele; Frosini, Luca; Laskaris, Nikolas

    2015-01-01

    Deliverable D4.1 - "Software Release Procedures and Tools" aims to provide a detailed description of the procedures applied and tools used to manage releases of the gCube System within Work Package 4. gCube System is the software at the basis of all VREs applications, data management services and portals. Given the large size of the gCube system, its high degree of modularity and the number of developers involved in the implementation, a set of procedures that formalize and simplify the integ...

  6. A View from Agricultural and Applied Economics

    OpenAIRE

    Henry, Mark

    2000-01-01

    Is regional science too focused on abstract models, theorizing, and methodology with weak links to policy and practice? Not from the perspective of the land grant university, where many applied economists with a regional science interest reside. The job of these applied economists is, in part, to translate the models and methods into tools for the use in understanding regional development processes and for undertaking policy analysis at the regional level.

  7. Changes in phytoplankton productivity and impacts on environment in the Zhejiang coastal mud area during the last 100 years%浙江近岸泥质区百年来浮游植物生产力的变化及对环境的响应

    Institute of Scientific and Technical Information of China (English)

    冯旭文; 段杉杉; 石学法; 刘升发; 赵美训; 杨海丽; 朱德弟; 王奎

    2013-01-01

      在210 Pb定年的基础上,对取自浙江沿岸泥质缺氧区的柱样沉积物开展了菜子甾醇、甲藻甾醇、长链烯酮等生物标志化合物分析,根据生物标志化合物含量及比例的分布特征,重建了泥质区110年来浮游植物生产力及群落结构变化。结果表明浙江近岸浮游植物生产力百年来呈上升趋势,自20世纪60年代开始上升,80年代以来有显著增加,浮游植物群落结构则均有甲藻比例上升、硅藻比例下降的趋势。研究认为,浙江沿岸泥质区百年来浮游植物生产力的提高与我国化肥施用量和长江氮的入海通量呈正相关,营养盐N∶P和N∶Si比值的增加导致浮游植物优势种由硅藻向甲藻的转变,说明自20世纪60年代,尤其是自20世纪80年代以来工农业快速发展、大型水利工程建设等人类活动是导致浙江沿岸泥质区海域浮游植物生产力提高及群落结构变化的主要因素。%A high resolution sediment core was selected in the Zhejiang coastal mud area ,which was also located within the hypoxia area. The biomarkers ,such as brassicasterol ,dinosterol and C37-Alkenone were determined on the 210 Pb-dated sediment core. According to the vertical distribution of the biomarkers and ratios in the core sedi-ments ,we reconstructed the changes in phytoplankton productivity and community structure over the last 110 years in the Zhejiang coastal region. The results indicated increased phytoplankton productivity during the last 100 years in the mud area. Phytoplankton productivity increased gradually starting in the 1960s and accelerated after the 1980s. The change of phytoplankton community structure showed an increasing relative contribution of dino-flagellates and a decreasing relative contribution of diatoms over the last 100 years. The increase in phytoplankton productivity in the Zhejiang coastal mud area corresponded to the increased use of fertilizer and nitrogen

  8. Advances in Applied Mechanics

    OpenAIRE

    2014-01-01

    Advances in Applied Mechanics draws together recent significant advances in various topics in applied mechanics. Published since 1948, Advances in Applied Mechanics aims to provide authoritative review articles on topics in the mechanical sciences, primarily of interest to scientists and engineers working in the various branches of mechanics, but also of interest to the many who use the results of investigations in mechanics in various application areas, such as aerospace, chemical, civil, en...

  9. Perspectives on Applied Ethics

    OpenAIRE

    2007-01-01

    Applied ethics is a growing, interdisciplinary field dealing with ethical problems in different areas of society. It includes for instance social and political ethics, computer ethics, medical ethics, bioethics, envi-ronmental ethics, business ethics, and it also relates to different forms of professional ethics. From the perspective of ethics, applied ethics is a specialisation in one area of ethics. From the perspective of social practice applying eth-ics is to focus on ethical aspects and ...

  10. Applied Neuroscience Laboratory Complex

    Data.gov (United States)

    Federal Laboratory Consortium — Located at WPAFB, Ohio, the Applied Neuroscience lab researches and develops technologies to optimize Airmen individual and team performance across all AF domains....

  11. Complex reconfiguration - developing common tools

    International Nuclear Information System (INIS)

    Reconfiguring DOE sites, facilities, and laboratories to meet expected and evolving missions involves a number of disciplines and approaches formerly the preserve of private industry and defense contractors. This paper considers the process of identifying common tools for the various disciplines that can be exercised, assessed, and applied by team members to arrive at integrated solutions. The basic tools include: systems, hardware, software, and procedures that can characterize a site/facility's environment to meet organizational goals, safeguards and security, ES ampersand H, and waste requirements. Other tools such as computer-driven inventory and auditing programs can provide traceability of materials and product as they are processed and required added protection and control. This paper will also discuss the use of integrated teams in a number of high technology enterprises that could be adopted by DOE in high profile programs from environmental remediation to weapons dismantling and arms control

  12. Navigating Towards Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2006-01-01

    like opposites, the term tectonics deals with creating a meaningful relationship between the two. The aim of this paper is to investigate what a digital tectonic tool could be and what relationship with technology it should represent. An understanding of this relationship can help us not only...... to understand the conflicts in architecture and the building industry but also bring us further into a discussion of how architecture can use digital tools. The investigation is carried out firstly by approaching the subject theoretically through the term tectonics and by setting up a model of the values...... a tectonic tool should encompass. Secondly the ability and validity of the model are shown by applying it to a case study of Jørn Utzon’s work on Minor Hall in Sydney Opera House - for the sake of exemplification the technical field focused on in this paper is room acoustics. Thirdly the relationship between...

  13. What are applied ethics?

    Science.gov (United States)

    Allhoff, Fritz

    2011-03-01

    This paper explores the relationships that various applied ethics bear to each other, both in particular disciplines and more generally. The introductory section lays out the challenge of coming up with such an account and, drawing a parallel with the philosophy of science, offers that applied ethics may either be unified or disunified. The second section develops one simple account through which applied ethics are unified, vis-à-vis ethical theory. However, this is not taken to be a satisfying answer, for reasons explained. In the third section, specific applied ethics are explored: biomedical ethics; business ethics; environmental ethics; and neuroethics. These are chosen not to be comprehensive, but rather for their traditions or other illustrative purposes. The final section draws together the results of the preceding analysis and defends a disunity conception of applied ethics.

  14. Force feedback facilitates multisensory integration during robotic tool use

    OpenAIRE

    Sengül, Ali; Rognini, Giulio; van Elk, Michiel; Aspell, Jane Elizabeth; Bleuler, Hannes; Blanke, Olaf

    2013-01-01

    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal congruency task, by responding to tactile vibrations applied to their hands, while ignoring visual distractors superimposed on the robotic tools. In the first experiment it was found that tool-use tra...

  15. Special Functions for Applied Scientists

    CERN Document Server

    Mathai, A M

    2008-01-01

    Special Functions for Applied Scientists provides the required mathematical tools for researchers active in the physical sciences. The book presents a full suit of elementary functions for scholars at the PhD level and covers a wide-array of topics and begins by introducing elementary classical special functions. From there, differential equations and some applications into statistical distribution theory are examined. The fractional calculus chapter covers fractional integrals and fractional derivatives as well as their applications to reaction-diffusion problems in physics, input-output analysis, Mittag-Leffler stochastic processes and related topics. The authors then cover q-hypergeometric functions, Ramanujan's work and Lie groups. The latter half of this volume presents applications into stochastic processes, random variables, Mittag-Leffler processes, density estimation, order statistics, and problems in astrophysics. Professor Dr. A.M. Mathai is Emeritus Professor of Mathematics and Statistics, McGill ...

  16. Pre-Columbian monkey tools.

    Science.gov (United States)

    Haslam, Michael; Luncz, Lydia V; Staff, Richard A; Bradshaw, Fiona; Ottoni, Eduardo B; Falótico, Tiago

    2016-07-11

    Stone tools reveal worldwide innovations in human behaviour over the past three million years [1]. However, the only archaeological report of pre-modern non-human animal tool use comes from three Western chimpanzee (Pan troglodytes verus) sites in Côte d'Ivoire, aged between 4.3 and 1.3 thousand years ago (kya) [2]. This anthropocentrism limits our comparative insight into the emergence and development of technology, weakening our evolutionary models [3]. Here, we apply archaeological techniques to a distinctive stone tool assemblage created by a non-human animal in the New World, the Brazilian bearded capuchin monkey (Sapajus libidinosus). Wild capuchins at Serra da Capivara National Park (SCNP) use stones to pound open defended food, including locally indigenous cashew nuts [4], and we demonstrate that this activity dates back at least 600 to 700 years. Capuchin stone hammers and anvils are therefore the oldest non-human tools known outside of Africa, opening up to scientific scrutiny questions on the origins and spread of tool use in New World monkeys, and the mechanisms - social, ecological and cognitive - that support primate technological evolution. PMID:27404235

  17. Pre-Columbian monkey tools.

    Science.gov (United States)

    Haslam, Michael; Luncz, Lydia V; Staff, Richard A; Bradshaw, Fiona; Ottoni, Eduardo B; Falótico, Tiago

    2016-07-11

    Stone tools reveal worldwide innovations in human behaviour over the past three million years [1]. However, the only archaeological report of pre-modern non-human animal tool use comes from three Western chimpanzee (Pan troglodytes verus) sites in Côte d'Ivoire, aged between 4.3 and 1.3 thousand years ago (kya) [2]. This anthropocentrism limits our comparative insight into the emergence and development of technology, weakening our evolutionary models [3]. Here, we apply archaeological techniques to a distinctive stone tool assemblage created by a non-human animal in the New World, the Brazilian bearded capuchin monkey (Sapajus libidinosus). Wild capuchins at Serra da Capivara National Park (SCNP) use stones to pound open defended food, including locally indigenous cashew nuts [4], and we demonstrate that this activity dates back at least 600 to 700 years. Capuchin stone hammers and anvils are therefore the oldest non-human tools known outside of Africa, opening up to scientific scrutiny questions on the origins and spread of tool use in New World monkeys, and the mechanisms - social, ecological and cognitive - that support primate technological evolution.

  18. Applied statistics: A review

    OpenAIRE

    Cox, D R

    2007-01-01

    The main phases of applied statistical work are discussed in general terms. The account starts with the clarification of objectives and proceeds through study design, measurement and analysis to interpretation. An attempt is made to extract some general notions.

  19. Applied eye tracking research

    NARCIS (Netherlands)

    Jarodzka, Halszka

    2011-01-01

    Jarodzka, H. (2010, 12 November). Applied eye tracking research. Presentation and Labtour for Vereniging Gewone Leden in oprichting (VGL i.o.), Heerlen, The Netherlands: Open University of the Netherlands.

  20. Applied Mathematics Seminar 1982

    International Nuclear Information System (INIS)

    This report contains the abstracts of the lectures delivered at 1982 Applied Mathematics Seminar of the DPD/LCC/CNPq and Colloquy on Applied Mathematics of LCC/CNPq. The Seminar comprised 36 conferences. Among these, 30 were presented by researchers associated to brazilian institutions, 9 of them to the LCC/CNPq, and the other 6 were given by visiting lecturers according to the following distribution: 4 from the USA, 1 from England and 1 from Venezuela. The 1981 Applied Mathematics Seminar was organized by Leon R. Sinay and Nelson do Valle Silva. The Colloquy on Applied Mathematics was held from october 1982 on, being organized by Ricardo S. Kubrusly and Leon R. Sinay. (Author)

  1. Mesothelioma Applied Research Foundation

    Science.gov (United States)

    ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs Help us raise awareness and ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs © 2013 Mesothelioma Applied Research Foundation, ...

  2. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  3. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  4. Applied chemical engineering thermodynamics

    CERN Document Server

    Tassios, Dimitrios P

    1993-01-01

    Applied Chemical Engineering Thermodynamics provides the undergraduate and graduate student of chemical engineering with the basic knowledge, the methodology and the references he needs to apply it in industrial practice. Thus, in addition to the classical topics of the laws of thermodynamics,pure component and mixture thermodynamic properties as well as phase and chemical equilibria the reader will find: - history of thermodynamics - energy conservation - internmolecular forces and molecular thermodynamics - cubic equations of state - statistical mechanics. A great number of calculated problems with solutions and an appendix with numerous tables of numbers of practical importance are extremely helpful for applied calculations. The computer programs on the included disk help the student to become familiar with the typical methods used in industry for volumetric and vapor-liquid equilibria calculations.

  5. PSYCHOANALYSIS AS APPLIED AESTHETICS.

    Science.gov (United States)

    Richmond, Stephen H

    2016-07-01

    The question of how to place psychoanalysis in relation to science has been debated since the beginning of psychoanalysis and continues to this day. The author argues that psychoanalysis is best viewed as a form of applied art (also termed applied aesthetics) in parallel to medicine as applied science. This postulate draws on a functional definition of modernity as involving the differentiation of the value spheres of science, art, and religion. The validity criteria for each of the value spheres are discussed. Freud is examined, drawing on Habermas, and seen to have erred by claiming that the psychoanalytic method is a form of science. Implications for clinical and metapsychological issues in psychoanalysis are discussed. PMID:27428582

  6. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  7. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  8. Retransmission Steganography Applied

    CERN Document Server

    Mazurczyk, Wojciech; Szczypiorski, Krzysztof

    2010-01-01

    This paper presents experimental results of the implementation of network steganography method called RSTEG (Retransmission Steganography). The main idea of RSTEG is to not acknowledge a successfully received packet to intentionally invoke retransmission. The retransmitted packet carries a steganogram instead of user data in the payload field. RSTEG can be applied to many network protocols that utilize retransmissions. We present experimental results for RSTEG applied to TCP (Transmission Control Protocol) as TCP is the most popular network protocol which ensures reliable data transfer. The main aim of the performed experiments was to estimate RSTEG steganographic bandwidth and detectability by observing its influence on the network retransmission level.

  9. Applied mathematics made simple

    CERN Document Server

    Murphy, Patrick

    1982-01-01

    Applied Mathematics: Made Simple provides an elementary study of the three main branches of classical applied mathematics: statics, hydrostatics, and dynamics. The book begins with discussion of the concepts of mechanics, parallel forces and rigid bodies, kinematics, motion with uniform acceleration in a straight line, and Newton's law of motion. Separate chapters cover vector algebra and coplanar motion, relative motion, projectiles, friction, and rigid bodies in equilibrium under the action of coplanar forces. The final chapters deal with machines and hydrostatics. The standard and conte

  10. Applied statistics with SPSS

    CERN Document Server

    Huizingh, Eelko K R E

    2007-01-01

    Accessibly written and easy to use, Applied Statistics Using SPSS is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. Based around the needs of undergraduate students embarking on their own research project, the text's self-help style is designed to boost the skills and confidence of those that will need to use SPSS in the course of doing their research project. The book is pedagogically well developed and contains many screen dumps and exercises, glossary terms and worked examples. Divided into two parts, Applied Statistics Using SPSS covers :

  11. Applied Electromagnetism and Materials

    CERN Document Server

    Moliton, André

    2007-01-01

    Applied Electromagnetism and Materials picks up where the author's Basic Electromagnetism and Materials left off by presenting practical and relevant technological information about electromagnetic material properties and their applications. This book is aimed at senior undergraduate and graduate students as well as researchers in materials science and is the product of many years of teaching basic and applied electromagnetism. Topics range from the spectroscopy and characterization of dielectrics and semiconductors, to non-linear effects and electromagnetic cavities, to ion-beam applications in materials science.

  12. On applying cognitive psychology.

    Science.gov (United States)

    Baddeley, Alan

    2013-11-01

    Recent attempts to assess the practical impact of scientific research prompted my own reflections on over 40 years worth of combining basic and applied cognitive psychology. Examples are drawn principally from the study of memory disorders, but also include applications to the assessment of attention, reading, and intelligence. The most striking conclusion concerns the many years it typically takes to go from an initial study, to the final practical outcome. Although the complexity and sheer timescale involved make external evaluation problematic, the combination of practical satisfaction and theoretical stimulation make the attempt to combine basic and applied research very rewarding.

  13. Introduction to applied thermodynamics

    CERN Document Server

    Helsdon, R M; Walker, G E

    1965-01-01

    Introduction to Applied Thermodynamics is an introductory text on applied thermodynamics and covers topics ranging from energy and temperature to reversibility and entropy, the first and second laws of thermodynamics, and the properties of ideal gases. Standard air cycles and the thermodynamic properties of pure substances are also discussed, together with gas compressors, combustion, and psychrometry. This volume is comprised of 16 chapters and begins with an overview of the concept of energy as well as the macroscopic and molecular approaches to thermodynamics. The following chapters focus o

  14. Sedimentary Records of Environmental Evolution During the Recent 100 Years in the Coastal Zone of Guangxi Province%广西海岸带近百年来人类活动影响下环境演变的沉积记录

    Institute of Scientific and Technical Information of China (English)

    夏鹏; 孟宪伟; 李珍; 丰爱平; 王湘芹

    2012-01-01

    Six sediment cores were collected during 2007 from the coastal zone of Guangxi province.The temporal evolution of biogenic elements(C,N and P) and metal(Hg,Cu,Pb,Zn,Cd,Cr and As) inputs was clearly recorded by high 210Pbxs-derived sedimentation rates(0.25~1.68 cm y-1),especially in the estuary of Qinjiang river and Nanliu river(1.68 cm·y-1 and 0.70 cm·y-1,respectively),which could be attributed to high rates of river sediment transport.Based on the vertical distributions of enrichment factors and excess fluxes,heavy metals and total phosphorus were obviously enriched in the recent 20 years,but do not exceed the quality standard for marine sediment.The results indicated that the natural inputs prevailed up to the early 1980s except Cu.After this period,the excess metal fluxes could be associated with the intensive use of phosphate fertilizers and the combustion of fossil fuels,which caused a slight enrichment.However,total organic carbon showed decreasing trends toward the surface,which could be associated with the decreases of the mangrove forest derived from tidal flat reclamation recently.According to all indicators,environmental evolution of the Guangxi coast during the recent 100 years can be divided into two stages:(1) before the early 1980s characterized by the relatively low heavy metal pollution and scarce eutrophication;(2) after the early 1980s,the concentrations of heavy metals and total phosphorus are significantly increasing,indicating of the anthropogenic inputs.%基于2007年在广西海岸带采集的6根短柱状样(64~97 cm),在210Pb年代框架构建的基础上对沉积物中的生源要素(C、N、P)、常量元素以及重金属(Hg、Cu、Pb、Zn、Cd、Cr和As)等指标进行了综合分析,并依此重建了广西海岸带近百年来人类活动影响下沉积环境的演变历程。研究发现,近二十年来重金属和总磷的表层富集和埋藏通量有明显的上升趋势,但整体污染

  15. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  16. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  17. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  18. Essays on Applied Microeconomics

    Science.gov (United States)

    Mejia Mantilla, Carolina

    2013-01-01

    Each chapter of this dissertation studies a different question within the field of Applied Microeconomics. The first chapter examines the mid- and long-term effects of the 1998 Asian Crisis on the educational attainment of Indonesian children ages 6 to 18, at the time of the crisis. The effects are identified as deviations from a linear trend for…

  19. Taxonomic Evidence Applying Intelligent Information Taxonomic Evidence Applying Intelligent Information

    Directory of Open Access Journals (Sweden)

    Félix Anibal Vallejos

    2005-12-01

    Full Text Available The Numeric Taxonomy aims to group operational taxonomic units in clusters (OTUs or taxons or taxa, using the denominated structure analysis by means of numeric methods. These clusters that constitute families are the purpose of this series of projects and they emerge of the structural analysis, of their phenotypical characteristic, exhibiting the relationships in terms of grades of similarity of the OTUs, employing tools such as i the Euclidean distance and ii nearest neighbor techniques. Thus taxonomic evidence is gathered so as to quantify the similarity for each pair of OTUs (pair-group method obtained from the basic data matrix and in this way the significant concept of spectrum of the OTUs is introduced, being based the same one on the state of their characters. A new taxonomic criterion is thereby formulated and a new approach to Computational Taxonomy is presented, that has been already employed with reference to Data Mining, when apply of Machine Learning techniques, in particular to the C4.5 algorithms, created by Quinlan, the degree of efficiency achieved by the TDIDT family's algorithms when are generating valid models of the data in classification problems with the Gain of Entropy through Maximum Entropy Principle. The Numeric Taxonomy aims to group operational taxonomic units in clusters (OTUs or taxons or taxa, using the denominated structure analysis by means of numeric methods. These clusters that constitute families are the purpose of this series of projects and they emerge of the structural analysis, of their phenotypical characteristic, exhibiting the relationships in terms of grades of similarity of the OTUs, employing tools such as i the Euclidean distance and ii nearest neighbor techniques. Thus taxonomic evidence is gathered so as to quantify the similarity for each pair of OTUs (pair-group method obtained from the basic data matrix and in this way the significant concept of spectrum of the OTUs is introduced, being based

  20. Pickering tool management system

    International Nuclear Information System (INIS)

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  1. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  2. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  3. Applied data mining for business and industry

    CERN Document Server

    Giudici, Paolo

    2009-01-01

    The increasing availability of data in our current, information overloaded society has led to the need for valid tools for its modelling and analysis. Data mining and applied statistical methods are the appropriate tools to extract knowledge from such data. This book provides an accessible introduction to data mining methods in a consistent and application oriented statistical framework, using case studies drawn from real industry projects and highlighting the use of data mining methods in a variety of business applications. Introduces data mining methods and applications.Covers classical and Bayesian multivariate statistical methodology as well as machine learning and computational data mining methods.Includes many recent developments such as association and sequence rules, graphical Markov models, lifetime value modelling, credit risk, operational risk and web mining.Features detailed case studies based on applied projects within industry.Incorporates discussion of data mining software, with case studies a...

  4. Applying the WEAP Model to Water Resource

    DEFF Research Database (Denmark)

    Gao, Jingjing; Christensen, Per; Li, Wei

    Water resources assessment is a tool to provide decision makers with an appropriate basis to make informed judgments regarding the objectives and targets to be addressed during the Strategic Environmental Assessment (SEA) process. The study shows how water resources assessment can be applied in SEA...... in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like irrigation...... efficiency, treatment and reuse of water. The WEAP model was applied to the Ordos catchment where it was used for the first time in China. The changes in water resource utilization in Ordos basin were assessed with the model. It was found that the WEAP model is a useful tool for water resource assessment...

  5. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  6. ENVIRONMENTAL MANAGEMENT TOOLS: INTERNATIONAL PRACTICIES FOR RUSSIA

    OpenAIRE

    Smetanina, T.; Pintassilgo, P.; Matias, A.

    2014-01-01

    This article deals with the basic tools of environmental management applied by developed countries and discusses its application to Russia. The focus is on environmental management instruments such as environmental taxes, subsidies, standards, permits and also on the important role of voluntary tools. Russian practice is analyzed in terms of the current environmental management situation and the prospects of necessary legislative actions. The article refers to the formation of the basic parts...

  7. Applied Control Systems Design

    CERN Document Server

    Mahmoud, Magdi S

    2012-01-01

    Applied Control System Design examines several methods for building up systems models based on real experimental data from typical industrial processes and incorporating system identification techniques. The text takes a comparative approach to the models derived in this way judging their suitability for use in different systems and under different operational circumstances. A broad spectrum of control methods including various forms of filtering, feedback and feedforward control is applied to the models and the guidelines derived from the closed-loop responses are then composed into a concrete self-tested recipe to serve as a check-list for industrial engineers or control designers. System identification and control design are given equal weight in model derivation and testing to reflect their equality of importance in the proper design and optimization of high-performance control systems. Readers’ assimilation of the material discussed is assisted by the provision of problems and examples. Most of these e...

  8. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  9. Applied Economics in Teaching

    Institute of Scientific and Technical Information of China (English)

    朱红萍

    2009-01-01

    This paper explains some plain phenomena in teaching and class management with an economic view. Some basic economic principles mentioned therein are: everything has its opportunity cost; the marginal utility of consumption of any kind is diminishing; Game theory is everywhere. By applying the economic theories to teaching, it is of great help for teachers to understand the students' behavior and thus improve the teaching effectiveness and efficiency.

  10. Essays in Applied Microeconomics

    OpenAIRE

    Buehler, Benno

    2010-01-01

    This thesis consists of 4 chapters in the field of applied microeconomics. Chapter 1 develops a model of international roaming. International alliances emerge endogenously and serve as a commitment device to soften competition on the retail market. Chapter 2 provides an explanation for why political leaders may want to adopt ideological positions. Because voters expect the perceived ideology of office holders to determine their future political actions, politicians are tempted to act ac...

  11. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  12. Methods of applied mathematics

    CERN Document Server

    Hildebrand, Francis B

    1992-01-01

    This invaluable book offers engineers and physicists working knowledge of a number of mathematical facts and techniques not commonly treated in courses in advanced calculus, but nevertheless extremely useful when applied to typical problems in many different fields. It deals principally with linear algebraic equations, quadratic and Hermitian forms, operations with vectors and matrices, the calculus of variations, and the formulations and theory of linear integral equations. Annotated problems and exercises accompany each chapter.

  13. Essays on Applied Microeconomics

    OpenAIRE

    Lee, Hoan Soo

    2013-01-01

    Empirical and theoretical topics in applied microeconomics are discussed in this dissertation. The first essay identifies and measures managerial advantages from access to high-quality deals in venture capital investments. The underlying social network of Harvard Business School MBA venture capitalists and entrepreneurs is used to proxy availability of deal access. Random section assignment of HBS MBA graduates provides a key exogenous variation for identification. Being socially connected to...

  14. Brain oxygenation patterns during the execution of tool use demonstration, tool use pantomime, and body-part-as-object tool use.

    Science.gov (United States)

    Helmich, Ingo; Holle, Henning; Rein, Robert; Lausberg, Hedda

    2015-04-01

    Divergent findings exist whether left and right hemispheric pre- and postcentral cortices contribute to the production of tool use related hand movements. In order to clarify the neural substrates of tool use demonstrations with tool in hand, tool use pantomimes without tool in hand, and body-part-as-object presentations of tool use (BPO) in a naturalistic mode of execution, we applied functional Near InfraRed Spectroscopy (fNIRS) in twenty-three right-handed participants. Functional NIRS techniques allow for the investigation of brain oxygenation during the execution of complex hand movements with an unlimited movement range. Brain oxygenation patterns were retrieved from 16 channels of measurement above pre- and postcentral cortices of each hemisphere. The results showed that tool use demonstration with tool in hand leads to increased oxygenation as compared to tool use pantomimes in the left hemispheric somatosensory gyrus. Left hand executions of the demonstration of tool use, pantomime of tool use, and BPO of tool use led to increased oxygenation in the premotor and somatosensory cortices of the left hemisphere as compared to right hand executions of either condition. The results indicate that the premotor and somatosensory cortices of the left hemisphere constitute relevant brain structures for tool related hand movement production when using the left hand, whereas the somatosensory cortex of the left hemisphere seems to provide specific mental representations when performing tool use demonstrations with the tool in hand.

  15. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  16. Miniaturised Spotter-Compatible Multicapillary Stamping Tool for Microarray Printing

    CERN Document Server

    Drobyshev, A L; Zasedatelev, A S; Drobyshev, Alexei L; Verkhodanov, Nikolai N; Zasedatelev, Alexander S

    2007-01-01

    Novel microstamping tool for microarray printing is proposed. The tool is capable to spot up to 127 droplets of different solutions in single touch. It is easily compatible with commercially available microarray spotters. The tool is based on multichannel funnel with polypropylene capillaries inserted into its channels. Superior flexibility is achieved by ability to replace any printing capillary of the tool. As a practical implementation, hydrogel-based microarrays were stamped and successfully applied to identify the Mycobacterium tuberculosis drug resistance.

  17. Capacitive tool standoff sensor for dismantlement tasks

    Energy Technology Data Exchange (ETDEWEB)

    Schmitt, D.J.; Weber, T.M. [Sandia National Labs., Albuquerque, NM (United States); Liu, J.C. [Univ. of Illinois, Urbana, IL (United States)

    1996-12-31

    A capacitive sensing technology has been applied to develop a Standoff Sensor System for control of robotically deployed tools utilized in Decontamination and Dismantlement (D and D) activities. The system combines four individual sensor elements to provide non-contact, multiple degree-of-freedom control of tools at distances up to five inches from a surface. The Standoff Sensor has been successfully integrated to a metal cutting router and a pyrometer, and utilized for real-time control of each of these tools. Experiments demonstrate that the system can locate stationary surfaces with a repeatability of 0.034 millimeters.

  18. Capacitive tool standoff sensor for dismantlement tasks

    International Nuclear Information System (INIS)

    A capacitive sensing technology has been applied to develop a Standoff Sensor System for control of robotically deployed tools utilized in Decontamination and Dismantlement (D and D) activities. The system combines four individual sensor elements to provide non-contact, multiple degree-of-freedom control of tools at distances up to five inches from a surface. The Standoff Sensor has been successfully integrated to a metal cutting router and a pyrometer, and utilized for real-time control of each of these tools. Experiments demonstrate that the system can locate stationary surfaces with a repeatability of 0.034 millimeters

  19. The process-based stand growth model Formix 3-Q applied in a GIS environment for growth and yield analysis in a tropical rain forest.

    Science.gov (United States)

    Ditzer, T.; Glauner, R.; Förster, M.; Köhler, P.; Huth, A.

    2000-03-01

    Managing tropical rain forests is difficult because few long-term field data on forest growth and the impact of harvesting disturbance are available. Growth models may provide a valuable tool for managers of tropical forests, particularly if applied to the extended forest areas of up to 100,000 ha that typically constitute the so-called forest management units (FMUs). We used a stand growth model in a geographic information system (GIS) environment to simulate tropical rain forest growth at the FMU level. We applied the process-based rain forest growth model Formix 3-Q to the 55,000 ha Deramakot Forest Reserve (DFR) in Sabah, Malaysia. The FMU was considered to be composed of single and independent small-scale stands differing in site conditions and forest structure. Field data, which were analyzed with a GIS, comprised a terrestrial forest inventory, site and soil analyses (water, nutrients, slope), the interpretation of aerial photographs of the present vegetation and topographic maps. Different stand types were determined based on a classification of site quality (three classes), slopes (four classes), and present forest structure (four strata). The effects of site quality on tree allometry (height-diameter curve, biomass allometry, leaf area) and growth (increment size) are incorporated into Formix 3-Q. We derived allometric relations and growth factors for different site conditions from the field data. Climax forest structure at the stand level was shown to depend strongly on site conditions. Simulated successional pattern and climax structure were compared with field observations. Based on the current management plan for the DFR, harvesting scenarios were simulated for stands on different sites. The effects of harvesting guidelines on forest structure and the implications for sustainable forest management at Deramakot were analyzed. Based on the stand types and GIS analysis, we also simulated undisturbed regeneration of the logged-over forest in the DFR at

  20. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Bolic, Andrijana; Svanholm, Bent

    2012-01-01

    knowledge is central in PAT projects. This manuscript therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: on-line sensors, mechanistic models and small-scale equipment for high-throughput experimentation. The manuscript ends with a short......The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...

  1. Lunar hand tools

    Science.gov (United States)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  2. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  3. Inspection tools : tool selection and execution

    Energy Technology Data Exchange (ETDEWEB)

    Van Aelst, A. [Cimarron Engineering Ltd., Calgary, AB (Canada); Parker, C. [TransCanada PipeLines Ltd., Calgary, AB (Canada); Lussier, S.; Gates, J.; Revie, W. [Natural Resources Canada, Ottawa, ON (Canada). CANMET Materials Technology Lab

    2007-07-01

    Working Group 8 discussed oil and gas pipeline integrity issues with particular reference to inspection tools and techniques used by the upstream and downstream pipeline industries. The group addressed other options besides in-line inspection (ILI) for integrity assessments and presented emerging concerns with non-piggable piping. Inspection techniques such as hydrotesting, direct assessment, cameras, and X-ray imaging were reviewed. The actions taken in response to recommendations from past workshops were also presented, including measures to improve standardization of reporting practices. It was noted that tools currently available have limitations that vary with the environment in which they are used, the depth to which the pipeline is buried, and the size and orientation of the pipeline. It was determined that economics plays a key role in the selection of pipeline inspection techniques, and that failure modes should be considered when choosing inspection techniques. Given that environmental consequences are different for natural gas and liquid pipelines, this may influence the choice of inspection needed in environmentally sensitive areas. It was determined that new technologies have yet to gain the confidence of industry and that while some existing technologies (such as cameras) can provide data, they are not suitable as the only tool used in an integrity assessment. The accuracy of inspections was found to depend on the tools, methods and personnel, with variation in accuracy depending primarily on personnel. It was concluded that research and development of new inspection techniques should continue. tabs., figs.

  4. The Matecat Tool

    OpenAIRE

    Federico, Marcello; Bertoldi, Nicola; Cettolo, Mauro; Negri, Matteo; TURCHI Marco; Trombetti, Marco; Cattelan, Alessandro; Farina, Antonio; Lupinetti, Domenico; Martines, Andrea; Massidda, Alberto; Schwenk, Holger; Barrault, Loïc; Blain, Frédéric; Koehn, Philipp

    2014-01-01

    We present a new web-based CAT tool providing translators with a professional work environment, integrating translation memories, terminology bases, concordancers, and machine translation. The tool is completely developed as open source software and has been already successfully deployed for business, research and education. The MateCat Tool represents today probably the best available open source platform for investigating, integrating, and evaluating under realistic conditions the impact of...

  5. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  6. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  7. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2005-01-01

    Master linear regression techniques with a new edition of a classic text Reviews of the Second Edition: ""I found it enjoyable reading and so full of interesting material that even the well-informed reader will probably find something new . . . a necessity for all of those who do linear regression."" -Technometrics, February 1987 ""Overall, I feel that the book is a valuable addition to the now considerable list of texts on applied linear regression. It should be a strong contender as the leading text for a first serious course in regression analysis."" -American Scientist, May-June 1987

  8. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  9. Applied energy an introduction

    CERN Document Server

    Abdullah, Mohammad Omar

    2012-01-01

    Introduction to Applied EnergyGeneral IntroductionEnergy and Power BasicsEnergy EquationEnergy Generation SystemsEnergy Storage and MethodsEnergy Efficiencies and LossesEnergy industry and Energy Applications in Small -Medium Enterprises (SME) industriesEnergy IndustryEnergy-Intensive industryEnergy Applications in SME Energy industriesEnergy Sources and SupplyEnergy SourcesEnergy Supply and Energy DemandEnergy Flow Visualization and Sankey DiagramEnergy Management and AnalysisEnergy AuditsEnergy Use and Fuel Consumption StudyEnergy Life-Cycle AnalysisEnergy and EnvironmentEnergy Pollutants, S

  10. Applied impulsive mathematical models

    CERN Document Server

    Stamova, Ivanka

    2016-01-01

    Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

  11. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  12. SIFT applied to CBIR

    Directory of Open Access Journals (Sweden)

    ALMEIDA, J.

    2009-12-01

    Full Text Available Content-Based Image Retrieval (CBIR is a challenging task. Common approaches use only low-level features. Notwithstanding, such CBIR solutions fail on capturing some local features representing the details and nuances of scenes. Many techniques in image processing and computer vision can capture these scene semantics. Among them, the Scale Invariant Features Transform~(SIFT has been widely used in a lot of applications. This approach relies on the choice of several parameters which directly impact its effectiveness when applied to retrieve images. In this paper, we discuss the results obtained in several experiments proposed to evaluate the application of the SIFT in CBIR tasks.

  13. Applied complex variables

    CERN Document Server

    Dettman, John W

    1965-01-01

    Analytic function theory is a traditional subject going back to Cauchy and Riemann in the 19th century. Once the exclusive province of advanced mathematics students, its applications have proven vital to today's physicists and engineers. In this highly regarded work, Professor John W. Dettman offers a clear, well-organized overview of the subject and various applications - making the often-perplexing study of analytic functions of complex variables more accessible to a wider audience. The first half of Applied Complex Variables, designed for sequential study, is a step-by-step treatment of fun

  14. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist

    to summarize process knowledge, to support experimental work, and also within design of PAT systems - Small-scale equipment for high-throughput experimentation, a field which has been researched intensively during the past decade The presentation ends with a short perspective on future developments......The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This presentation therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: - On-line sensors, where for example spectroscopic measurements are increasingly applied - Mechanistic models, which can be used...

  15. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    Tools for design management are on the agenda in building projects in order to set targets, to choose and prioritise between alternative environmental solutions, to involve stakeholders and to document, evaluate and benchmark. Different types of tools are available, but what can we learn from...... the use or lack of use of current tools in the development of future design tools for sustainable buildings? Why are some used while others are not? Who is using them? The paper deals with design management, with special focus on sustainable building in Denmark, and the challenge of turning the generally...

  16. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  17. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  18. Chimera Grid Tools

    Science.gov (United States)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  19. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  20. Orbiter Entry Aerothermodynamics Practical Engineering and Applied Research

    Science.gov (United States)

    Campbell, Charles H.

    2009-01-01

    The contents include: 1) Organization of the Orbiter Entry Aeroheating Working Group; 2) Overview of the Principal RTF Aeroheating Tools Utilized for Tile Damage Assessment; 3) Description of the Integrated Tile Damage Assessment Team Analyses Process; 4) Space Shuttle Flight Support Process; and 5) JSC Applied Aerosciences and CFD Branch Applied Research Interests.

  1. Apply of Automatic Generation Technology Segment Tool Electrode Blank Geometry and Cutting Dimension%花纹块工具电极毛坯几何体及下料尺寸图自动生成技术的应用

    Institute of Scientific and Technical Information of China (English)

    胡海明; 张浩

    2013-01-01

    To solve the inefficient problem of relying on traditional manual drawing the tool electrode blank geometry and cutting dimension separately,it write the tool electrode blank geometry and cutting dimension generated procedures automatically and simultaneously are writed using GRIP language.Users need only directly to select the three-dimensional model of the tool electrode and enter the unilateral margin,the tool electrode blank geometry and cutting dimension could be generated automatically.This improves greatly work efficiency.%为了解决传统手工单独绘制工具电极毛坯几何体及下料尺寸图效率低的问题,应用GRIP语言编写了工具电极毛坯几何体及下料尺寸图自动同时生成程序.用户只需选择工具电极的三维模型并输入单边余量值,便可自动生成该工具电极的毛坯几何体及其下料尺寸图,大大提高了工作效率.

  2. Applied plasma physics

    International Nuclear Information System (INIS)

    Applied Plasma Physics is a major sub-organizational unit of the MFE Program. It includes Fusion Plasma Theory and Experimental Plasma Research. The Fusion Plasma Theory group has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under the Experimental Plasma Research Program, we are developing the intense, pulsed neutral-beam source (IPINS) for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of utilizing certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  3. Applied partial differential equations

    CERN Document Server

    Logan, J David

    2015-01-01

    This text presents the standard material usually covered in a one-semester, undergraduate course on boundary value problems and PDEs.  Emphasis is placed on motivation, concepts, methods, and interpretation, rather than on formal theory. The concise treatment of the subject is maintained in this third edition covering all the major ideas: the wave equation, the diffusion equation, the Laplace equation, and the advection equation on bounded and unbounded domains. Methods include eigenfunction expansions, integral transforms, and characteristics. In this third edition, text remains intimately tied to applications in heat transfer, wave motion, biological systems, and a variety other topics in pure and applied science. The text offers flexibility to instructors who, for example, may wish to insert topics from biology or numerical methods at any time in the course. The exposition is presented in a friendly, easy-to-read, style, with mathematical ideas motivated from physical problems. Many exercises and worked e...

  4. Applied number theory

    CERN Document Server

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  5. Applied plasma physics

    International Nuclear Information System (INIS)

    Applied Plasma Physics is a major sub-organizational unit of the MFE Porgram. It includes Fusion Plasma Theory and Experimental Plasma Research. Fusion Plasma Theory has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under Experimental Plasma Research, we are developing the intense, pulsed ion-neutral source (IPINS) for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of utilizing certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  6. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  7. Applied plasma physics

    International Nuclear Information System (INIS)

    Applied Plasma Physics is a major sub-organizational unit of the Magnetic Fusion Energy (MFE) Program. It includes Fusion Plasma Theory and Experimental Plasma Research. The Fusion Plasma Theory group has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under the Experimental Plasma Research Program we are developing a neutral-beam source, the intense, pulsed ion-neutral source (IPINS), for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of using certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  8. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  9. Study of Tools Interoperability

    NARCIS (Netherlands)

    Krilavičius, T.

    2007-01-01

    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  10. WATERS Expert Query Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Expert Query Tool is a web-based reporting tool using the EPA’s WATERS database.There are just three steps to using Expert Query:1. View Selection – Choose what...

  11. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  12. UniProt Tools.

    Science.gov (United States)

    Pundir, Sangya; Martin, Maria J; O'Donovan, Claire

    2016-01-01

    The Universal Protein Resource (UniProt) is a comprehensive resource for protein sequence and annotation data (UniProt Consortium, 2015). The UniProt Web site receives ∼400,000 unique visitors per month and is the primary means to access UniProt. Along with various datasets that you can search, UniProt provides three main tools. These are the 'BLAST' tool for sequence similarity searching, the 'Align' tool for multiple sequence alignment, and the 'Retrieve/ID Mapping' tool for using a list of identifiers to retrieve UniProtKB proteins and to convert database identifiers from UniProt to external databases or vice versa. This unit provides three basic protocols, three alternate protocols, and two support protocols for using UniProt tools. © 2016 by John Wiley & Sons, Inc. PMID:27010333

  13. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  14. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  15. PREFACE: Celebrating 100 years of superconductivity: special issue on the iron-based superconductors Celebrating 100 years of superconductivity: special issue on the iron-based superconductors

    Science.gov (United States)

    Crabtree, George; Greene, Laura; Johnson, Peter

    2011-12-01

    In honor of this year's 100th anniversary of the discovery of superconductivity, this special issue of Reports on Progress in Physics is a dedicated issue to the 'iron-based superconductors'—a new class of high-temperature superconductors that were discovered in 2008. This is the first time the journal has generated a 'theme issue', and we provide this to the community to provide a 'snapshot' of the present status, both for researchers working in this fast-paced field, and for the general physics community. Reports on Progress in Physics publishes three classes of articles—comprehensive full Review Articles, Key Issues Reviews and, most recently, Reports on Progress articles that recount the current status of a rapidly evolving field, befitting of the articles in this special issue. It has been an exciting year for superconductivity—there have been numerous celebrations for this centenary recounting the fascinating history of this field, from seven Nobel prizes to life-saving discoveries that brought us medically useful magnetic resonance imaging. The discovery of a completely new class of high-temperature superconductors, whose mechanism remains as elusive as the cuprates discovered in 1986, has injected a new vitality into this field, and this year those new to the field were provided with the opportunity of interacting with those who have enjoyed a long history in superconductivity. Furthermore, as high-density current carriers with little or no power loss, high-temperature superconductors offer unique solutions to fundamental grid challenges of the 21st century and hold great promise in addressing our global energy challenges. The complexity and promise of these materials has caused our community to more freely share our ideas and results than ever before, and it is gratifying to see how we have grown into an enthusiastic global network to advance the field. This invited collection is true to this agenda and we are delighted to have received contributions from many of the world leaders for an initiative that is designed to benefit both newcomers and established researchers in superconductivity.

  16. Directory of Online Cataloging Tools

    Directory of Open Access Journals (Sweden)

    Mohamed Hamed Mu'awwad

    2007-09-01

    Full Text Available A directory of online cataloging tools, it collects more than 200 resources of essential tools used in libraries, the resources divided into 4 categories; individual tools classified according to type of activity, like: cataloging, classification, standard format; collective tools, non-collective tools, and free general tools available on the internet.

  17. New tools for learning.

    Science.gov (United States)

    Dickinson, D

    1999-01-01

    more often to collaborate on creating new knowledge as well as mastering the basics. As technology becomes more ubiquitous, there is growing recognition of the importance of the arts in humanizing the curriculum. "More high-tech, more need for high-touch" is becoming the by-word of many schools. They recognize that the arts are not only culturally important and civilizing influences, but they can facilitate the learning of almost any subject. I believe that these four concepts--the plasticity of the brain, the modifiability of intelligence, the use of technology as a powerful new tool for learning, and the renaissance of the arts in education--have major implications specifically for educational systems and generally for the future of our world. In this time of rapid change, leading-edge educational systems are equipping people with the ability to learn, unlearn, and relearn continually. They are giving students meaningful opportunities to apply what they have learned in order to turn information into knowledge. And--of critical importance if any of this is to lead to a healthy future--they are helping students to learn to use knowledge responsibly, ethically, and with integrity. Furthermore, they are involving students in experiences that develop compassion and altruism in the process of their education. Our complex world urgently needs more people who have developed their fullest potential in mind, body, and spirit. PMID:10770084

  18. New tools for learning.

    Science.gov (United States)

    Dickinson, D

    1999-01-01

    more often to collaborate on creating new knowledge as well as mastering the basics. As technology becomes more ubiquitous, there is growing recognition of the importance of the arts in humanizing the curriculum. "More high-tech, more need for high-touch" is becoming the by-word of many schools. They recognize that the arts are not only culturally important and civilizing influences, but they can facilitate the learning of almost any subject. I believe that these four concepts--the plasticity of the brain, the modifiability of intelligence, the use of technology as a powerful new tool for learning, and the renaissance of the arts in education--have major implications specifically for educational systems and generally for the future of our world. In this time of rapid change, leading-edge educational systems are equipping people with the ability to learn, unlearn, and relearn continually. They are giving students meaningful opportunities to apply what they have learned in order to turn information into knowledge. And--of critical importance if any of this is to lead to a healthy future--they are helping students to learn to use knowledge responsibly, ethically, and with integrity. Furthermore, they are involving students in experiences that develop compassion and altruism in the process of their education. Our complex world urgently needs more people who have developed their fullest potential in mind, body, and spirit.

  19. Applied Linguistics and the "Annual Review of Applied Linguistics."

    Science.gov (United States)

    Kaplan, Robert B.; Grabe, William

    2000-01-01

    Examines the complexities and differences involved in granting disciplinary status to the role of applied linguistics, discusses the role of the "Annual Review of Applied Linguistics" as a contributor to the development of applied linguistics, and highlights a set of publications for the future of applied linguistics. (Author/VWL)

  20. Micro and nano fabrication tools and processes

    CERN Document Server

    Gatzen, Hans H; Leuthold, Jürg

    2015-01-01

    For Microelectromechanical Systems (MEMS) and Nanoelectromechanical Systems (NEMS) production, each product requires a unique process technology. This book provides a comprehensive insight into the tools necessary for fabricating MEMS/NEMS and the process technologies applied. Besides, it describes enabling technologies which are necessary for a successful production, i.e., wafer planarization and bonding, as well as contamination control.

  1. Computational social networks tools, perspectives and applications

    CERN Document Server

    Abraham, Ajith

    2012-01-01

    Provides the latest advances in computational social networks, and illustrates how organizations can gain a competitive advantage by applying these ideas in real-world scenarios Presents a specific focus on practical tools and applications Provides experience reports, survey articles, and intelligence techniques and theories relating to specific problems in network technology

  2. Academic training: Applied superconductivity

    CERN Multimedia

    2007-01-01

    LECTURE SERIES 17, 18, 19 January from 11.00 to 12.00 hrs Council Room, Bldg 503 Applied Superconductivity : Theory, superconducting Materials and applications E. PALMIERI/INFN, Padova, Italy When hearing about persistent currents recirculating for several years in a superconducting loop without any appreciable decay, one realizes that we are dealing with a phenomenon which in nature is the closest known to the perpetual motion. Zero resistivity and perfect diamagnetism in Mercury at 4.2 K, the breakthrough during 75 years of several hundreds of superconducting materials, the revolution of the "liquid Nitrogen superconductivity"; the discovery of still a binary compound becoming superconducting at 40 K and the subsequent re-exploration of the already known superconducting materials: Nature discloses drop by drop its intimate secrets and nobody can exclude that the last final surprise must still come. After an overview of phenomenology and basic theory of superconductivity, the lectures for this a...

  3. Applied hydraulic transients

    CERN Document Server

    Chaudhry, M Hanif

    2014-01-01

    This book covers hydraulic transients in a comprehensive and systematic manner from introduction to advanced level and presents various methods of analysis for computer solution. The field of application of the book is very broad and diverse and covers areas such as hydroelectric projects, pumped storage schemes, water-supply systems, cooling-water systems, oil pipelines and industrial piping systems. Strong emphasis is given to practical applications, including several case studies, problems of applied nature, and design criteria. This will help design engineers and introduce students to real-life projects. This book also: ·         Presents modern methods of analysis suitable for computer analysis, such as the method of characteristics, explicit and implicit finite-difference methods and matrix methods ·         Includes case studies of actual projects ·         Provides extensive and complete treatment of governed hydraulic turbines ·         Presents design charts, desi...

  4. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution.

  5. Applied research with cyclotrons

    International Nuclear Information System (INIS)

    During the past three decades the Flerov laboratory carried out research and development of a number of applications that have found or may find use in modern technologies. One of the applications is the so-called ion track technology enabling us to create micro- and nano-structured materials. Accelerated heavy ion beams are the unique tools for structuring insulating solids in a controllable manner. At FLNR JINR the U-400 cyclotron and the IC-100 cyclotron are employed for irradiation of materials to be modified by the track-etch technique. For practical applications, U-400 delivers the 86Kr ion beams with total energies of 250, 350, 430 and 750 MeV, and the 136Xe ion beams with the energy of 430 MeV. The cyclotron is equipped with a specialized channel for irradiation of polymer foils. IC-100 is a compact accelerator specially designed for the technological uses. High-intensity krypton ion beams with the energy of ∼ 1 MeV/u are available now at IC-100. Production of track-etch membranes is an example of mature technology based on irradiation with accelerated ions. The track-etch membranes offer distinct advantages over other types of membranes due to their precisely determined structure. One-pore, oligo-pore and multi-pore samples can serve as models for studying the transport of liquids, gases, particles, solutes, and electrolytes in narrow channels. Track-etch pores are also used as templates for making nano wires, nano tubes or array of nano rods. The microstructures obtained this way may find use in miniaturized devices such as sensors for biologically important molecules. Bulk and surface modification for the production of new composites and materials with special optical properties can be performed with ion beams. Flexible printed circuits, high-performance heat transfer modules, X-ray filters, and protective signs are examples of products developed in collaboration with research and industrial partners. Some recent achievements and promising ideas that

  6. Tool Gear: Infrastructure for Building Parallel Programming Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J M; Gyllenhaal, J

    2002-12-09

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  7. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  8. Applied large eddy simulation.

    Science.gov (United States)

    Tucker, Paul G; Lardeau, Sylvain

    2009-07-28

    Large eddy simulation (LES) is now seen more and more as a viable alternative to current industrial practice, usually based on problem-specific Reynolds-averaged Navier-Stokes (RANS) methods. Access to detailed flow physics is attractive to industry, especially in an environment in which computer modelling is bound to play an ever increasing role. However, the improvement in accuracy and flow detail has substantial cost. This has so far prevented wider industrial use of LES. The purpose of the applied LES discussion meeting was to address questions regarding what is achievable and what is not, given the current technology and knowledge, for an industrial practitioner who is interested in using LES. The use of LES was explored in an application-centred context between diverse fields. The general flow-governing equation form was explored along with various LES models. The errors occurring in LES were analysed. Also, the hybridization of RANS and LES was considered. The importance of modelling relative to boundary conditions, problem definition and other more mundane aspects were examined. It was to an extent concluded that for LES to make most rapid industrial impact, pragmatic hybrid use of LES, implicit LES and RANS elements will probably be needed. Added to this further, highly industrial sector model parametrizations will be required with clear thought on the key target design parameter(s). The combination of good numerical modelling expertise, a sound understanding of turbulence, along with artistry, pragmatism and the use of recent developments in computer science should dramatically add impetus to the industrial uptake of LES. In the light of the numerous technical challenges that remain it appears that for some time to come LES will have echoes of the high levels of technical knowledge required for safe use of RANS but with much greater fidelity. PMID:19531503

  9. Essays in applied microeconomics

    Science.gov (United States)

    Wang, Xiaoting

    In this dissertation I use Microeconomic theory to study firms' behavior. Chapter One introduces the motivations and main findings of this dissertation. Chapter Two studies the issue of information provision through advertisement when markets are segmented and consumers' price information is incomplete. Firms compete in prices and advertising strategies for consumers with transportation costs. High advertising costs contribute to market segmentation. Low advertising costs promote price competition among firms and improves consumer welfare. Chapter Three also investigates market power as a result of consumers' switching costs. A potential entrant can offer a new product bundled with an existing product to compensate consumers for their switching cost. If the primary market is competitive, bundling simply plays the role of price discrimination, and it does not dominate unbundled sales in the process of entry. If the entrant has market power in the primary market, then bundling also plays the role of leveraging market power and it dominates unbundled sales. The market for electric power generation has been opened to competition in recent years. Chapter Four looks at issues involved in the deregulated electricity market. By comparing the performance of the competitive market with the social optimum, we identify the conditions under which market equilibrium generates socially efficient levels of electric power. Chapter Two to Four investigate the strategic behavior among firms. Chapter Five studies the interaction between firms and unemployed workers in a frictional labor market. We set up an asymmetric job auction model, where two types of workers apply for two types of job openings by bidding in auctions and firms hire the applicant offering them the most profits. The job auction model internalizes the determination of the share of surplus from a match, therefore endogenously generates incentives for an efficient division of the matching surplus. Microeconomic

  10. Applied Historical Astronomy

    Science.gov (United States)

    Stephenson, F. Richard

    2014-01-01

    F. Richard Stephenson has spent most of his research career -- spanning more than 45 years -- studying various aspects of Applied Historical Astronomy. The aim of this interdisciplinary subject is the application of historical astronomical records to the investigation of problems in modern astronomy and geophysics. Stephenson has almost exclusively concentrated on pre-telescopic records, especially those preserved from ancient and medieval times -- the earliest reliable observations dating from around 700 BC. The records which have mainly interested him are of eclipses (both solar and lunar), supernovae, sunspots and aurorae, and Halley's Comet. The main sources of early astronomical data are fourfold: records from ancient and medieval East Asia (China, together with Korea and Japan); ancient Babylon; ancient and medieval Europe; and the medieval Arab world. A feature of Stephenson's research is the direct consultation of early astronomical texts in their original language -- either working unaided or with the help of colleagues. He has also developed a variety of techniques to help interpret the various observations. Most pre-telescopic observations are very crude by present-day standards. In addition, early motives for skywatching were more often astrological rather than scientific. Despite these drawbacks, ancient and medieval astronomical records have two remarkable advantages over modern data. Firstly, they can enable the investigation of long-term trends (e.g. in the terrestrial rate of rotation), which in the relatively short period covered by telescopic observations are obscured by short-term fluctuations. Secondly, over the lengthy time-scale which they cover, significant numbers of very rare events (such as Galactic supernovae) were reported, which have few -- if any-- counterparts in the telescopic record. In his various researches, Stephenson has mainly focused his attention on two specific topics. These are: (i) long-term changes in the Earth's rate of

  11. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  12. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  13. NWRS Survey Prioritization Tool

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — A SMART Tool and User's Guide for aiding NWRS Station staff when prioritizing their surveys for an Inventory and Monitoring Plan. This guide describes a process and...

  14. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  15. Tools and their uses

    CERN Document Server

    1973-01-01

    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  16. Chemical Data Access Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  17. Recovery Action Mapping Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  18. THOR: Metrics and Tools

    OpenAIRE

    Dasler, Robin

    2016-01-01

    This report describes the work of the THOR Project to develop a dashboard to monitor interoperability of persistent identifiers. The dashboard is an essential step towards a suite of tools to measure the impact of the project.

  19. Mapping Medicare Disparities Tool

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Minority Health has designed an interactive map, the Mapping Medicare Disparities Tool, to identify areas of disparities between subgroups of...

  20. Friction stir welding tool

    Science.gov (United States)

    Tolle; Charles R. , Clark; Denis E. , Barnes; Timothy A.

    2008-04-15

    A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

  1. Neighborhood Mapping Tool

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  2. Cash Reconciliation Tool

    Data.gov (United States)

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create...

  3. Autism Teaching Tool

    CERN Multimedia

    2014-01-01

    CERN pattern recognition technologies transferred to Austistic children learning tool. The state of the art of pattern recognition technology developed at CERN for High Energy Physics are transferred to Computer Vision domain and are used to develop a new

  4. Wound assessment tools and nurses' needs: an evaluation study.

    Science.gov (United States)

    Greatrex-White, Sheila; Moxey, Helen

    2015-06-01

    The purpose of this study was to ascertain how well different wound assessment tools meet the needs of nurses in carrying out general wound assessment and whether current tools are fit for purpose. The methodology employed was evaluation research. In order to conduct the evaluation, a literature review was undertaken to identify the criteria of an optimal wound assessment tool which would meet nurses' needs. Several freely available wound assessment tools were selected based on predetermined inclusion and exclusion criteria and an audit tool was developed to evaluate the selected tools based on how well they met the criteria of the optimal wound assessment tool. The results provide a measure of how well the selected wound assessment tools meet the criteria of the optimal wound assessment tool. No tool was identified which fulfilled all the criteria, but two (the Applied Wound Management tool and the National Wound Assessment Form) met the most criteria of the optimal tool and were therefore considered to best meet nurses' needs in wound assessment. The study provides a mechanism for the appraisal of wound assessment tools using a set of optimal criteria which could aid practitioners in their search for the best wound assessment tool.

  5. Stochastic tools in turbulence

    CERN Document Server

    Lumey, John L

    2012-01-01

    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  6. Stone Tool Production

    OpenAIRE

    Hikade, Thomas

    2010-01-01

    In ancient Egypt, flint or chert was used for knapped stone tools from the Lower Palaeolithic down to the Pharaonic Period. The raw material was available in abundance on the desert surface, or it could be mined from the limestone formations along the Nile Valley. While the earliest lithic industries of Prehistoric Egypt resemble the stone tool assemblages from other parts of Africa, as well as Asia and Europe, the later Prehistoric stone industries in Egypt had very specific characteristics,...

  7. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Science.gov (United States)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  8. Essays in Applied Microeconomics

    Science.gov (United States)

    Ge, Qi

    This dissertation consists of three self-contained applied microeconomics essays on topics related to behavioral economics and industrial organization. Chapter 1 studies how sentiment as a result of sports event outcomes affects consumers' tipping behavior in the presence of social norms. I formulate a model of tipping behavior that captures consumer sentiment following a reference-dependent preference framework and empirically test its relevance using the game outcomes of the NBA and the trip and tipping data on New York City taxicabs. While I find that consumers' tipping behavior responds to unexpected wins and losses of their home team, particularly in close game outcomes, I do not find evidence for loss aversion. Coupled with the findings on default tipping, my empirical results on the asymmetric tipping responses suggest that while social norms may dominate loss aversion, affect and surprises can result in freedom on the upside of tipping. Chapter 2 utilizes a novel data source of airline entry and exit announcements and examines how the incumbent airlines adjust quality provisions as a response to their competitors' announcements and the role of timing in such responses. I find no evidence that the incumbents engage in preemptive actions when facing probable entry and exit threats as signaled by the competitors' announcements in either short term or long term. There is, however, evidence supporting their responses to the competitors' realized entry or exit. My empirical findings underscore the role of timing in determining preemptive actions and suggest that previous studies may have overestimated how the incumbent airlines respond to entry threats. Chapter 3, which is collaborated with Benjamin Ho, investigates the habit formation of consumers' thermostat setting behavior, an often implicitly made decision and yet a key determinant of home energy consumption and expenditures. We utilize a high frequency dataset on household thermostat usage and find that

  9. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    Directory of Open Access Journals (Sweden)

    Cui Chengsong

    2015-01-01

    Full Text Available In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming process was used to produce graded tool materials in the form of a flat product. The graded deposits were subsequently hot rolled and heat treated to achieve an optimal microstructure and advanced properties. Micro plunge rotary swaging tools with fine geometrical structures were machined from the hot rolled materials. The new forming tools were successfully applied in the micro plunge rotary swaging of wires of stainless steel.

  10. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    Directory of Open Access Journals (Sweden)

    Cui Chengsong

    2015-01-01

    Full Text Available In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming process was used to produce graded tool material in the form of a flat product. The graded deposit was subsequently hot rolled and heat treated to achieve an optimal microstructure and advanced properties. Micro plunge rotary swaging tools with fine geometrical structures were machined from the hot rolled material. The new forming tools were successfully applied in the micro plunge rotary swaging of wires of stainless steel.

  11. Surface exploration geophysics applied to the moon

    International Nuclear Information System (INIS)

    With the advent of a permanent lunar base, the desire to explore the lunar near-surface for both scientific and economic purposes will arise. Applications of exploration geophysical methods to the earth's subsurface are highly developed. This paper briefly addresses some aspects of applying this technology to near surface lunar exploration. It is noted that both the manner of application of some techniques, as well as their traditional hierarchy as assigned on earth, should be altered for lunar exploration. In particular, electromagnetic techniques may replace seismic techniques as the primary tool for evaluating near-surface structure

  12. Kymenlaakso University of Applied Sciences Social Media Marketing Campaign

    OpenAIRE

    Bogdanov, Evgeny

    2012-01-01

    The popularity of social media has grown significantly in the course of the last few years and become a beneficial tool for promoting an organisation. Maintaining online presence on social media websites allows a company to reach its target audiences and raise public awareness on a global basis. The purpose of this study was to achieve a better understanding of how social media marketing of Kymenlaakso University of Applied Sciences can be improved and which social media marketing tools ...

  13. GIS Technology: Resource and Habitability Assessment Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a one-year project to apply a GIS analysis tool to new orbital data for lunar resource assessment and martian habitability identification.  We used...

  14. NASA's Applied Sciences for Water Resources

    Science.gov (United States)

    Doorn, Bradley; Toll, David; Engman, Ted

    2011-01-01

    The Earth Systems Division within NASA has the primary responsibility for the Earth Science Applied Science Program and the objective to accelerate the use of NASA science results in applications to help solve problems important to society and the economy. The primary goal of the Earth Science Applied Science Program is to improve future and current operational systems by infusing them with scientific knowledge of the Earth system gained through space-based observation, assimilation of new observations, and development and deployment of enabling technologies, systems, and capabilities. This paper discusses one of the major problems facing water resources managers, that of having timely and accurate data to drive their decision support tools. It then describes how NASA?s science and space based satellites may be used to overcome this problem. Opportunities for the water resources community to participate in NASA?s Water Resources Applications Program are described.

  15. Fractal Description of the Shearing-Surface of Tools

    Institute of Scientific and Technical Information of China (English)

    WANG Bing-cheng; JING Chang; REN Zhao-hui; REN Li-yi

    2004-01-01

    In this paper, the basic methods are introduced to calculate the fractal dimensions of the shearing surface of some tools. The fractal dimension of the shearing surface of experimental sampling is obtained and the fractal characteristics are also discussed. We can apply the fractal method to identify types of tools used by burglars and to do the job of individual recognition. New theories and methods are provided to measure and process the shearing surface profile of tools.

  16. Some Notes About Artificial Intelligence as New Mathematical Tool

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-04-01

    Full Text Available Mathematics is a mere instance of First-Order Predicate Calculus. Therefore it belongs to applied Monotonic Logic. So, we found the limitations of classical logic reasoning and the clear advantages of Fuzzy Logic and many other new interesting tools. We present here some of the more usefulness tools of this new field of Mathematics so-called Artificial Intelligence.

  17. New Conceptual Design Tools

    DEFF Research Database (Denmark)

    Pugnale, Alberto; Holst, Malene; Kirkegaard, Poul Henning

    2010-01-01

    This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  18. Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY-08

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Dreher, Joseph

    2008-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the third quarter of Fiscal Year 2008 (April - June 2008). Tasks reported on are: Peak Wind Tool for User Launch Commit Criteria (LCC), Anvil Forecast Tool in AWIPS Phase II, Completion of the Edward Air Force Base (EAFB) Statistical Guidance Wind Tool, Volume Averaged Height Integ rated Radar Reflectivity (VAHIRR), Impact of Local Sensors, Radar Scan Strategies for the PAFB WSR-74C Replacement, VAHIRR Cost Benefit Analysis, and WRF Wind Sensitivity Study at Edwards Air Force Base

  19. Applied Meteorology Unit (AMU) Quarterly Report - Fourth Quarter FY-09

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark

    2009-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the fourth quarter of Fiscal Year 2009 (July - September 2009). Tasks reports include: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool. Phase III, (3) Peak Wind Tool for General Forecasting. Phase II, (4) Update and Maintain Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS), (5) Verify MesoNAM Performance (6) develop a Graphical User Interface to update selected parameters for the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLlT)

  20. Applied Ethics in Nowadays Society

    OpenAIRE

    Tomita CIULEI

    2013-01-01

    This special issue is dedicated to Nowadays Applied Ethics in Society, and falls in the field of social sciences and humanities, being hosted both theoretical approaches and empirical research in various areas of applied ethics. Applied ethics analyzes of a series of morally concrete situations of social or professional practice in order to make / adopt decisions. In the field of applied ethics are integrated medical ethics, legal ethics, media ethics, professional ethics, environmental ethic...

  1. Morphometrics applied to medical entomology.

    Science.gov (United States)

    Dujardin, Jean-Pierre

    2008-12-01

    Morphometrics underwent a revolution more than one decade ago. In the modern morphometrics, the estimate of size is now contained in a single variable reflecting variation in many directions, as many as there are landmarks under study, and shape is defined as their relative positions after correcting for size, position and orientation. With these informative data, and the corresponding software freely available to conduct complex analyses, significant biological and epidemiological features can be quantified more accurately. We discuss the evolutionary significance of the environmental impact on metric variability, mentioning the importance of concepts like genetic assimilation, genetic accommodation, and epigenetics. We provide examples of measuring the effect of selection on metric variation by comparing (unpublished) Qst values with corresponding (published) Fst. The primary needs of medical entomologists are to distinguish species, especially cryptic species, and to detect them where they are not expected. We explain how geometric morphometrics could apply to these questions, and where there are deficiencies preventing the approach from being utilized at its maximum potential. Medical entomologists in connection with control programs aim to identify isolated populations where the risk of reinfestation after treatment would be low ("biogeographical islands"). Identifying them can be obtained from estimating the number of migrants per generation. Direct assessment of movement remains the most valid approach, but it scores active movement only. Genetic methods estimating gene flow levels among interbreeding populations are commonly used, but gene flow does not necessarily mean the current flow of migrants. Methods using the morphometric variation are neither suited to evaluate gene flow, nor are they adapted to estimate the flow of migrants. They may provide, however, the information needed to create a preliminary map pointing to relevant areas where one could

  2. CMS tracker visualization tools

    CERN Document Server

    Zito, G; Osborne, I; Regano, A

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  3. CMS tracker visualization tools

    Energy Technology Data Exchange (ETDEWEB)

    Mennea, M.S. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Osborne, I. [Northeastern University, 360 Huntington Avenue, Boston, MA 02115 (United States); Regano, A. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Zito, G. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy)]. E-mail: giuseppe.zito@ba.infn.it

    2005-08-21

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  4. Tool nimega Sacco

    Index Scriptorium Estoniae

    1998-01-01

    Kolmekümneseks on saanud Zanotta kott ehk tool "Sacco", mille 1968. a. disainisid P. Gatti, C. Paolini, F. Teodoro. "Sacco" - polüstüreenist graanulitega täidetud kott. Tähelepanu pälvis ka Zanotta firma täispuhutav tool "Blow" (1967, Scholari, D'Urbino, Lomazzi, De Pas). E. Lucie-Smith neist. 1968. aastale on pühendatud Düsseldorfi Kunstimuuseumi näitus "1968. a. legendid ja sümbolid", kus on eksponeeritud ligi 500 objekti ja mitu rekonstrueeritud interjööri

  5. The Routledge Applied Linguistics Reader

    Science.gov (United States)

    Wei, Li, Ed.

    2011-01-01

    "The Routledge Applied Linguistics Reader" is an essential collection of readings for students of Applied Linguistics. Divided into five sections: Language Teaching and Learning, Second Language Acquisition, Applied Linguistics, Identity and Power and Language Use in Professional Contexts, the "Reader" takes a broad interpretation of the subject…

  6. The science writing tool

    Science.gov (United States)

    Schuhart, Arthur L.

    This is a two-part dissertation. The primary part is the text of a science-based composition rhetoric and reader called The Science Writing Tool. This textbook has seven chapters dealing with topics in Science Rhetoric. Each chapter includes a variety of examples of science writing, discussion questions, writing assignments, and instructional resources. The purpose of this text is to introduce lower-division college science majors to the role that rhetoric and communication plays in the conduct of Science, and how these skills contribute to a successful career in Science. The text is designed as a "tool kit," for use by an instructor constructing a science-based composition course or a writing-intensive Science course. The second part of this part of this dissertation reports on student reactions to draft portions of The Science Writing Tool text. In this report, students of English Composition II at Northern Virginia Community College-Annandale were surveyed about their attitudes toward course materials and topics included. The findings were used to revise and expand The Science Writing Tool.

  7. Nitrogen Trading Tool (NTT)

    Science.gov (United States)

    The Natural Resources Conservation Service (NRCS) recently developed a prototype web-based nitrogen trading tool to facilitate water quality credit trading. The development team has worked closely with the Agriculture Research Service Soil Plant Nutrient Research Unit (ARS-SPNR) and the Environmenta...

  8. C-TOOL

    DEFF Research Database (Denmark)

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John;

    2014-01-01

    agricultural soils. C-TOOL simulates observed losses of SOC in soils under intensive agricultural use and the gain in SOC derived from large inputs of animal manure and inclusion of perennial grassland. The model simulates changes in SOC for the entire profile, but lack of data on subsoil SOC storage hampers...

  9. Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due

    2005-01-01

    in particular. A model of the aspects in the term tectonics – epresentation, ontology and culture – will be presented and used to discuss the current digital tools’ ability in tectonics. Furthermore it will be discussed what a digital tectonic tool is and could be and how a connection between the digital...

  10. Apple Shuns Tracking Tool

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Apple Inc. is advising software de- velopers to stop using a feature in software for its iPhones and iPads .that has been linked to privacyconcerns, a move that would also take away a widely used tool for tracking users and their behavior. Developers who write programs for Apple's lOS operating system have been using a unique.

  11. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...

  12. Community Profiles Tool

    OpenAIRE

    Institute of Public Health in Ireland (IPH)

    2014-01-01

    The Community Profiles Tool can be used to develop local health and wellbeing profiles from over 200 health-related indicators compiled from a range of data sources. Users can create tables, maps and charts of health-related indicators, and integrate this with key public health documents from the Health Well website such as relevant interventions, policies, and evidence related to each indicator.

  13. Google - Security Testing Tool

    OpenAIRE

    Staykov, Georgi

    2007-01-01

    Using Google as a security testing tool, basic and advanced search techniques using advanced google search operators. Examples of obtaining control over security cameras, VoIP systems, web servers and collecting valuable information as: Credit card details, cvv codes – only using Google.

  14. Clean Cities Tools

    Energy Technology Data Exchange (ETDEWEB)

    None

    2014-12-19

    The U.S. Department of Energy's Clean Cities offers a large collection of Web-based tools on the Alternative Fuels Data Center. These calculators, interactive maps, and data searches can assist fleets, fuels providers, and other transportation decision makers in their efforts to reduce petroleum use.

  15. Photutils: Photometry tools

    Science.gov (United States)

    Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan

    2016-09-01

    Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).

  16. Rapid Tooling via Stereolithography

    OpenAIRE

    Montgomery, Eva

    2006-01-01

    Approximately three years ago, composite stereolithography (SL) resins were introduced to the marketplace, offering performance features beyond what traditional SL resins could offer. In particular, the high heat deflection temperatures and high stiffness of these highly filled resins have opened the door to several new rapid prototyping (RP) applications, including wind tunnel test modelling and, more recently, rapid tooling.

  17. Incident Information Management Tool

    CERN Document Server

    Pejovic, Vladimir

    2015-01-01

    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  18. Linux programming tools unveiled

    CERN Document Server

    Venkateswarlu, N B

    2007-01-01

    In the recent years, Linux, a public domain, freely available Unix variant has attracted the people very much. Today's complex production environments demands superior application performance. Linux is having extraordinary advantages such as : complete source code access, availability of exceptional optimization, testing tools. This book has explored this facet of Linux.

  19. Essential marketing tools

    OpenAIRE

    Potter, Ned

    2012-01-01

    This chapter from The Library Marketing Toolkit introduces several essential tools which every library needs as part of their marketing toolkit. These include developing the library website (including mobile version), utilising word-of-mouth promotion, obtaining and interpreting user feedback, perfecting the elevator pitch, and signs and displays. It includes case studies from David Lee King, and Aaron Tay, Rebecca Jones.

  20. Surgical tool detection and tracking in retinal microsurgery

    Science.gov (United States)

    Alsheakhali, Mohamed; Yigitsoy, Mehmet; Eslami, Abouzar; Navab, Nassir

    2015-03-01

    Visual tracking of surgical instruments is an essential part of eye surgery, and plays an important role for the surgeons as well as it is a key component of robotics assistance during the operation time. The difficulty of detecting and tracking medical instruments in-vivo images comes from its deformable shape, changes in brightness, and the presence of the instrument shadow. This paper introduces a new approach to detect the tip of surgical tool and its width regardless of its head shape and the presence of the shadows or vessels. The approach relies on integrating structural information about the strong edges from the RGB color model, and the tool location-based information from L*a*b color model. The probabilistic Hough transform was applied to get the strongest straight lines in the RGB-images, and based on information from the L* and a*, one of these candidates lines is selected as the edge of the tool shaft. Based on that line, the tool slope, the tool centerline and the tool tip could be detected. The tracking is performed by keeping track of the last detected tool tip and the tool slope, and filtering the Hough lines within a box around the last detected tool tip based on the slope differences. Experimental results demonstrate the high accuracy achieved in term of detecting the tool tip position, the tool joint point position, and the tool centerline. The approach also meets the real time requirements.

  1. Designer nanoparticle: nanobiotechnology tool for cell biology

    Science.gov (United States)

    Thimiri Govinda Raj, Deepak B.; Khan, Niamat Ali

    2016-09-01

    This article discusses the use of nanotechnology for subcellular compartment isolation and its application towards subcellular omics. This technology review significantly contributes to our understanding on use of nanotechnology for subcellular systems biology. Here we elaborate nanobiotechnology approach of using superparamagnetic nanoparticles (SPMNPs) optimized with different surface coatings for subcellular organelle isolation. Using pulse-chase approach, we review that SPMNPs interacted differently with the cell depending on its surface functionalization. The article focuses on the use of functionalized-SPMNPs as a nanobiotechnology tool to isolate high quality (both purity and yield) plasma membranes and endosomes or lysosomes. Such nanobiotechnology tool can be applied in generating subcellular compartment inventories. As a future perspective, this strategy could be applied in areas such as immunology, cancer and stem cell research.

  2. EVALUATION OF MACHINE TOOL QUALITY

    OpenAIRE

    Ivan Kuric

    2011-01-01

    Paper deals with aspects of quality and accuracy of machine tools. As the accuracy of machine tools has key factor for product quality, it is important to know the methods for evaluation of quality and accuracy of machine tools. Several aspects of diagnostics of machine tools are described, such as aspects of reliability.

  3. Web Tools: The Second Generation

    Science.gov (United States)

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  4. Tool use by aquatic animals.

    Science.gov (United States)

    Mann, Janet; Patterson, Eric M

    2013-11-19

    Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631

  5. EVALUATION OF MACHINE TOOL QUALITY

    Directory of Open Access Journals (Sweden)

    Ivan Kuric

    2011-12-01

    Full Text Available Paper deals with aspects of quality and accuracy of machine tools. As the accuracy of machine tools has key factor for product quality, it is important to know the methods for evaluation of quality and accuracy of machine tools. Several aspects of diagnostics of machine tools are described, such as aspects of reliability.

  6. Dynamic optimization case studies in DYNOPT tool

    Science.gov (United States)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  7. Using corporate social responsibility tools while forming the enterprise internationalization marketing strategy

    OpenAIRE

    Ilchuk, P. H.; Мuzhelyak, М. М.; Кots, О. О.

    2014-01-01

    The article investigates the possibility of improving the marketing strategy by using new tools, among themthe tools of corporate social responsibility at Ukrainian enterprises. The definition of the corporate social responsibility concept is generalized. The main tools of the corporate social responsibility as well as the relationship between these tools and the enterprise internationalization marketing strategy are determined. Feasibility of applying corporate social responsibility tools in...

  8. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  9. A Review of Applied Mathematics

    OpenAIRE

    Ó Náraigh, Lennon; Ní Shúilleabháin, Aoibhinn

    2015-01-01

    Applied Mahtematics is a subject which deals with problmes arising inthe physical, life, and social sciences as well as in engineering and provides a broad body of knowledge for use in a wide spectrum of research and insdustry. Applied Mathematics is an important school subject which builds students' mathematical and problem solving skills. The subject has remained on the periphery of school time-tables and, without the commitment and enthusiasm of Applied Maths teachers, would likely be omit...

  10. Algal functional annotation tool

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, D. [UCLA; Casero, D. [UCLA; Cokus, S. J. [UCLA; Merchant, S. S. [UCLA; Pellegrini, M. [UCLA

    2012-07-01

    The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps and batch gene identifier conversion.

  11. LIKWID: Lightweight Performance Tools

    CERN Document Server

    Treibig, Jan; Wellein, Gerhard

    2011-01-01

    Exploiting the performance of today's microprocessors requires intimate knowledge of the microarchitecture as well as an awareness of the ever-growing complexity in thread and cache topology. LIKWID is a set of command line utilities that addresses four key problems: Probing the thread and cache topology of a shared-memory node, enforcing thread-core affinity on a program, measuring performance counter metrics, and microbenchmarking for reliable upper performance bounds. Moreover, it includes an mpirun wrapper allowing for portable thread-core affinity in MPI and hybrid MPI/threaded applications. To demonstrate the capabilities of the tool set we show the in uence of thread affinity on performance using the well-known OpenMP STREAM triad benchmark, use hardware counter tools to study the performance of a stencil code, and finally show how to detect bandwidth problems on ccNUMA-based compute nodes.

  12. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  13. Remote vehicle survey tool

    International Nuclear Information System (INIS)

    The Remote Vehicle Survey Tool (RVS7) is a color graphical display tool for viewing remotely acquired scientific data. The RVST displays the data in the form of a color two-dimensional world model map. The world model map allows movement of the remote vehicle to be tracked by the operator and the data from sensors to be graphically depicted in the interface. Linear and logarithmic meters, dual channel oscilloscopes, and directional compasses are used to display sensor information. The RVST is user-configurable by the use of ASCII text files. The operator can configure the RVST to work with any remote data acquisition system and teleoperated or autonomous vehicle. The modular design of the RVST and its ability to be quickly configured for varying system requirements make the RVST ideal for remote scientific data display in all environmental restoration and waste management programs

  14. Applied Ethics in Nowadays Society

    Directory of Open Access Journals (Sweden)

    Tomita CIULEI

    2013-12-01

    Full Text Available This special issue is dedicated to Nowadays Applied Ethics in Society, and falls in the field of social sciences and humanities, being hosted both theoretical approaches and empirical research in various areas of applied ethics. Applied ethics analyzes of a series of morally concrete situations of social or professional practice in order to make / adopt decisions. In the field of applied ethics are integrated medical ethics, legal ethics, media ethics, professional ethics, environmental ethics, business ethics etc. Classification-JEL: A23

  15. Open ICT tools project

    OpenAIRE

    Turnock, Chris; Bohemia, Erik; Woodhouse, Jed; Smith, Neil; Lovatt, Ben

    2009-01-01

    The paper will introduce a project titled the ‘Open ICT Tools’ which aims to explore and trial out ICT tools to facilitate a global collaborative and secured engagement with external business and community partners. The challenge is to facilitate a communication and multimedia data exchange between Northumbria University and participating external educational and business organisations without compromising the security of either Northumbria University IT infrastructure or that of the partner ...

  16. Channel nut tool

    Energy Technology Data Exchange (ETDEWEB)

    Olson, Marvin

    2016-01-12

    A method, system, and apparatus for installing channel nuts includes a shank, a handle formed on a first end of a shank, and an end piece with a threaded shaft configured to receive a channel nut formed on the second end of the shaft. The tool can be used to insert or remove a channel nut in a channel framing system and then removed from the channel nut.

  17. Program Management Tool

    Science.gov (United States)

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity

  18. Java Vertexing Tools

    International Nuclear Information System (INIS)

    This document describes the implementation of the topological vertex finding algorithm ZVTOP within the org.lcsim reconstruction and analysis framework. At the present date, Java vertexing tools allow users to perform topological vertexing on tracks that have been obtained from a Fast MC simulation. An implementation that will be able to handle fully reconstructed events is being designed from the ground up for longevity and maintainability

  19. Friction Stir Weld Tools

    Science.gov (United States)

    Carter, Robert W. (Inventor); Payton, Lewis N. (Inventor)

    2007-01-01

    A friction stir weld tool sleeve is supported by an underlying support pin. The pin material is preferably selected for toughness and fracture characteristics. The pin sleeve preferably has a geometry which employs the use of an interrupted thread, a plurality of flutes and/or eccentric path to provide greater flow through. Paddles have been found to assist in imparting friction and directing plastic metal during the welding process.

  20. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  1. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  2. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  3. Knowing about tools: neural correlates of tool familiarity and experience.

    Science.gov (United States)

    Vingerhoets, Guy

    2008-04-15

    The observation of tools is known to elicit a distributed cortical network that reflects close-knit relations of semantic, action-related, and perceptual knowledge. The neural correlates underlying the critical knowledge of skilled tool use, however, remain to be elucidated. In this study, functional magnetic resonance imaging in 14 volunteers compares neural activation during the observation of familiar tools versus equally graspable unfamiliar tools of which the observers have little, if any, functional knowledge. In a second paradigm, the level of tool-experience is investigated by comparing the neural effects of observing frequently versus infrequently used familiar tools. Both familiar and unfamiliar tools activate the classic neural network associated with tool representations. Direct comparison of the activation patterns during the observation of familiar and unfamiliar tools in a priori determined regions of interest (ptool-use knowledge, with supramarginal gyrus storing information about limb and hand positions, and precuneus storing visuospatial information about hand-tool interactions. As no frontal activation survived this contrast, it appears that premotor activity is unrelated to experience based motor knowledge of tool use/function, but rather, is elicited by any graspable tool. Confrontation with unfamiliar or infrequently used tools reveals an increase in inferior temporal and medial and lateral occipital activation, predominantly in the left hemisphere, suggesting that these regions reflect visual feature processing for tool identification.

  4. Abductive networks applied to electronic combat

    Science.gov (United States)

    Montgomery, Gerard J.; Hess, Paul; Hwang, Jong S.

    1990-08-01

    A practical approach to dealing with combinatorial decision problems and uncertainties associated with electronic combat through the use of networks of high-level functional elements called abductive networks is presented. It describes the application of the Abductory Induction Mechanism (AIMTM) a supervised inductive learning tool for synthesizing polynomial abductive networks to the electronic combat problem domain. From databases of historical expert-generated or simulated combat engagements AIM can often induce compact and robust network models for making effective real-time electronic combat decisions despite significant uncertainties or a combinatorial explosion of possible situations. The feasibility of applying abductive networks to realize advanced combat decision aiding capabilities was demonstrated by applying AIM to a set of electronic combat simulations. The networks synthesized by AIM generated accurate assessments of the intent lethality and overall risk associated with a variety of simulated threats and produced reasonable estimates of the expected effectiveness of a group of electronic countermeasures for a large number of simulated combat scenarios. This paper presents the application of abductive networks to electronic combat summarizes the results of experiments performed using AIM discusses the benefits and limitations of applying abductive networks to electronic combat and indicates why abductive networks can often result in capabilities not attainable using alternative approaches. 1. ELECTRONIC COMBAT. UNCERTAINTY. AND MACHINE LEARNING Electronic combat has become an essential part of the ability to make war and has become increasingly complex since

  5. Algal functional annotation tool

    Energy Technology Data Exchange (ETDEWEB)

    2012-07-12

    Abstract BACKGROUND: Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. DESCRIPTION: The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes

  6. Writing, Literacy, and Applied Linguistics.

    Science.gov (United States)

    Leki, Ilona

    2000-01-01

    Discusses writing and literacy in the domain of applied linguistics. Focus is on needs analysis for literacy acquisition; second language learner identity; longitudinal studies as extensions of identity work; and applied linguistics contributions to second language literacy research. (Author/VWL)

  7. Conversation Analysis and Applied Linguistics.

    Science.gov (United States)

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  8. The Observation Of Defects Of School Buildings Over 100 Years Old In Perak

    OpenAIRE

    Alauddin Kartina; Ishakt Mohd Fisal; Mohd Isa Haryati; Mohamad Sohod Fariz

    2016-01-01

    Malaysia is blessed with a rich legacy of heritage buildings with unique architectural and historical values. The heritage buildings become a symbol of the national identity of our country. Therefore, heritage buildings, as important monuments should be conserved well to ensure the extension of the building’s life span and to make sure continuity functions of the building for future generations. The aim of this study is to analyze the types of defects attached in school buildings over 100 yea...

  9. How has our interest in the airway changed over 100 years?

    Science.gov (United States)

    Kim, Ki Beom

    2015-11-01

    Since the beginning of our specialty, our understanding of the link between function and facial growth and development has progressively improved. Today, we know that children with sleep-related breathing problems will often develop distinctive facial characteristics. In adults, sleep apnea can result in serious morbidity and mortality. Orthodontists can ask sleep-related questions in the health history to help identify sleep breathing disorders. Treating these patients presents unique opportunities for orthodontists to collaborate with other medical specialties to improve a patient's health and treatment outcome. Research presented in our Journal in the next century may shed new light that will help us better identify the problem and aid the specialty in developing more effective evidence-based treatment. Additional efforts are needed to understand the physiology, neurology, and genetics of sleep breathing disorders. PMID:26522033

  10. 100 Years of Glacier Photographs: Available Online at the National Snow and Ice Data Center

    Science.gov (United States)

    Ballagh, L. M.; Wolfe, J.; Wang, I.; Casey, A.; Fetterer, F.

    2004-12-01

    Historic glacier photographs can be used to study fluctuations in glacier extent over time in response to climate change. Researchers can also use the photographs to approximate changes in glacier terminus location and mass balance. The "Glacier Photograph Collection" at the National Snow and Ice Data Center (NSIDC) contains approximately 5,000 photographs, including both aerial and terrestrial images. NSIDC received funding from the NOAA Climate Database Modernization Program (CDMP) to digitize a portion of the photographs and make an Online Glacier Photograph Database available. The CDMP's primary objective is to preserve climate data and facilitate access to the data. Although digitizing images is expensive, long-term data preservation is a major benefit. When historic photographs are stored on film, images can easily be scratched or damaged. Scanning the images and having them online makes browsing images easier for users. At present, there are 1,313 glacier photographs available online. Additional photos and metadata are being added. The Online Glacier Photograph Database will date from 1883 to 1995, totaling nearly 3,000 photographs available as high resolution TIFF images and lower resolution reference images and thumbnails by the end of 2004. Maintaining accurate metadata records for each photograph is very important. The database is searchable by various fields, including photographer name, photograph date, glacier name, glacier coordinates, state/province, and keyword.

  11. Industrial production of acetone and butanol by fermentation—100 years later

    Science.gov (United States)

    Sauer, Michael

    2016-01-01

    Microbial production of acetone and butanol was one of the first large-scale industrial fermentation processes of global importance. During the first part of the 20th century, it was indeed the second largest fermentation process, superseded in importance only by the ethanol fermentation. After a rapid decline after the 1950s, acetone-butanol-ethanol (ABE) fermentation has recently gained renewed interest in the context of biorefinery approaches for the production of fuels and chemicals from renewable resources. The availability of new methods and knowledge opens many new doors for industrial microbiology, and a comprehensive view on this process is worthwhile due to the new interest. This thematic issue of FEMS Microbiology Letters, dedicated to the 100th anniversary of the first industrial exploitation of Chaim Weizmann's ABE fermentation process, covers the main aspects of old and new developments, thereby outlining a model development in biotechnology. All major aspects of industrial microbiology are exemplified by this single process. This includes new technologies, such as the latest developments in metabolic engineering, the exploitation of biodiversity and discoveries of new regulatory systems such as for microbial stress tolerance, as well as technological aspects, such as bio- and down-stream processing. PMID:27199350

  12. Industrial production of acetone and butanol by fermentation-100 years later.

    Science.gov (United States)

    Sauer, Michael

    2016-07-01

    Microbial production of acetone and butanol was one of the first large-scale industrial fermentation processes of global importance. During the first part of the 20th century, it was indeed the second largest fermentation process, superseded in importance only by the ethanol fermentation. After a rapid decline after the 1950s, acetone-butanol-ethanol (ABE) fermentation has recently gained renewed interest in the context of biorefinery approaches for the production of fuels and chemicals from renewable resources. The availability of new methods and knowledge opens many new doors for industrial microbiology, and a comprehensive view on this process is worthwhile due to the new interest. This thematic issue of FEMS Microbiology Letters, dedicated to the 100th anniversary of the first industrial exploitation of Chaim Weizmann's ABE fermentation process, covers the main aspects of old and new developments, thereby outlining a model development in biotechnology. All major aspects of industrial microbiology are exemplified by this single process. This includes new technologies, such as the latest developments in metabolic engineering, the exploitation of biodiversity and discoveries of new regulatory systems such as for microbial stress tolerance, as well as technological aspects, such as bio- and down-stream processing. PMID:27199350

  13. KIC 8462852 did likely not fade during the last 100 years

    CERN Document Server

    Hippke, Michael

    2016-01-01

    A recent analysis found a "completely unprecedented" dimming of $0.165\\pm0.013$ magnitudes per century in the F3 main sequence star KIC8462852. This star is interesting, as it shows episodes of day-long dips with up to 20% dimming of unknown origin. We re-analyze the same Harvard archival Johnson B photometry and find comparable dimmings, and structural breaks, for 18 of 28 checked F-dwards (64%) in the Kepler field of view. We conclude that the Harvard plates photometry suffers from imperfect long-term (1890--1989) calibration. The most likely explanation for the century-long dimming of KIC8462852 is thus a data artefact, and it is probably not of astrophysical origin.

  14. Channel and floodplain change analysis over a 100-year period : Lower Yuba River, California

    OpenAIRE

    Rolf Aalto; L. Allan James; Michael B. Singer; Subhajit Ghoshal

    2010-01-01

    Hydraulic gold mining in the Sierra Nevada, California (1853–1884) displaced ~1.1 billion m3 of sediment from upland placer gravels that were deposited along piedmont rivers below dams where floods can remobilize them. This study uses topographic and planimetric data from detailed 1906 topographic maps, 1999 photogrammetric data, and pre- and post-flood aerial photographs to document historic sediment erosion and deposition along the lower Yuba River due to individual floods at the reach scal...

  15. The InSiGHT database : utilizing 100 years of insights into Lynch Syndrome

    NARCIS (Netherlands)

    Plazzer, J. P.; Sijmons, R. H.; Woods, M. O.; Peltomaki, P.; Thompson, B.; Den Dunnen, J. T.; Macrae, F.

    2013-01-01

    This article provides a historical overview of the online database (www.insight-group.org/mutations) maintained by the International Society for Gastrointestinal Hereditary Tumours. The focus is on the mismatch repair genes which are mutated in Lynch Syndrome. APC, MUTYH and other genes are also an

  16. 100 Years later: Celebrating the contributions of x-ray crystallography to allergy and clinical immunology.

    Science.gov (United States)

    Pomés, Anna; Chruszcz, Maksymilian; Gustchina, Alla; Minor, Wladek; Mueller, Geoffrey A; Pedersen, Lars C; Wlodawer, Alexander; Chapman, Martin D

    2015-07-01

    Current knowledge of molecules involved in immunology and allergic disease results from the significant contributions of x-ray crystallography, a discipline that just celebrated its 100th anniversary. The histories of allergens and x-ray crystallography are intimately intertwined. The first enzyme structure to be determined was lysozyme, also known as the chicken food allergen Gal d 4. Crystallography determines the exact 3-dimensional positions of atoms in molecules. Structures of molecular complexes in the disciplines of immunology and allergy have revealed the atoms involved in molecular interactions and mechanisms of disease. These complexes include peptides presented by MHC class II molecules, cytokines bound to their receptors, allergen-antibody complexes, and innate immune receptors with their ligands. The information derived from crystallographic studies provides insights into the function of molecules. Allergen function is one of the determinants of environmental exposure, which is essential for IgE sensitization. Proteolytic activity of allergens or their capacity to bind LPSs can also contribute to allergenicity. The atomic positions define the molecular surface that is accessible to antibodies. In turn, this surface determines antibody specificity and cross-reactivity, which are important factors for the selection of allergen panels used for molecular diagnosis and the interpretation of clinical symptoms. This review celebrates the contributions of x-ray crystallography to clinical immunology and allergy, focusing on new molecular perspectives that influence the diagnosis and treatment of allergic diseases.

  17. Interaction between ice conditions and hydro power regulations over 100 years : the Norwegian case

    Energy Technology Data Exchange (ETDEWEB)

    Pytte Asvall, R. [Norwegian Water Resources and Energy Directorate, Oslo (Norway). Dept. of Hydrology

    2008-07-01

    The many variations in Norway's climate can be attributed to the country's diverse geography, topography and latitude. The large gradients in elevation and seasonal variations in climate influence both runoff and ice conditions. This paper presented results of ice studies regarding water power regulations, as most of the electricity produced in Norway is based on hydropower. Although the period of constructing new large hydroelectric facilities in the country is over, focus is now on building facilities on smaller rivers and on increasing production at existing power plants. As such, possible changes in water regulation for hydroelectric power production relevant to ice conditions was discussed. Most ice problems associated with hydroelectric development is caused by increased water discharge. Some of the problems and solutions were outlined in this paper, notably stabilizing ice conditions to gradually increase winter discharge; extreme frazil formation and flooding; sudden unexpected increase in discharge; ice conditions in reservoirs; selective withdrawal to reduce effects on water temperature and ice coverage to reduce effects on salmon; increased winter freshwater flow to fjords; and, influence of climate change.

  18. [Karl Jaspers. 100 years of “Allgemeine Psychopathologie” (General Psychopathology)].

    Science.gov (United States)

    Häfner, H

    2013-11-01

    With his "Allgemeine Psychopathologie" (general psychopathology) published in 1913, Karl Jaspers laid a comprehensive methodological and systematic foundation in psychiatry. Following Edmund Husserl, the founder of philosophical phenomenology, Jaspers introduced "static understanding" into psychopathology, i.e. the unprejudiced reproduction of conscious phenomena. From the philosopher Wilhelm Dilthey he further adopted the distinction between causal understanding as a means of accessing nature and pathological processes and hermeneutic understanding, also called genetic understanding, as a way of accessing mental phenomena. The intrusion of an event that is incomprehensible in terms of an understandable development is seen as indicating an extraconscious phenomenon or transition to a somatic process. Jaspers opted for philosophy early in his life. After quitting law studies he graduated in medicine, arrived in psychopathology without any psychiatric training, to psychology without ever studying psychology and to a chair in philosophy without a degree in philosophy. Despite believing himself to be chronically ill and to die early, Jaspers produced a life’s work almost immeasurable in scope. He died in 1969 aged 86 years.

  19. General relativity the most beautiful of theories : applications and trends after 100 years

    CERN Document Server

    2015-01-01

    Generalising Newton's law of gravitation, general relativity is one of the pillars of modern physics. On the occasion of general relativity's centennial, leading scientists in the different branches of gravitational research review the history and recent advances in the main fields of applications of the theory, which was referred to by Lev Landau as “the most beautiful of the existing physical theories”.

  20. Breast fat and fallacies: more than 100 years of anatomical fantasy.

    Science.gov (United States)

    Nickell, William B; Skelton, Jackie

    2005-05-01

    The authors studied the anatomy of 136 patients who underwent breast reduction surgery from 1998 to 2003 to determine the relationship of breast fat to the glandular tissue of the breast. Histological sections of freshly preserved breast tissue taken from representative patients were examined and compared to depictions of normal breast anatomy as portrayed in standard anatomical texts from the classic work of Sir Astley Cooper in 1845 to current publications such as Auberbach and Riordan's Breastfeeding and Human Lactation. Most texts portray little intermix of fat with the glandular tissue of the breast. Our studies confirm the texts that demonstrate the fat and glandular tissue to be inseparable and present in continuity with each other except in the subcutaneous plane where only fat is present. The implications of this anatomical fact as it relates to breast surgery and human lactation are discussed.

  1. What History Is Teaching Us: 100 Years of Advocacy in "Music Educators Journal"

    Science.gov (United States)

    Hedgecoth, David M.; Fischer, Sarah H.

    2014-01-01

    As "Music Educators Journal" celebrates its centennial, it is appropriate to look back over the past century to see how advocacy in music education has evolved. Of the more than 200 submitted articles on advocacy, four main themes emerged: music education in community, the relevancy of music education, the value of music education, and…

  2. Oral delivery of macromolecular drugs: Where we are after almost 100years of attempts.

    Science.gov (United States)

    Moroz, Elena; Matoori, Simon; Leroux, Jean-Christophe

    2016-06-01

    Since the first attempt to administer insulin orally in humans more than 90years ago, the oral delivery of macromolecular drugs (>1000g/mol) has been rather disappointing. Although several clinical pilot studies have demonstrated that the oral absorption of macromolecules is possible, the bioavailability remains generally low and variable. This article reviews the formulations and biopharmaceutical aspects of orally administered biomacromolecules on the market and in clinical development for local and systemic delivery. The most successful approaches for systemic delivery often involve a combination of enteric coating, protease inhibitors and permeation enhancers in relatively high amounts. However, some of these excipients have induced local or systemic adverse reactions in preclinical and clinical studies, and long-term studies are often missing. Therefore, strategies aimed at increasing the oral absorption of macromolecular drugs should carefully take into account the benefit-risk ratio. In the absence of specific uptake pathways, small and potent peptides that are resistant to degradation and that present a large therapeutic window certainly represent the best candidates for systemic absorption. While we acknowledge the need for systemically delivering biomacromolecules, it is our opinion that the oral delivery to local gastrointestinal targets is currently more promising because of their accessibility and the lacking requirement for intestinal permeability enhancement. PMID:26826437

  3. [Primal psychoanalytic manuscript. 100 years "Studies of Hysteria" by Josef Breuer and Sigmund Freud].

    Science.gov (United States)

    Grubrich-Simitis, I

    1995-12-01

    In 1895 Breuer and Freud jointly published the Studies on Hysteria, a work that Grubrich-Simitis regards as the very first psychoanalytic monograph. The author begins by outlining the intellectual context in which the work took shape and the initial reception accorded to it by contemporary medical science and sexology. The main focus of the discussion centres around those aspects of the book that mark it out as a genuinely psychoanalytic work - hitherto unknown quality of seeing and hearing, a radical change in the relationship between doctor and patient, the establishment of a new form of case presentation and the development of approaches adumbrating psychoanalytic theory and technique. In conclusion the author describes the scientific cooperation between Freud and Breuer, assigning to the latter his rightful place in the history of psychoanalysis, a status frequently denied him by Freudians.

  4. Is psychiatry only neurology? Or only abnormal psychology? Déjà vu after 100 years.

    Science.gov (United States)

    de Leon, Jose

    2015-04-01

    Forgetting history, which frequently repeats itself, is a mistake. In General Psychopathology, Jaspers criticised early 20th century psychiatrists, including those who thought psychiatry was only neurology (Wernicke) or only abnormal psychology (Freud), or who did not see the limitations of the medical model in psychiatry (Kraepelin). Jaspers proposed that some psychiatric disorders follow the medical model (Group I), while others are variations of normality (Group III), or comprise schizophrenia and severe mood disorders (Group II). In the early 21st century, the players' names have changed but the game remains the same. The US NIMH is reprising both Wernicke's brain mythology and Kraepelin's marketing promises. The neo-Kraepelinian revolution started at Washington University, became pre-eminent through the DSM-III developed by Spitzer, but reached a dead end with the DSM-5. McHugh, who described four perspectives in psychiatry, is the leading contemporary representative of the Jaspersian diagnostic approach. Other neo-Jaspersians are: Berrios, Wiggins and Schwartz, Ghaemi, Stanghellini, Parnas and Sass. Can psychiatry learn from its mistakes? The current psychiatric language, organised at its three levels, symptoms, syndromes, and disorders, was developed in the 19th century but is obsolete for the 21st century. Scientific advances in Jaspers' Group III disorders require collaborating with researchers in the social and psychological sciences. Jaspers' Group II disorders, redefined by the author as schizophrenia, catatonic syndromes, and severe mood disorders, are the core of psychiatry. Scientific advancement in them is not easy because we are not sure how to delineate between and within them correctly.

  5. Jacobson v Massachusetts at 100 years: police power and civil liberties in tension.

    Science.gov (United States)

    Gostin, Lawrence O

    2005-04-01

    A century ago, the US Supreme Court in Jacobson v Massachusetts upheld the exercise of the police power to protect the public's health. Despite intervening scientific and legal advances, public health practitioners still struggle with Jacobson's basic tension between individual liberty and the common good. In affirming Massachusetts' compulsory vaccination law, the Court established a floor of constitutional protections that consists of 4 standards: necessity, reasonable means, proportionality, and harm avoidance. Under Jacobson, the courts are to support public health matters insofar as these standards are respected. If the Court today were to decide Jacobson once again, the analysis would likely differ--to account for developments in constitutional law--but the outcome would certainly reaffirm the basic power of government to safeguard the public's health. PMID:15798112

  6. Oral delivery of macromolecular drugs: Where we are after almost 100years of attempts.

    Science.gov (United States)

    Moroz, Elena; Matoori, Simon; Leroux, Jean-Christophe

    2016-06-01

    Since the first attempt to administer insulin orally in humans more than 90years ago, the oral delivery of macromolecular drugs (>1000g/mol) has been rather disappointing. Although several clinical pilot studies have demonstrated that the oral absorption of macromolecules is possible, the bioavailability remains generally low and variable. This article reviews the formulations and biopharmaceutical aspects of orally administered biomacromolecules on the market and in clinical development for local and systemic delivery. The most successful approaches for systemic delivery often involve a combination of enteric coating, protease inhibitors and permeation enhancers in relatively high amounts. However, some of these excipients have induced local or systemic adverse reactions in preclinical and clinical studies, and long-term studies are often missing. Therefore, strategies aimed at increasing the oral absorption of macromolecular drugs should carefully take into account the benefit-risk ratio. In the absence of specific uptake pathways, small and potent peptides that are resistant to degradation and that present a large therapeutic window certainly represent the best candidates for systemic absorption. While we acknowledge the need for systemically delivering biomacromolecules, it is our opinion that the oral delivery to local gastrointestinal targets is currently more promising because of their accessibility and the lacking requirement for intestinal permeability enhancement.

  7. Publicly Open Virtualized Gaming Environment For Simulation of All Aspects Related to '100 Year Starship Study'

    Science.gov (United States)

    Obousy, R. K.

    2012-09-01

    Sending a mission to distant stars will require our civilization to develop new technologies and change the way we live. The complexity of the task is enormous [1] thus, the thought is to involve people from around the globe through the ``citizen scientist'' paradigm. The suggestion is a ``Gaming Virtual Reality Network'' (GVRN) to simulate sociological and technological aspects involved in this project. Currently there is work being done [2] in developing a technology which will construct computer games within GVRN. This technology will provide quick and easy ways for individuals to develop game scenarios related to various aspects of the ``100YSS'' project. People will be involved in solving certain tasks just by play games. Players will be able to modify conditions, add new technologies, geological conditions, social movements and assemble new strategies just by writing scenarios. The system will interface with textual and video information, extract scenarios written in millions of texts and use it to assemble new games. Thus, players will be able to simulate enormous amounts of possibilities. Information technologies will be involved which will require us to start building the system in a way that any modules can be easily replaced. Thus, GVRN should be modular and open to the community.

  8. A Centennial Milestone (1910-2010): 100 Years of Youth Suicide Prevention

    Science.gov (United States)

    Miller, David N.

    2010-01-01

    Anniversaries are appropriate times for reflecting on the past and planning for the future, and in this 100th anniversary year of Sigmund Freud's famous group meeting--a meeting among a large group of prominent mental health professionals that provides a useful marker and arguable "starting point" for contemporary youth suicide prevention efforts,…

  9. Turnover of Species and Guilds in Shrub Spider Communities in a 100-Year Postlogging Forest Chronosequence.

    Science.gov (United States)

    Haraguchi, Takashi F; Tayasu, Ichiro

    2016-02-01

    Disturbance of forests by logging and subsequent forest succession causes marked changes in arthropod communities. Although vegetation cover provides important habitat for arthropods, studies of the changes in their community structure associated with forest succession have been conducted mostly at ground level. To evaluate how forests of different ages contribute to arthropod biodiversity in shrub habitat, spiders were collected from shrubs in 12 forests ranging in age from 1 to 107 yr after logging. We found marked changes in spider community structure about 10 yr after logging: the number of species and individuals declined rapidly after this time. These changes were likely caused by a decrease in shrub cover in association with forest succession. Changes in spider species composition associated with stand age were small in forests at least 11 yr old and were not clustered by forest age. After the exclusion of species of which we sampled only one or two individuals incidentally, just 0.9 ± 0.5 (mean ± SD) species were unique to these older forests. The other 41.2 ± 4.3 species found in these forests were common to both older and young forests, although some of these species in common were found mainly in forests at least 11 yr old. These results suggest that preservation of old-growth forests contributes to the abundance of these common species, although old-growth forests contribute little to species diversity.

  10. The development of incorporated structures for charities: a 100 year comparison of England and New Zealand

    OpenAIRE

    Cordery, Carolyn J; Fowler, Carolyn J; Morgan, Gareth G

    2016-01-01

    This article contrasts the emergence of two specific incorporated structures for non-profit organizations which came into being more than a century apart. We compare the ability for charities to form as “incorporated societies” under the New Zealand Incorporated Societies Act 1908 (effective from 1909), and to form as “charitable incorporated organizations” (CIOs), which were enacted in England and Wales under the Charities Act 2006 (effective from 2013). The article emphasizes th...

  11. Structure and dynamics of monohydroxy alcohols-Milestones towards their microscopic understanding, 100 years after Debye

    Science.gov (United States)

    Böhmer, Roland; Gainaru, Catalin; Richert, Ranko

    2014-12-01

    In 1913 Debye devised a relaxation model for application to the dielectric properties of water and alcohols. These hydrogen-bonded liquids continue to be studied extensively because they are vital for biophysical processes, of fundamental importance as solvents in industrial processes, and in every-day use. Nevertheless, the way to a microscopic understanding of their properties has been beset with apparently conflicting observations and conceptual difficulties. Much of this remains true for water, but fortunately the situation for monohydroxy alcohols is different. Here, with the experimental progress witnessed in recent years and with the growing recognition of the importance of specific supramolecular structures, a coherent microscopic understanding of the structure and dynamics of these hydrogen-bonded liquids is within reach.

  12. A CTE Legacy Built on Chocolate: Milton Hershey School's 100 Years

    Science.gov (United States)

    Kemmery, Robert

    2010-01-01

    One hundred years ago, Chocolate Magnate Milton S. Hershey and his wife Catherine signed the deed of trust creating the Hershey Industrial School in the heart of their Pennsylvania farming community. They had no children of their own and wanted to help orphan boys get a good education. The couple eventually left their entire fortune to the school.…

  13. [Evidence in support of Florence Nightingale's theories. 100 years after her death].

    Science.gov (United States)

    Zapico Yáñez, Florentina

    2010-05-01

    This article has been written to pay homage to Florence Nightingale as a pioneer and beacon for scientific curiosity which should serve as the nursing professional's guide for fine praxis. For this purpose, the author studied the evidence regarding Ms Nightingale's supposed theories; these have been arranged into four large groups in order to make those findings which support her theories easier to understand. PMID:20617658

  14. Projected changes to growth and mortality of Hawaiian corals over the next 100 years

    Science.gov (United States)

    Hoeke, R.K.; Jokiel, P.L.; Buddemeier, R.W.; Brainard, R.E.

    2011-01-01

    Background: Recent reviews suggest that the warming and acidification of ocean surface waters predicated by most accepted climate projections will lead to mass mortality and declining calcification rates of reef-building corals. This study investigates the use of modeling techniques to quantitatively examine rates of coral cover change due to these effects. Methodology/Principal Findings: Broad-scale probabilities of change in shallow-water scleractinian coral cover in the Hawaiian Archipelago for years 2000-2099 A.D. were calculated assuming a single middle-of-the-road greenhouse gas emissions scenario. These projections were based on ensemble calculations of a growth and mortality model that used sea surface temperature (SST), atmospheric carbon dioxide (CO2), observed coral growth (calcification) rates, and observed mortality linked to mass coral bleaching episodes as inputs. SST and CO2 predictions were derived from the World Climate Research Programme (WCRP) multi-model dataset, statistically downscaled with historical data. Conclusions/Significance: The model calculations illustrate a practical approach to systematic evaluation of climate change effects on corals, and also show the effect of uncertainties in current climate predictions and in coral adaptation capabilities on estimated changes in coral cover. Despite these large uncertainties, this analysis quantitatively illustrates that a large decline in coral cover is highly likely in the 21st Century, but that there are significant spatial and temporal variances in outcomes, even under a single climate change scenario.

  15. Playback from the Victrola to MP3, 100 years of music, machines, and money

    CERN Document Server

    Coleman, Mark

    2009-01-01

    ""Playback is the first book to place the fascinating history of sound reproduction within its larger social, economic, and cultural context-and includes appearances by everyone from Thomas Edison to En""

  16. How Earth works 100 years after Wegener's continental drift theory and IGCP 648

    Science.gov (United States)

    Li, Z. X.; Evans, D. A.; Zhong, S.; Eglington, B. M.

    2015-12-01

    It took half a century for Wegener's continental drift theory to be accepted as a fundamental element of the plate tectonic theory. Another half a century on, we are still unsure of the driving mechanism for plate tectonics: is it dominated by thermal convection, gravitational forces, or by a combination of mechanisms? Nonetheless, breakthroughs in the past decades put us in a position to make a major stride in answering this question. These include: (1) widely accepted cyclic occurrences of supercontinent assembly and break-up (whereas random occurrence of supercontinents was an equal possibility in the 1990s); (2) the discovery of two equatorial and antipodal large low seismic velocity provinces (LLSVPs) that dominate the lower mantle and appear to have been the base for almost all mantle plumes since at the Mesozoic, and of subduction of oceanic slabs all the way to the core-mantle boundary, which together suggesting whole-mantle convection; (3) the recognition of true polar wander (TPW) as an important process in Earth history, likely reflecting Earth's major internal mass redistribution events; and (4) rapidly enhancing computer modelling power enabling us to simulate all aspect of Earth's dynamic inner working. Many new yet often controversial ideas have been proposed, such a possible coupling in time (with an offset) and space between supercontinent cycle and superplume (LLSVP) events which oppose to the idea of static and long-lived LLSVPs, and the orthoversion v.s. introversion or extroversion models for supercontinent transition. To fully utilise these advances as well as the rapidly expanding global geoscience databases to address the question of how Earth works, an UNESCO-IUGS sponsored IGCP project No. 648 was formed to coordinate a global cross-disciplinary effort. We aim to achieve a better understanding of the supercontinent cycle, and examine the relationship between supercontinent cycle and global plume events. We will establish a series of global geological and geophysical databases to enable the geoscience community to make data-rich visual paleogeographic reconstructions using software like GPlates. In addition, the project will bring the geotectonic and the geodynamic modelling communities together to test global geodynamic models into the geological deep time.

  17. Industrial production of acetone and butanol by fermentation—100 years later

    OpenAIRE

    Sauer, Michael

    2016-01-01

    Microbial production of acetone and butanol was one of the first large-scale industrial fermentation processes of global importance. During the first part of the 20th century, it was indeed the second largest fermentation process, superseded in importance only by the ethanol fermentation. After a rapid decline after the 1950s, acetone-butanol-ethanol (ABE) fermentation has recently gained renewed interest in the context of biorefinery approaches for the production of fuels and chemicals from ...

  18. Channel and Floodplain Change Analysis over a 100-Year Period: Lower Yuba River, California

    Directory of Open Access Journals (Sweden)

    Rolf Aalto

    2010-07-01

    Full Text Available Hydraulic gold mining in the Sierra Nevada, California (1853–1884 displaced ~1.1 billion m3 of sediment from upland placer gravels that were deposited along piedmont rivers below dams where floods can remobilize them. This study uses topographic and planimetric data from detailed 1906 topographic maps, 1999 photogrammetric data, and pre- and post-flood aerial photographs to document historic sediment erosion and deposition along the lower Yuba River due to individual floods at the reach scale. Differencing of 3 × 3-m topographic data indicates substantial changes in channel morphology and documents 12.6 × 106 m3 of erosion and 5.8 × 106 m3 of deposition in these reaches since 1906. Planimetric and volumetric measurements document spatial and temporal variations of channel enlargement and lateral migration. Over the last century, channels incised up to ~13 m into mining sediments, which dramatically decreased local flood frequencies and increased flood conveyance. These adjustments were punctuated by event-scale geomorphic changes that redistributed sediment and associated contaminants to downstream lowlands.

  19. 100 Years of Deuterostomia (Grobben, 1908): Cladogenetic and Anagenetic Relations within the Notoneuralia Domain

    OpenAIRE

    Gudo, Michael; Syed, Tareq

    2008-01-01

    Results from molecular systematics and comparative developmental genetics changed the picture of metazoan and especially bilaterian radiation. According to this new animal phylogeny (introduced by Adoutte et al. 1999/2000), Grobbens (1908) widely favoured protostome-deuterostome division of the Bilateria can be upheld, but only with major rearrangements within these superphyla. On the cladogenetic level, the Protostomia are split into two unexpected subgroups, the Lophotrochozoa and Ecdysozoa...

  20. Accuracy assessment of the ERP prediction method based on analysis of 100-year ERP series

    Science.gov (United States)

    Malkin, Z.; Tissen, V. M.

    2012-12-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole motion (PM). In this study, a detailed comparison was made of real-time UT1 predictions made in 2006-2011 and PMpredictions made in 2009-2011making use of the SNIIM method with simultaneous predictions computed at the International Earth Rotation and Reference Systems Service (IERS), USNO. Obtained results have shown that proposed method provides better accuracy at different prediction lengths.