WorldWideScience

Sample records for 100-year tool applied

  1. 100 years of superconductivity

    Rogalla, Horst

    2011-01-01

    Even a hundred years after its discovery, superconductivity continues to bring us new surprises, from superconducting magnets used in MRI to quantum detectors in electronics. 100 Years of Superconductivity presents a comprehensive collection of topics on nearly all the subdisciplines of superconductivity. Tracing the historical developments in superconductivity, the book includes contributions from many pioneers who are responsible for important steps forward in the field.The text first discusses interesting stories of the discovery and gradual progress of theory and experimentation. Emphasizi

  2. Convergence: Human Intelligence The Next 100 Years

    Fluellen, Jerry E., Jr.

    2005-01-01

    How might human intelligence evolve over the next 100 years? This issue paper explores that idea. First, the paper summarizes five emerging perspectives about human intelligence: Howard Gardner's multiple intelligences theory, Robert Sternberg's triarchic theory of intelligence, Ellen Langer's mindfulness theory, David Perkins' learnable…

  3. Applied regression analysis a research tool

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  4. Analysis of 100 Years of Curriculum Designs

    Lynn Kelting-Gibson

    2013-01-01

    Full Text Available Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational planning and teaching because it is a way to gather accurate evidence of student learning and information to inform instructional decisions. The purpose of this review was to analyze 100 years of curriculum designs to uncover elements of assessment that will support teachers in their desire to improve student learning. Throughout this research the author seeks to answer the question: Do historical and contemporary curriculum designs include elements of assessment that help teachers improve student learning? The results of the review reveal that assessment elements were addressed in all of the curricular designs analyzed, but not all elements of assessment were identified using similar terminology. Based on the analysis of this review, it is suggested that teachers not be particular about the terminology used to describe assessment elements, as all curriculum models discussed use one or more elements similar to the context of pre, formative, and summative assessments.

  5. 100 Years of the Physics of Diodes

    Luginsland, John

    2013-10-01

    The Child-Langmuir Law (CL), discovered 100 years ago, gives the maximum current that can be transported across a planar diode in the steady state. As a quintessential example of the impact of space-charge shielding near a charged surface, it is central to the studies of high current diodes, such as high power microwave sources, vacuum microelectronics, electron and ion sources, and high current drivers used in high-energy density physics experiments. CL remains a touchstone of fundamental sheath physics, including contemporary studies of nano-scale quantum diodes and plasmonic devices. Its solid state analog is the Mott-Gurney law, governing the maximum charge injection in solids, such as organic materials and other dielectrics, which is important to energy devices, such as solar cells and light-emitting diodes. This paper reviews the important advances in the physics of diodes since the discovery of CL, including virtual cathode formation and extension of CL to multiple dimensions, to the quantum regime, and to ultrafast processes. We will review the influence of magnetic fields, multiple species in bipolar flow, electromagnetic and time dependent effects in both short pulse and high frequency THz limits, and single electron regimes. Transitions from various emission mechanisms (thermionic, field, and photo-emission) to the space charge limited state (CL) will be addressed, especially highlighting important simulation and experimental developments in selected contemporary areas of study. This talk will stress the fundamental physical links between the physics of beams to limiting currents in other areas, such as low temperature plasmas, laser plasmas, and space propulsion. Also emphasized is the role of non-equilibrium phenomena associated with materials and plasmas in close contact. Work supported by the Air Force Office of Scientific Research.

  6. Beam Line: 100 years of elementary particles

    Pais, A.; Weinberg, S.; Quigg, C.; Riordan, M.; Panofsky, W. K. H.

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  7. Opening the 100-Year Window for Time-Domain Astronomy

    Grindlay, Jonathan; Tang, Sumin; Los, Edward; Servillat, Mathieu

    2012-04-01

    The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890-1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the ``production scanning'' phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys.

  8. Opening the 100-Year Window for Time Domain Astronomy

    Grindlay, Jonathan; Los, Edward; Servillat, Mathieu

    2012-01-01

    The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890--1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the "production scanning" phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys

  9. Advances of Bioinformatics Tools Applied in Virus Epitopes Prediction

    Ping Chen; Simon Rayner; Kang-hong Hu

    2011-01-01

    In recent years, the in silico epitopes prediction tools have facilitated the progress of vaccines development significantly and many have been applied to predict epitopes in viruses successfully. Herein, a general overview of different tools currently available, including T cell and B cell epitopes prediction tools, is presented. And the principles of different prediction algorithms are reviewed briefly. Finally, several examples are present to illustrate the application of the prediction tools.

  10. 100-Year Flood-It's All About Chance

    Holmes, Jr., Robert R.; Dinicola, Karen

    2010-01-01

    In the 1960's, the United States government decided to use the 1-percent annual exceedance probability (AEP) flood as the basis for the National Flood Insurance Program. The 1-percent AEP flood was thought to be a fair balance between protecting the public and overly stringent regulation. Because the 1-percent AEP flood has a 1 in 100 chance of being equaled or exceeded in any 1 year, and it has an average recurrence interval of 100 years, it often is referred to as the '100-year flood'. The term '100-year flood' is part of the national lexicon, but is often a source of confusion by those not familiar with flood science and statistics. This poster is an attempt to explain the concept, probabilistic nature, and inherent uncertainties of the '100-year flood' to the layman.

  11. Understanding General Relativity after 100 years: A matter of perspective

    Dadhich, Naresh

    2016-01-01

    This is the centenary year of general relativity, it is therefore natural to reflect on what perspective we have evolved in 100 years. I wish to share here a novel perspective, and the insights and directions that ensue from it.

  12. Using Applied Theatre as a Tool to Address Netizenship

    Skeiker, Fadi Fayad

    2015-01-01

    This paper charts the ways in which a researcher uses applied theatre practice as a tool to address netizenship issues in the advancement of digital age by documenting a workshop he co-facilitated with graduate students at the University of Porto during the Future Places conference in 2013. The workshop used applied theatre both to catalyze…

  13. The Who and Why of 100 Year Bonds

    Richard Kish

    2014-09-01

    Full Text Available A unique category of debt, 100 year bonds, offers limited benefits to both buyers and sellers. From an issuer’s viewpoint, the entities are able to lock in low financing over an extended period of time, since these bonds are only offered during periods of low interest rates. From the buyer’s viewpoint, 100 year bonds provide investors with an option for lengthening the duration and convexity of their portfolios. Firms, such as life insurance companies, may be interested in lengthening the duration of their portfolios to match liabilities or other investment strategies. Besides supporting answers to the question of why 100 year bonds are issued, we outline the two waves of issuance; the issuers, the industry categories of issuers and investors; and provide examples showing the effects of changes in interest rates when investing in these bonds.

  14. Semantic Differential applied to the evaluation of machine tool design

    Mondragón Donés, Salvador; Company, Pedro; Vergara Monedero, Margarita

    2005-01-01

    In this article, a study is presented showing that Product Semantics (PS) can be used to study the design of machine tools. Nowadays, different approaches to PS (Semantic Differential, Kansei Engineering, etc.) are being applied to consumer products with successful results, but commercial products have generally received less attention and machine tools in particular have not yet been studied. Our second objective is to measure the different sensitivities that the different groups of the popu...

  15. Centennial Calendar- 100 Years of the American Phytopathological Society

    I edited a 40-page publication (calendar) that covered 18 chapters written by members of our society. This covered pioneering researchers, departments, and epidemics of the last 100 years of plant pathology in the U. S. This was given to all members of the American Phytopathological Society who att...

  16. The 7 basic tools of quality applied to radiological safety

    This work seeks to establish a series of correspondences among the search of the quality and the optimization of the doses received by the occupationally exposed personnel. There are treated about the seven basic statistic tools of the quality: the Pareto technique, Cause effect diagrams, Stratification, Verification sheet, Histograms, Dispersion diagrams and Graphics and control frames applied to the Radiological Safety

  17. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  18. Coating-substrate-simulations applied to HFQ® forming tools

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  19. Relativity and Gravitation : 100 Years After Einstein in Prague

    Ledvinka, Tomáš; General Relativity, Cosmology and Astrophysics : Perspectives 100 Years After Einstein's Stay in Prague

    2014-01-01

    In early April 1911 Albert Einstein arrived in Prague to become full professor of theoretical physics at the German part of Charles University. It was there, for the first time, that he concentrated primarily on the problem of gravitation. Before he left Prague in July 1912 he had submitted the paper “Relativität und Gravitation: Erwiderung auf eine Bemerkung von M. Abraham” in which he remarkably anticipated what a future theory of gravity should look like. At the occasion of the Einstein-in-Prague centenary an international meeting was organized under a title inspired by Einstein's last paper from the Prague period: "Relativity and Gravitation, 100 Years after Einstein in Prague". The main topics of the conference included: classical relativity, numerical relativity, relativistic astrophysics and cosmology, quantum gravity, experimental aspects of gravitation, and conceptual and historical issues. The conference attracted over 200 scientists from 31 countries, among them a number of leading experts in ...

  20. NASA Administrator Dan Goldin greets 100-year-old VIP.

    2000-01-01

    Astronaut Andy Thomas (left) greets 100-year-old Captain Ralph Charles, one of the VIPs attending the launch of STS-99. Charles also met NASA Administrator Dan Goldin. An aviator who has the distinction of being the oldest licensed pilot in the United States, Charles is still flying. He has experienced nearly a century of flight history, from the Wright Brothers to the Space Program. He took flying lessons from one of the first fliers trained by Orville Wright, first repaired then built airplanes, went barnstorming, operated a charter service in the Caribbean, and worked as a test pilot for the Curtiss Wright Airplane Co. Charles watches all the Shuttle launches from his home in Ohio and his greatest wish is to be able to watch one in person from KSC.

  1. Overuse syndrome in musicians--100 years ago. An historical review.

    Fry, H J

    Overuse syndrome in musicians was extensively reported 100 years ago. The clinical features and results of treatment, which were recorded in considerable detail, match well the condition that is described today. The medical literature that is reviewed here extends from 1830 to 1911 and includes 21 books and 54 articles from the English language literature, apart from two exceptions; however, the writers of the day themselves reviewed French, German and Italian literature on the subject. The disorder was said to result from the overuse of the affected parts. Two theories of aetiology, not necessarily mutually exclusive, were argued. The central theory regarded the lesion as being in the central nervous system, the peripheral theory implied a primary muscle disorder. No serious case was put forward for a psychogenic origin, though emotional factors were believed to aggravate the condition. Advances in musical instrument manufacture--particularly the development of the concert piano and the clarinet--may have played a part in the prevalence of overuse syndrome in musicians. Total rest from the mechanical use of the hand was the only effective treatment recorded. PMID:3540544

  2. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  3. Creating Long Term Income Streams for the 100 Year Starship Study Initiative

    Sylvester, A. J.

    Development and execution of long term research projects are very dependent on a consistent application of funding to maximize the potential for success. The business structure for the 100 Year Starship Study project should allow for multiple income streams to cover the expenses of the research objectives. The following examples illustrate the range of potential avenues: 1) affiliation with a charitable foundation for creating a donation program to fund a long term endowment for research, 2) application for grants to fund initial research projects and establish the core expertise of the research entity, 3) development of intellectual property which can then be licensed for additional revenue, 4) creation of spinout companies with equity positions retained by the lab for funding the endowment, and 5) funded research which is dual use for the technology goals of the interstellar flight research objectives. With the establishment of a diversified stream of funding options, then the endowment can be funded at a level to permit dedicated research on the interstellar flight topics. This paper will focus on the strategy of creating spinout companies to create income streams which would fund the endowment of the 100 Year Starship Study effort. This technique is widely used by universities seeking to commercially develop and market technologies developed by university researchers. An approach will be outlined for applying this technique to potentially marketable technologies generated as a part of the 100 Year Starship Study effort.

  4. 100 years of seismic research on the Moho

    Prodehl, Claus; Kennett, Brian; Artemieva, Irina;

    2013-01-01

    1980s, passive seismology using distant earthquakes has played an increasingly important role in studies of crustal structure. The receiver function technique exploiting conversions between P and SV waves at discontinuities in seismic wavespeed below a seismic station has been extensively applied to...... the increasing numbers of permanent and portable broad-band seismic stations across the globe. Receiver function studies supplement controlled source work with improved geographic coverage and now make a significant contribution to knowledge of the nature of the crust and the depth to Moho......The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research on the...

  5. Applying MDE Tools at Runtime: Experiments upon Runtime Models

    Song, Hui; Huang, Gang; Chauvel, Franck; Sun, Yanshun

    2010-01-01

    to be published International audience Runtime models facilitate the management of running systems in many different ways. One of the advantages of runtime models is that they enable the use of existing MDE tools at runtime to implement common auxiliary activities in runtime management, such as querying, visualization, and transformation. In this tool demonstration paper, we focus on this specific aspect of runtime models. We discuss the requirements of runtime models to enable the use of...

  6. Process for selecting engineering tools : applied to selecting a SysML tool.

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  7. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  8. 100-Year Floodplains, Floodplains 100 year define in gold color, Published in 2009, 1:2400 (1in=200ft) scale, WABASH COUNTY GOVERNMENT.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale, was produced all or in part from Published Reports/Deeds information as of 2009. It is...

  9. 100-Year Floodplains, 100 year flood plain data, Published in 2006, 1:1200 (1in=100ft) scale, Washoe County.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Field Survey/GPS information as of 2006. It is described...

  10. Applying AI tools to operational space environmental analysis

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  11. Geo-environmental mapping tool applied to pipeline design

    Andrade, Karina de S.; Calle, Jose A.; Gil, Euzebio J. [Geomecanica S/A Tecnologia de Solo Rochas e Materiais, Rio de Janeiro, RJ (Brazil); Sare, Alexandre R. [Geomechanics International Inc., Houston, TX (United States); Soares, Ana Cecilia [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The Geo-Environmental Mapping is an improvement of the Geological-Geotechnical Mapping used for basic pipeline designs. The main purpose is to assembly the environmental, geotechnical and geological concepts in a methodological tool capable to predict constrains and reduce the pipeline impact to the environment. The Geo-Environmental mapping was built to stress the influence of soil/structure interaction, related to the physical effect that comes from the contact between structures and soil or rock. A Geological-Geotechnical-Environmental strip (chart) was presented to emphasize the pipeline operational constrains and its influence to the environment. The mapping was developed to clearly show the occurrence and properties of geological materials divided into geotechnical domain units (zones). The strips present construction natural properties, such as: excavability, stability of the excavation and soil re-use capability. Also, the environmental constrains were added to the geological-geotechnical mapping. The Geo-Environmental Mapping model helps the planning of the geotechnical and environmental inquiries to be carried out during executive design, the discussion on the types of equipment to be employed during construction and the analysis of the geological risks and environmental impacts to be faced during the built of the pipeline. (author)

  12. Monitoring operational data production applying Big Data tooling

    Som de Cerff, Wim; de Jong, Hotze; van den Berg, Roy; Bos, Jeroen; Oosterhoff, Rijk; Klein Ikkink, Henk Jan; Haga, Femke; Elsten, Tom; Verhoef, Hans; Koutek, Michal; van de Vegte, John

    2015-04-01

    Within the KNMI Deltaplan programme for improving the KNMI operational infrastructure an new fully automated system for monitoring the KNMI operational data production systems is being developed: PRISMA (PRocessflow Infrastructure Surveillance and Monitoring Application). Currently the KNMI operational (24/7) production systems consist of over 60 applications, running on different hardware systems and platforms. They are interlinked for the production of numerous data products, which are delivered to internal and external customers. All applications are individually monitored by different applications, complicating root cause and impact analysis. Also, the underlying hardware and network is monitored separately using Zabbix. Goal of the new system is to enable production chain monitoring, which enables root cause analysis (what is the root cause of the disruption) and impact analysis (what other products will be effected). The PRISMA system will make it possible to dispose all the existing monitoring applications, providing one interface for monitoring the data production. For modeling the production chain, the Neo4j Graph database is used to store and query the model. The model can be edited through the PRISMA web interface, but is mainly automatically provided by the applications and systems which are to be monitored. The graph enables us to do root case and impact analysis. The graph can be visualized in the PRISMA web interface on different levels. Each 'monitored object' in the model will have a status (OK, error, warning, unknown). This status is derived by combing all log information available. For collecting and querying the log information Splunk is used. The system is developed using Scrum, by a multi-disciplinary team consisting of analysts, developers, a tester and interaction designer. In the presentation we will focus on the lessons learned working with the 'Big data' tooling Splunk and Neo4J.

  13. Applied climate-change analysis: the climate wizard tool.

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  14. From Smallpox to Big Data: The Next 100 Years of Epidemiologic Methods.

    Gange, Stephen J; Golub, Elizabeth T

    2016-03-01

    For more than a century, epidemiology has seen major shifts in both focus and methodology. Taking into consideration the explosion of "big data," the advent of more sophisticated data collection and analytical tools, and the increased interest in evidence-based solutions, we present a framework that summarizes 3 fundamental domains of epidemiologic methods that are relevant for the understanding of both historical contributions and future directions in public health. First, the manner in which populations and their follow-up are defined is expanding, with greater interest in online populations whose definition does not fit the usual classification by person, place, and time. Second, traditional data collection methods, such as population-based surveillance and individual interviews, have been supplemented with advances in measurement. From biomarkers to mobile health, innovations in the measurement of exposures and diseases enable refined accuracy of data collection. Lastly, the comparison of populations is at the heart of epidemiologic methodology. Risk factor epidemiology, prediction methods, and causal inference strategies are areas in which the field is continuing to make significant contributions to public health. The framework presented herein articulates the multifaceted ways in which epidemiologic methods make such contributions and can continue to do so as we embark upon the next 100 years. PMID:26443419

  15. 100-Year Floodplains, FEMA FIRM Mapping, Published in 2014, Not Applicable scale, GIS.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from Other information as of 2014. It is described as 'FEMA FIRM...

  16. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  17. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a...... applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory in...

  18. LOW FREQUENCY VARIABILITY OF INTERANNUAL CHANGE PATTERNS FOR GLOBAL MEAN TEMPERATURE DURING THE RECENT 100 YEARS

    刘晶淼; 丁裕国; 等

    2002-01-01

    The TEEOF method that expands temporally is used to conduct a diagnostic study of the variation patterns of 1,3,6and 10 years with regard to mean air temperature over the globe and Southern and Northern Hemispheres over the course of 100 years.The results show that the first mode of TEEOF takes up more than 50%in the total variance,with each of the first mode in the interannual osicllations generally standing for annually varying patterns which are related with climate and reflecting long-term tendency of change in air temperature.It is particularly true for the first mode on the 10-year scale.which shows an obvious ascending ascending trend concerning the temperature in winter and consistently the primary component of time goes in a way that is very close to the sequence of actual temperature,Apart from the first mode of all time sections of TEEOF for the globe and the two hemispheres and the second mode of the 1-year TEEOF.interannual variation described by other characteristic vectors are showing various patterns,with corresponding primary components having relation with longterm variability of specific interannual quasi-periodic oscillation structures.A T2 test applied to the annual variation pattern shows that the abrupt changes for the southern Hemisphere and the globe come close to the result of a uni-element t test for mean temperature than those for the Northern Hemisphere do.It indicates that the T2 test,when carried out with patterns of multiple variables.Seems more reasonable than the t test with single elements.

  19. Discharge, gage height, and elevation of 100-year floods in the Hudson River basin, New York

    Archer, Roger J.

    1978-01-01

    The flood discharge that may be expected to be equaled or exceeded on the average of once in 100 years (100-year flood) was computed by the log-Pearson Type-III frequency relation for 72 stations in the Hudson River basin. These discharges and, where available, their corresponding gage height and elevation above mean sea level are presented in tabular form. A short explanation of computation methods is included. The data are to be used as part of a federally funded study of the water resources and related land resources of the Hudson River basin. (Woodard-USGS)

  20. Applying observations of work activity in designing prototype data analysis tools

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  1. Innovation and Involvement: 100 Years of Community Work with Older People.

    Glasby, Jon

    2000-01-01

    Birmingham Settlement has provided services to British older adults for over 100 years, including such innovations as adult day centers, meals on wheels, and transportation services. The participation of the clientele in research helped flesh out the history of the settlement through narratives that demonstrate its impact on the life of the…

  2. Ageing management of instrumentation and control systems for 100 years life of AHWR

    Currently Nuclear Power Plants are being designed for a life of about 40 years. However, Advanced Heavy Water Reactor (AHWR), being designed by BARC, is intended to have a life of 100 years. Instrumentation and Control (I and C) plays a crucial role in the safe operation of any nuclear reactor. Design of I and C especially for a life of 100 years offers a great deal of challenges. Experience has shown that ageing and obsolescence have the potential to cause the maintainability and operability of I and C systems in Nuclear Power Plants to deteriorate well before the end of plant life. Hence, all ageing effects are to be detected in time and eliminated by repair, upgrading and replacement measures. However since no I and C system can survive such a long life of 100 years, special attention is to be paid in the design to effect easy replacement. Every aspect of design of hardware and software should deal with obsolescence. Design strategies like minimising the amount of cabling by resorting to networked data communication will go a long way in achieving the desired life extension. Hence it is essential that an effective Ageing Management Programme to be established at the very initial stages of design, planning and engineering of I and C systems for AHWR. This will ensure reliable continued operation of I and C systems for 100 years of life. (author)

  3. 100 Years of the American Economic Review : The Top 20 Articles

    Kenneth J. Arrow; B. Douglas Bernheim; Martin S. Feldstein; Daniel L. McFadden; James M. Poterba; Solow, Robert M.

    2011-01-01

    This paper presents a list of the top 20 articles published in the American Economic Review during its first 100 years. This list was assembled in honor of the AER 's one-hundredth anniversary by a group of distinguished economists at the request of AER 's editor. A brief description accompanies the citations of each article.

  4. The Observation Of Defects Of School Buildings Over 100 Years Old In Perak

    Alauddin Kartina

    2016-01-01

    Full Text Available Malaysia is blessed with a rich legacy of heritage buildings with unique architectural and historical values. The heritage buildings become a symbol of the national identity of our country. Therefore, heritage buildings, as important monuments should be conserved well to ensure the extension of the building’s life span and to make sure continuity functions of the building for future generations. The aim of this study is to analyze the types of defects attached in school buildings over 100 years located in Perak. The data were collected in four different schools aged over 100 years in Perak. The finding of the study highlighted the types of defects which were categorized based on building elements, including external wall, roof, door, ceiling, staircase, column, internal wall, floor and windows. Finding showed that the type of defects occurred in school buildings over 100 years in Perak is the same as the other heritage buildings. This finding can be used by all parties to take serious actions in preventing defects from occurring in buildings over 100 years. This would ensure that buildings’ functional life span can be extended for future use.

  5. Simulations of the Greenland ice sheet 100 years into the future with the full Stokes model Elmer/Ice

    Seddik, H.; Greve, R.; Zwinger, T.; Gillet-Chaulet, F.; Gagliardini, O.

    2011-12-01

    the surface precipitation and temperature and the set S (three experiments) applies an amplification factor to change the basal sliding velocity. The experiments are compared to a constant climate control run beginning at present (epoch 2004-1-1 0:0:0) and running up to 100 years holding the climate constant to its present state. The experiments with the amplification factor (Set S) show high sensitivities. Relative to the control run, the scenario with an amplification factor of 3x applied to the sliding velocity produces a Greenland contribution to sea level rise of ~25 cm. An amplification factor of 2.5x produces a contribution of ~16 cm and an amplification factor 2x produces a contribution of ~9 cm. The experiments with the changes to the surface precipitation and temperature (set C) show a contribution to sea level rise of ~4 cm when a factor 1x is applied to the temperature and precipitation anomalies. A factor 1.5x produces a sea level rise of ~8 cm and a factor 2x produces a sea level rise of ~12 cm.

  6. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP. PMID:21330730

  7. Lessons learned applying CASE methods/tools to Ada software development projects

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  8. 100 years of mapping the Holocene Rhine-Meuse delta plain: combining research and teaching

    Cohen, K.M.; Stouthamer, E.; Hoek, W.Z.; Middelkoop, H.

    2012-01-01

    The history of modern soil, geomorphological and shallow geological mapping in the Holocene Rhine-Meuse delta plain goes back about 100 years. The delta plain is of very heterogeneous build up, with clayey and peaty flood basins, dissected by sandy fluvial distributary channel belts with fine textured levees grading into tidal-influenced rivers and estuaries. Several generations of precursor rivers occur as alluvial ridges and buried ribbon sands. They form an intricate network originating fr...

  9. Comparative structural response of two steel bridges constructed 100 years apart

    VARUM Humberto; Sousa, Romain; Delgado, Walter; Fernandes, Catarina; Costa, Anibal; Jara, Jose M.; Jara, Manuel; Álvarez, Jose J.

    2011-01-01

    This paper presents a comparative numerical analysis of the structural behaviour and seismic performance of two existing steel bridges, the Infiernillo II Bridge and the Pinhao Bridge, one located in Mexico and the other in Portugal. The two bridges have similar general geometrical characteristics, but were constructed 100 years apart. Three-dimensional structural models of both bridges are developed and analysed for various load cases and several seismic conditions. The results of the compar...

  10. 100 years of Einstein's theory of Brownian motion: from pollen grains to protein trains

    Chowdhury, Debashish

    2005-01-01

    Experimental verification of the theoretical predictions made by Albert Einstein in his paper, published in 1905, on the molecular mechanisms of Brownian motion established the existence of atoms. In the last 100 years discoveries of many facets of the ubiquitous Brownian motion has revolutionized our fundamental understanding of the role of {\\it thermal fluctuations} in the exotic structures and complex dynamics exhibited by soft matter like, for example, colloids, gels, etc. The domain of B...

  11. 100-year history of the development of bread winter wheat breeding programs

    Литвиненко, М. А.

    2016-01-01

    Purpose. Review of the main achievements of the Wheat Breeding and Seed ProductionDepartment in the Plant Breeding and Genetic Institute – National Centre of Seed and Cultivar Investigation in the developing theoretical principles of breeding and creation of winter wheat varieties of different types during 100-year (1916–2016) period of breeding programs realization. Results. The main theoretical, methodical developments and breeding achievements of Wheat Breeding and Seed Production Departme...

  12. The Extent of Applying Strategic Management Accounting Tools in Jordanian Banks

    Musa Abdel Latif Ibrahim Alnawaiseh

    2013-01-01

    The study aims to know the extent of applying strategic management accounting tools in Jordanian banks. Thetools that this study tested are; Activity Based Costing, Benchmarking, Competitor Analysis, ValuingCustomers, Integrated Performance Measurement, Life Cycle Costing, Cost of Quality, Brand Value Monitoring,Managing and Budgeting, Strategic Pricing, Target Costing, Value Chain Costing and Balanced Scorecard.To analyze the responses of the respondents (employees) in these banks, and the e...

  13. Accuracy assessment of the UT1 prediction method based on 100-year series analysis

    Malkin, Z.; Tissen, V.; Tolstikov, A.

    2013-01-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole coordinates. The method is based on construction of a general polyharmonic model of the variations of the Earth rotation parameters using all the data available for the last 80-100 years, and modified autoregression technique. In this presentation, a detailed comparison was made of real-time UT1 predictions computed making use of this method in 2006-2010 with ...

  14. Oceanic environmental changes of subarctic Bering Sea in recent 100 years: Evidence from molecular fossils

    LU Bing; CHEN Ronghua; ZHOU Huaiyang; WANG Zipan; CHEN Jianfang; ZHU Chun

    2005-01-01

    The core sample B2-9 from the seafloor of the subarctic Bering Sea was dated with 210Pb to obtain a consecutive sequence of oceanic sedimentary environments at an interval of a decade during 1890-1999. A variety of molecular fossils were detected, including n-alkanes, isoprenoids, fatty acids, sterols, etc. By the characteristics of these fine molecules (C27, C28, and C29 sterols) and their molecular indices (Pr/Ph, ∑C+22/∑C?21, CPI and C18∶2/C18∶0) and in consideration of the variation of organic carbon content, the 100-year evolution history of subarctic sea paleoenvironment was reestablished. It is indicated that during the past 100 years in the Arctic, there were two events of strong climate warming (1920-1950 and 1980-1999), which resulted in an oxidated sediment environment owing to decreasing terrigenous organic matters and increasing marine-derived organic matters, and two events of transitory climate cooling (1910 and 1970-1980), which resulted in a slightly reduced sediment environment owing to increasing terrigenous organic matters and decreasing marine-derived organic matters. It is revealed that the processes of warming/cooling alternated climate are directly related to the Arctic and global climate variations.

  15. Extending dry storage of spent LWR fuel for up to 100 years

    Because of delays in closing the back end of the fuel cycle in the U.S., there is a need to extend dry inert storage of spent fuel beyond its originally anticipated 20-year duration. Many of the methodologies developed to support initial licensing for 20-year storage should be able to support the longer storage periods envisioned. This paper evaluates the applicability of existing information and methodologies to support dry storage up to 100 years. The thrust of the analysis is the potential behavior of the spent fuel. In the USA, the criteria for dry storage of LWR spent fuel are delineated in 10 CFR 72 [1]. The criteria fall into four general categories: maintain subcriticality, prevent the release of radioactive material above acceptable limits, ensure that radiation rates and doses do not exceed acceptable levels, and maintain retrievability of the stored radioactive material. These criteria need to be considered for normal, off-normal, and postulated accident conditions. The initial safety analysis report submitted for licensing evaluated the fuel's ability to meet the requirements for 20 years. It is not the intent to repeat these calculations, but to look at expected behavior over the additional 80 years, during which the temperatures and radiation fields are lower. During the first 20 years, the properties of the components may change because of elevated temperatures, presence of moisture, effects of radiation, etc. During normal storage in an inert atmosphere, there is potential for the cladding mechanical properties to change due to annealing or interaction with cask materials. The emissivity of the cladding could also change due to storage conditions. If there is air leakage into the cask, additional degradation could occur through oxidation in breached rods, which could lead to additional fission gas release and enlargement of cladding breaches. Air in-leakage could also affect cover gas conductivity, cladding oxidation, emissivity changes, and

  16. Extending dry storage of spent LWR fuel for up to 100 years

    Because of delays in closing the back end of the fuel cycle in the U.S., there is a need to extend dry inert storage of spent fuel beyond its originally anticipated 20-year duration. Many of the methodologies developed to support initial licensing for 20-year storage should be able to support the longer storage periods envisioned. This paper evaluates the applicability of existing information and methodologies to support dry storage up to 100 years. The thrust of the analysis is the potential behavior of the spent fuel. In the USA, the criteria for dry storage of LWR spent fuel are delineated in 10 CFR 72. The criteria fall into four general categories: maintain subcriticality, prevent the release of radioactive material above acceptable limits, ensure that radiation rates and doses do not exceed acceptable levels, and maintain retrievability of the stored radioactive material. These criteria need to be considered for normal, off-normal, and postulated accident conditions. The initial safety analysis report submitted for licensing evaluated the fuel's ability to meet the requirements for 20 years. It is not the intent to repeat these calculations, but to look at expected behavior over the additional 80 years, during which the temperatures and radiation fields are lower. During the first 20 years, the properties of the components may change because of elevated temperatures, presence of moisture, effects of radiation, etc. During normal storage in an inert atmosphere, there is potential for the cladding mechanical properties to change due to annealing or interaction with cask materials. The emissivity of the cladding could also change due to storage conditions. If there is air leakage into the cask, additional degradation could occur through oxidation in breached rods, which could lead to additional fission gas release and enlargement of cladding breaches. Air in-leakage could also affect cover gas conductivity, cladding oxidation, emissivity changes, and excessive

  17. Degradation of building materials over a lifespan of 30-100 years

    Following preliminary visits to four Magnox Nuclear Power Stations, a study was made of existing Central Electricity Generating Board (CEGB) reports on the condition of buildings at eight Power Stations. Sampling of building materials, non-destructive testing and inspections were carried out at Transfynydd, Oldbury and Dungeness ''A'' Magnox Power Stations, and the samples were subsequently laboratory tested. From the results of this work it can be concluded that little major deterioration is likely to occur in the reactor buildings at Transfynydd and Oldbury over the next 50 years and at Dungeness ''A'' for at least 25 years, assuming reasonable maintenance and the continuation of suitable internal temperatures and relative humidities. Because of the limitations on taking samples from, and tests on, the reactor biological shields and prestressed concrete vessel, no sensible forecast can be made of their potential life in the 75-100 year range

  18. Sustainable Foods and Medicines Support Vitality, Sex and Longevity for a 100-Year Starship Expedition

    Edwards, M. R.

    Extended space flight requires foods and medicines that sustain crew health and vitality. The health and therapeutic needs for the entire crew and their children for a 100-year space flight must be sustainable. The starship cannot depend on resupply or carry a large cargo of pharmaceuticals. Everything in the starship must be completely recyclable and reconstructable, including food, feed, textiles, building materials, pharmaceuticals, vaccines, and medicines. Smart microfarms will produce functional foods with superior nutrition and sensory attributes. These foods provide high-quality protein and nutralence (nutrient density), that avoids obesity, diabetes, and other Western diseases. The combination of functional foods, lifestyle actions, and medicines will support crew immunity, energy, vitality, sustained strong health, and longevity. Smart microfarms enable the production of fresh medicines in hours or days, eliminating the need for a large dispensary, which eliminates concern over drug shelf life. Smart microfarms are adaptable to the extreme growing area, resource, and environmental constraints associated with an extended starship expedition.

  19. 100 Years of British military neurosurgery: on the shoulders of giants.

    Roberts, S A G

    2015-01-01

    Death from head injuries has been a feature of conflicts throughout the world for centuries. The burden of mortality has been variously affected by the evolution in weaponry from war-hammers to explosive ordnance, the influence of armour on survivability and the changing likelihood of infection as a complicating factor. Surgery evolved from haphazard trephination to valiant, yet disjointed, neurosurgery by a variety of great historical surgeons until the Crimean War of 1853-1856. However, it was events initiated by the Great War of 1914-1918 that not only marked the development of modern neurosurgical techniques, but our approach to military surgery as a whole. Here the author describes how 100 years of conflict and the input and intertwining relationships between the 20th century's great neurosurgeons established neurosurgery in the United Kingdom and beyond. PMID:26292388

  20. Were moas really hunted to extinction in less than 100 years?

    Three months ago New Zealand archaeologists were surprised to read in their daily newspapers that moas had been eaten to extinction by Maori moahunters in less than 100 years. The claim had been made in the US journal 'Science' by Richard Holdaway, formerly with Canterbury University, and Chris Jacomb of Canterbury Museum. It seems to me there are a number of weaknesses in the original paper, which should have been thrashed out locally before going for prestigious exposure overseas. The rapid extinction claim is based first of all on a 'Leslie matrix model' of moa population dynamics, and secondly on some recent carbon dates of a single archaeological site, Monck's Cave, near Christchurch. 21 refs

  1. The volcanic contribution to climate change of the past 100 years

    Volcanic eruptions which inject large amounts of sulfur-rich gas into the stratosphere produce dust veils which last several years and cool the earth's surface. At the same time these dust veils absorb enough solar radiation to warm the stratosphere. Since these temperature changes at the earth's surface and in the stratosphere are both in the opposite direction to the hypothesized effects from greenhouse gases, they act to delay and mask the detection of greenhouse effects on the climate system. A large portion of the global climate change of the past 100 years may be due to the effects of volcanoes, but a definitive answer is not yet clear. While effects over several years have been demonstrated with both data studies and numerical models, long-term effects, while found in climate model calculations, await confirmation with more realistic models. In this paper chronologies of past volcanic eruptions and the evidence from data analyses and climate model calculations are reviewed

  2. Prediction of Climatic Change for the Next 100 Years in the Apulia Region, Southern Italy

    Mladen Todorovic

    2007-12-01

    Full Text Available The impact of climate change on water resources and use for agricultural production has become a critical question for sustainability. Our objective was investigate the impact of the expected climate changes for the next 100 years on the water balance variations, climatic classifications, and crop water requirements in the Apulia region (Southern Italy. The results indicated that an increase of temperature, in the range between 1.3 and 2,5 °C, is expected in the next 100 years. The reference evapotranspiration (ETo variations would follow a similar trend; as averaged over the whole region, the ETo increase would be about 15.4%. The precipitation will not change significantly on yearly basis although a slight decrease in summer months and a slight increase during the winter season are foreseen. The climatic water deficit (CWD is largely caused by ETo increase, and it would increase over the whole Apulia region in average for more than 200 mm. According to Thornthwaite and Mather climate classification, the moisture index will decrease in the future, with decreasing of humid areas and increasing of aridity zones. The net irrigation requirements (NIR, calculated for ten major crops in the Apulia region, would increase significantly in the future. By the end of the 21st Century, the foreseen increase of NIR, in respect to actual situation, is the greatest for olive tree (65%, wheat (61%, grapevine (49%, and citrus (48% and it is slightly lower for maize (35%, sorghum (34%, sunflower (33%, tomato (31%, and winter and spring sugar beet (both 27%.

  3. Rapid warming in mid-latitude central Asia for the past 100 years

    Fahu CHEN; Jinsong WANG; Liya JIN; Qiang ZHANG; Jing LI; Jianhui CHEN

    2009-01-01

    Surface air temperature variations during the last 100 years (1901-2003) in mid-latitude central Asia were analyzed using Empirical Orthogonal Functions (EOFs). The results suggest that temperature variations in four major sub-regions, i.e. the eastern monsoonal area, central Asia, the Mongolian Plateau and the Tarim Basin, respectively, are coherent and characterized by a striking warming trend during the last 100 years. The annual mean temperature increasing rates at each sub-region (represen-tative station) are 0.19℃ per decade, 0.16℃ per decade, 0.23℃ per decade and 0.15℃ per decade, respectively.The average annual mean temperature increasing rate of the four sub-regions is 0.18℃ per decade, with a greater increasing rate in winter (0.21℃ per decade). In Asian mid-latitude areas, surface air temperature increased relatively slowly from the 1900s to 1970s, and it has increased rapidly since 1970s. This pattern of temperature variation differs from that in the other areas of China. Notably, there was no obvious warming between the 1920s and 1940s, with temperature fluctuating between warming and cooling trends (e.g. 1920s, 1940s, 1960s, 1980s, 1990s). However, the warming trends are of a greater magnitude and their durations are longer than that of the cooling periods, which leads to an overall warming. The amplitude of temperature variations in the study region is also larger than that in eastern China during different periods.

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  6. DIII-D integrated plasma control tools applied to next generation tokamaks

    A complete software suite for integrated tokamak plasma control has been developed within the DIII-D program. The suite consists of software for real-time control of all aspects of the plasma, modeling, simulation and design tools for analysis and development of controllers, a flexible and modular architecture for implementation and testing of algorithms and many fully validated models. Many elements of the system have been applied to and implemented on NSTX and MAST. The DIII-D realtime plasma control system together with the integrated modeling and simulation suite have been selected for operational use by both the KSTAR and EAST tokamaks, and are also being used at General Atomics to investigate control issues for ITER

  7. Interferometric Techniques Apply to Gemona (Friuli-Italy) Area as Tool for Structural Analysis.

    Sternai, P.; Calcagni, L.; Crippa, B.

    2009-04-01

    Interferometric Techniques Apply to Gemona (Friuli) Area as Tool for Structural Analysis. We suggest a possible exploitation of radar interferometry for estimating many features of the brittle deformation occurring at the very surface of the Earth, such as, for example, the length of the dislocation front, the total amount of the dislocation, the dislocation rate over the time interval considered. The Interferometric techniques allows obtaining highly reliable vertical velocity values of the order of 1 mm/yr, with a maximum resolution of 80m2. The values obtained always refer to the temporal interval considered, which depends on the availability of SAR images. We demonstrate that is possible to see the evolution and the behaviour of the main tectonic lineament of the considered area even on short period of time (few years). We describe the results of a procedure to calculate terrain motion velocity on highly correlated pixels of an area nearby Gemona - Friuli, Northern Italy, and then we presented some considerations, based on three successful examples of the analysis, on how to exploit these results in a structural-geological description of the area. The versatility of the technique, the large dimensions of the area that can be analyzed (10.000 km2), and the high precision and reliability of the results obtained, make radar interferometry a powerful tool not only to monitor the dislocation occurring at the surface, but also to obtain important information on the structural evolution of mountain belts, otherwise very difficult to recognize.

  8. Strategy for 100-year life of the ACR-1000 concrete containment structure

    The purpose of this paper is to present the Plant Life Management (PLiM) strategy for the concrete containment structure of the ACR-1000 (Advanced CANDU Reactor) designed by AECL. The ACR-1000 is designed for 100-year plant life including 60-year operating life and additional 40-year decommissioning period of time. The approach adopted for the PLiM strategy of the concrete containment structure is a preventive one, key areas being: 1) design methodology, 2) material performance and 3) life cycle management and ageing management program. In the design phase, in addition to strength and serviceability, durability is a major requirement during the service life and decommissioning phase of the ACR structure. Parameters affecting durability design include: a) concrete performance, b) structural application, and c) environmental conditions. Due to the complex nature of the environmental effects acting on structures during the service life of project, it is considered that true improved performance during the service life can be achieved by improving the material characteristics. Many recent innovations in advanced concrete materials technology have made it possible to produce modern concrete such as high-performance concrete with exceptional performance characteristics. In this paper, the PLiM strategy for the ACR-1000 concrete containment is presented. In addition to addressing the design methodology and material performance areas, a systematic approach for ageing management program for the concrete containment structure is presented. (author)

  9. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed

  10. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    Pittman, William; Han, Zhe; Harding, Brian; Rosas, Camilo; Jiang, Jiaojun; Pineda, Alba; Mannan, M. Sam, E-mail: mannan@tamu.edu

    2014-09-15

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed.

  11. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    Freddy Abel Vargas

    2010-04-01

    Full Text Available The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced for applying it to existing sectors of the economy. The present article attempts to define a methodology for technological awareness (based on transforming the information contained in databases by means of constructing technological maps contributing useful knowledge to production processes. This methodology has been applied to the flexible plastic packaging sector. The main trends in this industry's technological development were identified allowing strategies to be proposed for incorporating these advances and tendencies in national companies and research groups involved in flexible plastic packaging technological development and innovation. Technological mappiong's possibilities as an important instrument for producing technological development in a given sector are the analysed as are their possibilities for being used in other production processes.

  12. Quantitative tools for comparing animal communication systems: information theory applied to bottlenose dolphin whistle repertoires.

    McCOWAN; Hanser; Doyle

    1999-02-01

    Comparative analysis of nonhuman animal communication systems and their complexity, particularly in comparison to human language, has been generally hampered by both a lack of sufficiently extensive data sets and appropriate analytic tools. Information theory measures provide an important quantitative tool for examining and comparing communication systems across species. In this paper we use the original application of information theory, that of statistical examination of a communication system's structure and organization. As an example of the utility of information theory to the analysis of animal communication systems, we applied a series of information theory statistics to a statistically categorized set of bottlenose dolphin Tursiops truncatus, whistle vocalizations. First, we use the first-order entropic relation in a Zipf-type diagram (Zipf 1949 Human Behavior and the Principle of Least Effort) to illustrate the application of temporal statistics as comparative indicators of repertoire complexity, and as possible predictive indicators of acquisition/learning in animal vocal repertoires. Second, we illustrate the need for more extensive temporal data sets when examining the higher entropic orders, indicative of higher levels of internal informational structure, of such vocalizations, which could begin to allow the statistical reconstruction of repertoire organization. Third, we propose using 'communication capacity' as a measure of the degree of temporal structure and complexity of statistical correlation, represented by the values of entropic order, as an objective tool for interspecies comparison of communication complexity. In doing so, we introduce a new comparative measure, the slope of Shannon entropies, and illustrate how it potentially can be used to compare the organizational complexity of vocal repertoires across a diversity of species. Finally, we illustrate the nature and predictive application of these higher-order entropies using a preliminary

  13. The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools

    Holst, Michael; Tiglio, Manuel; Vallisneri, Michele

    2016-01-01

    On September 14, 2015, the newly upgraded Laser Interferometer Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW) signal, emitted a billion light-years away by a coalescing binary of two stellar-mass black holes. The detection was announced in February 2016, in time for the hundredth anniversary of Einstein's prediction of GWs within the theory of general relativity (GR). The signal represents the first direct detection of GWs, the first observation of a black-hole binary, and the first test of GR in its strong-field, high-velocity, nonlinear regime. In the remainder of its first observing run, LIGO observed two more signals from black-hole binaries, one moderately loud, another at the boundary of statistical significance. The detections mark the end of a decades-long quest, and the beginning of GW astronomy: finally, we are able to probe the unseen, electromagnetically dark Universe by listening to it. In this article, we present a short historical overview of GW science: this youn...

  14. 100-Year Floodplains, flood plain, Published in 2009, 1:24000 (1in=2000ft) scale, Washington County.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'flood...

  15. 100 years of California’s water rights system: patterns, trends and uncertainty

    Grantham, Theodore E.; Viers, Joshua H.

    2014-08-01

    For 100 years, California’s State Water Resources Control Board and its predecessors have been responsible for allocating available water supplies to beneficial uses, but inaccurate and incomplete accounting of water rights has made the state ill-equipped to satisfy growing societal demands for water supply reliability and healthy ecosystems. Here, we present the first comprehensive evaluation of appropriative water rights to identify where, and to what extent, water has been dedicated to human uses relative to natural supplies. The results show that water right allocations total 400 billion cubic meters, approximately five times the state’s mean annual runoff. In the state’s major river basins, water rights account for up to 1000% of natural surface water supplies, with the greatest degree of appropriation observed in tributaries to the Sacramento and San Joaquin Rivers and in coastal streams in southern California. Comparisons with water supplies and estimates of actual use indicate substantial uncertainty in how water rights are exercised. In arid regions such as California, over-allocation of surface water coupled with trends of decreasing supply suggest that new water demands will be met by re-allocation from existing uses. Without improvements to the water rights system, growing human and environmental demands portend an intensification of regional water scarcity and social conflict. California’s legal framework for managing its water resources is largely compatible with needed reforms, but additional public investment is required to enhance the capacity of the state’s water management institutions to effectively track and regulate water rights.

  16. To Humbly Go: Guarding Against Perpetuating Models of Colonization in the 100-Year Starship Study

    Kramer, W. R.

    Past patterns of exploration, colonization and exploitation on Earth continue to provide the predominant paradigms that guide many space programs. Any project of crewed space exploration, especially of the magnitude envisioned by the 100-Year Starship Study, must guard against the hubris that may emerge among planners, crew, and others associated with the project, including those industries and bureaucracies that will emerge from the effort. Maintaining a non-exploitative approach may be difficult in consideration of the century of preparatory research and development and the likely multigenerational nature of the voyage itself. Starting now with mission dreamers and planners, the purpose of the voyage must be cast as one of respectful learning and humble discovery, not of conquest (either actual or metaphorical) or other inappropriate models, including military. At a minimum, the Study must actively build non-violence into the voyaging culture it is beginning to create today. References to exploitive colonization, conquest, destiny and other terms from especially American frontier mythology, while tempting in their propagandizing power, should be avoided as they limit creative thinking about alternative possible futures. Future voyagers must strive to adapt to new environments wherever possible and be assimilated by new worlds both biologically and behaviorally rather than to rely on attempts to recreate the Earth they have left. Adaptation should be strongly considered over terraforming. This paper provides an overview of previous work linking the language of colonization to space programs and challenges the extension of the myth of the American frontier to the Starship Study. It argues that such metaphors would be counter-productive at best and have the potential to doom long-term success and survival by planting seeds of social decay and self-destruction. Cautions and recommendations are suggested.

  17. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  18. Climatic and Hydrological Changes of Past 100 Years in Asian Arid Zone

    Feng, Zhaodong; Salnikov, Vitaliy; Xu, Changchun

    2014-05-01

    The Asian Arid Zone (AAZ) is here defined to include the following regions: northwestern China, Mongolia, Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan, and Uzbekistan. Generally speaking, the AAZ has experienced a temperature rising during the past 100 years that was significantly faster than the global average (0.14 ºC per decade). Specifically, the rate was 0.39 ºC per decade in northwestern China (1950-2010), 0.26 ºC per decade in Kazakhstan (1936-2005), 0.22 ºC per decade in Mongolia (1940-2010), 0.29 ºC per decade in Uzbekistan (1950-2005), 0.18 ºC per decade in Turkmenistan (1961-1995). It should be noted that the mountainous parts of AAZ seems to have experienced a slower rate of temperature rising. For example, the rate was 0.10 ºC per decade in Tajikistan (1940-2005) and was 0.08 ºC per decade in Kyrgyzstan (1890-2005). Precipitation has a slight increasing trend in northwestern China, but it has fluctuated along a near-constant line in the rest of the AAZ. Hydrological data from high-elevation basin show that the runoff has been increasing primarily as a result of rising temperature that caused increases in ice melting. A natural decreasing trend of surface runoff in low-elevation basins is undeniable and the decreasing trend is attributable to intensified evaporation under warming conditions. It is true that the total amount of runoff in the Tianshan Mountains and the associated basins has been increased primarily as a result of temperature rising-resulted increases in ice melting. But, approaching to the turning point of glacier-melting supplies to runoff will pose a great threat to socio-economic sustainability and to ecological security. The turning point refers to the transition from increasing runoff to decreasing runoff within ice melting supplied watersheds under a warming climate.

  19. Underworld-GT Applied to Guangdong, a Tool to Explore the Geothermal Potential of the Crust

    Steve Quenette; Yufei Xi; John Mansour; Louis Moresi; David Abramson

    2015-01-01

    Geothermal energy potential is usually discussed in the context of conventional or engi-neered systems and at the scale of an individual reservoir. Whereas exploration for conventional reser-voirs has been relatively easy, with expressions of resource found close to or even at the surface, explora-tion for non-conventional systems relies on temperature inherently increasing with depth and searching for favourable geological environments that maximise this increase. To utilitise the information we do have, we often assimilate available exploration data with models that capture the physics of the domi-nant underlying processes. Here, we discuss computational modelling approaches to exploration at a re-gional or crust scale, with application to geothermal reservoirs within basins or systems of basins. Tar-get reservoirs have (at least) appropriate temperature, permeability and are at accessible depths. We discuss the software development approach that leads to effective use of the tool Underworld. We ex-plore its role in the process of modelling, understanding computational error, importing and exporting geological knowledge as applied to the geological system underpinning the Guangdong Province, China.

  20. A Concept for Testing Decision Support Tools in Participatory Processes Applied to the ToSIA Tool

    David Edwards

    2013-04-01

    Full Text Available ToSIA (Tool for Sustainability Impact Assessment offers a transparent and consistent methodological framework to assess impacts of changes (technological, policy, management, etc. in the forest-based sector. This tool is able to facilitate the decision making process within and between diverse groups of stakeholders (e.g., forest managers and policymakers as it provides a neutral, transparent and data-driven platform for stakeholder interaction and communication. To test these capabilities of ToSIA, a practical approach to test if a decision support system is suitable for participatory processes was developed based on a set of evaluation criteria for participatory processes. ToSIA’s performance was assessed and discussed in different categories against a selection of criteria for successful participatory processes: six criteria were fulfilled by ToSIA, in nine, ToSIA is potentially helpful, in two, criteria ToSIA has no influence, and for three criteria, no experiences exist until now. As a result, ToSIA’s conceptual suitability as a participatory decision support system was confirmed for two interlinked roles: as a decision support system to assess alternative scenarios, and as a communication platform for stakeholder interaction.

  1. Reply to the Comment of Leclercq et al. on "100-year mass changes in the Swiss Alps linked to the Atlantic Multidecadal Oscillation"

    M. Huss

    2010-12-01

    Full Text Available In their comment, Leclercq et al. argue that Huss et al. (2010 overestimate the effect of the Atlantic Multidecadal Oscillation (AMO on the 100-year mass balance variations in the Swiss Alps because time series of conventional balances instead of reference-surface balances were used. Applying the same model as in Huss et al. we calculate time series of reference-surface mass balance, and show that the difference between conventional and reference-surface mass balance is significantly smaller than stated in the comment. Both series exhibit very similar multidecadal variations. The opposing effects of retreat and surface lowering on mass balance partly cancel each other.

  2. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  3. The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review

    Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza

    2014-01-01

    Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strength...

  4. Tool for Experimenting with Concepts of Mobile Robotics as Applied to Children's Education

    Jimenez Jojoa, E. M.; Bravo, E. C.; Bacca Cortes, E. B.

    2010-01-01

    This paper describes the design and implementation of a tool for experimenting with mobile robotics concepts, primarily for use by children and teenagers, or by the general public, without previous experience in robotics. This tool helps children learn about science in an approachable and interactive way, using scientific research principles in…

  5. XVII International Botanical Congress. 100 years after the II IBC in Vienna 1905. Abstracts

    Full text: The program of XVII IBC 2005 includes all aspects of basic and applied botanical research. Progress in the different sub-disciplines is revealed through plenary talks, general lectures, symposia, and poster sessions. This conference emphasizes the newest developments in the botanical sciences worldwide. (botek)

  6. (Journal of Applied Toxicology) BMDExpress Data Viewer: A Visualization Tool to Analyze BMDExpress Datasets

    Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at whic...

  7. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  8. Development of an intelligent system for tool wear monitoring applying neural networks

    A. Antić

    2005-12-01

    Full Text Available Purpose: The objective of the researches presented in the paper is to investigate, in laboratory conditions, the application possibilities of the proposed system for tool wear monitoring in hard turning, using modern tools and artificial intelligence (AI methods.Design/methodology/approach: On the basic theoretical principles and the use of computing methods of simulation and neural network training, as well as the conducted experiments, have been directed to investigate the adequacy of the setting.Findings: The paper presents tool wear monitoring for hard turning for certain types of neural network configurations where there are preconditions for up building with dynamic neural networks.Research limitations/implications: Future researches should include the integration of the proposed system into CNC machine, instead of the current separate system, which would provide synchronisation between the system and the machine, i.e. the appropriate reaction by the machine after determining excessive tool wear.Practical implications: Practical application of the conducted research is possible with certain restrictions and supplement of adequate number of experimental researches which would be directed towards certain combinations of machining materials and tools for which neural networks are trained.Originality/value: The contribution of the conducted research is observed in one possible view of the tool monitoring system model and it’s designing on modular principle, and principle building neural network.

  9. Using the natural abundance of 13C and 15N to examine soil organic matter accumulated during 100 years of cropping

    The 13C natural abundance technique was applied to soils of a long term experimental field in the study of organic matter turnover. The technique allowed evaluation of soil organic matter (SOM) originating from residues of different cropping systems that partially replaced the native prairie SOM mineralized during 100 years of cropping history. A large pool of the prairie SOM was highly resistant to decay, demonstrating a turnover time of 1000 years. Labile prairie SOM, lost when cultivation was initiated, had a half-life of 11 years. Accumulated SOM that originated from residues of a particular crop demonstrated a similar half-life as it decayed and was replaced by new SOM from residues of a different crop. Apparent turnover time for soil organic carbon calculated from annual input of crop residues to the soil for different cropping systems ranged from 2 years for corn to 6.4 years for timothy sod. The natural abundance of 15N showed significant change for soil treated with chemical fertilizer or manure relative to the control soil. Manure applied to timothy for 100 years contributed 24% of existing soil organic nitrogen. (author). 12 refs, 2 figs, 3 tabs

  10. 100 years of Elementary Particles [Beam Line, vol. 27, issue 1, Spring 1997

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K. H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  11. 100 years of elementary particles [Beam Line, vol. 27, issue 1, Spring 1997

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K.H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  12. 100 years of elementary particles [Beam Line, vol. 27, number 1, Spring 1997

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe

  13. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  14. Simulation of water-surface elevations for a hypothetical 100-year peak flow in Birch Creek at the Idaho National Engineering and Environmental Laboratory, Idaho

    Delineation of areas at the Idaho National Engineering and Environmental Laboratory that would be inundated by a 100-year peak flow in Birch Creek is needed by the US Department of Energy to fulfill flood-plain regulatory requirements. Birch Creek flows southward about 40 miles through an alluvium-filled valley onto the northern part of the Idaho National Engineering and Environmental laboratory site on the eastern Snake River Plain. The lower 10-mile reach of Birch Creek that ends in Birch Creek Playa near several Idaho National Engineering and Environmental Laboratory facilities is of particular concern. Twenty-six channel cross sections were surveyed to develop and apply a hydraulic model to simulate water-surface elevations for a hypothetical 100-year peak flow in Birch Creek. Model simulation of the 100-year peak flow (700 cubic feet per second) in reaches upstream from State Highway 22 indicated that flow was confined within channels even when all flow was routed to one channel. Where the highway crosses Birch Creek, about 315 cubic feet per second of water was estimated to move downstream--115 cubic feet per second through a culvert and 200 cubic feet per second over the highway. Simulated water-surface elevation at this crossing was 0.8 foot higher than the elevation of the highway. The remaining 385 cubic feet per second flowed southwestward in a trench along the north side of the highway. Flow also was simulated with the culvert removed. The exact location of flood boundaries on Birch Creek could not be determined because of the highly braided channel and the many anthropogenic features (such as the trench, highway, and diversion channels) in the study area that affect flood hydraulics and flow. Because flood boundaries could not be located exactly, only a generalized flood-prone map was developed

  15. Accumulation of pharmaceuticals, Enterococcus, and resistance genes in soils irrigated with wastewater for zero to 100 years in central Mexico.

    Philipp Dalkmann

    Full Text Available Irrigation with wastewater releases pharmaceuticals, pathogenic bacteria, and resistance genes, but little is known about the accumulation of these contaminants in the environment when wastewater is applied for decades. We sampled a chronosequence of soils that were variously irrigated with wastewater from zero up to 100 years in the Mezquital Valley, Mexico, and investigated the accumulation of ciprofloxacin, enrofloxacin, sulfamethoxazole, trimethoprim, clarithromycin, carbamazepine, bezafibrate, naproxen, diclofenac, as well as the occurrence of Enterococcus spp., and sul and qnr resistance genes. Total concentrations of ciprofloxacin, sulfamethoxazole, and carbamazepine increased with irrigation duration reaching 95% of their upper limit of 1.4 µg/kg (ciprofloxacin, 4.3 µg/kg (sulfamethoxazole, and 5.4 µg/kg (carbamazepine in soils irrigated for 19-28 years. Accumulation was soil-type-specific, with largest accumulation rates in Leptosols and no time-trend in Vertisols. Acidic pharmaceuticals (diclofenac, naproxen, bezafibrate were not retained and thus did not accumulate in soils. We did not detect qnrA genes, but qnrS and qnrB genes were found in two of the irrigated soils. Relative concentrations of sul1 genes in irrigated soils were two orders of magnitude larger (3.15 × 10(-3 ± 0.22 × 10(-3 copies/16S rDNA than in non-irrigated soils (4.35 × 10(-5± 1.00 × 10(-5 copies/16S rDNA, while those of sul2 exceeded the ones in non-irrigated soils still by a factor of 22 (6.61 × 10(-4 ± 0.59 × 10(-4 versus 2.99 × 10(-5 ± 0.26 × 10(-5 copies/16S rDNA. Absolute numbers of sul genes continued to increase with prolonging irrigation together with Enterococcus spp. 23S rDNA and total 16S rDNA contents. Increasing total concentrations of antibiotics in soil are not accompanied by increasing relative abundances of resistance genes. Nevertheless, wastewater irrigation enlarges the absolute concentration of resistance genes in soils due to a

  16. An assessment tool applied to manure management systems using innovative technologies

    Sørensen, Claus G.; Jacobsen, Brian H.; Sommer, Sven G.

    2003-01-01

    operational and cost-effective animal manure handling technologies. An assessment tool covering the whole chain of the manure handling system from the animal houses to the field has been developed. The tool enables a system-oriented evaluation of labour demand, machinery capacity and costs related to the...... tanker transport may reduce labour requirements, increase capacity, and open up new ways for reducing ammonia emission. In its most efficient configuration, the use of umbilical systems may reduce the labour requirement by about 40% and increase capacity by 80%. However, these systems are costly and will...

  17. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  18. Applying New Computer-Aided Tools for Wind Farm Planning and Environmental Impact Analysis

    Lybech Thoegersen, Morten; Nielsen, Per; Soerensen, Mads V. [Energi- og Miljoedata (EMD) Aalborg (Denmark); Toppenberg, Per [County of Northern Jutland, Aalborg (Denmark); Soee Christiansen, Erik [Municipality of Nibe, Nibe (Denmark)

    2005-07-01

    The demand for an environmental impact analysis (environmental assessment study) in any major Danish wind farm project has initiated the development for a set of computer-aided tools for wind turbine planning purposes. This paper gives an introduction to the newly developed computer-aided tools integrated in the wind farm design and planning tool WindPRO. The new module WindPLAN includes three interrelated spatial planning models: a weighted visibility calculation model, a conflict check calculation and a wind resource weighted planning module. The application of the models is exemplified through a case study covering the municipality of Nibe - situated in the Northern Jutland, Denmark. The different analysis are heavily dependent on detailed GIS-data - showing objects such as local housing, leisure areas, preservation areas etc. Finally, a brief presentation of other valuable computer-aided tools integrated in the WindPRO/WindPLAN module is given, such as rendering of terrain profiles, user defined map composing and saved pollution calculation.

  19. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  20. Changing patterns of infant death over the last 100 years: autopsy experience from a specialist children's hospital

    Pryce, J. W.; Weber, M A; Ashworth, M T; Roberts, S; Malone, M.; Sebire, N. J.

    2012-01-01

    OBJECTIVES: Infant mortality has undergone a dramatic reduction in the UK over the past century because of improvements in public health policy and medical advances. Postmortem examinations have been performed at Great Ormond Street Hospital for over 100 years, and analysis of cases across this period has been performed to assess changing patterns of infant deaths undergoing autopsy. DESIGN: Autopsy reports from 1909 and 2009 were examined. Age, major pathology and cause of death was reviewed...

  1. Adaptive Monte Carlo applied to uncertainty estimation in a five axis machine tool link errors identification

    Andolfatto, Loïc; Lavernhe, Sylvain; 10.1016/j.ijmachtools.2011.03.006

    2011-01-01

    Knowledge of a machine tool axis to axis location errors allows compensation and correcting actions to be taken to enhance its volumetric accuracy. Several procedures exist, involving either lengthy individual test for each geometric error or faster single tests to identify all errors at once. This study focuses on the closed kinematic Cartesian chain method which uses a single setup test to identify the eight link errors of a five axis machine tool. The identification is based on volumetric error measurements for different poses with a non-contact measuring instrument called CapBall, developed in house. In order to evaluate the uncertainty on each identified error, a multi-output Monte Carlo approach is implemented. Uncertainty sources in the measurement and identification chain - such as sensors output, machine drift and frame transformation uncertainties - can be included in the model and propagated to the identified errors. The estimated uncertainties are finally compared to experimental results to assess...

  2. Systems thinking tools as applied to community-based participatory research: a case study.

    BeLue, Rhonda; Carmack, Chakema; Myers, Kyle R; Weinreb-Welch, Laurie; Lengerich, Eugene J

    2012-12-01

    Community-based participatory research (CBPR) is being used increasingly to address health disparities and complex health issues. The authors propose that CBPR can benefit from a systems science framework to represent the complex and dynamic characteristics of a community and identify intervention points and potential "tipping points." Systems science refers to a field of study that posits a holistic framework that is focused on component parts of a system in the context of relationships with each other and with other systems. Systems thinking tools can assist in intervention planning by allowing all CBPR stakeholders to visualize how community factors are interrelated and by potentially identifying the most salient intervention points. To demonstrate the potential utility of systems science tools in CBPR, the authors show the use of causal loop diagrams by a community coalition engaged in CBPR activities regarding youth drinking reduction and prevention. PMID:22467637

  3. Automated Computer Systems for Manufacturability Analyses and Tooling Design : Applied to the Rotary Draw Bending Process

    Johansson, Joel

    2011-01-01

    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation processes. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the manufacturability analysis process, a part of the production preparation pr...

  4. Applying Model Driven Engineering Techniques and Tools to the Planets Game Learning Scenario

    Nodenot, Thierry; Caron, Pierre André; Le Pallec, Xavier; Laforcade, Pierre

    2008-01-01

    CPM (Cooperative Problem-Based learning Metamodel) is a visual language for the instructional design of Problem-Based Learning (PBL) situations. This language is a UML profile implemented on top of the Objecteering UML Case tool. In this article, we first present the way we used CPM language to bring about the pedagogical transposition of the planets game learning scenario. Then, we propose some related works conducted to improve CPM usability: on the one hand, we outline a MOF solution and a...

  5. Is Scores Derived from the Most Internationally Applied Patient Safety Culture Assessment Tool Correct?

    Javad Moghri; Ali Akbari Sari; Mehdi Yousefi; Hasan Zahmatkesh; Ranjbar Mohammad Ezzatabadi; Pejman Hamouzadeh; Satar Rezaei; Jamil Sadeghifar

    2013-01-01

    Abstract Background Hospital Survey on Patient Safety Culture, known as HSOPS, is an internationally well known and widely used tool for measuring patient safety culture in hospitals. It includes 12 dimensions with positive and negative wording questions. The distribution of these questions in different dimensions is uneven and provides the risk of acquiescence bias. The aim of this study was to assess the questionnaire against this bias. Methods Three hundred nurses were assigned into study ...

  6. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    Freddy Abel Vargas; Óscar Fernando Castellanos Domínguez

    2010-01-01

    The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature) is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced...

  7. A new phase pattern recognition tool applied to field line resonances

    Plaschke, F.; Glassmeier, K.-H.; Milan, S. E.; Mann, I. R.; Motschmann, U.; Rae, I. J.

    2009-04-01

    The detection and characterization of geomagnetic pulsations (standing Alfven waves on magnetospheric field lines, as produced by the field-line resonance (FLR) process) using ground magnetic field data has been based for decades on the interpretation of the longitudinal and latitudinal distributions of pulsation amplitudes and phases. By adopting this approach only clear and single FLRs can be correctly analyzed. Magnetometer array data, however, contain much more phase information due to the coherency of the ground observed FLR wave structures across the array of stations, which remains undisclosed if phase pattern recognition of beamforming techniques are not used. We present theory and applications of such a new phase pattern recognition tool, the Field-Line Resonance Detector (FLRD), which is an adaptation of the wave telescope technique, previously used in seismology and multi-spacecraft analysis. Unlike the traditional methods the FLRD is able to detect and fully characterize multiple superposed or hidden FLR structures, of which the tool allows for an automated detection. We show results of its application in a statistical analysis of one year (2002) of ground magnetometer data from the Canadian magnetometer array CANOPUS (now known as CARISMA, www.carisma.ca) and a comparison of FLRD results with other ground-based data from optical and radar instruments. The remarkable adaptability of the tool to other datasets and phase structures shall also be discussed.

  8. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  9. Atomic Force Microscopy as a Tool for Applied Virology and Microbiology

    Zaitsev, Boris

    2003-12-01

    Atomic force microscope (AFM) can be successfully used for simple and fast solution of many applied biological problems. In this paper the survey of the results of the application of atomic force microscope SolverP47BIO (NT-MDT, Russia) in State Research Center of Virology and Biotechnology "Vector" is presented. The AFM has been used: - in applied virology for the counting of viral particles and examination of virus-cell interaction; - in microbiology for measurements and indication of bacterial spores and cells; - in biotechnology for control of biotechnological processes and evaluation of the distribution of particle dimension for viral and bacterial diagnostic assays. The main advantages of AFM in applied researches are simplicity of the processing of sample preparation and short time of the examination.

  10. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...... that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by...... enriching and extending the concept of MV while trying to lead the CBR methodology into a new domain by applying it in strategic management....

  11. OPERATIONS MANAGEMENT TOOLS APPLIED TO THE OPERATING ROOM: A REVIEW OF CURRENT CONCEPTS AND A SINGLE CENTRE EXPERIENCE

    Lo, Charles Yuan-Hui

    2009-01-01

    Operations management tools can be applied to the operating room setting in order to improve throughput of the system. This is important because of the limitation of resources and funds available to hospitals in the public healthcare system. Hospitals must deal with variability in demand and uncertainty surrounding scheduling; these considerations can be placed in a queuing theory framework to better design processing capacity to minimize wait times and maximize utilization. Lean techniques c...

  12. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  13. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). PMID:24007752

  14. The ZEW combined microsimulation-CGE model : innovative tool for applied policy analysis

    Clauss, Markus; Schubert, Stefanie

    2009-01-01

    This contribution describes the linkage of microsimulation models and computable general equilibrium (CGE) models using two already established models called "STSM" and "PACE-L" used by the Centre for European Economic Research. This state of the art research method for applied policy analysis combines the advantages of both model types: On the one hand, microsimulation models allow for detailed labor supply and distributional effects due to policy measures, as individual household data is us...

  15. 100 years of superconductivity

    Globe Info

    2011-01-01

    Public lecture by Philippe Lebrun, who works at CERN on applications of superconductivity and cryogenics for particle accelerators. He was head of CERN’s Accelerator Technology Department during the LHC construction period. Centre culturel Jean Monnet, route de Gex Tuesday 11 October from 8.30 p.m. to 10.00 p.m. » Suitable for all – Admission free - Lecture in French » Number of places limited For further information: +33 (0)4 50 42 29 37

  16. 100 years of radar

    Galati, Gaspare

    2016-01-01

    This book offers fascinating insights into the key technical and scientific developments in the history of radar, from the first patent, taken out by Hülsmeyer in 1904, through to the present day. Landmark events are highlighted and fascinating insights provided into the exceptional people who made possible the progress in the field, including the scientists and technologists who worked independently and under strict secrecy in various countries across the world in the 1930s and the big businessmen who played an important role after World War II. The book encourages multiple levels of reading. The author is a leading radar researcher who is ideally placed to offer a technical/scientific perspective as well as a historical one. He has taken care to structure and write the book in such a way as to appeal to both non-specialists and experts. The book is not sponsored by any company or body, either formally or informally, and is therefore entirely unbiased. The text is enriched by approximately three hundred ima...

  17. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    : environmentally sustainable, technologically feasible, economically viable, socially desirable, legally permissible, and administratively achievable. Specific LCIA indicators may provide preliminary information to support a precautionary approach to act earlier on D-P and contribute to sustainability. Impacts...... eutrophication. The goal is to promote an educational example of environmental impacts assessment through science-based tools to predict the impacts, communicate knowledge and support decisions. The example builds on the (D) high demand for fixation of reactive nitrogen that supports several socio......-economic secondary drivers. The nitrogen exported to marine coastal ecosystems (P), after point and nonpoint source emissions, promote changes in the environmental conditions (S) such as low dissolved oxygen levels that cause the (I) effects on biota. These, stimulate society into designing actions ® to modify D...

  18. Applied Railway Optimization in Production Planning at DSB-S-tog - Tasks, Tools and Challenges

    Clausen, Jens

    2007-01-01

    customers, and has concurrently been met with demands for higher efficiency in the daily operation. The plans of timetable, rolling stock and crew must hence allow for a high level of customer service, be efficient, and be robust against disturbances of operations. It is a highly non-trivial task to meet...... scheduling. In addition we describe on-going efforts in using mathematical models in activities such as timetable design and work-force planning. We also identify some organizatorial key factors, which have paved the way for extended use of optimization methods in railway production planning....... these conflicting goals. S-tog has therefore on the strategic level decided to use software with optimization capabilities in the planning processes. We describe the current status for each activity using optimization or simulation as a tool: Timetable evaluation, rolling stock planning, and crew...

  19. Teaching Strategies to Apply in the Use of Technological Tools in Technical Education

    Olga Arranz García

    2014-09-01

    Full Text Available The emergence of new technologies in education area is changing the way of organizing the educational processes. Teachers are not unrelated to these changes and must employ new strategies to adapt their teaching methods to the new circumstances. One of these adaptations is framed in the virtual learning, where the learning management systems have been revealed as a very effective means within the learning process. In this paper we try to provide teachers in engineering schools how to use in an appropriate way the different technological tools that are present in a virtual platform. Thus, in the experimental framework we show the results outcomes in the analysis of two data samples obtained before and after the implementation of the European Higher Education Area, that would be extrapolated for its innovative application to the learning techniques.

  20. Quantitative seismic interpretation: Applying rock physics tools to reduce interpretation risk

    Yong Chen

    2007-01-01

    @@ Seismic data analysis is one of the key technologies for characterizing reservoirs and monitoring subsurface pore fluids. While there have been great advances in 3D seismic data processing, the quantitative interpretation of the seismic data for rock properties still poses many challenges. This book demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes, as well as how the multidisciplinary combination of rock physics models with seismic data, sedimentological information, and stochastic techniques can lead to more powerful results than can be obtained from a single technique.

  1. Applied tools for determining low-activity radionuclides in large environmental samples

    Considerable amounts of biological material contaminated with artificial radionuclides were generated to obtain the efficiency curves for low-activity radionuclide analyses of large environmental samples. Likewise, improving detection geometry is also an important task, mainly for studies involving conservation units with a high level of biodiversity preservation. This study aimed to evaluate the Monte Carlo efficiency curves without generating contaminated material with artificial radionuclides for water and vegetation measurements. An in-house adapted Marinelli geometry was applied to reduce the sampled amount of biological material in the ecosystem, which was combined with the Monte Carlo assisted efficiency curve for a more sustainable radiometric analysis. (author)

  2. Prediction of permafrost distribution on the Qinghai-Tibet Plateau in the next 50 and 100 years

    NAN; Zhuotong; LI; Shuxun; CHENG; Guodong

    2005-01-01

    Intergovernmental Panel on Climate Change (IPCC) in 2001 reported that the Earth air temperature would rise by 1.4-5.8℃ and 2.5℃ on average by the year 2100. China regional climate model results also showed that the air temperature on the Qinghai-Tibet Plateau (QTP) would increase by 2.2-2.6℃ in the next 50 years. A numerical permafrost model was ed to predict the changes of permafrost distribution on the QTP over the next 50 and 100 years under the two climatic warming scenarios, i.e. 0.02℃/a, the lower value of IPCC's estimation, and 0.052℃/a, the higher value predicted by Qin et al. Simulation results show that ( i ) in the case of 0.02℃/a air-temperature rise, permafrost area on the QTP will shrink about 8.8% in the next 50 years, and high temperature permafrost with mean annual ground temperature (MAGT) higher than -0.11℃ may turn into seasonal frozen soils. In the next 100 years, permafrost with MAGT higher than -0.5℃ will disappear and the permafrost area will shrink up to 13.4%. (ii) In the case of 0.052℃/a air-temperature rise, permafrost area on the QTP will reduce about 13.5% after 50 years. More remarkable degradation will take place after 100 years, and permafrost area will reduce about 46%. Permafrost with MAGT higher than -2℃ will turn into seasonal frozen soils and even unfrozen soils.

  3. Applied Circular Dichroism: A Facile Spectroscopic Tool for Configurational Assignment and Determination of Enantiopurity

    Macduff O. Okuom

    2015-01-01

    Full Text Available In order to determine if electronic circular dichroism (ECD is a good tool for the qualitative evaluation of absolute configuration and enantiopurity in the absence of chiral high performance liquid chromatography (HPLC, ECD studies were performed on several prescriptions and over-the-counter drugs. Cotton effects (CE were observed for both S and R isomers between 200 and 300 nm. For the drugs examined in this study, the S isomers showed a negative CE, while the R isomers displayed a positive CE. The ECD spectra of both enantiomers were nearly mirror images, with the amplitude proportional to the enantiopurity. Plotting the differential extinction coefficient (Δε versus enantiopurity at the wavelength of maximum amplitude yielded linear standard curves with coefficients of determination (R2 greater than 97% for both isomers in all cases. As expected, Equate, Advil, and Motrin, each containing a racemic mixture of ibuprofen, yielded no chiroptical signal. ECD spectra of Suphedrine and Sudafed revealed that each of them is rich in 1S,2S-pseudoephedrine, while the analysis of Equate vapor inhaler is rich in R-methamphetamine.

  4. Environmental management systems tools applied to the nuclear fuel center of IPEN

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  5. Environmental management systems tools applied to the nuclear fuel center of IPEN

    Mattos, Luis A. Terribile de; Meldonian, Nelson Leon; Madi Filho, Tufic, E-mail: mattos@ipen.br, E-mail: meldonia@ipen.br, E-mail: tmfilho@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  6. Applying CBR to machine tool product configuration design oriented to customer requirements

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2016-03-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  7. Neutron tomography of particulate filters: a non-destructive investigation tool for applied and industrial research

    This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerized reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). This effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF

  8. 3&4D Geomodeling Applied to Mineral Resources Exploration - A New Tool for Targeting Deposits.

    Royer, Jean-Jacques; Mejia, Pablo; Caumon, Guillaume; Collon-Drouaillet, Pauline

    2013-04-01

    3 & 4D geomodeling, a computer method for reconstituting the past deformation history of geological formations, has been used in oil and gas exploration for more than a decade for reconstituting fluid migration. It begins nowadays to be applied for exploring with new eyes old mature mining fields and new prospects. We describe shortly the 3&4D geomodeling basic notions, concepts, and methodology when applied to mineral resources assessment and modeling ore deposits, pointing out the advantages, recommendations and limitations, together with new challenges they rise. Several 3D GeoModels of mining explorations selected across Europe will be presented as illustrative case studies which have been achieved during the EU FP7 ProMine research project. It includes: (i) the Cu-Au porphyry deposits in the Hellenic Belt (Greece); (ii) the VMS in the Iberian Pyrite Belt including the Neves Corvo deposit (Portugal) and (iii) the sediment-hosted polymetallic Cu-Ag (Au, PGE) Kupferschiefer ore deposit in the Foresudetic Belt (Poland). In each case full 3D models using surfaces and regular grid (Sgrid) were built from all dataset available from exploration and exploitation including geological primary maps, 2D seismic cross-sections, and boreholes. The level of knowledge may differ from one site to another however those 3D resulting models were used to pilot additional field and exploration works. In the case of the Kupferschiefer, a sequential restoration-decompaction (4D geomodeling) from the Upper Permian to Cenozoic was conducted in the Lubin- Sieroszowice district of Poland. The results help in better understanding the various superimposed mineralization events which occurred through time in this copper deposit. A hydro-fracturing index was then calculated from the estimated overpressures during a Late Cretaceous-Early Paleocene up-lifting, and seems to correlate with the copper content distribution in the ore-series. These results are in agreement with an Early Paleocene

  9. A practical guide to applying lean tools and management principles to health care improvement projects.

    Simon, Ross W; Canacari, Elena G

    2012-01-01

    Manufacturing organizations have used Lean management principles for years to help eliminate waste, streamline processes, and cut costs. This pragmatic approach to structured problem solving can be applied to health care process improvement projects. Health care leaders can use a step-by-step approach to document processes and then identify problems and opportunities for improvement using a value stream process map. Leaders can help a team identify problems and root causes and consider additional problems associated with methods, materials, manpower, machinery, and the environment by using a cause-and-effect diagram. The team then can organize the problems identified into logical groups and prioritize the groups by impact and difficulty. Leaders must manage action items carefully to instill a sense of accountability in those tasked to complete the work. Finally, the team leaders must ensure that a plan is in place to hold the gains. PMID:22201573

  10. Microgravity: A New Tool for Basic and Applied Research in Space

    1985-01-01

    This brochure highlights selected aspects of the NASA Microgravity Science and Applications program. So that we can expand our understanding and control of physical processes, this program supports basic and applied research in electronic materials, metals, glasses and ceramics, biological materials, combustion and fluids and chemicals. NASA facilities that provide weightless environments on the ground, in the air, and in space are available to U.S. and foreign investigators representing the academic and industrial communities. After a brief history of microgravity research, the text explains the advantages and methods of performing microgravity research. Illustrations follow of equipment used and experiments preformed aboard the Shuttle and of prospects for future research. The brochure concludes be describing the program goals and the opportunities for participation.

  11. Study description of ExternE Project and the EcoSense Tool applied to Brazil

    In the present work an overview of the Extern Project in Brazil that has been conducted by the Brazilian National Nuclear Energy Commission (CNEN) is presented. To perform part of this evaluation study is used a version of the Eco Sense software developed by the Institute for Energy Economy and Rational Energy Application of University of Stuttgart to be applied to Brazil and other countries of South America and part of Central America. An important feature of this study is to establish a local and regional data bank with environmental and social parameters around Brazil and other countries to estimate the externalities of energy that will be introduced due to the new generation units that are planned to be built during the next years to provide an increase of electricity availability to support the economy growth. (author)

  12. Spatio-temporal analysis of rainfall trends over a maritime state (Kerala) of India during the last 100 years

    Nair, Archana; Ajith Joseph, K.; Nair, K. S.

    2014-05-01

    Kerala, a maritime state of India is bestowed with abundant rainfall which is about three times the national average. This study is conducted to have a better understanding of rainfall variability and trend at regional level for this state during the last 100 years. It is found that the rainfall variation in northern and southern regions of Kerala is large and the deviation is on different timescales. There is a shifting of rainfall mean and variability during the seasons. The trend analysis on rainfall data over the last 100 years reveals that there is a significant (99%) decreasing trend in most of the regions of Kerala especially in the month of January, July and November. The annual and seasonal trends of rainfall in most regions of Kerala are also found to be decreasing significantly. This decreasing trend may be related to global anomalies as a result of anthropogenic green house gas (GHG) emissions due to increased fossil fuel use, land-use change due to urbanisation and deforestation, proliferation in transportation associated atmospheric pollutants. We have also conducted a study of the seasonality index (SI) and found that only one district in the northern region (Kasaragod) has seasonality index of more than 1 and that the distribution of monthly rainfall in this district is mostly attributed to 1 or 2 months. In rest of the districts, the rainfall is markedly seasonal. The trend in SI reveals that the rainfall distribution in these districts has become asymmetric with changes in rainfall distribution.

  13. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  14. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  15. Effects of 100 years wastewater irrigation on resistance genes, class 1 integrons and IncP-1 plasmids in Mexican soil

    Sven eJechalke

    2015-03-01

    Full Text Available Long-term irrigation with untreated wastewater can lead to an accumulation of antibiotic substances and antibiotic resistance genes in soil. However, little is known so far about effects of wastewater, applied for decades, on the abundance of IncP-1 plasmids and class 1 integrons which may contribute to the accumulation and spread of resistance genes in the environment, and their correlation with heavy metal concentrations.Therefore, a chronosequence of soils that were irrigated with wastewater from zero to 100 years was sampled in the Mezquital Valley in Mexico in the dry season. The total community DNA was extracted and the absolute and relative abundance (relative to 16S rRNA genes of antibiotic resistance genes (tet(W, tet(Q, aadA, class 1 integrons (intI1, quaternary ammonium compound resistance genes (qacE+qacEΔ1 and IncP-1 plasmids (korB were quantified by real-time PCR. Except for intI1 and qacE+qacEΔ1 the abundances of selected genes were below the detection limit in non-irrigated soil. Confirming the results of a previous study, the absolute abundance of 16S rRNA genes in the samples increased significantly over time (linear regression model, p < 0.05 suggesting an increase in bacterial biomass due to repeated irrigation with wastewater. Correspondingly, all tested antibiotic resistance genes as well as intI1 and korB significantly increased in abundance over the period of 100 years of irrigation. In parallel, concentrations of the heavy metals Zn, Cu, Pb, Ni, and Cr significantly increased. However, no significant positive correlations were observed between the relative abundance of selected genes and years of irrigation, indicating no enrichment in the soil bacterial community due to repeated wastewater irrigation or due to a potential co-selection by increasing concentrations of heavy metals.

  16. 1,100 years after an earthquake: modification of the earthquake record by submergence, Puget Lowland, Washington State

    Arcos, M. E.

    2011-12-01

    Crustal faults may pose a complicated story for earthquake reconstruction. In some cases, regional tectonic strain overprints the record of coseismic land-level changes. This study looks at the record of earthquakes at two sites in the Puget Lowland, Gorst and the Skokomish delta, and how post-earthquake submergence modified the paleoseismic records. The Puget Lowland is the slowly subsiding forearc basin of the northern Cascadia subduction zone. A series of active thrust faults cross this lowland. Several of these faults generated large (M7+) earthquakes, about 1,100 years ago and both field sites have submerged at least 1.5 m since that time. This submergence masked the geomorphic record of uplift in some areas, resulting in a misreading of the zone of earthquake deformation and potential misinterpretation of the underlying fault structure. Earthquakes ~1,100 years ago uplifted both field localities and altered river dynamics. At Gorst, a tsunami and debris flow accompanied uplift of at least 3 m by the Seattle fault. The increased sediment load resulted in braided stream formation for a period after the earthquake. At the Skokomish delta, differential uplift trapped the river on the eastern side of the delta for the last 1,100 years resulting in an asymmetric intertidal zone, 2-km wider on one side of the delta than the other. The delta slope or submergence may contribute to high rates of flooding on the Skokomish River. Preliminary results show the millennial scale rates of submergence vary with the southern Puget Lowland submerging at a faster rate than the northern Puget Lowland. This submergence complicates the reconstruction of past earthquakes and renders assessment of future hazards difficult for those areas that are based on uplifted marine platforms and other coastal earthquake signatures in several ways. 1) Post-earthquake submergence reduces the apparent uplift of marine terraces. 2) Submergence makes zones of earthquake deformation appear narrower. 3

  17. Are Geodetically and Geologically Constrained Vertical Deformation Models Compatible With the 100-Year Coastal Tide Gauge Record in California?

    Smith-Konter, B. R.; Sandwell, D. T.

    2006-12-01

    Sea level change has been continuously recorded along the California coastline at several tide gauge stations for the past 50-100 years. These stations provide a temporal record of sea level change, generally attributed to post-glacial rebound and ocean climate phenomena. However, geological processes, including displacements from large earthquakes, have also been shown to produce sea level variations. Furthermore, the vertical tectonic response to interseismic strain accumulation in regions of major fault bends has been shown to produce uplift and subsidence rates consistent with sea level trends. To investigate the long-term extent and implication of tectonic deformation on sea level change, we compare time series data from California tide gauge stations to model estimates of vertical displacements produced by earthquake cycle deformation. Using a 3-D semi-analytic viscoelastic model, we combine geologic slip rates, geodetic velocities, and historical seismic data to simulate both horizontal and vertical deformation of the San Andreas Fault System. Using this model, we generate a time-series of vertical displacements spanning the 100-year sea level record and compare this to tide gauge data provided by the Permanent Service for Mean Sea Level (PSMSL). Comparison between sea level data and a variety of geologically and geodetically constrained models confirms that the two are highly compatible. Vertical displacements are largely controlled by interseismic strain accumulation, however displacements from major earthquakes are also required to explain varying trends in the sea level data. Models based on elastic plate thicknesses of 30-50km and viscosities of 7x10^1^8-2x10^1^9 Pa-s produce vertical displacements at tide-gauge locations that explain long-term trends in the sea level record to a high degree of accuracy at nearly all stations. However, unmodeled phenomena are also present in the sea level data and require further inspection.

  18. Evaluating the 100 year floodplain as an indicator of flood risk in low-lying coastal watersheds

    Sebastian, A.; Brody, S.; Bedient, P. B.

    2013-12-01

    The Gulf of Mexico is the fastest growing region in the United States. Since 1960, the number of housing units built in the low-lying coastal counties has increased by 246%. The region experiences some of the most intense rainfall events in the country and coastal watersheds are prone to severe flooding characterized by wide floodplains and ponding. This flooding is further exacerbated as urban development encroaches on existing streams and waterways. While the 100 year floodplain should play an important role in our ability to develop disaster resilient communities, recent research has indicated that existing floodplain delineations are a poor indicator of actual flood losses in low-lying coastal regions. Between 2001 and 2005, more than 30% of insurance claims made to FEMA in the Gulf Coast region were outside of the 100 year floodplain and residential losses amounted to more than $19.3 billion. As population density and investments in this region continue to increase, addressing flood risk in coastal communities should become a priority for engineers, urban planners, and decision makers. This study compares the effectiveness of 1-D and a 2-D modeling approaches to spatially capture flood claims from historical events. Initial results indicate that 2-D models perform much better in coastal environments and may serve better for floodplain modeling helping to prevent unintended losses. The results of this study encourage a shift towards better engineering practices using existing 2-D models in order to protect resources and provide guidance for urban development in low-lying coastal regions.

  19. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    Gustafsson, David; Jansson, Per-Erik [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Land and Water Resources Engineering; Gaerdenaes, Annemieke [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Soil Sciences; Eckersten, Henrik [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Crop Production Ecology

    2006-12-15

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem.

  20. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem

  1. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in fie

  2. 100-Year Floodplains, Flood plains from FEMA, Published in 2003, 1:600 (1in=50ft) scale, Town of Cary NC.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:600 (1in=50ft) scale, was produced all or in part from LIDAR information as of 2003. It is described as 'Flood...

  3. 100-Year Floodplains, Published in 2008, 1:100000 (1in=8333ft) scale, City of Americus & Sumter County, GA GIS.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:100000 (1in=8333ft) scale, was produced all or in part from Road Centerline Files information as of 2008. Data by...

  4. 100-Year Floodplains, FEMA Floodway and Flood Boundary Maps, Published in 2005, 1:24000 (1in=2000ft) scale, Lafayette County Land Records.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2005. It is described as 'FEMA...

  5. 100-Year Floodplains, FEMA flood insurance rate map vector data, Published in 2009, 1:7200 (1in=600ft) scale, Portage County.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:7200 (1in=600ft) scale, was produced all or in part from Orthoimagery information as of 2009. It is described as...

  6. 100-Year Floodplains, Data provided by FEMA and WI DNR, Published in 2009, 1:2400 (1in=200ft) scale, Dane County Land Information Office.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale as of 2009. It is described as 'Data provided by FEMA and WI DNR'. Data by this publisher...

  7. 100-Year Floodplains, St James FEMA Flood Map, Published in 2010, 1:24000 (1in=2000ft) scale, St James Parish Government.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Hardcopy Maps information as of 2010. It is described...

  8. 100-Year Floodplains, FloodZone; FEMA; Update Frequency is every five or ten years, Published in 2008, Athens-Clarke County Planning Department.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, was produced all or in part from Field Survey/GPS information as of 2008. It is described as 'FloodZone; FEMA; Update Frequency...

  9. Acidophilic denitrifiers dominate the N2O production in a 100-year-old tea orchard soil.

    Huang, Ying; Long, Xi-En; Chapman, Stephen J; Yao, Huaiying

    2015-03-01

    Aerobic denitrification is the main process for high N2O production in acid tea field soil. However, the biological mechanisms for the high emission are not fully understood. In this study, we examined N2O emission and denitrifier communities in 100-year-old tea soils with four pH levels (3.71, 5.11, 6.19, and 7.41) and four nitrate concentration (0, 50, 200, and 1000 mg kg(-1) of NO3 (-)-N) addition. Results showed the highest N2O emission (10.1 mg kg(-1) over 21 days) from the soil at pH 3.71 with 1000 mg kg(-1) NO3 (-) addition. The N2O reduction and denitrification enzyme activity in the acid soils (pH pH 7.41. Moreover, TRF 78 of nirS and TRF 187 of nosZ dominated in soils of pH 3.71, suggesting an important role of acidophilic denitrifiers in N2O production and reduction. CCA analysis also showed a negative correlation between the dominant denitrifier ecotypes (nirS TRF 78, nosZ TRF 187) and soil pH. The representative sequences were identical to those of cultivated denitrifiers from acidic soils via phylogenetic tree analysis. Our results showed that the acidophilic denitrifier adaptation to the acid environment results in high N2O emission in this highly acidic tea soil. PMID:25273518

  10. GEodesy Tools for Societal Issues (GETSI): Undergraduate curricular modules that feature geodetic data applied to critical social topics

    Douglas, B. J.; Pratt-Sitaula, B.; Walker, B.; Miller, M. S.; Charlevoix, D.

    2014-12-01

    The GETSI project is a three-year NSF funded project to develop and disseminate teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (http://serc.carleton.edu/getsi). GETSI was born out of requests from geoscience faculty for more resources with which to educate future citizens and future geoscience professionals on the power and breadth of geodetic methods to address societally relevant topics. Development of the first two modules started at a February 2014 workshop and initial classroom testing begins in fall 2014. The Year 1 introductory module "Changing Ice and Sea Level" includes geodetic data such as gravity, satellite altimetry, and GPS time series. The majors-level Year 1 module is "Imaging Active Tectonics" and it has students analyzing InSAR and LiDAR data to assess infrastructure vulnerability to demonstratively active faults. Additional resources such as animations and interactive data tools are also being developed. The full modules will take about two weeks of class time; module design will permit portions of the material to be used as individual projects or assignments of shorter duration. Ultimately a total of four modules will be created and disseminated, two each at the introductory and majors-levels. GETSI is working in tight partnership with the Science Education Resource Center's (SERC) InTeGrate project on the module development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. This will allow for an optimized module development process based on successful practices defined by these earlier efforts.

  11. Floodplain sediment from a 100-year-recurrence flood in 2005 of the Ping River in northern Thailand

    S. H. Wood

    2008-07-01

    Full Text Available The tropical storm, floodwater, and the floodplain-sediment layer of a 100-year recurrence flood are examined to better understand characteristics of large monsoon floods on medium-sized rivers in northern Thailand. Storms producing large floods in northern Thailand occur early or late in the summer rainy season (May–October. These storms are associated with tropical depressions evolving from typhoons in the South China Sea that travel westward across the Indochina Peninsula. In late September, 2005, the tropical depression from Typhoon Damrey swept across northern Thailand delivering 100–200 mm/day at stations in mountainous areas. Peak flow from the 6355-km2 drainage area of the Ping River upstream of the city of Chiang Mai was 867 m3s−1 (river-gage of height 4.93 m and flow greater than 600 m3s−1 lasted for 2.5 days. Parts of the city of Chiang Mai and some parts of the floodplain in the intermontane Chiang Mai basin were flooded up to 1-km distant from the main channel. Suspended-sediment concentrations in the floodwater were measured and estimated to be 1000–1300 mg l−1.

    The mass of dry sediment (32.4 kg m-2, measured over a 0.32-km2 area of the floodplain is relatively high compared to reports from European and North American river floods. Average wet sediment thickness over the area was 3.3 cm. Sediment thicker than 8 cm covered 16 per cent of the area, and sediment thicker than 4 cm covered 44 per cent of the area. High suspended-sediment concentration in the floodwater, flow to the floodplain through a gap in the levee afforded by the mouth of a tributary stream as well as flow over levees, and floodwater depths of 1.2 m explain the relatively large amount of sediment in the measured area.

    Grain-size analyses and examination of the flood layer showed about 15-cm thickness of massive fine-sandy silt on the levee within 15

  12. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  13. Portable hyperspectral device as a valuable tool for the detection of protective agents applied on hystorical buildings

    Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.

    2012-04-01

    In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration [1]. The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in

  14. A re-collection of Diplocentrum recurvum lindl. (Orchidaceae after a lapse of 100 years or more from Andhra Pradesh, India

    Mitta Mahendranath

    2015-08-01

    Full Text Available Diplocentrumrecurvum Lindl. (Orchidaceae has been recollected from the Horsley hills of Chittoor districts from Andhra Pradesh after a lapse of 100 years or more. The present paper provides a detailed description, photographs of old herbarium specimens and distribution of the species. 

  15. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of rece

  16. NAA applied to the study of metallic ion transfer induced by orthopedic surgical tools or by metallic prostheses

    After implantation of a metallic prosthesis in a patient, damage can occur and surrounding tissues may be modified. These effects were related by characterizing the soft tissue content. However, the variations in element concentrations can be small and it is necessary to evaluate the instrumental contamination, measured in muscular and capsular tissues surrounding the hips of selected corpses. From corpses, which never undergo surgical operations, samples have been cut with different surgical tools. From corpses with hip prosthesis, samples have been cut with scalpels to determine the contamination induced by metallic prostheses. Results give the mineral composition of surgical tools and of muscular and capsular tissues. (author)

  17. Experiences and Results of Applying Tools for Assessing the Quality of a mHealth App Named Heartkeeper.

    Martínez-Pérez, Borja; de la Torre-Díez, Isabel; López-Coronado, Miguel

    2015-11-01

    Currently, many incomplete mobile apps can be found in the commercial stores, apps with bugs or low quality that needs to be seriously improved. The aim of this paper is to use two different tools for assessing the quality of a mHealth app for the self-management of heart diseases by the own patients named Heartkeeper. The first tool measures the compliance with the Android guidelines given by Google and the second measures the users' Quality of Experience (QoE). The results obtained indicated that Heartkeeper follows in many cases the Android guidelines, especially in the structure, and offers a satisfactory QoE for its users, with special mention to aspects such as the learning curve, the availability and the appearance. As a result, Heartkeeper has proved to be a satisfactory app from the point of view of Google and the users. The conclusions obtained are that the type of tools that measure the quality of an app can be very useful for developers in order to find aspects that need improvements before releasing their apps. By doing this, the number of low-quality applications released will decrease dramatically, so these techniques are strongly recommended for all the app developers. PMID:26345452

  18. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of recent and fossil, principally freshwater fish; “fish kills” and mass mortality in the sea (especially of fish); taphonomy and preservation of fish; upwelling; anoxic conditions, linked to fish mortali...

  19. Applied acoustics concepts, absorbers, and silencers for acoustical comfort and noise control alternative solutions, innovative tools, practical examples

    Fuchs, Helmut V

    2013-01-01

    The author gives a comprehensive overview of materials and components for noise control and acoustical comfort. Sound absorbers must meet acoustical and architectural requirements, which fibrous or porous material alone can meet. Basics and applications are demonstrated, with representative examples for spatial acoustics, free-field test facilities and canal linings. Acoustic engineers and construction professionals will find some new basic concepts and tools for developments in order to improve acoustical comfort. Interference absorbers, active resonators and micro-perforated absorbers of different materials and designs complete the list of applications.

  20. Applying value engineering and modern assessment tools in managing NEPA: Improving effectiveness of the NEPA scoping and planning process

    ECCLESTON, C.H.

    1998-09-03

    While the National Environmental Policy Act (NEPA) implementing regulations focus on describing ''What'' must be done, they provide surprisingly little direction on ''how'' such requirements are to be implemented. Specific implementation of these requirements has largely been left to the discretion of individual agencies. More than a quarter of a century after NEPA's enactment, few rigorous tools, techniques, or methodologies have been developed or widely adopted for implementing the regulatory requirements. In preparing an Environmental Impact Statement, agencies are required to conduct a public scoping process to determine the range of actions, alternatives, and impacts that will be investigated. Determining the proper scope of analysis is an element essential in the successful planning and implementation of future agency actions. Lack of rigorous tools and methodologies can lead to project delays, cost escalation, and increased risk that the scoping process may not adequately capture the scope of decisions that eventually might need to be considered. Recently, selected Value Engineering (VE) techniques were successfully used in managing a prescoping effort. A new strategy is advanced for conducting a pre-scoping/scoping effort that combines NEPA with VE. Consisting of five distinct phases, this approach has potentially wide-spread implications in the way NEPA, and scoping in particular, is practiced.

  1. Finite element code in Python as a universal and modular tool applied to Kohn-Sham equations

    Cimrman, R.; Vackář, Jiří; Novák, M.; Čertík, O.; Rohan, E.; Tůma, Miroslav

    Vienna: Vienna University of Technology, 2012 - (Eberhardsteiner, J.; Böhm, H.; Rammerstorfer, F.), s. 5212-5221 ISBN 9783950353709. [European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS 2012) /6./. Vienna (AT), 10.09.2012-14.09.2012] R&D Projects: GA ČR GA101/09/1630; GA ČR(CZ) GAP108/11/0853 Institutional support: RVO:68378271 ; RVO:67985807 Keywords : electronic structure * density-functional theory * pseudopotentials * molecules * clusters * finite-element method Subject RIV: BE - Theoretical Physics; IN - Informatics, Computer Science (UIVT-O)

  2. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices of...

  3. 100 years of Planck's quantum

    Duck, Ian M

    2000-01-01

    This invaluable book takes the reader from Planck's discovery of the quantum in 1900 to the most recent interpretations and applications of nonrelativistic quantum mechanics.The introduction of the quantum idea leads off the prehistory of quantum mechanics, featuring Planck, Einstein, Bohr, Compton, and de Broglie's immortal contributions. Their original discovery papers are featured with explanatory notes and developments in Part 1.The invention of matrix mechanics and quantum mechanics by Heisenberg, Born, Jordan, Dirac, and Schrödinger is presented next, in Part 2.Following that, in Part 3,

  4. 100 Years of Brownian motion

    Hänggi, Peter; Marchesoni, Fabio

    2005-01-01

    In the year 1905 Albert Einstein published four papers that raised him to a giant in the history of science of all times. These works encompass the photon hypothesis (for which he obtained the Nobel prize in 1921), his first two papers on (special) relativity theory and, of course, his first paper on Brownian motion, entitled "\\"Uber die von der molekularkinetischen Theorie der W\\"arme geforderte Bewegung von in ruhenden Fl\\"ussigkeiten suspendierten Teilchen'' (submitted on May 11, 1905). Th...

  5. FEMA 100 year Flood Data

    California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  6. 100 Years of Reality Learning

    Zimpher, Nancy L.; Wright Ron, D.

    2006-01-01

    One may have heard of reality TV, but what about reality learning? The latter is probably a term one hasn't seen much, although it is in many ways a clearer and more concise name for a concept that in 2006 marks its 100th anniversary: cooperative education, or "co-op." Co-op, a break-through idea pioneered at the University of Cincinnati by Herman…

  7. Effects of applying an external magnetic field during the deep cryogenic heat treatment on the corrosion resistance and wear behavior of 1.2080 tool steel

    Highlights: ► Deep cryogenic increases the carbide percentage and make a more homogenous distribution. ► Deep cryogenic improve the wear resistance and corrosion behavior of 1.2080 tool steel. ► Applying the magnetic field weaker the carbide distribution and decreases the carbides percentage. ► Magnetized samples showed weaker corrosion and wear behavior. -- Abstract: This work concerns with the effect of applying an external magnetic field on the corrosion behavior, wear resistance and microstructure of 1.2080 (D2) tool steel during the deep cryogenic heat treatment. These analyses were performed via scanning electron microscope (SEM), optical microscope (OM), transmission electron microscope (TEM) and X-ay diffraction (XRD) to study the microstructure, a pin-on-disk wear testing machine to study the wear behavior, and linear sweep voltammetry to study the corrosion behavior of the samples. It was shown that the deep cryogenic heat treatment eliminates retained austenite and makes a more uniform carbide distribution with higher percentage. It was also observed that the deep cryogenic heat treatment improves the wear behavior and corrosion resistance of 1.2080 tool steel. In comparison between the magnetized and non-magnetized samples, the carbide percentage decreases and the carbide distribution weakened in the magnetized samples; subsequently, the wear behavior and corrosion resistance attenuated compared in the magnetized samples.

  8. Applying TRIZ and Fuzzy AHP Based on Lean Production to Develop an Innovative Design of a New Shape for Machine Tools

    Ho-Nien Hsieh

    2015-03-01

    Full Text Available Companies are facing cut throat competition and are forced to continuously perform better than their competitors. In order to enhance their position in the competitive world, organizations are improving at a faster pace. Industrial organizations must be used to the new ideals, such as innovation. Today, innovative design in the development of new products has become a core value in most companies, while innovation is recognized as the main driving force in the market. This work applies the Russian theory of inventive problem-solving, TRIZ and the fuzzy analytical hierarchy process (FAHP to design a new shape for machine tools. TRIZ offers several concepts and tools to facilitate concept creation and problem-solving, while FAHP is employed as a decision support tool that can adequately represent qualitative and subjective assessments under the multiple criteria decision-making environment. In the machine tools industry, this is the first study to develop an innovative design under the concept of lean production. We used TRIZ to propose the relevant principles to the shape’s design with the innovative design consideration and also used FAHP to evaluate and select the best feasible alternative from independent factors based on a multiple criteria decision-making environment. To develop a scientific method based on the lean production concept in order to design a new product and improve the old designing process is the contribution of this research.

  9. Lead-time reduction utilizing lean tools applied to healthcare: the inpatient pharmacy at a local hospital.

    Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed

    2010-01-01

    The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time. PMID:20151593

  10. Covalent perturbation as a tool for validation of identifications and PTM mapping applied to bovine alpha-crystallin

    Bunkenborg, Jakob; Falkenby, Lasse Gaarde; Harder, Lea Mørch;

    2016-01-01

    . Chemical modification of all peptides in a sample leads to shifts in masses depending on the chemical properties of each peptide. The identification of a native peptide sequence and its perturbed version with a different parent mass and fragment ion masses provides valuable information. Labeling all...... peptides using reductive alkylation with formaldehyde is one such perturbation where the ensemble of peptides shifts mass depending on the number of reactive amine groups. Matching covalently perturbed fragmentation patterns from the same underlying peptide sequence increases confidence in the assignments...... and can salvage low scoring post-translationally modified peptides. Applying this strategy to bovine alpha-crystallin we identify 9 lysine acetylation sites, 4 O-GlcNAc sites and 13 phosphorylation sites. This article is protected by copyright. All rights reserved....

  11. Abstracts of the International conference 'Geological and geophysical studies of the Republic of Kazakhstan's sites', devoted to 100-year jubilee of K.I. Satpaev

    The International conference 'Geological and geophysical studies of the Republic of Kazakhstan's sites', was devoted to 100-year jubilee of K.I. Satpaev. Satpaev is the well-known Kazakh scientist-geologist, the first President of the Academy of Science of the Kazakh Soviet Socialist Republic. The conference was held in 26-29 April 1999 in the Kurchatov city on the former Semipalatinsk test site territory. The conference was mainly dedicated to problems of geological and geophysical examinations and monitoring of objects exposed to effects from underground nuclear explosions. The Collection of abstracts comprises 21 papers

  12. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling. (letter)

  13. Apply Web-based Analytic Tool and Eye Tracking to Study The Consumer Preferences of DSLR Cameras

    Jih-Syongh Lin

    2013-11-01

    Full Text Available Consumer’s preferences and purchase motivation of products often lie in the purchasing behaviors generated by the synthetic evaluation of form features, color, function, and price of products. If an enterprise can bring these criteria under control, they can grasp the opportunities in the market place. In this study, the product form, brand, and prices of five DSLR digital cameras of Nikon, Lumix, Pentax, Sony, and Olympus were investigated from the image evaluation and eye tracking. The web-based 2-dimensional analytical tool was used to present information on three layers. Layer A provided information of product form and brand name; Layer B for product form, brand name, and product price for the evaluation of purchase intention (X axis and product form attraction (Y axis. On Layer C, Nikon J1 image samples of five color series were presented for the evaluation of attraction and purchase intention. The study results revealed that, among five Japanese brands of digital cameras, LUMIX GF3 is most preferred and serves as the major competitive product, with a product price of US$630. Through the visual focus of eye-tracking, the lens, curvatured handle bar, the curve part and shuttle button above the lens as well as the flexible flash of LUMIX GF3 are the parts that attract the consumer’s eyes. From the verbal descriptions, it is found that consumers emphasize the functions of 3D support lens, continuous focusing in shooting video, iA intelligent scene mode, and all manual control support. In the color preference of Nikon J1, the red and white colors are most preferred while pink is least favored. These findings can serve as references for designers and marketing personnel in new product design and development.

  14. Covalent perturbation as a tool for validation of identifications and PTM mapping applied to bovine alpha-crystallin.

    Bunkenborg, Jakob; Falkenby, Lasse Gaarde; Harder, Lea Mørch; Molina, Henrik

    2016-02-01

    Proteomic identifications hinge on the measurement of both parent and fragment masses and matching these to amino acid sequences via database search engines. The correctness of the identifications is assessed by statistical means. Here we present an experimental approach to test identifications. Chemical modification of all peptides in a sample leads to shifts in masses depending on the chemical properties of each peptide. The identification of a native peptide sequence and its perturbed version with a different parent mass and fragment ion masses provides valuable information. Labeling all peptides using reductive alkylation with formaldehyde is one such perturbation where the ensemble of peptides shifts mass depending on the number of reactive amine groups. Matching covalently perturbed fragmentation patterns from the same underlying peptide sequence increases confidence in the assignments and can salvage low scoring post-translationally modified peptides. Applying this strategy to bovine alpha-crystallin, we identify 9 lysine acetylation sites, 4 O-GlcNAc sites and 13 phosphorylation sites. PMID:26644245

  15. Diagnosing Integrity of Transformer Windings by Applying Statistical Tools to Frequency Response Analysis Data Obtained at Site

    M. Prameela

    2014-03-01

    Full Text Available This study presents the results of Sweep Frequency Response Analysis (SFRA measurement work carried out on number of power transformers at various sites involving problems like shorting of winding turns, core faults and related issues, On-Load Tap Changer (OLTC open contacts and winding displacement issues. The numerical parameters Viz., Min-Max ratio (MM, Mean Square Error (MSE, Maximum Absolute difference (MABS, Absolute Sum of Logarithmic Error (ASLE, Standard Deviation (S.D. and Correlation Coefficient (CC computed in three different frequency bands are presented to aid the interpretation of SFRA data. Comparison of frequency responses among different phases of the same transformer and with sister units were carried out to interpret the data. The study presents limits for various numerical parameters to diagnose the condition of the transformer and discriminate the faulty winding after accounting for manufacturing, design and asymmetry of the winding. The results presented in the study will help in interpreting the SFRA data by applying numerical techniques and assess the condition of the transformer.

  16. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Végh, J., E-mail: janos.vegh@ec.europa.eu [Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Pós, I., E-mail: pos@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Horváth, Cs., E-mail: csaba.horvath@energia.mta.hu [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, P.O. Box 49 (Hungary); Kálya, Z., E-mail: kalyaz@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Parkó, T., E-mail: parkot@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Ignits, M., E-mail: ignits@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary)

    2015-10-15

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  17. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  18. Undergraduate teaching modules featuring geodesy data applied to critical social topics (GETSI: GEodetic Tools for Societal Issues)

    Pratt-Sitaula, B. A.; Walker, B.; Douglas, B. J.; Charlevoix, D. J.; Miller, M. M.

    2015-12-01

    The GETSI project, funded by NSF TUES, is developing and disseminating teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (serc.carleton.edu/getsi). It is collaborative between UNAVCO (NSF's geodetic facility), Mt San Antonio College, and Indiana University. GETSI was initiated after requests by geoscience faculty for geodetic teaching resources for introductory and majors-level students. Full modules take two weeks but module subsets can also be used. Modules are developed and tested by two co-authors and also tested in a third classroom. GETSI is working in partnership with the Science Education Resource Center's (SERC) InTeGrate project on the development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. Two GETSI modules are being published in October 2015. "Ice mass and sea level changes" includes geodetic data from GRACE, satellite altimetry, and GPS time series. "Imaging Active Tectonics" has students analyzing InSAR and LiDAR data to assess infrastructure earthquake vulnerability. Another three modules are in testing during fall 2015 and will be published in 2016. "Surface process hazards" investigates mass wasting hazard and risk using LiDAR data. "Water resources and geodesy" uses GRACE, vertical GPS, and reflection GPS data to have students investigating droughts in California and the High Great Plains. "GPS, strain, and earthquakes" helps students learn about infinitesimal and coseismic strain through analysis of horizontal GPS data and includes an extension module on the Napa 2014 earthquake. In addition to teaching resources, the GETSI project is compiling recommendations on successful development of geodesy curricula. The chief recommendations so far are the critical importance of including scientific experts in the authorship team and investing significant resources in

  19. 还原染料百年发展史话(待续)%The development of vat dyes in 100 years(to be continued)

    陈荣圻

    2015-01-01

    The first vat dye (vat dye RSN) was synthesized and produced by BASF in 1901, which was more than 100 years. Considering indigo blue was synthesized by BASF in 1987, it has a history of far more than 100 years. Vat dyes are expensive because of its complex chemical structure and long synthesis process with large amount of "three wastes" that are difficult to deal with. However, vat dyes are colourful, high color density and impossible to be replaced by any other dyes. Besides printing and dyeing, vat dyes can obtain high-grade organic pigments after pigmentation. Some of those high-grade organic pigments can extend to optical physics, liquid crystal in the electrochemistry optical material fields and other high-tech fields, which are indispensable function materials and completely changed.%1901年,BASF合成并生产了第1只还原染料(还原染料RSN),距今超过一百年,如果说1987年合成靛蓝早已诞生于BASF,那离百年就更远了.还原染料化学结构复杂,合成过程冗长、三废量大、难于处理,所以价格昂贵.但因其色彩鲜艳、密度高,非其他棉用染料所能取代.还原染料除了印染,经颜料化后,某些染料可制得高档有机颜料,有些品种还能拓展到光物理、电化学领域的液晶、光导材料等高科技领域,是不可缺失的功能材料,旧貌换新颜.

  20. Geographical information systems as a tool in limnological studies An applied case study in a shallow .lake of a plain area, Buenos Aires province, Argentina

    The understanding of the hydrological functioning and the interaction among the different water bodies in an area is essential when a sustainable use of the hydric resources is considered. The aim of the present paper is to assess both hydrological-limnological methods and GIS as an integrated methodology applied to the study of shallow lakes, and the hydrological behavior of shallow wetlands in plain areas. La Salada is an areic permanent shallow lake with an area of 5,78 km2 located near La Dulce town (SE of Buenos Aires Province, Argentina). In this paper we applied methods and tools of the Geographical information Systems in order to assess both, the evolution and state of the wetland. Topographic profiles, showing the relationship among the lake and the other aquatic systems, and also a multi temporal assessment of the morphometric parameters were performed by using a Digital Terrain Model of the area. A sample grid was designed to obtain bathymetric, hydrogeochemical and isotopic data. The chemical water composition is homogeneous in area and depth. changes in the conductivity values along depth, the isotopic contents and the Gibbs diagram showed that the evaporation is the main process controlling the water chemistry. Physical-chemical parameters established water quality and uses of the lake.

  1. Bottom-Up modeling, a tool for decision support for long-term policy on energy and environment - The TIMES model applied to the energy intensive industries

    Among the energy users in France and Europe, some industrial sectors are very important and should have a key role when assessing the final energy demand patterns in the future. The aim of our work is to apply a prospective model for the long range analysis of energy/technology choices in the industrial sector, focussing on the energy-intensive sectors. The modelling tool applied in this study is the TIMES model (family of best known MARKAL model). It is an economic linear programming model generator for local, national or multi regional energy systems, which provides a technology-rich basis for estimating energy dynamics over a long term, multi period time. We illustrate our work with nine energy-intensive industrial sectors: paper, steel, glass, cement, lime, tiles, brick, ceramics and plaster. It includes a detailed description of the processes involved in the production of industrial products, providing typical energy uses in each process step. In our analysis, we identified for each industry, several commercially available state-of-the-art technologies, characterized and chosen by the Model on the basis of cost effectiveness. Furthermore, we calculated potential energy savings, carbon dioxide emissions' reduction and we estimated the energy impact of a technological rupture. This work indicates that there still exists a significant potential for energy savings and carbon dioxide emissions' reduction in all industries. (author)

  2. Success at 100 is easier said than done--comments on Araújo et al: successful aging at 100 years.

    Martin, Peter; Poon, Leonard W

    2016-02-01

    Few would argue that achieving the age of 100 years is extraordinary, but what about the quality of life at this extreme age? Is it worth it to live to 100 and beyond? The study by Araújo, Ribero, Teixeira, and Paúl (2015) in three ways provided an answer to this question substantiating and complementing recent findings about successful aging in extreme old age (Poon and Perls, 2007; Martin et al., 2015). First, the study joined other investigators in asking whether the criteria for successful aging posed by Rowe and Kahn (1997) are applicable for older adults at the end stage of a very long life. Second, the study shed light on whether objective or subjective criteria are more appropriate to gauge levels of successful aging for the oldest old (e.g. Pruchno et al., 2010; Cho et al., 2012). Finally, the study provided additional data on psychological, social, and economic resources that enhance the needed ingredients of successful aging at the century mark. PMID:26781990

  3. Quantification of uncertainties in the 100-year flow at an ungaged site near a gaged station and its application in Georgia

    Cho, Huidae; Bones, Emma

    2016-08-01

    The Federal Emergency Management Agency has introduced the concept of the "1-percent plus" flow to incorporate various uncertainties in estimation of the 100-year or 1-percent flow. However, to the best of the authors' knowledge, no clear directions for calculating the 1-percent plus flow have been defined in the literature. Although information about standard errors of estimation and prediction is provided along with the regression equations that are often used to estimate the 1-percent flow at ungaged sites, uncertainty estimation becomes more complicated when there is a nearby gaged station because regression flows and the peak flow estimate from a gage analysis should be weighted to compute the weighted estimate of the 1-percent flow. In this study, an equation for calculating the 1-percent plus flow at an ungaged site near a gaged station is analytically derived. Also, a detailed process is introduced for calculating the 1-percent plus flow for an ungaged site near a gaged station in Georgia as an example and a case study is performed. This study provides engineers and practitioners with a method that helps them better assess flood risks and develop mitigation plans accordingly.

  4. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China.

    Xiao, Rong; Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe-Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (pheavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. PMID:25723958

  5. Organochlorine pesticides (OCPs) in wetland soils under different land uses along a 100-year chronosequence of reclamation in a Chinese estuary

    Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Wang, Junjing; Gao, Zhaoqin; Zhang, Guangliang

    2015-12-01

    Soil profiles were collected at a depth of 30 cm in ditch wetlands (DWs), riverine wetlands (RiWs) and reclaimed wetlands (ReWs) along a 100-year chronosequence of reclamation in the Pearl River Delta. In total, 16 OCPs were measured to investigate the effects of wetland reclamation and reclamation history on OCP levels. Our results showed that average ∑DDTs, HCB, MXC, and ∑OCPs were higher in surface soils of DWs compared to RiWs and ReWs. Both D30 and D20 soils contained the highest ∑OCP levels, followed by D40 and D100 soils; lower ∑OCP levels occurred in D10 soils. Higher ∑OCP levels were observed in the younger RiWs than in the older ones, and surface soils exhibited higher ∑OCP concentrations in the older ReWs compared with younger ReWs. The predominant percentages of γ-HCH in ∑HCHs (>42%) and aldrin in ∑DRINs (>46%) in most samples reflected the recent use of lindane and aldrin. The presence of dominant DDT isomers (p,p’-DDE and p,p’-DDD) indicated the historical input of DDT and significant aerobic degradation of the compound. Generally, DW soils had a higher ecotoxicological risk of OCPs than RiW and ReW soils, and the top 30 cm soils had higher ecotoxicological risks of HCHs than of DDTs.

  6. A geochemical record of environmental changes in sediments from Sishili Bay, northern Yellow Sea, China: Anthropogenic influence on organic matter sources and composition over the last 100 years

    Highlights: • Increased TOC and TN in the sediment cores indicated a eutrophic trend since 1975. • Marine organic matter sources dominated in Sishili Bay. • Scallop culture displayed mitigation on eutrophication pressures in Sishili Bay. • Increased fertilizer use well matched eutrophic process in Sishili Bay in 1975. -- Abstract: Total organic carbon (TOC), total nitrogen (TN), δ13C and δ15N were measured in sediment cores at three sites in Sishili Bay, China, to track the impacts of anthropogenic activities on the coastal environment over the last 100 years. The increased TOC and TN in the upper section of sediment cores indicated a eutrophic process since 1975. In comparison, the TOC and TN in the sediment core near to a scallop aquaculture area displayed a much slower increase, indicating the contribution of scallop aquaculture in mitigating eutrophication. Combined information from δ13C, δ15N and TOC:TN indicated an increased terrestrial signal, although organic matter sources in Sishili Bay featured a mixture of terrestrial and marine sources, with phytoplankton being dominant. Increased fertilizer use since 1970s contributed to the eutrophic process in Sishili Bay since 1975, and increased sewage discharge from 1990s has added to this process

  7. Application of a stent splint to protect intraoral organs from radiation injury to a 97 year-old patient with multiple oral cancers who survived over 100 year-old

    Radiation therapy had been used with increasing frequency in recent years in the management of oral cancers of advanced ages. In those cases we have to take good care to maintain the oral health of patients undergoing cancerocidal dose of radiation therapy. Using splints, as a tissue displacer, during radiation, we could treat a 99-year-old female patient without serious radiation sequelae, successfully she survived over 100 year-old. As she visited us at 97 year-old, the primary lesions located on the left upper lip, nose, upper and lower gums were diagnosed as multiple verrucous carcinoma histologically. Seventeen months after the first radiotherapy to the lip, nose and upper jaw, we planned again radiotherapy to the recurrent tumor of the lower gum. In order to eliminate and minimize side effects of the second irradiation for the contigenous intraoral organs, we devised a splint to exclude the tongue and upper gum apart from a radiation field. The splint, as tissue displacer, was made of heat-cured acrylic resin and divided into two pieces which were formed like full denture without artificial teeth. They were applied to the upper and lower jaws. The lower one had a large wing to exclude the tongue from irradiation field. After setting of the splint, she had been clenched slightly with an aid of chin cap. Then we could finish successfully the radiotherapy with 10 MV X-ray 40 Gy as scheduled without serious troubles. (author)

  8. 100-Year Floodplains, Digital Floodplain maps create by WI DNR added to our website in 2013, Published in 2013, 1:24000 (1in=2000ft) scale, Oneida County Wisconsin.

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Published Reports/Deeds information as of 2013. It is...

  9. Upwelling and anthropogenic forcing on phytoplankton productivity and community structure changes in the Zhejiang coastal area over the last 100 years

    DUAN Shanshan; XING Lei; ZHANG Hailong; FENG Xuwen; YANG Haili; ZHAO Meixun

    2014-01-01

    Phytoplankton productivity and community structure in marginal seas have been altered significantly dur-ing the past three decades, but it is still a challenge to distinguish the forcing mechanisms between climate change and anthropogenic activities. High time-resolution biomarker records of two 210Pb-dated sediment cores (#34:28.5°N, 122.272°E;CJ12-1269:28.861 9°N, 122.515 3°E) from the Min-Zhe coastal mud area were compared to reveal changes of phytoplankton productivity and community structure over the past 100 years. Phytoplankton productivity started to increase gradually from the 1970s and increased rapidly after the late 1990s at Site #34;and it started to increase gradually from the middle 1960s and increased rapidly after the late 1980s at Site CJ12-1269. Productivity of Core CJ12-1269 was higher than that of Core #34. Phy-toplankton community structure variations displayed opposite patterns in the two cores. The decreasing D/B (dinosterol/brassicasterol) ratio of Core #34 since the 1960s revealed increased diatom contribution to total productivity. In contrast, the increasing D/B ratio of Core CJ12-1269 since the 1950s indicated in-creased dinoflagellate contribution to total productivity. Both the productivity increase and the increased dinoflagellate contribution in Core CJ12-1269 since the 1950-1960s were mainly caused by anthropogenic activities, as the location was closer to the Changjiang River Estuary with higher nutrient concentration and decreasing Si/N ratios. However, increased diatom contribution in Core #34 is proposed to be caused by increased coastal upwelling, with higher nutrient concentration and higher Si/N ratios.

  10. Polychlorinated biphenyls (PCBs) in sediments/soils of different wetlands along 100-year coastal reclamation chronosequence in the Pearl River Estuary, China.

    Zhao, Qingqing; Bai, Junhong; Lu, Qiongqiong; Gao, Zhaoqin; Jia, Jia; Cui, Baoshan; Liu, Xinhui

    2016-06-01

    PCBs (polychlorinated biphenyls) were determined in sediment/soil profiles to a depth of 30 cm from three different wetlands (i.e., ditch wetlands, riparian wetlands and reclaimed wetlands) of the Pearl River Estuary to elucidate their levels, distribution and toxic risks along a 100-year chronosequence of reclamation. All detected PCB congeners and the total 15 PCBs (∑15 PCBs) decreased with depth along sediment/soil profiles in these three wetlands. The ∑15 PCBs concentrations ranged from 17.68 to 169.26 ng/g in surface sediments/soils. Generally, old wetlands tended to have higher PCB concentrations than younger ones. The dominant PCB congeners at all sampling sites were light PCB homologues (i.e., tetra-CBs and tri-CBs). According to the sediment quality guideline, the average PCB concentrations exceeded the threshold effects level (TEL, 21.6 ng/g) at most of the sampling sites, exhibiting possible adverse biological effects, which were dominantly caused by light PCB congeners. The total toxic equivalent (TEQ) concentrations of 10 dioxin-like PCBs (DL-PCBs) detected at all sampling sites ranged from 0.04 to 852.7 (10(-3) ng/g), mainly affected by PCB126. Only DL-PCB concentrations in ditch and riparian wetland sediments with 40-year reclamation histories (i.e., D40 and Ri40) exhibited moderate adverse biological effects according to SQGQ values. Principal component analysis indicated that PCBs in three wetland sediments/soils mainly originated from Aroclor 1016, 1242, and 1248. Correlation analysis showed that sediment/soil organic carbon content had a significant correlation with the concentrations of several PCB congeners (P  0.05). PMID:27038573

  11. Changes in stable isotopes, lignin-derived phenols, and fossil pigments in sediments of Lake Biwa, Japan: Implications for anthropogenic effects over the last 100 years

    We measured stable nitrogen (N) and carbon (C) isotope ratios, lignin-derived phenols, and fossil pigments in sediments of known ages to elucidate the historical changes in the ecosystem status of Lake Biwa, Japan, over the last 100 years. Stable N isotope ratios and algal pigments in the sediments increased rapidly from the early 1960s to the 1980s, and then remained relatively constant, indicating that eutrophication occurred in the early 1960s but ceased in the 1980s. Stable C isotope ratios of the sediment increased from the 1960s, but decreased after the 1980s to the present. This decrease in stable C isotope ratios after the 1980s could not be explained by annual changes in either terrestrial input or algal production. However, when the C isotope ratios were corrected for the Suess effect, the shift to more negative isotopic value in atmospheric CO2 by fossil fuel burning, the isotopic value showed a trend, which is consistent with the other biomarkers and the monitoring data. The trend was also mirrored by the relative abundance of lignin-derived phenols, a unique organic tracer of material that originated from terrestrial plants, which decreased in the early 1960s and recovered to some degree in the 1980s. We detected no notable difference in the composition of lignin phenols, suggesting that the terrestrial plant composition did not change markedly. However, we found that lignin accumulation rate increased around the 1980s. These results suggest that although eutrophication has stabilized since the 1980s, allochthonous organic matter input has changed in Lake Biwa over the past 25 years

  12. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    Xiao, Rong [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); School of Nature Conservation, Beijing Forestry University, Beijing 100083 (China); Bai, Junhong, E-mail: junhongbai@163.com [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China)

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  13. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  14. 100 Years of benthic foraminiferal history on the inner Texas shelf inferred from fauna and stable isotopes: Preliminary results from two cores

    Strauss, Josiah; Grossman, Ethan L.; Carlin, Joseph A.; Dellapenna, Timothy M.

    2012-04-01

    Coastal regions, such as the Texas-Louisiana shelf, are subject to seasonal hypoxia that strongly depends on the magnitude of freshwater discharge from local and regional river systems. We have determined benthic foraminiferal fauna and isotopic compositions in two 210Pb dated box cores (BR4 and BR5) to examine the evidence for nearshore hypoxia and freshwater discharge on the Texas shelf during the last 100 years. The 210Pb chronologies of both cores reveal sedimentation rates of 0.2 and 0.1 cm yr-1, translating to ˜60 and ˜90 year records. The fauna of both cores were almost exclusively composed of Ammonia parkinsoniana and Elphidium excavatum, indicating euryhaline ambient waters. The Ammonia-Elphidium (A-E) index, a qualitative measure of low oxygen conditions, shows an increase from values between 20 and 50 to near 100 in both cores, suggesting low oxygen conditions between 1960 and the core top. Between 1950 and 1960 (9-10 cm), low A-E values in BR4 coincide with high δ18O and δ13C values greater than 0‰ and -1‰ respectively. This event corresponds to severe drought (the Texas Drought of Record) over the Brazos River drainage basin and considerably reduced river discharge from 1948 to 1957. High A-E values prior to this event imply low-oxygen conditions were prevalent prior to anthropogenic exacerbation of Louisiana shelf hypoxia and at least since the dredging of a new Brazos River delta in 1929. Elphidium excavatum δ13C values are very low (-4‰) and indicative of significant vital effect. The δ13C values of A. parkinsoniana average -3‰ and exhibit little variability, most likely reflecting pore waters influenced by aerobic and anaerobic respiration. The association of lowered Brazos River discharge with more oxygenated shelf bottom waters suggests Brazos River discharge and shelf hypoxia are linked, but the influence of Mississippi-Atchafalaya discharge can also contribute to shelf stratification.

  15. Indications of progressive desiccation of the Transvaal Lowveld over the past 100 years, and implications for the water stabilization programme in the Kruger National Park

    U. De V. Pienaar

    1985-12-01

    Full Text Available All available rainfall statistics recorded for the Kruger National Park area since 1907, coupled with an analysis of all the historical climatological data on hand, appear to confirm the quasi-twenty-year rainfall oscillation in precipitation pattern for the summer rainfall area. This was first pointed out by Tyson & Dyer (1975. The dendrochronological data obtained by Hall (1976 from a study of growth rings of a very old yellowwood tree (Podocarpus falcatus in Natal, also appear to indicate a superimposed, long-term (80-100 years pattern of alternate below- average and above-average rainfall periods. The historical data relating to climate in the park, during the past century or two, seem to bear out such a pattern. If this can be confirmed, it will be an enormous aid not only in wildlife-management planning, but also to agriculturists, demographic planners and others. It would appear that the long, relatively dry rainfall period of 1860-1970, with its concomitant progressive desiccation of the @ area in question, has passed over into the next aboveverage rainfall era. This does not mean that there will be no further cataclysmic droughts during future rainfall trough periods. It is therefore wise to plan ahead to meet such contingencies. The present water distribution pattern in the park (natural plus artificial water is conspicuously still well below that which pertained, during dry seasons, at the turn of the century, when the Sabi and Shingwedzi game reserves were proclaimed. It is the declared policy of the National Parks Board of Trustees to simulate natural regulating mechanisms as closely as possible. In consequence the artificial water-for-game program is a long way from completion. The large numbers of game animals in the park (including dominant species such as elephant Loxodonta africana and buffalo Syncerus coffer can no longer migrate out of the area to escape natural catastrophes (such as the crippling droughts of 1911-1917, the

  16. Simulation tools

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  17. Centennial annual general meeting of the CIM/CMMI/MIGA. Montreal `98: a vision for the future; 100 years of ground subsidence studies

    Chrzanowski, A.; Szostak-Chrzanowski, A.; Forrester, D.J. [University of New Brunswick, Fredericton, NB (Canada)

    1998-12-31

    Some of the empirical methods developed in central Europe for monitoring and analysis of ground subsidence have been adapted to North American conditions. A century of subsidence observations in Cape Breton is outlined. Empirical methods are being replaced by deterministic modelling of rock behaviour methods, that applies numerical methods to development of subsidence models. These deterministic models can be verified by monitoring under diverse geological and mining conditions. Some of the new monitoring methods developed in Canada are illustrated by case studies describing the use of hydrographic surveys to measure subsidence in offshore coal mines, a telemetric monitoring system for a coal mine in British Columbia, and deterministic monitoring and modelling of ground subsidence in a potash mine. 29 refs., 9 figs., 2 tabs.

  18. Decision support tool for Virtual Power Players: Hybrid Particle Swarm Optimization applied to Day-ahead Vehicle-To-Grid Scheduling

    Soares, João; Valle, Zita; Morais, Hugo

    2013-01-01

    This paper presents a decision support Tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy ressource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-aheadscheduling minimizing total operation cost...

  19. Proposition of a PLM tool to support textile design: A case study applied to the definition of the early stages of design requirements

    SEGONDS, Frédéric; Mantelet, Fabrice; Nelson, Julien; Gaillard, Stéphane

    2015-01-01

    The current climate of economic competition forces businesses to adapt more than ever to the expectations of their customers. Faced with new challenges, practices in textile design have evolved in order to be able to manage projects in new work environments. After presenting a state of the art overview of collaborative tools used in product design and making functional comparison between PLM solutions, our paper proposes a case study for the development and testing of a collaborative platform...

  20. Applying standards to ICT models, tools and data in Europe to improve river basin networks and spread innovation on water sector

    Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier

    2015-04-01

    This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking

  1. 100 years of the main mine rescue service. A contribution to the protection against disasters in the coal mining industry; 100 Jahre Hauptstelle fuer das Grubenrettungswesen. Ein Beitrag zum Katastrophenschutz im Steinkohlenbergbau

    Hermuelheim, Walter [RAG Aktiengesellschaft, Herne (Germany). Zentralbereich Arbeits-, Gesundheits- und Umweltschutz

    2011-06-15

    A review of 100 years of protection against disasters in the coal mining industry impressively shows the way from an era of major accidents to a modern branch of industry, which justifiably and with good prospects of success can pursue the aim of ''No accidents - no damage to health - no damage to the environment''. However, the development of the mine rescue service over more than 100 years - represented in the Ruhr by the Main Mine Rescue Service established in 1910 in Essen - would be incomplete without consideration of the allied technical fields underground fire protection and explosion protection. Cooperation between institutions such as the Tremonia test mine and the BVG has produced a safety level in all three fields, which is regarded as exemplary worldwide, and in addition to the latest mining technology is a good advertisement for the German coal mining industry. (orig.)

  2. An H-formulation-based three-dimensional hysteresis loss modelling tool in a simulation including time varying applied field and transport current: the fundamental problem and its solution

    When analytic solutions are not available, finite-element-based tools can be used to simulate hysteresis losses in superconductors with various shapes. A widely used tool for the corresponding magnetoquasistatic problem is based on the H-formulation, where H is the magnetic field intensity, eddy current model. In this paper, we study this type of tool in a three-dimensional simulation problem. We consider a case where we simultaneously apply both a time-varying external magnetic field and a transport current to a twisted wire. We show how the modelling decisions (air has high finite resistivity and applied field determines the boundary condition) affect the current density distribution along the wire. According to the results, the wire carries the imposed net current only on the boundary of the modelling domain, but not inside it. The current diffuses to the air and back to the boundary. To fix this problem, we present another formulation where air is treated as a region with 0 conductivity. Correspondingly, we express H in the air with a scalar potential and a cohomology basis function which considers the net current condition. As shown in this paper, this formulation does not fail in these so-called AC-AC (time varying transport current and applied magnetic field) simulations. (paper)

  3. Decision support tool for Virtual Power Players: Hybrid Particle Swarm Optimization applied to Day-ahead Vehicle-To-Grid Scheduling

    Soares, João; Valle, Zita; Morais, Hugo

    2013-01-01

    This paper presents a decision support Tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy ressource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application...... of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric...... network constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance...

  4. Modified Linear Theory Aircraft Design Tools and Sonic Boom Minimization Strategy Applied to Signature Freezing via F-function Lobe Balancing

    Jung, Timothy Paul

    Commercial supersonic travel has strong business potential; however, in order for the Federal Aviation Administration to lift its ban on supersonic flight overland, designers must reduce aircraft sonic boom strength to an acceptable level. An efficient methodology and associated tools for designing aircraft for minimized sonic booms are presented. The computer-based preliminary design tool, RapidF, based on modified linear theory, enables quick assessment of an aircraft's sonic boom with run times less than 30 seconds on a desktop computer. A unique feature of RapidF is that it tracks where on the aircraft each segment of the of the sonic boom came from, enabling precise modifications, speeding the design process. Sonic booms from RapidF are compared to flight test data, showing that it is capability of predicting a sonic boom duration, overpressure, and interior shock locations. After the preliminary design is complete, scaled flight tests should be conducted to validate the low boom design. When conducting such tests, it is insufficient to just scale the length; thus, equations to scale the weight and propagation distance are derived. Using RapidF, a conceptual supersonic business jet design is presented that uses F-function lobe balancing to create a frozen sonic boom using lifting surfaces. The leading shock is reduced from 1.4 to 0.83 psf, and the trailing shock from 1.2 to 0.87 psf, 41% and 28% reductions respectfully. By changing the incidence angle of the surfaces, different sonic boom shapes can be created, and allowing the lobes to be re-balanced for new flight conditions. Computational fluid dynamics is conducted to validate the sonic boom predictions. Off-design analysis is presented that varies weight, altitude, Mach number, and propagation angle, demonstrating that lobe-balance is robust. Finally, the Perceived Level of Loudness metric is analyzed, resulting in a modified design that incorporates other boom minimization techniques to further reduce

  5. Scoring functions--the first 100 years.

    Tame, Jeremy R H

    2005-06-01

    The use of simple linear mathematical models to estimate chemical properties is not a new idea. Albert Einstein used very simple 'gravity-like' forces to explain the capillarity of different liquids in 1900-1901. Today such models are used in more complicated situations, and a great many have been developed to analyse interactions between proteins and their ligands. This is not surprising, since proteins are too complicated to model accurately without lengthy numerical analysis, and simple models often do at least as good a job in predicting binding constants as much more computationally expensive methods. One hundred years after Einstein's 'miraculous year' in which he transformed physics, it is instructive to recall some of his even earlier work. As approximations, 'scoring functions' are excellent, but it is dangerous to read too much into them. A few cautionary tales are presented for the beginner to the field of ligand affinity prediction by linear models. PMID:16231202

  6. 100 years of ionizing radiation protection

    The development of radiation protection from the end of 19. century and evolution of opinion about injurious effect of ionizing radiation were presented. Observations of undesirable effects of ionizing radiation exposition, progress of radiobiology and dosimetry directed efforts toward radiation protection. These activities covered, at the beginning, limited number of persons and were subsequently extended to whole population. The current means, goals and regulations of radiological control have been discussed

  7. Appraising Schumpeter's "Essence" after 100 years

    Andersen, Esben Sloth

    Schumpeter's unique type of evolutionary analysis can hardly be understood unless we recognise that he developed it in relation to a study of the strength and weaknesses of the Walrasian form of Neoclassical Economics. This development was largely performed in his first book 'Wesen und Hauptinhalt...... der theoretischen Nationalökonomie'. This German-language book-which in English might be called 'Essence and Scope of Theoretical Economics'-was published a century ago (in 1908). Different readings of Wesen provide many clues about the emergence and structure of Schumpeter's programme for teaching...... and research. This programme included a modernisation of static economic analysis but he concentrated on the difficult extension of economic analysis to cover economic evolution. Schumpeter thought that this extension required a break with basic neoclassical assumptions, but he tried to avoid...

  8. Lurpak: Ready for another 100 years?

    Grunert, Klaus G.

    2001-01-01

    The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of.......The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of....

  9. Peacock: 100 years of servicing Canadian industry

    In 1997 Peacock Inc., a supplier of pipeline, filtration, pumping, materials handling and mechanical equipment of all kinds to the Canadian oil and natural gas industries, will celebrate its 100th year of servicing Canadian industry, and 50th year in the oil patch. The company has outlets in several Canadian cities from Halifax to Vancouver. It manufactures, distributes, maintains and repairs all types of industrial equipment. It also manages the Naval Engineering Test Establishment at LaSalle, PQ, for the Department of Defence. Peacock service centres provide 24-hour service response to emergency breakdowns anywhere in Canada; its engineers and technicians are ISO 9003 qualified or better, and are experts in turnarounds and planned maintenance outages, major overhauls of critical equipment, supplying mechanical crews for emergency equipment breakdowns, and grouting of heavy machinery. By close coordination of its four divisions, and by maintaining their dedication to service, the company looks to the future with pride and confidence

  10. The Flexner Report--100 years later.

    Duffy, Thomas P

    2011-09-01

    The Flexner Report of 1910 transformed the nature and process of medical education in America with a resulting elimination of proprietary schools and the establishment of the biomedical model as the gold standard of medical training. This transformation occurred in the aftermath of the report, which embraced scientific knowledge and its advancement as the defining ethos of a modern physician. Such an orientation had its origins in the enchantment with German medical education that was spurred by the exposure of American educators and physicians at the turn of the century to the university medical schools of Europe. American medicine profited immeasurably from the scientific advances that this system allowed, but the hyper-rational system of German science created an imbalance in the art and science of medicine. A catching-up is under way to realign the professional commitment of the physician with a revision of medical education to achieve that purpose. PMID:21966046

  11. Trends in nuclear physics. 100 years later

    In the first years after the discovery of radioactivity it became clear that nuclear physics was, by excellence, the science of small quantum systems. Between the fifties and the eighties nuclear physics and elementary particles physics lived their own lives, without much interaction. During this period the basic concepts were defined. Recently, contrary to the specialization law often observed in science, the overlap between nuclear and elementary particle physics has become somewhat blurred. This Les Houches Summer School was set up with the aim of fighting off the excessive specialization evident in many international meetings, and return to the roots. The twofold challenge of setting up a fruitful exchange between experimentalists and theorists in the first place, and between nuclear and hadronic matter physicists in the second place was successfully met. The volume presents high quality, up-to-date reviews starting with an account of the birth and first developments of nuclear physics. Further chapters discuss the description of the nuclear structure, the physics of nuclei at very high spin, the existence of super-heavy nuclei as a consequence of shell structure, liquid-gas transition, including both a description and a review of the experimental situation. Other topics dealt with include the interactions between moderately relativistic heavy ions, the concept of a nucleon dressed by a cloud of pions, the presence of pions in the nucleus, the subnucleonic phenomena in nuclei and quark-gluons deconfinement transition, both theoretical and experimental aspects. Nuclear physics continues to influence many other fields, such as astrophysics, and is also inspired by these same fields. This cross-fertilisation is illustrated by the treatment of neutron stars in one of the final chapters. The last chapter provides an overview of a recent development in which particle and nuclear physicists have cooperated to revitalize an alternative method for nuclear energy production associating high energy production accelerators and sub-critical neutron multiplying assemblies

  12. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  13. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  14. Solar geometry tool applied to systems and bio-climatic architecture; Herramienta de geometria solar aplicada a sistemas y arquitectura bio-climatica

    Urbano, Antonio; Matsumoto, Yasuhiro; Aguilar, Jaime; Asomoza Rene [CIMVESTAV-IPN, Mexico, D.F (Mexico)

    2000-07-01

    The present article shows the annual solar path, by means of graphic Cartesian, as well as the use of these, taken as base the astronomical, geographical antecedents and of the place. These graphs indicate the hours of sun along the day, month and year for the latitude of 19 Celsius degrees north, as well as the values of radiation solar schedule for the most important declines happened annually (equinoxes, solstices and the intermediate months). These graphs facilitate the user's good location to evaluate inherent obstacles of the environment and to determine in the place, the shades on the solar equipment or immovable (mountains, tree, buildings, windows, terraces, domes, et cetera), the hours of sun or the radiation for the wanted bio-climatic calculation. The present work is a tool of place engineering for the architects, designers, manufactures, planners, installers, energy auditors among other that require the use of the solar energy for anyone of its multiple applications. [Spanish] El presente articulo, muestra las trayectorias solares anules, mediante graficas cartesianas, asi como la utilizacion de estas, tomando como base los antecedentes astronomicos, geograficos y del lugar. Estas graficas indican las horas del sol a lo largo del dia, mes y ano para la latitud de 19 grados Celsius norte, asi como los valores de radiacion solar horaria para las declinaciones mas importantes ocurridas anualmente (equinoccios, solsticios y los meses intermedios). Estas graficas facilitan la ubicacion optima del usuario para evaluar obstaculos inherentes del entorno y determinar en el sitio, las sombras sobre los equipos solares o inmuebles (montanas, arboles, edificios, ventanas, terrazas, domos, etc.), las horas de sol o bien la radiacion para el calculo bio-climatico deseado. El presente trabajo es una herramienta de Ingenieria de sitio para los Arquitectos, Disenadores, Constructores, Proyectistas, Instaladores, Auditores Energeticos entre otros, que requieran el

  15. Science serving people. IAEA-supported projects are helping countries apply the right tools to fight food, health, and water problems

    A new booklet 'Science Serving People' features stories about how IAEA-supported projects are making a difference in many poorer countries. The stories describe applications of nuclear science and technology that are being used through technical cooperation channels to overcome challenges of water scarcity, food shortage, malnutrition, malaria, environmental degradation and many other problems. They also illustrate how the complementary development, safety, and security initiatives of the IAEA are fostering atoms for peace in the developing world. Extreme poverty and deprivation remain a problem of monumental proportions at the dawn of the 21st century, notes IAEA Director General Mohamed ElBaradei in the booklet's Introduction. Through effective partnerships, collaborative research, and strategic direction, the IAEA is contributing to global efforts to help the poor. IAEA programmes have entered an important phase, he said, in which scientific contributions to Member States are yielding very sizeable human benefits. It's clear that science and technology must be better mobilized to meet the needs of the poor, emphasizes Jeffrey Sachs, Director of the Earth Institute at Columbia University, USA, and Special Advisor to UN Secretary-General Kofi Annan. The UN agencies, such as the IAEA, have a great role to play, he says in the booklet's Foreword. This is especially so, he points out, if they act as a bridge between the activities of advanced- country and developing country scientific centres, and if they help to harness the advances of world science for the poor as well as the rich. The bottom line, he concludes, is that rich countries should expand support for those United Nations organizations that can help in solving the unique problems confronting the world's poorest peoples. The booklet features stories on managing water resources, promoting food security, focusing science on health problems, new tools for environmental management, and strengthening nuclear

  16. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  17. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available. PMID:26985673

  18. Multivariate curve resolution applied to in situ X-ray absorption spectroscopy data: An efficient tool for data processing and analysis

    Highlights: • Use of MCR algorithms to extract component spectra of different kinetic evolution. • Obtaining components and concentration profiles without use of reference spectra. • Automatic extraction of meaningful component profiles from large XAS datasets. - Abstract: Large datasets containing many spectra commonly associated with in situ or operando experiments call for new data treatment strategies as conventional scan by scan data analysis methods have become a time-consuming bottleneck. Several convenient automated data processing procedures like least square fitting of reference spectra exist but are based on assumptions. Here we present the application of multivariate curve resolution (MCR) as a blind-source separation method to efficiently process a large data set of an in situ X-ray absorption spectroscopy experiment where the sample undergoes a periodic concentration perturbation. MCR was applied to data from a reversible reduction–oxidation reaction of a rhenium promoted cobalt Fischer–Tropsch synthesis catalyst. The MCR algorithm was capable of extracting in a highly automated manner the component spectra with a different kinetic evolution together with their respective concentration profiles without the use of reference spectra. The modulative nature of our experiments allows for averaging of a number of identical periods and hence an increase in the signal to noise ratio (S/N) which is efficiently exploited by MCR. The practical and added value of the approach in extracting information from large and complex datasets, typical for in situ and operando studies, is highlighted

  19. Management Tools

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  20. How credible are the study results? Evaluating and applying internal validity tools to literature-based assessments of environmental health hazards.

    Rooney, Andrew A; Cooper, Glinda S; Jahnke, Gloria D; Lam, Juleen; Morgan, Rebecca L; Boyles, Abee L; Ratcliffe, Jennifer M; Kraft, Andrew D; Schünemann, Holger J; Schwingl, Pamela; Walker, Teneille D; Thayer, Kristina A; Lunn, Ruth M

    2016-01-01

    Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or "risk of bias" - the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program's (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or "domains" for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. PMID:26857180

  1. Advanced experimental tools designed for the assessment of the thermal load applied to the mixing tee and nozzle geometries in the PWR plant

    Thermal fatigue studies have been started again, after the incident in the heat removal system of the Civaux NPP (may 1998). Thermal fatigue problem was suspected, the cracks occurred in the mixing tee were probably due to the fluctuation at large gap of temperature. This paper introduces the experimental strategy directed by the CEA for assessing the thermal load in the mixing area. 2 mockups of mixing tee, similar in geometry are tested in the FATHERINO facility under similar thermal hydraulic conditions; the first one in brass for selecting the mixing areas where the temperature fluctuation is high and the second one in stainless steel for setting measurements with local specific sensors to determine the thermal load. The specific sensors, Tf fluid, fluxmeter sensor, and the Coefh sensor, record in local, the fluctuation close to the wall in the fluid and in the wall. The heat flux sensor 'fluxmeter' and the heat transfer sensor 'Coefh' are equipped with 3 micro thermocouples in their respective body, non intrusive and typically designed to catch the fluctuations within a low attenuation in the frequency range from 0 to 20 Hz. By applying an inverse heat conduction method to the output data given by the fluxmeter, the wall temperature (mean and fluctuating values) at the internal surface can be accurately determined. The Coefh sensor is like a fluxmeter sensor using the same technology but equipped with a thermocouple in the fluid to determine the heat transfer coefficient. In addition, the results from both experiments (brass and stainless steel mockups) are implemented as input data for the needs of the CFD calculation. The FATHERINO experiment consists of emptying 2 vessels in the mockup, initially field in water and conditioned at low (5 C) and high temperature (80 C). There is a motor pump for each line (cold and hot legs) and the flow rate is controlled by temperature valve

  2. The Magnetosusceptibility Stratigraphy (MS) Applied as a Correlation and High Precision Relative Dating Tool in Archaeology: Application to Caves in Spain and Portugal

    Ellwood, B. B.; Arbizu, M.; Arsuaga, J.; Harrold, F.; Zilhao, J.; Adán, G. E.; Aramburu, A.; Fombella, M. A.; Bedia, I. M.; Alvarez-Laó, D.; García, M.

    2005-05-01

    The magnetic susceptibility (MS) method, when carefully applied, can be used to correlatie between sediment sequences and to characterize the paleoclimate at the time the sediments were deposited in protected archaeological sites, such as within caves or deep rock shelters. This method works because the MS of sediments outside caves, that are eventually deposited in caves, is controlled by pedogenesis that in turn is driven by climate. Here we summarize the method and discuss ways designed to identify anomalous samples that should not be used in relative dating or for correlations. We will then present our results from Cueva del Conde located in the Province of Asturias, northwestern Spain, and compare those results with results from other caves from Spain and Portugal. Cueva del Conde was first excavated in 1915, with additional excavations and studies performed in 1962, 1965, and 1999. The current excavations began in 2001. This body of work identified a transitional sequence from Middle Paleolithic (Mousterian) to early Upper Paleolithic (Aurignacian) artifacts, including perhaps the earliest art known from the Upper Paleolithic, thus establishing Cueva del Conde as an important Paleolithic cave site. We collected a continuous series of 44 samples, each covering about 0.027 m of section, from an exposed 1.2 m sequence within the cave. This section has been excavated and studied by archaeologists working at the site and three 14C dates from charcoal have been reported. The MS for samples collected for this study were measured using the susceptibility bridge at LSU. The MS shows a systematic cyclicity that when constrained by the 14C ages can be correlated to our MS standard curve for Europe (Ellwood et al., 2001; Harrold et al., 2004), and thus to other sites in the region. This cyclicity we interpret to result from climate fluctuations. By comparison to our MS standard curves, we are able to assign MS relative ages to Cueva del Conde that extends the sequence

  3. 100 years of thermal waste treatment - off-gas purification technology then and now. Performance results of the Stellinger Moor waste incineration plant at Hamburg; 100 Jahre thermische Abfallbehandlung - Abgasreinigungstechnik damals und heute, Betriebserfahrungen der MVA Stellinger Moor, Hamburg

    Franck, J. [Stadtreinigung Hamburg, MVA Stellinger Moor, Hamburg (Germany); Schellenberger, I. [Goepfert, Reimer und Partner, Hamburg (Germany); Karpinski, A. [Lentjes Energietechnik GmbH, Essen (Germany)

    1997-06-01

    The contribution outlines the history of thermal waste treatment, starting from the first such plant constructed at Hamburg-Bullerdeich. It also goes into the social context in which the decision to construct such a plant was made. As an example of a modern, up-to-date system, the Stellinger Moor plant at Hamburg is described for comparison. The flue gas purification process employed here reflects the most recent state of the art. If the technical and social boundary conditions of 100 years ago are compared with those of today, one sees how far plant technology has advanced since the first years, especially in the field of emission reduction. On the other hand, the acceptance problems facing operators of thermal waste incineration systems are still the same as 100 years ago. [Deutsch] Dieser Beitrag vermittelt einen Eindruck ueber die Anfaenge der thermischen Abfallbehandlung. Es wird gezeigt, mit welchen technischen Raffinessen die erste Anlage am Bullerdeich in Hamburg errichtet und betrieben wurde. Daneben wird das gesellschaftliche Umfeld angerissen, in dem die Entscheidung fuer eine derartige Behandlungsanlage reifte. Um einen Vergleich mit der heutigen, modernen Anlagentechnik zu ermoeglichen, wird die MVA Stellinger Moor in Hamburg technisch und betrieblich vorgestellt. Das angewandte technische Verfahren der Abgasreinigung entspricht voll und ganz den heutigen hohen Erwartungen der Gesetzgebung. Vergleicht man die Randbedingungen von vor 100 Jahren mit den heutigen sowohl in technischer als auch in gesellschaftlicher Sicht, so ist festzustellen, dass es eine gewaltige Entwicklung bei der Anlagentechnik, besonders vor dem Hintergrund der Reduzierung von Schadgasemissionen, gegeben hat. Die Akzeptanzprobleme fuer thermische Behandlungsanlagen sind aber heute noch genauso vorhanden wie vor 100 Jahren. (orig.)

  4. The“Musical Ningboese” and the Emergenceof China’s Piano Industry in the Past 100 Years%音乐“宁波帮”与中国百年钢琴制作的崛起

    沈浩杰

    2015-01-01

    Without piano, there would be no the development in the musical field in the past 100 years in China. By taking advantage of their inner priorities and outer environments, the people of Ningbo, as the first Chinese people engaged in the piano-making sector, have dominated the all-round development in the piano-related fields such as the production of the spare parts and whole products, marketing, research and development, talents’ training, piano education and service, and they have always been the leader in the 100 years when China’s piano-making has developed from a small and weak industry to a large and strong one. The large and excellent group of people in the piano-making field and the other local professional musicians form the“Musical Ningboese”, a unique group of people in the modern musical history in China.%没有钢琴,就没有中国一百多年新音乐事业的发展。宁波人作为近代以来首批涉略钢琴制作的中国人,善于凭借自身优势和外部环境,主导钢琴零部件、整件制作、市场拓展、科研开发、人才培养、教育服务等全方位建设,始终引领着中国钢琴制作由小变大,由弱到强的百年发展史。这支跨越百年的庞大而优秀的从业队伍和其他宁波籍专业音乐家合在一起,形成了音乐“宁波帮”这一中国近现代音乐史上独一无二的群体。

  5. 100 years of Zukunft/Inden opencast mine. Lignite mining west of the Inde river between Eschweiler and Juelich; 100 Jahre Zukunft - Tagebau Inden. Braunkohlengewinnung westlich der Inde zwischen Eschweiler und Juelich

    Roeggener, Oliver; Oster, Arthur [RWE Power AG, Eschweiler (Germany). Tagebau Inden

    2010-11-15

    With the extraction of the first coal at the Zukunft opencast mine, industrial lignite mining in the west of the Rhenish mining area commenced 100 years ago in September 1910. The setting up of ''BIAG Zukunft'' and the commissioning of the power station of the same name only a few years later triggered sustainable growth of this industry and of the entire region. In the course of the dynamic events in the years that followed, development of the Zukunft-West follow-up opencast mine was finally started, the Weisweiler briquette factory erected and extended, the Zukunft power plant's capacity built up to 230 MW and RWE's Weisweiler power station constructed. The merger of the four big Rhenish lignite-mining companies led to profound structural adjustments in the west of the mining area as well. Where the lignite had previously been extracted in different opencast mines, mining activities and power generation were now focused on own companies whose evolution was driven forward strategically. The Inden mine, which was in the development phase in parallel with the Zukunft mine, was temporarily discontinued in this course of events. In 1981, it was successively re-commissioned to provide an offset for the incipient exhaustion of the Zukunft-West mine. In the further course of opencast mining, several towns have had to be resettled by today, and the Inde river and the mine's belt junction relocated. The Inden mine, too, will be exhausted around the year 2030 and will be recultivated, as the first of the still-operational Rhenisch opencast mines, with a good sized residual lake. By resolution of the Lignite Commission dated 5 December 2008, the Inden II Lignite Plan was therefore amended with the aim of creating a substantial lake instead of backfilling the final void with masses from the Hambach mine. Creating the roughly 11-km{sup 2} lake will also be accompanied with economic structural change in the region. According to expert

  6. 近百年El Nino/La Nina事件与北京气候相关性分析%Correlation Analysis Between El Nino/La Nina Phenomenon During the Recent 100 Years and Beijing Climate

    刘桂莲; 张明庆

    2001-01-01

    Results of the analysis suggest that during the recent 100 years there e xists a strong correlation between the El Nino/La Nina phenomenon and Beijing′s rainfall in summer(June—August),mean monthly maximum temperature (July) and mean monthly minimum temperature in winter (January).El Nino phenomenon appears a negative-correlation with the summer rainfall and the mean monthly minimum te mperature;whereas a positive correlation with the mean monthly maximum temperatu re in summer.La Nina phenomenon appears a positive correlation with the summer r ainfall and the mean monthly minimum temperature in winter;whereas a negative-c orrelation with the mean monthly maximum temperature in summer.%通过对近百年El Nino/La Nina事件与北京气候相关性研究发现,El Nino/ La Nina事件与北京夏季(6~8月)降水、平均最高气温(7月)和冬季(1月)平均最低气温之间 相互关系显著。El Nino事件与夏季降水、冬季平均最低气温呈负相关,与夏季平均最高气 温呈正相关,造成降水减少,气温年较差增大,大陆性增强的气候特点。La Nina事件与夏 季降水、冬季平均最低气温呈正相关,与夏季平均最高气温呈负相关,使降水增加,气温年 较差减小,大陆性减弱的气候特点。

  7. Applied Electromagnetics

    These proceedings contain papers relating to the 3rd Japanese-Bulgarian-Macedonian Joint Seminar on Applied Electromagnetics. Included are the following groups: Numerical Methods I; Electrical and Mechanical System Analysis and Simulations; Inverse Problems and Optimizations; Software Methodology; Numerical Methods II; Applied Electromagnetics

  8. Applied superconductivity

    Newhouse, Vernon L

    1975-01-01

    Applied Superconductivity, Volume II, is part of a two-volume series on applied superconductivity. The first volume dealt with electronic applications and radiation detection, and contains a chapter on liquid helium refrigeration. The present volume discusses magnets, electromechanical applications, accelerators, and microwave and rf devices. The book opens with a chapter on high-field superconducting magnets, covering applications and magnet design. Subsequent chapters discuss superconductive machinery such as superconductive bearings and motors; rf superconducting devices; and future prospec

  9. Applied Stratigraphy

    Lucas, Spencer G.

    Stratigraphy is a cornerstone of the Earth sciences. The study of layered rocks, especially their age determination and correlation, which are integral parts of stratigraphy, are key to fields as diverse as geoarchaeology and tectonics. In the Anglophile history of geology, in the early 1800s, the untutored English surveyor William Smith was the first practical stratigrapher, constructing a geological map of England based on his own applied stratigraphy. Smith has, thus, been seen as the first “industrial stratigrapher,” and practical applications of stratigraphy have since been essential to most of the extractive industries from mining to petroleum. Indeed, gasoline is in your automobile because of a tremendous use of applied stratigraphy in oil exploration, especially during the latter half of the twentieth century. Applied stratigraphy, thus, is a subject of broad interest to Earth scientists.

  10. Applied mathematics

    Logan, J David

    2013-01-01

    Praise for the Third Edition"Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." -MAA Reviews Applied Mathematics, Fourth Edition is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and nat

  11. Applied mineralogy

    Park, W.C.; Hausen, D.M.; Hagni, R.D. (eds.)

    1985-01-01

    A conference on applied mineralogy was held and figures were presented under the following headings: methodology (including image analysis); ore genesis; exploration; beneficiations (including precious metals); process mineralogy - low and high temperatures; and medical science applications. Two papers have been abstracted separately.

  12. Micromachining with Nanostructured Cutting Tools

    Jackson, Mark J

    2013-01-01

    The purpose of the brief is to explain how nanostructured tools can be used to machine materials at the microscale.  The aims of the brief are to explain to readers how to apply nanostructured tools to micromachining applications. This book describes the application of nanostructured tools to machining engineering materials and includes methods for calculating basic features of micromachining. It explains the nature of contact between tools and work pieces to build a solid understanding of how nanostructured tools are made.

  13. Applied dynamics

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  14. Applied optics

    The 1988 progress report, of the Applied Optics laboratory, of the (Polytechnic School, France), is presented. The optical fiber activities are focused on the development of an optical gyrometer, containing a resonance cavity. The following domains are included, in the research program: the infrared laser physics, the laser sources, the semiconductor physics, the multiple-photon ionization and the nonlinear optics. Investigations on the biomedical, the biological and biophysical domains are carried out. The published papers and the congress communications are listed

  15. Geometric reasoning about assembly tools

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  16. The Two-time Rise of Australian Competitive Sport in the 100 Years of the Olympic Games%百年奥运视角下澳大利亚竞技体育的二次崛起历程分析及启示

    浦义俊; 吴贻刚

    2014-01-01

    运用文献资料法、数理统计法、逻辑分析法等,以百年奥运会为研究视角,对澳大利亚竞技体育发展中二次崛起历程及其成功因素进行分析。首先对澳大利亚百年奥运竞技历程进行了发展分期,其次基于奖牌分布特征将对比分析了澳大利亚二次竞技体育崛起的项目分布差异;再次则是在比较澳大利亚两次竞技体育崛起内外部环境变化的基础上,重点对澳大利亚竞技体育得以再度崛起的成功因素进行深入分析,研究认为澳大利亚政治领域执政党及其执政理念的更迭为竞技体育的重生创造了重要前提,澳大利亚联邦政府对竞技体育发展深入的政策设计与逐渐的资金资助是其重回巅峰的关键,澳大利亚逐渐走向成熟的竞技体育管理系统是其再次崛起的重要保障,澳大利亚国民性与体育的高度融合是其竞技体育再度成功的重要根基。最后则指出了澳大利亚竞技体育二次崛起对我国的启示。%This study examined the process of the two-time rise of Australian competitive sport in the 100 years of the Olympic Games and factors contributing to its success. First,we conducted a stage division of the 100 years of Australian competitive sport. Then we looked at the event distribution of the two-time rise. After we compared the first-time rise and the second-time rise in terms of internal and external environment,we focused on the analysis of elements that contributed to the second-time rise of Australian competitive sport. These elements include the alterna-tion of the ruling party and change in administrating conceptions,the federal government’s policy design and finan-cial support for the development of competitive sport,the maturing management system of competitive sport,and the agreement between Australia’s national character and sport. The study concluded with inspirations that can be ap-plied in China’s sport context.

  17. Applied Literature for Healing,

    Susanna Marie Anderson

    2014-11-01

    Full Text Available In this qualitative research study interviews conducted with elite participants serve to reveal the underlying elements that unite the richly diverse emerging field of Applied Literature. The basic interpretative qualitative method included a thematic analysis of data from the interviews yielding numerous common elements that were then distilled into key themes that elucidated the beneficial effects of engaging consciously with literature. These themes included developing a stronger sense of self in balance with an increasing connection with community; providing a safe container to engage challenging and potentially overwhelming issues from a stance of empowered action; and fostering a healing space for creativity. The findings provide grounds for uniting the work being done in a range of helping professions into a cohesive field of Applied Literature, which offers effective tools for healing, transformation and empowerment.Keywords: Applied Literature, Bibliotherapy, Poetry Therapy, Arts in Corrections, Arts in Medicine

  18. Applied geodesy

    This volume is based on the proceedings of the CERN Accelerator School's course on Applied Geodesy for Particle Accelerators held in April 1986. The purpose was to record and disseminate the knowledge gained in recent years on the geodesy of accelerators and other large systems. The latest methods for positioning equipment to sub-millimetric accuracy in deep underground tunnels several tens of kilometers long are described, as well as such sophisticated techniques as the Navstar Global Positioning System and the Terrameter. Automation of better known instruments such as the gyroscope and Distinvar is also treated along with the highly evolved treatment of components in a modern accelerator. Use of the methods described can be of great benefit in many areas of research and industrial geodesy such as surveying, nautical and aeronautical engineering, astronomical radio-interferometry, metrology of large components, deformation studies, etc

  19. Applied mathematics

    The 1988 progress report of the Applied Mathematics center (Polytechnic School, France), is presented. The research fields of the Center are the scientific calculus, the probabilities and statistics and the video image synthesis. The research topics developed are: the analysis of numerical methods, the mathematical analysis of the physics and mechanics fundamental models, the numerical solution of complex models related to the industrial problems, the stochastic calculus and the brownian movement, the stochastic partial differential equations, the identification of the adaptive filtering parameters, the discrete element systems, statistics, the stochastic control and the development, the image synthesis techniques for education and research programs. The published papers, the congress communications and the thesis are listed

  20. Applying radiation

    The invention discloses a method and apparatus for applying radiation by producing X-rays of a selected spectrum and intensity and directing them to a desired location. Radiant energy is directed from a laser onto a target to produce such X-rays at the target, which is so positioned adjacent to the desired location as to emit the X-rays toward the desired location; or such X-rays are produced in a region away from the desired location, and are channeled to the desired location. The radiant energy directing means may be shaped (as with bends; adjustable, if desired) to circumvent any obstruction between the laser and the target. Similarly, the X-ray channeling means may be shaped (as with fixed or adjustable bends) to circumvent any obstruction between the region where the X-rays are produced and the desired location. For producing a radiograph in a living organism the X-rays are provided in a short pulse to avoid any blurring of the radiograph from movement of or in the organism. For altering tissue in a living organism the selected spectrum and intensity are such as to affect substantially the tissue in a preselected volume without injuring nearby tissue. Typically, the selected spectrum comprises the range of about 0.1 to 100 keV, and the intensity is selected to provide about 100 to 1000 rads at the desired location. The X-rays may be produced by stimulated emission thereof, typically in a single direction

  1. Fastener starter tool

    Chandler, Faith T. (Inventor); Valentino, William D. (Inventor); Garton, Harry L. (Inventor); Arnett, Michael C. (Inventor)

    2003-01-01

    A fastener starter tool includes a number of spring retention fingers for retaining a small part, or combination of parts. The tool has an inner housing, which holds the spring retention fingers, a hand grip, and an outer housing configured to slide over the inner housing and the spring retention fingers toward and away from the hand grip, exposing and opening, or respectively, covering and closing, the spring retention fingers. By sliding the outer housing toward (away from) the hand grip, a part can be released from (retained by) the tool. The tool may include replaceable inserts, for retaining parts, such as screws, and configured to limit the torque applied to the part, to prevent cross threading. The inner housing has means to transfer torque from the hand grip to the insert. The tool may include replaceable bits, the inner housing having means for transferring torque to the replaceable bit.

  2. Downhole tool adapted for telemetry

    Hall, David R.; Fox, Joe

    2010-12-14

    A cycleable downhole tool such as a Jar, a hydraulic hammer, and a shock absorber adapted for telemetry. This invention applies to other tools where the active components of the tool are displaced when the tool is rotationally or translationally cycled. The invention consists of inductive or contact transmission rings that are connected by an extensible conductor. The extensible conductor permits the transmission of the signal before, after, and during the cycling of the tool. The signal may be continuous or intermittent during cycling. The invention also applies to downhole tools that do not cycle, but in operation are under such stress that an extensible conductor is beneficial. The extensible conductor may also consist of an extensible portion and a fixed portion. The extensible conductor also features clamps that maintain the conductor under stresses greater than that seen by the tool, and seals that are capable of protecting against downhole pressure and contamination.

  3. Preparing for the future today: The findings of the NEA RK and M initiative for the short term (This period covers several decades and likely more than 100 years. The actual duration will vary across national programmes)

    The NEA initiative on Records, Knowledge and Memory (RK and M) across Generations is an initiative that expresses, supports and aims to answer to an evolution in long-term radioactive waste management (RWM) thinking over the past decades. In the earlier days, the vision seems to have been that waste management ends with the closure of disposal sites. Oversight after closure was not an issue that was studied, (tacitly) assuming safe oblivion of geological repositories, or that archives, markers and other similar tools would suffice, e.g. to avoid human intrusion and/or to understand the nature of the underground facility. Today, it is recognised that oversight should take place for as long as practicable. The new vision includes the preservation of information to be used by future generations. In this paper we want to highlight that such a vision shift with regard to the future requires an accompanying shift with regard to present thinking and practices. To this aim, we will outline some of the studies undertaken within the RK and M initiative that substantiate this finding that the future starts today, and offer suggestions to support its concretization. societal oversight as long as practicable, we should acknowledge the fact that RK and M loss takes place rapidly if it is not acted upon in a conscious and ongoing manner that involves various actors and does more than dumping records into archives. The success of RK and M preservation cannot be judged today by whether they will last for one or ten thousand years. Instead, it lies in establishing and maintaining awareness of the need and responsibility for RK and M preservation in the minds of regulators, operators, stakeholders and, especially in the longer term, the local and regional authorities and general public. Therefore, we should not only think about future activities, but act upon the idea that the long term starts today, and that RK and M preservation needs to be prepared for in the present, while the

  4. Visualisation tools

    E. Dupont proposed that visualisation tools should be extended to Nuclear Data (ND) Information Systems in order to cover all data (and formats), all users and all needs. In particular, these ND Information Systems could both serve as an interface between data and users, as well as between data and codes (processing codes or nuclear reaction codes). It is expected that these systems will combine the advantages of processing codes and visualisation tools, as well as serving as a Tool Box to support various ND projects

  5. Applied ALARA techniques

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work

  6. TS Tools

    Yvette Linders

    2012-12-01

    Full Text Available In deze aflevering van TS Tools laat promovenda Yvette Linders (Radboud Universiteit Nijmegen zien hoe software voor kwalitatieve data-analyse kan worden toegepast in het onderzoek naar literatuurkritiek.

  7. 核电厂变形相关组件安全贮存工器具研发及应用%Safe Storage Tools Designed and Applied for Deformation Associated Core Components

    石中华; 邓志新; 张旭辉; 王玲彬

    2015-01-01

    由于变形相关组件形状发生变化,无法配插到燃料组件或存放架内进行贮存,一般是将其临时放入乏燃料水池贮存格架中的空贮存小室内,这种状态下的变形相关组件失去支撑,变形相关组件棒体会在自身重量作用下发生弯曲,长时间处于此状态下可能导致棒体破损,致使里面的物质泄漏而污染乏燃料水池.文章中工器具的整个研发是以秦山第二核电厂作为试验场所,以其乏燃料水池内的3组变形相关组件作为研制对象,最终研发出一套适用于变形相关组件安全贮存的工器具,本套工器具保证了变形相关组件的完整性.%Due to the shape of the deformation associated core components is changed, it can't be inserted to fuel assembly or storage rack, and it can only be stored in spent fuel storage cell. Deformation associated core components in this state will lose support. Rods of deformation as-sociated core components become bend under the weight of its own, which may be damaged for a long time in this state, thus the material inside leaked and the spent fuel pool is contami-nated. Therefore, we must design a tool, which can safely store the deformation associated core components. Qinshan Phase Ⅱ is selected as the test site for the whole development process, with three groups of deformation associated core components (a group of primary neutron source assembly, a group of burnable poison assembly, a group of rod cluster control assembly) in the spent fuel pool as development objects. Ultimately a set of tools suitable for the safe storage of deformation associated core components are developed, which ensure the integrity of the defor-mation associated core components.

  8. Management Tools in Engineering Education.

    Fehr, M.

    1999-01-01

    Describes a teaching model that applies management tools such as delegation, total quality management, time management, teamwork, and Deming rules. Promotes the advantages of efficiency, reporting, independent scheduling, and quality. (SK)

  9. Qualification of the nuclear reactor core model DYN3D coupled to the thermohydraulic system code ATHLET, applied as an advanced tool for accident analysis of VVER-type reactors. Final report

    The nuclear reactor core model DYN3D with 3D neutron kinetics has been coupled to the thermohydraulic system code ATHLET. In the report, activities on qualification of the coupled code complex ATHLET-DYN3D as a validated tool for the accident analysis of russian VVER type reactors are described. That includes: - Contributions to the validation of the single codes ATHLET and DYN3D by the analysis of experiments on natural circulation behaviour in thermohydraulic test facilities and solution of benchmark tasks on reactivity initiated transients, - the acquisition and evaluation of measurement data on transients in nuclear power plants, the validation of ATHLET-DYN3D by calculating an accident with delayed scram and a pump trip in VVER plants, - the complementary improvement of the code DYN3D by extension of the neutron physical data base, implementation of an improved coolant mixing model, consideration of decay heat release and xenon transients, - the analysis of steam leak scenarios for VVER-440 type reactors with failure of different safety systems, investigation of different model options. The analyses showed, that with realistic coolant mixing modelling in the downcomer and the lower plenum, recriticality of the scramed reactor due to overcooling can be reached. The application of the code complex ATHLET-DYN3D in Czech Republic, Bulgaria and the Ukraine has been started. Future work comprises the verification of ATHLET-DYN3D with a DYN3D version for the square fuel element geometry of western PWR. (orig.)

  10. Alternative affinity tools: more attractive than antibodies?

    Ruigrok, V.J.B.; Levisson, M.; Eppink, M.H.M.; Smidt, H.; Oost, van der J.

    2011-01-01

    Antibodies are the most successful affinity tools used today, in both fundamental and applied research (diagnostics, purification and therapeutics). Nonetheless, antibodies do have their limitations, including high production costs and low stability. Alternative affinity tools based on nucleic acids

  11. Built Environment Analysis Tool: April 2013

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  12. Tool steels

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...... serves primarily two purpose (i) to improve the hardenabillity and (ii) to provide harder and thermally more stable carbides than cementite. Assuming proper heattreatment, the properties of a tool steel depends on the which alloying elements are added and their respective concentrations....

  13. Applied mechanics of solids

    Bower, Allan F

    2009-01-01

    Modern computer simulations make stress analysis easy. As they continue to replace classical mathematical methods of analysis, these software programs require users to have a solid understanding of the fundamental principles on which they are based. Develop Intuitive Ability to Identify and Avoid Physically Meaningless Predictions Applied Mechanics of Solids is a powerful tool for understanding how to take advantage of these revolutionary computer advances in the field of solid mechanics. Beginning with a description of the physical and mathematical laws that govern deformation in solids, the text presents modern constitutive equations, as well as analytical and computational methods of stress analysis and fracture mechanics. It also addresses the nonlinear theory of deformable rods, membranes, plates, and shells, and solutions to important boundary and initial value problems in solid mechanics. The author uses the step-by-step manner of a blackboard lecture to explain problem solving methods, often providing...

  14. Applied multivariate statistical analysis

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  15. Developing Resilient Children: After 100 Years of Montessori Education

    Drake, Meg

    2008-01-01

    In this millennium, educators are faced with a number of issues that Dr. Maria Montessori could not have predicted. Today, students are different from the children Dr. Montessori observed in her "Casa dei Bambini." They are influenced by technology in all its forms. Some suffer from medical problems such as complex food allergies, which wreak…

  16. Media Storytelling, Curriculum, and the Next 100 Years

    Lipschultz, Jeremy Harris

    2012-01-01

    Journalism as an academic field in the United States has frequently changed and grown through new professions and new industries coming under its umbrella (sometimes but not always driven by technological and/or economic changes) and academic developments such as cultural studies and media studies. But journalism is still rooted in good…

  17. Global change and water resources in the next 100 years

    Larsen, M. C.; Hirsch, R. M.

    2010-03-01

    We are in the midst of a continental-scale, multi-year experiment in the United States, in which we have not defined our testable hypotheses or set the duration and scope of the experiment, which poses major water-resources challenges for the 21st century. What are we doing? We are expanding population at three times the national growth rate in our most water-scarce region, the southwestern United States, where water stress is already great and modeling predicts decreased streamflow by the middle of this century. We are expanding irrigated agriculture from the west into the east, particularly to the southeastern states, where increased competition for ground and surface water has urban, agricultural, and environmental interests at odds, and increasingly, in court. We are expanding our consumption of pharmaceutical and personal care products to historic high levels and disposing them in surface and groundwater, through sewage treatment plants and individual septic systems. These substances are now detectable at very low concentrations and we have documented significant effects on aquatic species, particularly on fish reproduction function. We don’t yet know what effects on human health may emerge, nor do we know if we need to make large investments in water treatment systems, which were not designed to remove these substances. These are a few examples of our national-scale experiment. In addition to these water resources challenges, over which we have some control, climate change models indicate that precipitation and streamflow patterns will change in coming decades, with western mid-latitude North America generally drier. We have already documented trends in more rain and less snow in western mountains. This has large implications for water supply and storage, and groundwater recharge. We have documented earlier snowmelt peak spring runoff in northeastern and northwestern States, and western montane regions. Peak runoff is now about two weeks earlier than it was in the first half of the 20th century. Decreased summer runoff affects water supply for agriculture, domestic water supply, cooling needs for thermoelectric power generation, and ecosystem needs. In addition to the reduced volume of streamflow during warm summer months, less water results in elevated stream temperature, which also has significant effects on cooling of power generating facilities and on aquatic ecosystem needs. We are now required to include fish and other aquatic species in negotiation over how much water to leave in the river, rather than, as in the past, how much water we could remove from a river. Additionally, we must pay attention to the quality of that water, including its temperature. This is driven in the US by the Endangered Species Act and the Clean Water Act. Furthermore, we must now better understand and manage the whole hydrograph and the influence of hydrologic variability on aquatic ecosystems. Man has trimmed the tails off the probability distribution of flows. We need to understand how to put the tails back on but can’t do that without improved understanding of aquatic ecosystems. Sea level rise presents challenges for fresh water extraction from coastal aquifers as they are compromised by increased saline intrusion. A related problem faces users of ‘run-of-the-river’ water-supply intakes that are threatened by a salt front that migrates further upstream because of higher sea level. We face significant challenges with water infrastructure. The U.S. has among the highest quality drinking water in the world piped to our homes. However, our water and sewage treatment plants and water and sewer pipelines have not had adequate maintenance or investment for decades. The US Environmental Protection Agency estimates that there are up to 3.5M illnesses per year from recreational contact with sewage from sanitary sewage overflows. Infrastructure investment needs have been put at 5 trillion nationally. Global change and water resources challenges that we face this century include a combination of local and national management probl

  18. Technique for estimating depth of 100-year floods in Tennessee

    Gamble, Charles R.; Lewis, James G.

    1977-01-01

    Preface: A method is presented for estimating the depth of the loo-year flood in four hydrologic areas in Tennessee. Depths at 151 gaging stations on streams that were not significantly affected by man made changes were related to basin characteristics by multiple regression techniques. Equations derived from the analysis can be used to estimate the depth of the loo-year flood if the size of the drainage basin is known.

  19. Engineering and malaria control: learning from the past 100 years

    Konradsen, Flemming; van der Hoek, Wim; Amerasinghe, Felix P;

    2004-01-01

    Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental-management-based approa......Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental......-management-based approaches for the control of malaria vectors follows the rapid development of resistance by mosquitoes to the widely used insecticides, the increasing cost of developing new chemicals, logistical constraints involved in the implementation of residual-spraying programs and the environmental concerns linked...... cases are discussed in the wider context of environment-based approaches for the control of malaria vectors, including current relevance. Clearly, some of the interventions piloted and implemented early in the last century still have relevance today but generally in a very site-specific manner and in...

  20. The Journal de Radiologie is 100 years old

    In January 1914, the first edition of Le Journal de Radiologie et d'Electrologie, a monthly medical review, was published by Masson. It was organized by a committee of ten members, whose general secretary was J. Belot. The members of the committee were the pioneers of radiology in France at the time and remained leaders in the field for three decades. The relationships between the Journal and the Societe de Radiologie are obvious: J. Belot was president of the Society and remained so until 1920, G. Haret, a committee member, was the general secretary of the Society from 1909 and remained so until 1928. The Journal at the time did not claim to be national and all of its committee members were Parisian. There was, nonetheless, an impressive list of contributors from throughout France and from several foreign countries. The table of contents of this first edition clearly shows the many skills of these first generation radiologists who saw themselves not only as radio-diagnosticians, radiotherapists, and electrologists, but also as physicians. From this first edition, the ambitions of the Journal's chiefs were clear: unquestionable competence and the need for research, and the importance of innovation. Certainly the founders of this new journal saw themselves as the masters of French radiology, but they were nevertheless wide open to the world. The Journal reported several European meetings, book reviews and articles, particularly from Germany and Austria, but also from England, Belgium, Italy, Cuba, Egypt, the Philippines and the United States. America, however, was very far from being preeminent in 1914. French radiology books also began to be published. Radiology was everywhere. Above all, however was the advertising, promoting the products of innumerable companies producing radiology instrumentation: in the following decades, these merged and concentrated until the French radiology industry had almost completely disappeared after a merger with an American company. The dangers of X-rays were already well known in 1914 and there was a full understanding of the need for protection. Until August 1914, the Journal continued in this same spirit of quality and accuracy to examine all of the fields of radiology and electrology. The Journal supplements generally attracted considerable interest. Whereas the content of the Journal itself was purely scientific, the supplements were useful for radiologists at conferences and meetings and contained large numbers of classified advertisement and advertising. They therefore allowed the reader of the day to connect the radiology community to the neighboring world. From August 1914 to March 1915, the results of radiology's contribution to the Army Health Service was remarkable. The main subject remained battlefield radiology and very soon the decision was made to equip radiology vehicles. A new development was the automobile ambulances. In the navy, hospital boats equipped with radiology instruments were used to examine and treat the injured evacuated from Flandres or the Dardanelles. Collaboration between radiologists and surgeons, eye and hand, had therefore become unquestioned. Their respective roles were clearly described in the Journal de Radiologie articles. However, subjects other than battlefield radiology were not neglected. Radiology acquired legitimacy and its merits were no longer questioned. It advanced enormously: new methods and new instruments were born and a new organization was developed, which radiologists took advantage of to defend their points of view

  1. Nigeria Anopheles vector database: an overview of 100 years' research.

    Patricia Nkem Okorie

    Full Text Available Anopheles mosquitoes are important vectors of malaria and lymphatic filariasis (LF, which are major public health diseases in Nigeria. Malaria is caused by infection with a protozoan parasite of the genus Plasmodium and LF by the parasitic worm Wuchereria bancrofti. Updating our knowledge of the Anopheles species is vital in planning and implementing evidence based vector control programs. To present a comprehensive report on the spatial distribution and composition of these vectors, all published data available were collated into a database. Details recorded for each source were the locality, latitude/longitude, time/period of study, species, abundance, sampling/collection methods, morphological and molecular species identification methods, insecticide resistance status, including evidence of the kdr allele, and P. falciparum sporozoite rate and W. bancrofti microfilaria prevalence. This collation resulted in a total of 110 publications, encompassing 484,747 Anopheles mosquitoes in 632 spatially unique descriptions at 142 georeferenced locations being identified across Nigeria from 1900 to 2010. Overall, the highest number of vector species reported included An. gambiae complex (65.2%, An. funestus complex (17.3%, An. gambiae s.s. (6.5%. An. arabiensis (5.0% and An. funestus s.s. (2.5%, with the molecular forms An. gambiae M and S identified at 120 locations. A variety of sampling/collection and species identification methods were used with an increase in molecular techniques in recent decades. Insecticide resistance to pyrethroids and organochlorines was found in the main Anopheles species across 45 locations. Presence of P. falciparum and W. bancrofti varied between species with the highest sporozoite rates found in An. gambiae s.s, An. funestus s.s. and An. moucheti, and the highest microfilaria prevalence in An. gambiae s.l., An. arabiensis, and An. gambiae s.s. This comprehensive geo-referenced database provides an essential baseline on Anopheles vectors and will be an important resource for malaria and LF vector control programmes in Nigeria.

  2. The FCS Body of Knowledge: Shaping the Next 100 Years

    Journal of Family and Consumer Sciences, 2010

    2010-01-01

    This article shares the Body of Knowledge (BOK) as articulated in the new "Accreditation Documents for Undergraduate Programs in Family and Consumer Sciences" (2010). The purpose of sharing the BOK is to enhance awareness of the current knowledge base of family and consumer sciences (FCS), whether for new or lifelong AAFCS members, those exploring…

  3. Spectroscopic Binaries: Towards the 100-Year Time Domain

    Griffin, R. F.

    2012-04-01

    Good measurements of visual binary stars (position angle and angular separation) have been made for nearly 200 years. Radial-velocity observers have exhibited less patience; when the orbital periods of late-type stars in the catalogue published in 1978 are sorted into bins half a logarithmic unit wide, the modal bin is the one with periods between 3 and 10 days. The same treatment of the writer's orbits shows the modal bin to be the one between 1000 and 3000 days. Of course the spectroscopists cannot quickly catch up the 200 years that the visual observers have been going, but many spectroscopic orbits with periods of decades, and a few of the order of a century, have been published. Technical developments have also been made in `visual' orbit determination, and orbits with periods of only a few days have been determined for certain `visual' binaries. In principle, therefore, the time domains of visual and spectroscopic binaries now largely overlap. Overlap is essential, as it is only by combining both techniques that orbits can be determined in three dimensions, as is necessary for the important objective of determining stellar masses accurately. Nevertheless the actual overlap-objects with accurate measurements by both techniques-remains disappointingly small. There have, however, been unforeseen benefits from the observation of spectroscopic binaries that have unconventionally long orbital periods, not a few of which have proved to be interesting and significant objects in their own right. It has also been shown that binary membership is more common than was once thought (orbits have even been determined for some of the IAU standard radial-velocity stars!); a recent study of the radial velocities of K giants that had been monitored for 45 years found a binary incidence of 30%, whereas a figure of 13.7% was given as recently as 2005 for a similar group.

  4. 100 years from the birth of Prof. Frantisek Behounek

    It was the scholarship in the Curie laboratory that directed Behounek's scientific interest to radioactivity and nuclear physics, and his participation in the Italia expedition to the North Pole, which revealed his talent as storyteller and writer. Behounek's role and significance as scientist, writer of popular science books and romantic sci-fi stories for children are reviewed in this historical note. (author)

  5. 100-year anniversary of the First international seismological conference

    Kozák, Jan

    2001-01-01

    Roč. 45, č. 2 (2001), s. 200-209. ISSN 0039-3169 Institutional research plan: CEZ:AV0Z3012916 Keywords : International seismological conference * Strasbourg * 1901 Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.680, year : 2001

  6. 100 YEARS SINCE THE BIRTH OF MILTON FRIEDMAN

    Marek Loužek

    2012-01-01

    The paper deals with the economic theory of Milton Friedman. Its first part outlines the life of Milton Friedman. The second part examines his economic theories - “Essays in Positive Economics” (1953), “Studies in the Quantity Theory of Money“ (1956), “A Theory of the Consumption Function” (1957), “A Program for Monetary Stability” (1959), “A Monetary History of the United States 1897 to 1960” (1963), and “Price Theory” (1976). His Nobel Prize lecture and American Economic Association lecture...

  7. 100 YEARS OF AUDI%百年奥迪

    2012-01-01

    Since entering China in 1988, Audi has been ranked "China Auto Customer Service Satisfaction" champion and "China Auto Sales Satisfaction" champion ten times by J.D. Power, meanwhile becoming the first luxury car brand in the Chinese market to exceed one million units in sales. This success is not only due to the formidable technical background based on the idea of "Innovation through Technology,"%源于德国的奥迪汽车1988年挺进中国,二十几年内,十次摘得J.D.Power公布的"中国汽车用户服务满意度"冠军和"中国汽车销售满意度"冠军,并成为第一个中国地区累计销量超过100万辆的高档汽车品牌。奥迪在中国的成功不仅源自"突破科技、启迪未来"的技术背景,更要归功于其品牌策划与本地用户的良性互动。3月21日,"心动上海"奥迪艺术展在沪申画廊开幕。策展人顾振清生于上海,

  8. 100 Years of Alzheimer's disease (1906-2006).

    Lage, José Manuel Martínez

    2006-01-01

    As we commemorate the first centennial since Alzheimer's disease (AD) was first diagnosed, this article casts back into the past while also looking to the future. It reflects on the life of Alois Alzheimer (1864-1915) and the scientific work he undertook in describing the disorder suffered by Auguste D. from age 51 to 56 and the neuropathological findings revealed by her brain, reminding us of the origin of the eponym. It highlights how, throughout the 1960's, the true importance of AD as the major cause of late life dementia ultimately came to light and narrates the evolution of the concepts related to AD throughout the years and its recognition as a major public health problem. Finally, the article pays homage to the work done by the Alzheimer's Association and the research undertaken at the Alzheimer's Disease Centres within the framework of the National Institute on Aging (NIA) Program, briefly discussing the long road travelled in the fight against AD in the past 25 years and the scientific odyssey that we trust will result in finding a cure. PMID:17004362

  9. American Child Care: Lessons from the First 100 Years.

    Anderson, Susan D.

    Child care has been part of American culture for nearly a century. This paper takes a backward glance at the history of child care in the United States. During the industrial revolution, child care was disguised as child labor. As child labor laws were enacted, schooling became the focus of ideas about caring for groups of children. The idea of a…

  10. I. P. PAVLOV: 100 YEARS OF RESERACH ON ASSOCIATIVE LEARNING

    GERMÁN GUTIÉRREZ

    2005-07-01

    Full Text Available A biographical summary of Ivan Pavlov is presented, emphasizing his academic formation and achievements, and hiscontributions to general science and psychology. His main findings on associative learning are described and three areasof current development in this area are discussed: the study of behavioral mechanisms, the study of neurobiologicalmechanisms and the functional role of learning.

  11. 100 Years of the Quantum - the Glory and the Shame

    Wheeler, John Archibald

    2001-03-01

    What is the greatest mystery that stands out on the books of physics today? Every one of us will have a different answer. Some will say it is the structure of the elementary particles. Others ask in what form is the mass or the attraction that has held the universe together thus far against perpetual expansion. Others will think of the November 11, 1999 dedication of America's first 6 kilometer by 6 kilometer gravity wave detector and will ask what information it will bring us about events going on in the deep and secret places of the universe. My own hope is that we'll see in time to come the answer to a question now almost a century old, "How come the quantum?" Gregory Breit would have sympathized with the investigations at each of these fronts, and could surely have filled us in on what the pioneers thought and said about the question, "How come the quantum?" ...

  12. Calibration apparatus for a machine-tool

    The invention proposes a calibration apparatus for a machine-tool comprising a torque measuring device, where the tool is driven by a motor of which supply electric current is proportional to the torque applied upon the tool and can be controlled and measured, a housing having an aperture through which the rotatable tool can pass. This device alloys to apply a torque on the tool and to measure it from the supply current of the motor. The invention applies, more particularly to the screwing machines used for the mounting of the core containment plates

  13. Engineering tools

    2010-01-01

    The aim of this report is to give an overview of the results of Work Package 5 “Engineering Tools”. In this workpackage numerical tools have been developed for all relevant CHCP systems in the PolySMART demonstration projects (WP3). First, existing simulation platforms have been described and specific characteristics have been identified. Several different simulation platforms are in principle appropriate for the needs in the PolySMART project. The result is an evaluation of available simulat...

  14. Applied Estimation of Mobile Environments

    Weekly, Kevin Pu

    2014-01-01

    For many research problems, controlling and estimating the position of the mobile elements within an environment is desired. Realistic mobile environments are unstructured, but share a set of common features, such as position, speed, and constraints on mobility. To estimate within these real-world environments requires careful selection of the best-suited estimation tools and software and hardware technologies. This dissertation discusses the design and implementation of applied estimation...

  15. Tool Gear: Infrastructure for Parallel Tools

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  16. RSP Tooling Technology

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  17. Development of applied genomics tools for cucumber breeding

    The past three years have witnessed rapid accumulation of whole genome sequences and other genomics resources in cucumber. So far, draft genomes of three cucumber inbred lines have been released; many cucumber lines are being re-sequenced using next-generation sequencing technologies; nearly three m...

  18. Scanning Probe Microscopy as a Tool Applied to Agriculture

    Leite, Fabio Lima; Manzoli, Alexandra; de Herrmann, Paulo Sérgio Paula; Oliveira, Osvaldo Novais; Mattoso, Luiz Henrique Capparelli

    The control of materials properties and processes at the molecular level inherent in nanotechnology has been exploited in many areas of science and technology, including agriculture where nanotech methods are used in release of herbicides and monitoring of food quality and environmental impact. Atomic force microscopy (AFM) and related techniques are among the most employed nanotech methods, particularly with the possibility of direct measurements of intermolecular interactions. This chapter presents a brief review of the applications of AFM in agriculture that may be categorized into four main topics, namely thin films, research on nanomaterials and nanostructures, biological systems and natural fibers, and soils science. Examples of recent applications will be provided to give the reader a sense of the power of the technique and potential contributions to agriculture.

  19. Applying mathematical finance tools to the competitive Nordic electricity market

    Vehviläinen, Iivo

    2004-01-01

    This thesis models competitive electricity markets using the methods of mathematical finance. Fundamental problems of finance are market price modelling, derivative pricing, and optimal portfolio selection. The same questions arise in competitive electricity markets. The thesis presents an electricity spot price model based on the fundamental stochastic factors that affect electricity prices. The resulting price model has sound economic foundations, is able to explain spot market price mo...

  20. Spectroscopic Tools Applied to Element Z = 115 Decay Chains

    Forsberg U.

    2014-03-01

    Full Text Available Nuclides that are considered to be isotopes of element Z = 115 were produced in the reaction 48Ca + 243Am at the GSI Helmholtzzentrum für Schwerionenforschung Darmstadt. The detector setup TASISpec was used. It was mounted behind the gas-filled separator TASCA. Thirty correlated α-decay chains were found, and the energies of the particles were determined with high precision. Two important spectroscopic aspects of the offline data analysis are discussed in detail: the handling of digitized preamplified signals from the silicon strip detectors, and the energy reconstruction of particles escaping to upstream detectors relying on pixel-by-pixel dead-layer thicknesses.

  1. Database Constraints Applied to Metabolic Pathway Reconstruction Tools

    Jordi Vilaplana

    2014-01-01

    Full Text Available Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (reannotation of proteomes, to properly identify both the individual proteins involved in the process(es of interest and their function. It also enables the sets of proteins involved in the process(es in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  2. Applying mathematical finance tools to the competitive Nordic electricity market

    This thesis models competitive electricity markets using the methods of mathematical finance. Fundamental problems of finance are market price modelling, derivative pricing, and optimal portfolio selection. The same questions arise in competitive electricity markets. The thesis presents an electricity spot price model based on the fundamental stochastic factors that affect electricity prices. The resulting price model has sound economic foundations, is able to explain spot market price movements, and offers a computationally efficient way of simulating spot prices. The thesis shows that the connection between spot prices and electricity forward prices is nontrivial because electricity is a commodity that must be consumed immediately. Consequently, forward prices of different times are based on the supply-demand conditions at those times. This thesis introduces a statistical model that captures the main characteristics of observed forward price movements. The thesis presents the pricing problems relating to the common Nordic electricity derivatives, as well as the pricing relations between electricity derivatives. The special characteristics of electricity make spot electricity market incomplete. The thesis assumes the existence of a risk-neutral martingale measure so that formal pricing results can be obtained. Some concepts introduced in financial markets are directly usable in the electricity markets. The risk management application in this thesis uses a static optimal portfolio selection framework where Monte Carlo simulation provides quantitative results. The application of mathematical finance requires careful consideration of the special characteristics of the electricity markets. Economic theory and reasoning have to be taken into account when constructing financial models in competitive electricity markets. (orig.)

  3. Indispensable tool

    Synchrotron radiation has become an indispensable research tool for a growing number of scientists in a seemingly ever expanding number of disciplines. We can thank the European Synchrotron Research Facility (ESRF) in Grenoble for taking an innovative step toward achieving the educational goal of explaining the nature and benefits of synchrotron radiation to audiences ranging from the general public (including students) to government officials to scientists who may be unfamiliar with x-ray techniques and synchrotron radiation. ESRF is the driving force behind a new CD-ROM playable on both PCs and Macs titled Synchrotron light to explore matter. Published by Springer-Verlag, the CD contains both English and French versions of a comprehensive overview of the subject

  4. Cosmo-geo-anthropo-logical history and political and deep future events in climate and life evolution conveyed by a physical/virtual installation at a scale of 1 mm per 100 years across Denmark during the COP15 climate summit meeting.

    Holm Jacobsen, Bo

    2010-05-01

    During the COP15 climate summit meeting a physical and virtual installation of time was performed at a linear scale of 1 mm per 100 years. The "track of time" was carefully anchored geographically so that highlights in time history coincided with landmarks of historical and cultural significance to both tourists and the local Danish population; with Big Bang at the site of early royal settlements from the Viking age (13.7 billion years ~ 137 km from now), Earth origin at Kronborg in Elsinore (4.6 bil. Years ~ 46 km), and fish go on land at The Little Mermaid (390 mil. Years ~ 3900 m). The venue of the COP15 meeting coincided with the position of severe global warming, driven by the steady solar constant increase, to be expected 600 million years into the future. Nested in this grand track of time were the Quaternary ice-ages (2.6 mil. years ~ 26 m), human origin as species (100,000 years ~ 1 m), human history (life and the scope of political consequences of voting action (100 years ~ 1 mm). This installation of time involved several media. Highlights in time history and future were installed as a kml-file so that the convenient user interface of Google Earth could be utilized to provide both overview of time and understanding of details and proportions events antropo-geo-cosmo-history. Each Google Earth marker-balloon gave short explanations and linked to "on location" video-narratives. A classical printed text-folder was prepared as a tour guide for those who wanted to actually walk the Phanerozoic (~5 km). Credit-card-shaped graphs of temperature, CO2 and sealevel development and scenarios were prepared to scale for the period 4000 BP to 1000 years into the future. Along the time line from "Fish on land" to the present 3900 chalk marks were placed on the street surface, one for every metre = time span of Man as a species so far. A "NowGate" marking the present was implemented physically as a door frame, where citizens could meet and discuss time and political and

  5. Tools for Authentication

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  6. Tools for Authentication

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work

  7. Building energy analysis tool

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  8. What Metadata Principles Apply to Scientific Data?

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  9. Implementation of cutting tool management system

    G. Svinjarević

    2007-07-01

    Full Text Available Purpose: of this paper is to show the benefits of implementation of management of cutting tools in the company which specializes in metal cutting process, after which the production conditions alows new possibilities for improvement of the tool management.Design/methodology/approach: applied in this paper was identification current state and exploatation conditions of cutting tools on lathes and milling machines and organization of the departments and other services, which are directly involved in the cutting tools management system.Findings: of the controlled testings and analyses in every phase of tool management in departments and other services which are directly involved in the tool management system will help to reduce stock and costs. It is possible to identify which operator makes errors and is responsible for inappropriate use of cutting tool. Some disadvantages have been identified and a few suggestions for the improvement in the tool management system have been given. A result of research is easy to apply in company with developed informatic infrastructure and is mostly interesting for CNC workshops. Small companies and specialized low volume productions have to made additional effort to integrate in clusters.Practical implications: are reduction of cutting tool on stock, reduction of employee, quick access to the necessary cutting tools and data, simplicity in tool order and supply. The most important is possibility to monitor and to identify which cutting tools and employees are weakest parts of chain in tool management system. Management activity should be foreseeable in all its segments, which includes both the appropriate choice and use of cutting tools, and monitoring of unwanted phenomena during the cutting process and usage of these data for further purchase of tools.Originality/value: in the paper is turnover methodology applied for determination of management efficacy and formation of employees from different departments in

  10. CD-SEM tool stability and tool-to-tool matching management using image sharpness monitor

    Abe, Hideaki; Ishibashi, Yasuhiko; Yamazaki, Yuichiro; Kono, Akemi; Maeda, Tatsuya; Miura, Akihiro; Koshihara, Shunsuke; Hibino, Daisuke

    2009-03-01

    As device feature size reduction continues, requirements for Critical Dimension (CD) metrology tools are becoming stricter. For sub-32 nm node, it is important to establish a CD-SEM tool management system with higher sensitivity for tool fluctuation and short Turn around Time (TAT). We have developed a new image sharpness monitoring method, PG monitor. The key feature of this monitoring method is the quantification of tool-induced image sharpness deterioration. The image sharpness index is calculated by a convolution method of image sharpness deterioration function caused by SEM optics feature. The sensitivity of this methodology was tested by the alteration of the beam diameter using astigmatism. PG monitor result can be related to the beam diameter variation that causes CD variation through image sharpness. PG monitor can detect the slight image sharpness change that cannot be noticed by engineer's visual check. Furthermore, PG monitor was applied to tool matching and long-term stability monitoring for multiple tools. As a result, PG monitor was found to have sufficient sensitivity to CD variation in tool matching and long-term stability assessment. The investigation showed that PG monitor can detect CD variation equivalent to ~ 0.1 nm. The CD-SEM tool management system using PG monitor is effective for CD metrology in production.

  11. Software Engineering applied to Manufacturing Problems

    Jorge A. Ruiz-Vanoye

    2010-05-01

    Full Text Available Optimization approaches have traditionally been viewed as tools for solving manufacturing problems, the optimization approach is not suitable for many problems arising in modern manufacturing systems due to their complexity and involvement of qualitative factors. In this paper we use a tool of software engineering applied to manufacturing problems. We use the Heuristics Lab software to determine and analyze the solution obtained for Manufacturing Problems.

  12. ECONOMETRIC TOOLS OF CONTROLLING

    Orlov A. I.

    2015-03-01

    Full Text Available Econometrics is one of the most effective mathematical tools of controlling. The article deals with general problems of application of econometric methods in solving problems of controlling. Econometric methods - is primarily a statistical analysis of concrete economic data, of course, with the help of computers. In our country, they are still relatively little known, even though we have the most powerful scientific school in the foundations of econometrics - the probability theory. The article shows that to decide the problems of controlling is necessary to apply econometric methods. Classification of econometric tools can be carried out on various grounds: on methods, by type of data, in tasks, etc. Mass introduction of software products, including modern econometric analysis tools of concrete economic data can be regarded as one of the most effective ways to accelerate scientific and technological progress. The whole arsenal currently used econometric and statistical techniques (methods can be divided into three streams: high econometric (statistical technology; classical econometric (statistical technology, low (inadequate, obsolete econometric (statistical technology. The main problem of modern econometrics is to ensure that the concrete econometric and statistical studies used only the first two types of technology. To get a broader representation of the use of econometric methods in the management of production organization we analyze basic textbook "Organization and planning of engineering production (production management," prepared by the Department of "Economics and organization of production" of the Bauman Moscow State Technical University. It has more than 20 times using econometric methods and models that testify to the effectiveness of such a tool of manager as econometrics

  13. Essays in applied economics

    Arano, Kathleen

    Three independent studies in applied economics are presented. The first essay looks at the US natural gas industrial sector and estimates welfare effects associated with the changes in natural gas regulatory policy over the past three decades. Using a disequilibrium model suited to the natural gas industry, welfare transfers and deadweight losses are calculated. Results indicate that deregulation policies, beginning with the NGPA of 1978, have caused the industry to become more responsive to market conditions. Over time, regulated prices converge toward the estimated equilibrium prices. As a result of this convergence, deadweight losses associated with regulation are also diminished. The second essay examines the discounted utility model (DU), the standard model used for intertemporal decision-making. Prior empirical studies challenge the descriptive validity of the model. This essay addresses the four main inconsistencies that have been raised: domain dependence, magnitude effects, time effects, and gain/loss asymmetries. These inconsistencies, however, may be the result of the implicit assumption of linear utility and not a failure of the DU model itself. In order to test this hypothesis, data was collected from in-class surveys of economics classes at Mississippi State University. A random effects model for panel data estimation which accounts for individual specific effects was then used to impute discount rates measured in terms of dollars and utility. All four inconsistencies were found to be present when the dollar measures were used. Using utility measures of the discount rate resolved the inconsistencies in some cases. The third essay brings together two perspectives in the study of religion and economics: modeling religious behavior using economic tools and variables, and modeling economic behavior using religious variables. A system of ordered probit equations is developed to simultaneously model religious activities and economic outcomes. Using data

  14. CoC GIS Tools (GIS Tool)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  15. Improving Forest Operations Management through Applied Research

    Mark Brown

    2011-09-01

    Full Text Available A great challenge of applied research is translating results into industry innovation. Increasingly, forest managers do not have the capacity to interpret research results but prefer to be presented with tools based on the research results that can be readily implemented. The Cooperative Research Centre (CRC for Forestry, based in Australia, has focused on delivering research results to industry partners in novel ways that can be easily applied in the field. This paper discusses six approaches taken by the CRC to help transfer applied research results to industry, including basic benchmarking curves for feller-bunchers, a toolbox for operational machine evaluation, a productivity model, a method to predict productivity with existing data, a guide for effective use of onboard computers and an optimised transportation planning tool. For each approach the paper will discuss how these approaches were developed and applied with industry collaboration.

  16. Radiotracers: Essential Nuclear Tools to Understand Oceans

    Radiotracer studies can be applied in floating or seabed tent structures called mesocosms. This valuable experimental tool allows natural environments to be studied under controlled conditions combining the benefits of lab and field work

  17. A View from Agricultural and Applied Economics

    Henry, Mark

    2000-01-01

    Is regional science too focused on abstract models, theorizing, and methodology with weak links to policy and practice? Not from the perspective of the land grant university, where many applied economists with a regional science interest reside. The job of these applied economists is, in part, to translate the models and methods into tools for the use in understanding regional development processes and for undertaking policy analysis at the regional level.

  18. Design of the ITER tokamak assembly tools

    ITER tokamak assembly is mainly composed of lower cryostat activities, sector sub-assembly, sector assembly, in-vessel activities and ex-vessel activities. The main tools for sector sub-assembly procedures consists of upending tool, sector lifting tool, vacuum vessel support and bracing tool and sector sub-assembly tool. Conceptual design of assembly tools for sector sub-assembly procedures is described herein. The basic structure for upending tool has been developed under the assumption that upending is performed with crane which will be installed in Tokamak building. Sector lifting tool is designed to adjust the position of a sector to minimize the difference between the center of the tokamak building crane and the center of gravity of the sector. Sector sub-assembly tool is composed of special frame for the fine adjustment of position control with 6 degrees of freedom. The design of VV support and bracing tool for four kinds of VV 40 deg. sectors has been developed. Also, structural analysis for upending tool, sector sub-assembly tool has been studied using ANSYS for the situation of an applied load with the same dead weight multiplied by 3/4. The results of structural analyses for these tools were below the allowable values

  19. Nanotechnology tools in pharmaceutical R&D

    Kumar, Challa S. S. R.

    2010-01-01

    Nanotechnology is a new approach to problem solving and can be considered as a collection of tools and ideas which can be applied in pharmaceutical industry. Application of nanotechnology tools in pharmaceutical R&D is likely to result in moving the industry from ‘blockbuster drug’ model to ‘personalized medicine’. There are compelling applications in pharmaceutical industry where inexpensive nanotechnology tools can be utilized. The review explores the possibility of categorizing various nan...

  20. A Hybrid Pattern Recognition Architecture for Cutting Tool Condition Monitoring

    Fu, Pan; Hope, A. D.

    2008-01-01

    An intelligent tool condition monitoring system has been established. Tool wear classification is realized by applying a unique fuzzy neural hybrid pattern recognition system. On the basis of this investigation, the following conclusions can be made.

  1. Software Release Procedure and Tools

    Giammatteo, Gabriele; Frosini, Luca; Laskaris, Nikolas

    2015-01-01

    Deliverable D4.1 - "Software Release Procedures and Tools" aims to provide a detailed description of the procedures applied and tools used to manage releases of the gCube System within Work Package 4. gCube System is the software at the basis of all VREs applications, data management services and portals. Given the large size of the gCube system, its high degree of modularity and the number of developers involved in the implementation, a set of procedures that formalize and simplify the integ...

  2. Advances in Applied Mechanics

    2014-01-01

    Advances in Applied Mechanics draws together recent significant advances in various topics in applied mechanics. Published since 1948, Advances in Applied Mechanics aims to provide authoritative review articles on topics in the mechanical sciences, primarily of interest to scientists and engineers working in the various branches of mechanics, but also of interest to the many who use the results of investigations in mechanics in various application areas, such as aerospace, chemical, civil, en...

  3. Perspectives on Applied Ethics

    2007-01-01

    Applied ethics is a growing, interdisciplinary field dealing with ethical problems in different areas of society. It includes for instance social and political ethics, computer ethics, medical ethics, bioethics, envi-ronmental ethics, business ethics, and it also relates to different forms of professional ethics. From the perspective of ethics, applied ethics is a specialisation in one area of ethics. From the perspective of social practice applying eth-ics is to focus on ethical aspects and ...

  4. Applied Neuroscience Laboratory Complex

    Federal Laboratory Consortium — Located at WPAFB, Ohio, the Applied Neuroscience lab researches and develops technologies to optimize Airmen individual and team performance across all AF domains....

  5. Applied Linguistics: Brazilian Perspectives

    Cavalcanti, Marilda C.

    2004-01-01

    The aim of this paper is to present perspectives in Applied Linguistics (AL) against the background of a historical overview of the field in Brazil. I take the stance of looking at AL as a field of knowledge and as a professional area of research. This point of view directs my reflections towards research-based Applied Linguistics carried out from…

  6. Evaluating Tidal Marsh Sustainability in the Face of Sea-Level Rise: A Hybrid Modeling Approach Applied to San Francisco Bay

    Stralberg, Diana; Brennan, Matthew; Callaway, John C.; Wood, Julian K.; Schile, Lisa M.; Jongsomjit, Dennis; Kelly, Maggi; Parker, V. Thomas; Crooks, Stephen

    2011-01-01

    Background Tidal marshes will be threatened by increasing rates of sea-level rise (SLR) over the next century. Managers seek guidance on whether existing and restored marshes will be resilient under a range of potential future conditions, and on prioritizing marsh restoration and conservation activities. Methodology Building upon established models, we developed a hybrid approach that involves a mechanistic treatment of marsh accretion dynamics and incorporates spatial variation at a scale relevant for conservation and restoration decision-making. We applied this model to San Francisco Bay, using best-available elevation data and estimates of sediment supply and organic matter accumulation developed for 15 Bay subregions. Accretion models were run over 100 years for 70 combinations of starting elevation, mineral sediment, organic matter, and SLR assumptions. Results were applied spatially to evaluate eight Bay-wide climate change scenarios. Principal Findings Model results indicated that under a high rate of SLR (1.65 m/century), short-term restoration of diked subtidal baylands to mid marsh elevations (−0.2 m MHHW) could be achieved over the next century with sediment concentrations greater than 200 mg/L. However, suspended sediment concentrations greater than 300 mg/L would be required for 100-year mid marsh sustainability (i.e., no elevation loss). Organic matter accumulation had minimal impacts on this threshold. Bay-wide projections of marsh habitat area varied substantially, depending primarily on SLR and sediment assumptions. Across all scenarios, however, the model projected a shift in the mix of intertidal habitats, with a loss of high marsh and gains in low marsh and mudflats. Conclusions/Significance Results suggest a bleak prognosis for long-term natural tidal marsh sustainability under a high-SLR scenario. To minimize marsh loss, we recommend conserving adjacent uplands for marsh migration, redistributing dredged sediment to raise elevations, and

  7. Navigating Towards Digital Tectonic Tools

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2006-01-01

    understand the conflicts in architecture and the building industry but also bring us further into a discussion of how architecture can use digital tools. The investigation is carried out firstly by approaching the subject theoretically through the term tectonics and by setting up a model of the values a...... tectonic tool should encompass. Secondly the ability and validity of the model are shown by applying it to a case study of Jørn Utzon’s work on Minor Hall in Sydney Opera House - for the sake of exemplification the technical field focused on in this paper is room acoustics. Thirdly the relationship between...... like opposites, the term tectonics deals with creating a meaningful relationship between the two. The aim of this paper is to investigate what a digital tectonic tool could be and what relationship with technology it should represent. An understanding of this relationship can help us not only to...

  8. Complex reconfiguration - developing common tools

    Reconfiguring DOE sites, facilities, and laboratories to meet expected and evolving missions involves a number of disciplines and approaches formerly the preserve of private industry and defense contractors. This paper considers the process of identifying common tools for the various disciplines that can be exercised, assessed, and applied by team members to arrive at integrated solutions. The basic tools include: systems, hardware, software, and procedures that can characterize a site/facility's environment to meet organizational goals, safeguards and security, ES ampersand H, and waste requirements. Other tools such as computer-driven inventory and auditing programs can provide traceability of materials and product as they are processed and required added protection and control. This paper will also discuss the use of integrated teams in a number of high technology enterprises that could be adopted by DOE in high profile programs from environmental remediation to weapons dismantling and arms control

  9. PAT tools for fermentation processes

    Gernaey, Krist

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This presentation therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: - On-line sensors, where for example spectroscopic measurements are increasingly applied - Mechanistic models, which can be used to...

  10. Force feedback facilitates multisensory integration during robotic tool use

    Sengül, Ali; Rognini, Giulio; van Elk, Michiel; Aspell, Jane Elizabeth; Bleuler, Hannes; Blanke, Olaf

    2013-01-01

    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal congruency task, by responding to tactile vibrations applied to their hands, while ignoring visual distractors superimposed on the robotic tools. In the first experiment it was found that tool-use tra...

  11. Aligators for Arrays (Tool Paper)

    Henzinger, Thomas A.; Hottelier, Thibaud; Kovács, Laura; Rybalchenko, Andrey

    This paper presents Aligators, a tool for the generation of universally quantified array invariants. Aligators leverages recurrence solving and algebraic techniques to carry out inductive reasoning over array content. The Aligators' loop extraction module allows treatment of multi-path loops by exploiting their commutativity and serializability properties. Our experience in applying Aligators on a collection of loops from open source software projects indicates the applicability of recurrence and algebraic solving techniques for reasoning about arrays.

  12. Mesothelioma Applied Research Foundation

    ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs Help us raise awareness and ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs © 2013 Mesothelioma Applied Research Foundation, ...

  13. Applied Mathematics Seminar 1982

    This report contains the abstracts of the lectures delivered at 1982 Applied Mathematics Seminar of the DPD/LCC/CNPq and Colloquy on Applied Mathematics of LCC/CNPq. The Seminar comprised 36 conferences. Among these, 30 were presented by researchers associated to brazilian institutions, 9 of them to the LCC/CNPq, and the other 6 were given by visiting lecturers according to the following distribution: 4 from the USA, 1 from England and 1 from Venezuela. The 1981 Applied Mathematics Seminar was organized by Leon R. Sinay and Nelson do Valle Silva. The Colloquy on Applied Mathematics was held from october 1982 on, being organized by Ricardo S. Kubrusly and Leon R. Sinay. (Author)

  14. Handbook of Applied Analysis

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  15. Applying contemporary statistical techniques

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  16. Pre-Columbian monkey tools.

    Haslam, Michael; Luncz, Lydia V; Staff, Richard A; Bradshaw, Fiona; Ottoni, Eduardo B; Falótico, Tiago

    2016-07-11

    Stone tools reveal worldwide innovations in human behaviour over the past three million years [1]. However, the only archaeological report of pre-modern non-human animal tool use comes from three Western chimpanzee (Pan troglodytes verus) sites in Côte d'Ivoire, aged between 4.3 and 1.3 thousand years ago (kya) [2]. This anthropocentrism limits our comparative insight into the emergence and development of technology, weakening our evolutionary models [3]. Here, we apply archaeological techniques to a distinctive stone tool assemblage created by a non-human animal in the New World, the Brazilian bearded capuchin monkey (Sapajus libidinosus). Wild capuchins at Serra da Capivara National Park (SCNP) use stones to pound open defended food, including locally indigenous cashew nuts [4], and we demonstrate that this activity dates back at least 600 to 700 years. Capuchin stone hammers and anvils are therefore the oldest non-human tools known outside of Africa, opening up to scientific scrutiny questions on the origins and spread of tool use in New World monkeys, and the mechanisms - social, ecological and cognitive - that support primate technological evolution. PMID:27404235

  17. PSYCHOANALYSIS AS APPLIED AESTHETICS.

    Richmond, Stephen H

    2016-07-01

    The question of how to place psychoanalysis in relation to science has been debated since the beginning of psychoanalysis and continues to this day. The author argues that psychoanalysis is best viewed as a form of applied art (also termed applied aesthetics) in parallel to medicine as applied science. This postulate draws on a functional definition of modernity as involving the differentiation of the value spheres of science, art, and religion. The validity criteria for each of the value spheres are discussed. Freud is examined, drawing on Habermas, and seen to have erred by claiming that the psychoanalytic method is a form of science. Implications for clinical and metapsychological issues in psychoanalysis are discussed. PMID:27428582

  18. Applied chemical engineering thermodynamics

    Tassios, Dimitrios P

    1993-01-01

    Applied Chemical Engineering Thermodynamics provides the undergraduate and graduate student of chemical engineering with the basic knowledge, the methodology and the references he needs to apply it in industrial practice. Thus, in addition to the classical topics of the laws of thermodynamics,pure component and mixture thermodynamic properties as well as phase and chemical equilibria the reader will find: - history of thermodynamics - energy conservation - internmolecular forces and molecular thermodynamics - cubic equations of state - statistical mechanics. A great number of calculated problems with solutions and an appendix with numerous tables of numbers of practical importance are extremely helpful for applied calculations. The computer programs on the included disk help the student to become familiar with the typical methods used in industry for volumetric and vapor-liquid equilibria calculations.

  19. Introduction to applied thermodynamics

    Helsdon, R M; Walker, G E

    1965-01-01

    Introduction to Applied Thermodynamics is an introductory text on applied thermodynamics and covers topics ranging from energy and temperature to reversibility and entropy, the first and second laws of thermodynamics, and the properties of ideal gases. Standard air cycles and the thermodynamic properties of pure substances are also discussed, together with gas compressors, combustion, and psychrometry. This volume is comprised of 16 chapters and begins with an overview of the concept of energy as well as the macroscopic and molecular approaches to thermodynamics. The following chapters focus o

  20. Applied statistics with SPSS

    Huizingh, Eelko K R E

    2007-01-01

    Accessibly written and easy to use, Applied Statistics Using SPSS is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. Based around the needs of undergraduate students embarking on their own research project, the text's self-help style is designed to boost the skills and confidence of those that will need to use SPSS in the course of doing their research project. The book is pedagogically well developed and contains many screen dumps and exercises, glossary terms and worked examples. Divided into two parts, Applied Statistics Using SPSS covers :

  1. Applied mathematics made simple

    Murphy, Patrick

    1982-01-01

    Applied Mathematics: Made Simple provides an elementary study of the three main branches of classical applied mathematics: statics, hydrostatics, and dynamics. The book begins with discussion of the concepts of mechanics, parallel forces and rigid bodies, kinematics, motion with uniform acceleration in a straight line, and Newton's law of motion. Separate chapters cover vector algebra and coplanar motion, relative motion, projectiles, friction, and rigid bodies in equilibrium under the action of coplanar forces. The final chapters deal with machines and hydrostatics. The standard and conte

  2. Retransmission Steganography Applied

    Mazurczyk, Wojciech; Szczypiorski, Krzysztof

    2010-01-01

    This paper presents experimental results of the implementation of network steganography method called RSTEG (Retransmission Steganography). The main idea of RSTEG is to not acknowledge a successfully received packet to intentionally invoke retransmission. The retransmitted packet carries a steganogram instead of user data in the payload field. RSTEG can be applied to many network protocols that utilize retransmissions. We present experimental results for RSTEG applied to TCP (Transmission Control Protocol) as TCP is the most popular network protocol which ensures reliable data transfer. The main aim of the performed experiments was to estimate RSTEG steganographic bandwidth and detectability by observing its influence on the network retransmission level.

  3. LensTools: Weak Lensing computing tools

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  4. Applying Mathematical Processes (AMP)

    Kathotia, Vinay

    2011-01-01

    This article provides insights into the "Applying Mathematical Processes" resources, developed by the Nuffield Foundation. It features Nuffield AMP activities--and related ones from Bowland Maths--that were designed to support the teaching and assessment of key processes in mathematics--representing a situation mathematically, analysing,…

  5. Applied Behavior Analysis

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  6. Essays on Applied Microeconomics

    Mejia Mantilla, Carolina

    2013-01-01

    Each chapter of this dissertation studies a different question within the field of Applied Microeconomics. The first chapter examines the mid- and long-term effects of the 1998 Asian Crisis on the educational attainment of Indonesian children ages 6 to 18, at the time of the crisis. The effects are identified as deviations from a linear trend for…

  7. Applied singular integral equations

    Mandal, B N

    2011-01-01

    The book is devoted to varieties of linear singular integral equations, with special emphasis on their methods of solution. It introduces the singular integral equations and their applications to researchers as well as graduate students of this fascinating and growing branch of applied mathematics.

  8. Applied Statistics with SPSS

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  9. Taxonomic Evidence Applying Intelligent Information Taxonomic Evidence Applying Intelligent Information

    Félix Anibal Vallejos

    2005-12-01

    Full Text Available The Numeric Taxonomy aims to group operational taxonomic units in clusters (OTUs or taxons or taxa, using the denominated structure analysis by means of numeric methods. These clusters that constitute families are the purpose of this series of projects and they emerge of the structural analysis, of their phenotypical characteristic, exhibiting the relationships in terms of grades of similarity of the OTUs, employing tools such as i the Euclidean distance and ii nearest neighbor techniques. Thus taxonomic evidence is gathered so as to quantify the similarity for each pair of OTUs (pair-group method obtained from the basic data matrix and in this way the significant concept of spectrum of the OTUs is introduced, being based the same one on the state of their characters. A new taxonomic criterion is thereby formulated and a new approach to Computational Taxonomy is presented, that has been already employed with reference to Data Mining, when apply of Machine Learning techniques, in particular to the C4.5 algorithms, created by Quinlan, the degree of efficiency achieved by the TDIDT family's algorithms when are generating valid models of the data in classification problems with the Gain of Entropy through Maximum Entropy Principle. The Numeric Taxonomy aims to group operational taxonomic units in clusters (OTUs or taxons or taxa, using the denominated structure analysis by means of numeric methods. These clusters that constitute families are the purpose of this series of projects and they emerge of the structural analysis, of their phenotypical characteristic, exhibiting the relationships in terms of grades of similarity of the OTUs, employing tools such as i the Euclidean distance and ii nearest neighbor techniques. Thus taxonomic evidence is gathered so as to quantify the similarity for each pair of OTUs (pair-group method obtained from the basic data matrix and in this way the significant concept of spectrum of the OTUs is introduced, being based

  10. Sedimentary Records of Environmental Evolution During the Recent 100 Years in the Coastal Zone of Guangxi Province%广西海岸带近百年来人类活动影响下环境演变的沉积记录

    夏鹏; 孟宪伟; 李珍; 丰爱平; 王湘芹

    2012-01-01

    Six sediment cores were collected during 2007 from the coastal zone of Guangxi province.The temporal evolution of biogenic elements(C,N and P) and metal(Hg,Cu,Pb,Zn,Cd,Cr and As) inputs was clearly recorded by high 210Pbxs-derived sedimentation rates(0.25~1.68 cm y-1),especially in the estuary of Qinjiang river and Nanliu river(1.68 cm·y-1 and 0.70 cm·y-1,respectively),which could be attributed to high rates of river sediment transport.Based on the vertical distributions of enrichment factors and excess fluxes,heavy metals and total phosphorus were obviously enriched in the recent 20 years,but do not exceed the quality standard for marine sediment.The results indicated that the natural inputs prevailed up to the early 1980s except Cu.After this period,the excess metal fluxes could be associated with the intensive use of phosphate fertilizers and the combustion of fossil fuels,which caused a slight enrichment.However,total organic carbon showed decreasing trends toward the surface,which could be associated with the decreases of the mangrove forest derived from tidal flat reclamation recently.According to all indicators,environmental evolution of the Guangxi coast during the recent 100 years can be divided into two stages:(1) before the early 1980s characterized by the relatively low heavy metal pollution and scarce eutrophication;(2) after the early 1980s,the concentrations of heavy metals and total phosphorus are significantly increasing,indicating of the anthropogenic inputs.%基于2007年在广西海岸带采集的6根短柱状样(64~97 cm),在210Pb年代框架构建的基础上对沉积物中的生源要素(C、N、P)、常量元素以及重金属(Hg、Cu、Pb、Zn、Cd、Cr和As)等指标进行了综合分析,并依此重建了广西海岸带近百年来人类活动影响下沉积环境的演变历程。研究发现,近二十年来重金属和总磷的表层富集和埋藏通量有明显的上升趋势,但整体污染

  11. Java Radar Analysis Tool

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  12. OOTW COST TOOLS

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  13. Pro Tools HD

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  14. Software engineering tools.

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development. PMID:10131419

  15. Applied data mining for business and industry

    Giudici, Paolo

    2009-01-01

    The increasing availability of data in our current, information overloaded society has led to the need for valid tools for its modelling and analysis. Data mining and applied statistical methods are the appropriate tools to extract knowledge from such data. This book provides an accessible introduction to data mining methods in a consistent and application oriented statistical framework, using case studies drawn from real industry projects and highlighting the use of data mining methods in a variety of business applications. Introduces data mining methods and applications.Covers classical and Bayesian multivariate statistical methodology as well as machine learning and computational data mining methods.Includes many recent developments such as association and sequence rules, graphical Markov models, lifetime value modelling, credit risk, operational risk and web mining.Features detailed case studies based on applied projects within industry.Incorporates discussion of data mining software, with case studies a...

  16. Applying the WEAP Model to Water Resource

    Gao, Jingjing; Christensen, Per; Li, Wei

    Water resources assessment is a tool to provide decision makers with an appropriate basis to make informed judgments regarding the objectives and targets to be addressed during the Strategic Environmental Assessment (SEA) process. The study shows how water resources assessment can be applied in SEA...... in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like...... irrigation efficiency, treatment and reuse of water. The WEAP model was applied to the Ordos catchment where it was used for the first time in China. The changes in water resource utilization in Ordos basin were assessed with the model. It was found that the WEAP model is a useful tool for water resource...

  17. Fundamentals of applied probability and random processes

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  18. Applied Control Systems Design

    Mahmoud, Magdi S

    2012-01-01

    Applied Control System Design examines several methods for building up systems models based on real experimental data from typical industrial processes and incorporating system identification techniques. The text takes a comparative approach to the models derived in this way judging their suitability for use in different systems and under different operational circumstances. A broad spectrum of control methods including various forms of filtering, feedback and feedforward control is applied to the models and the guidelines derived from the closed-loop responses are then composed into a concrete self-tested recipe to serve as a check-list for industrial engineers or control designers. System identification and control design are given equal weight in model derivation and testing to reflect their equality of importance in the proper design and optimization of high-performance control systems. Readers’ assimilation of the material discussed is assisted by the provision of problems and examples. Most of these e...

  19. Applying Technology Roadmapping to Exploratory Forecasting

    Dmitry Belousov; Alexander Frolov; Irina Sukhareva

    2012-01-01

    Technology roadmaps are typically used in terms of normative approach to long-term S&T forecasting thus serving as a visual tool to identify optimal ways to reach defined future goals. Nonetheless roadmaps can be applied to exploratory studies as well. The latter are aimed at identifying key potentially transformative events, whose dynamics could be influenced by a wide range of externalities. Trends in some fields appear to dominate and shape movement in other areas. Roadmapping allows visua...

  20. Essays in Applied Econometrics

    Michèle A. Weynandt

    2014-01-01

    This thesis includes three essays in applied econometrics. The first and third chapters focus on labor market outcomes of minority group members, while the second focuses on education. Chapter 1 deals with the relationship between sexual orientation, gender, partnership, and labor outcomes. I suggest that if there are compensating differentials and a gender gap in potential wages, an income effect can lead partnered gay men to jobs with lower wages and higher amenities than partnered straight...

  1. Applied Economics in Teaching

    朱红萍

    2009-01-01

    This paper explains some plain phenomena in teaching and class management with an economic view. Some basic economic principles mentioned therein are: everything has its opportunity cost; the marginal utility of consumption of any kind is diminishing; Game theory is everywhere. By applying the economic theories to teaching, it is of great help for teachers to understand the students' behavior and thus improve the teaching effectiveness and efficiency.

  2. Methods of applied mathematics

    Hildebrand, Francis B

    1992-01-01

    This invaluable book offers engineers and physicists working knowledge of a number of mathematical facts and techniques not commonly treated in courses in advanced calculus, but nevertheless extremely useful when applied to typical problems in many different fields. It deals principally with linear algebraic equations, quadratic and Hermitian forms, operations with vectors and matrices, the calculus of variations, and the formulations and theory of linear integral equations. Annotated problems and exercises accompany each chapter.

  3. Applied Logic in Engineering

    Spichkova, Maria

    2016-01-01

    Logic not only helps to solve complicated and safety-critical problems, but also disciplines the mind and helps to develop abstract thinking, which is very important for any area of Engineering. In this technical report, we present an overview of common challenges in teaching of formal methods and discuss our experiences from the course Applied Logic in Engineering. This course was taught at TU Munich, Germany, in Winter Semester 2012/2013.

  4. Applied longitudinal analysis

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  5. Essays on Applied Microeconomics

    Lee, Hoan Soo

    2013-01-01

    Empirical and theoretical topics in applied microeconomics are discussed in this dissertation. The first essay identifies and measures managerial advantages from access to high-quality deals in venture capital investments. The underlying social network of Harvard Business School MBA venture capitalists and entrepreneurs is used to proxy availability of deal access. Random section assignment of HBS MBA graduates provides a key exogenous variation for identification. Being socially connected to...

  6. Applied statistics for economists

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  7. Ludic Educational Game Creation Tool

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos;

    2015-01-01

    This paper presents initial findings and ongoing work of the game creation tool, a core component of the IOLAOS(IOLAOS in ancient Greece was a divine hero famed for helping with some of Heracles’s labors.) platform, a general open authorable framework for educational and training games. The game ...... for teaching road safety, by employing ludic interfaces for both the game creator and the game player, as well as ludic game design.......This paper presents initial findings and ongoing work of the game creation tool, a core component of the IOLAOS(IOLAOS in ancient Greece was a divine hero famed for helping with some of Heracles’s labors.) platform, a general open authorable framework for educational and training games. The game...... creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play...

  8. Machine tool structures

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  9. Pickering tool management system

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  10. A Shape Optimization Study for Tool Design in Resistance Welding

    Bogomolny, Michael; Bendsøe, Martin P.; Hattel, Jesper Henri

    2009-01-01

    The purpose of this study is to apply shape optimization tools for design of resistance welding electrodes. The numerical simulation of the welding process has been performed by a simplified FEM model implemented in COMSOL. The design process is formulated as an optimization problem where...... to simplify the calculation of shape sensitivities and to generate a generic tool that can be interfaced with other simulation tools. An example numerical study shows the potential of applying optimal design techniques in this area....

  11. Applied Research Study

    Leach, Ronald J.

    1997-01-01

    The purpose of this project was to study the feasibility of reusing major components of a software system that had been used to control the operations of a spacecraft launched in the 1980s. The study was done in the context of a ground data processing system that was to be rehosted from a large mainframe to an inexpensive workstation. The study concluded that a systematic approach using inexpensive tools could aid in the reengineering process by identifying a set of certified reusable components. The study also developed procedures for determining duplicate versions of software, which were created because of inadequate naming conventions. Such procedures reduced reengineering costs by approximately 19.4 percent.

  12. ENVIRONMENTAL MANAGEMENT TOOLS: INTERNATIONAL PRACTICIES FOR RUSSIA

    Smetanina, T.; Pintassilgo, P.; Matias, A.

    2014-01-01

    This article deals with the basic tools of environmental management applied by developed countries and discusses its application to Russia. The focus is on environmental management instruments such as environmental taxes, subsidies, standards, permits and also on the important role of voluntary tools. Russian practice is analyzed in terms of the current environmental management situation and the prospects of necessary legislative actions. The article refers to the formation of the basic parts...

  13. The process-based stand growth model Formix 3-Q applied in a GIS environment for growth and yield analysis in a tropical rain forest.

    Ditzer, T.; Glauner, R.; Förster, M.; Köhler, P.; Huth, A.

    2000-03-01

    Managing tropical rain forests is difficult because few long-term field data on forest growth and the impact of harvesting disturbance are available. Growth models may provide a valuable tool for managers of tropical forests, particularly if applied to the extended forest areas of up to 100,000 ha that typically constitute the so-called forest management units (FMUs). We used a stand growth model in a geographic information system (GIS) environment to simulate tropical rain forest growth at the FMU level. We applied the process-based rain forest growth model Formix 3-Q to the 55,000 ha Deramakot Forest Reserve (DFR) in Sabah, Malaysia. The FMU was considered to be composed of single and independent small-scale stands differing in site conditions and forest structure. Field data, which were analyzed with a GIS, comprised a terrestrial forest inventory, site and soil analyses (water, nutrients, slope), the interpretation of aerial photographs of the present vegetation and topographic maps. Different stand types were determined based on a classification of site quality (three classes), slopes (four classes), and present forest structure (four strata). The effects of site quality on tree allometry (height-diameter curve, biomass allometry, leaf area) and growth (increment size) are incorporated into Formix 3-Q. We derived allometric relations and growth factors for different site conditions from the field data. Climax forest structure at the stand level was shown to depend strongly on site conditions. Simulated successional pattern and climax structure were compared with field observations. Based on the current management plan for the DFR, harvesting scenarios were simulated for stands on different sites. The effects of harvesting guidelines on forest structure and the implications for sustainable forest management at Deramakot were analyzed. Based on the stand types and GIS analysis, we also simulated undisturbed regeneration of the logged-over forest in the DFR at

  14. Miniaturised Spotter-Compatible Multicapillary Stamping Tool for Microarray Printing

    Drobyshev, A L; Zasedatelev, A S; Drobyshev, Alexei L; Verkhodanov, Nikolai N; Zasedatelev, Alexander S

    2007-01-01

    Novel microstamping tool for microarray printing is proposed. The tool is capable to spot up to 127 droplets of different solutions in single touch. It is easily compatible with commercially available microarray spotters. The tool is based on multichannel funnel with polypropylene capillaries inserted into its channels. Superior flexibility is achieved by ability to replace any printing capillary of the tool. As a practical implementation, hydrogel-based microarrays were stamped and successfully applied to identify the Mycobacterium tuberculosis drug resistance.

  15. Recent Advances in Algal Genetic Tool Development

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  16. Applied linear regression

    Weisberg, Sanford

    2005-01-01

    Master linear regression techniques with a new edition of a classic text Reviews of the Second Edition: ""I found it enjoyable reading and so full of interesting material that even the well-informed reader will probably find something new . . . a necessity for all of those who do linear regression."" -Technometrics, February 1987 ""Overall, I feel that the book is a valuable addition to the now considerable list of texts on applied linear regression. It should be a strong contender as the leading text for a first serious course in regression analysis."" -American Scientist, May-June 1987

  17. Applied impulsive mathematical models

    Stamova, Ivanka

    2016-01-01

    Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

  18. Applied logistic regression

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  19. SIFT applied to CBIR

    ALMEIDA, J.

    2009-12-01

    Full Text Available Content-Based Image Retrieval (CBIR is a challenging task. Common approaches use only low-level features. Notwithstanding, such CBIR solutions fail on capturing some local features representing the details and nuances of scenes. Many techniques in image processing and computer vision can capture these scene semantics. Among them, the Scale Invariant Features Transform~(SIFT has been widely used in a lot of applications. This approach relies on the choice of several parameters which directly impact its effectiveness when applied to retrieve images. In this paper, we discuss the results obtained in several experiments proposed to evaluate the application of the SIFT in CBIR tasks.

  20. Applied linear regression

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  1. Applied energy an introduction

    Abdullah, Mohammad Omar

    2012-01-01

    Introduction to Applied EnergyGeneral IntroductionEnergy and Power BasicsEnergy EquationEnergy Generation SystemsEnergy Storage and MethodsEnergy Efficiencies and LossesEnergy industry and Energy Applications in Small -Medium Enterprises (SME) industriesEnergy IndustryEnergy-Intensive industryEnergy Applications in SME Energy industriesEnergy Sources and SupplyEnergy SourcesEnergy Supply and Energy DemandEnergy Flow Visualization and Sankey DiagramEnergy Management and AnalysisEnergy AuditsEnergy Use and Fuel Consumption StudyEnergy Life-Cycle AnalysisEnergy and EnvironmentEnergy Pollutants, S

  2. Applied nonparametric statistical methods

    Sprent, Peter

    2007-01-01

    While preserving the clear, accessible style of previous editions, Applied Nonparametric Statistical Methods, Fourth Edition reflects the latest developments in computer-intensive methods that deal with intractable analytical problems and unwieldy data sets. Reorganized and with additional material, this edition begins with a brief summary of some relevant general statistical concepts and an introduction to basic ideas of nonparametric or distribution-free methods. Designed experiments, including those with factorial treatment structures, are now the focus of an entire chapter. The text also e

  3. Applied Semantic Web Technologies

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  4. Applied complex variables

    Dettman, John W

    1965-01-01

    Analytic function theory is a traditional subject going back to Cauchy and Riemann in the 19th century. Once the exclusive province of advanced mathematics students, its applications have proven vital to today's physicists and engineers. In this highly regarded work, Professor John W. Dettman offers a clear, well-organized overview of the subject and various applications - making the often-perplexing study of analytic functions of complex variables more accessible to a wider audience. The first half of Applied Complex Variables, designed for sequential study, is a step-by-step treatment of fun

  5. Capacitive tool standoff sensor for dismantlement tasks

    A capacitive sensing technology has been applied to develop a Standoff Sensor System for control of robotically deployed tools utilized in Decontamination and Dismantlement (D and D) activities. The system combines four individual sensor elements to provide non-contact, multiple degree-of-freedom control of tools at distances up to five inches from a surface. The Standoff Sensor has been successfully integrated to a metal cutting router and a pyrometer, and utilized for real-time control of each of these tools. Experiments demonstrate that the system can locate stationary surfaces with a repeatability of 0.034 millimeters

  6. Capacitive tool standoff sensor for dismantlement tasks

    Schmitt, D.J.; Weber, T.M. [Sandia National Labs., Albuquerque, NM (United States); Liu, J.C. [Univ. of Illinois, Urbana, IL (United States)

    1996-12-31

    A capacitive sensing technology has been applied to develop a Standoff Sensor System for control of robotically deployed tools utilized in Decontamination and Dismantlement (D and D) activities. The system combines four individual sensor elements to provide non-contact, multiple degree-of-freedom control of tools at distances up to five inches from a surface. The Standoff Sensor has been successfully integrated to a metal cutting router and a pyrometer, and utilized for real-time control of each of these tools. Experiments demonstrate that the system can locate stationary surfaces with a repeatability of 0.034 millimeters.

  7. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses

  8. Orbiter Entry Aerothermodynamics Practical Engineering and Applied Research

    Campbell, Charles H.

    2009-01-01

    The contents include: 1) Organization of the Orbiter Entry Aeroheating Working Group; 2) Overview of the Principal RTF Aeroheating Tools Utilized for Tile Damage Assessment; 3) Description of the Integrated Tile Damage Assessment Team Analyses Process; 4) Space Shuttle Flight Support Process; and 5) JSC Applied Aerosciences and CFD Branch Applied Research Interests.

  9. Open Health Tools: Tooling for Interoperable Healthcare

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  10. The Matecat Tool

    Federico, Marcello; Bertoldi, Nicola; Cettolo, Mauro; Negri, Matteo; TURCHI Marco; Trombetti, Marco; Cattelan, Alessandro; Farina, Antonio; Lupinetti, Domenico; Martines, Andrea; Massidda, Alberto; Schwenk, Holger; Barrault, Loïc; Blain, Frédéric; Koehn, Philipp

    2014-01-01

    We present a new web-based CAT tool providing translators with a professional work environment, integrating translation memories, terminology bases, concordancers, and machine translation. The tool is completely developed as open source software and has been already successfully deployed for business, research and education. The MateCat Tool represents today probably the best available open source platform for investigating, integrating, and evaluating under realistic conditions the impact of...

  11. Agreement Workflow Tool (AWT)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  12. Java Power Tools

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  13. Chimera Grid Tools

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  14. Instant Spring Tool Suite

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  15. Applied plasma physics

    Applied Plasma Physics is a major sub-organizational unit of the MFE Program. It includes Fusion Plasma Theory and Experimental Plasma Research. The Fusion Plasma Theory group has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under the Experimental Plasma Research Program, we are developing the intense, pulsed neutral-beam source (IPINS) for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of utilizing certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  16. Contributions to Applied Cartography

    Radovan Pavić

    2012-12-01

    Full Text Available According to the increasing awareness of the importance, advantagesand feasibility of representing/visualizing spatial relations and spatial content through corresponding cartography –maps are becoming increasingly more frequent and elaborate when one needs to represent some aspect of reality from various standpoints: economical, natural scientific or politological. Some contents practically impose the need for applied cartography which is especially true of international-political, military, geopolitical and transport issues. Therefore, mass communication media have been increasingly accepting and adopting specific cartography as significant content which successfully compete with the importance of the text itself – this is the case everywhere, including in Croatia. The French geographical-political-cartographic school is the model and exceptional accomplishment. It also has predecessors in the German/Nazi geopolitical school from the first half of the 20th century.

  17. Applied number theory

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  18. Applied plasma physics

    Applied Plasma Physics is a major sub-organizational unit of the MFE Porgram. It includes Fusion Plasma Theory and Experimental Plasma Research. Fusion Plasma Theory has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under Experimental Plasma Research, we are developing the intense, pulsed ion-neutral source (IPINS) for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of utilizing certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  19. Applied partial differential equations

    Logan, J David

    2015-01-01

    This text presents the standard material usually covered in a one-semester, undergraduate course on boundary value problems and PDEs.  Emphasis is placed on motivation, concepts, methods, and interpretation, rather than on formal theory. The concise treatment of the subject is maintained in this third edition covering all the major ideas: the wave equation, the diffusion equation, the Laplace equation, and the advection equation on bounded and unbounded domains. Methods include eigenfunction expansions, integral transforms, and characteristics. In this third edition, text remains intimately tied to applications in heat transfer, wave motion, biological systems, and a variety other topics in pure and applied science. The text offers flexibility to instructors who, for example, may wish to insert topics from biology or numerical methods at any time in the course. The exposition is presented in a friendly, easy-to-read, style, with mathematical ideas motivated from physical problems. Many exercises and worked e...

  20. Applied plasma physics

    Applied Plasma Physics is a major sub-organizational unit of the Magnetic Fusion Energy (MFE) Program. It includes Fusion Plasma Theory and Experimental Plasma Research. The Fusion Plasma Theory group has the responsibility for developing theoretical-computational models in the general areas of plasma properties, equilibrium, stability, transport, and atomic physics. This group has responsibility for giving guidance to the mirror experimental program. There is a formal division of the group into theory and computational; however, in this report the efforts of the two areas are not separated since many projects have contributions from members of both. Under the Experimental Plasma Research Program we are developing a neutral-beam source, the intense, pulsed ion-neutral source (IPINS), for the generation of a reversed-field configuration on 2XIIB. We are also studying the feasibility of using certain neutron-detection techniques as plasma diagnostics in the next generation of thermonuclear experiments

  1. Applied statistical thermodynamics

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  2. Strategic decision analysis applied to borehole seismology

    Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

  3. Maailma suurim tool

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  4. Security Tools: cops & tiger

    Lehle, Bernd; Reutter, Oliver

    1996-01-01

    Nachdem in der vorletzten BI.-Ausgabe SATAN als Security Check Tool für komplette Netzwerke vorgestellt wurde, kommen nun zwei 'Kollegen' an die Reihe, die einzelne Rechner lokal auf Sicherheitslöcher testen. Insbesondere wollen wir hier auch Wert auf die Einschätzung dieser Tools legen und Einsatzrichtlinien für den Gebrauch in typischen Arbeitsumgebungen vorstellen.

  5. Tools for negotiating meaning

    Seale, Jane

    2004-01-01

    In this issue of ALT-J we have seven articles that explore how we as learning technologists can use a variety of tools to explore, evaluate, develop and understand our practice and experience. These tools include concepts, theories, symbols and metaphors and are used to:DOI: 10.1080/0968776042000216174

  6. Scheme Program Documentation Tools

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools ar...

  7. Software engineering methodologies and tools

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  8. EPR design tools. Integrated data processing tools

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.)

  9. PREFACE: Celebrating 100 years of superconductivity: special issue on the iron-based superconductors Celebrating 100 years of superconductivity: special issue on the iron-based superconductors

    Crabtree, George; Greene, Laura; Johnson, Peter

    2011-12-01

    In honor of this year's 100th anniversary of the discovery of superconductivity, this special issue of Reports on Progress in Physics is a dedicated issue to the 'iron-based superconductors'—a new class of high-temperature superconductors that were discovered in 2008. This is the first time the journal has generated a 'theme issue', and we provide this to the community to provide a 'snapshot' of the present status, both for researchers working in this fast-paced field, and for the general physics community. Reports on Progress in Physics publishes three classes of articles—comprehensive full Review Articles, Key Issues Reviews and, most recently, Reports on Progress articles that recount the current status of a rapidly evolving field, befitting of the articles in this special issue. It has been an exciting year for superconductivity—there have been numerous celebrations for this centenary recounting the fascinating history of this field, from seven Nobel prizes to life-saving discoveries that brought us medically useful magnetic resonance imaging. The discovery of a completely new class of high-temperature superconductors, whose mechanism remains as elusive as the cuprates discovered in 1986, has injected a new vitality into this field, and this year those new to the field were provided with the opportunity of interacting with those who have enjoyed a long history in superconductivity. Furthermore, as high-density current carriers with little or no power loss, high-temperature superconductors offer unique solutions to fundamental grid challenges of the 21st century and hold great promise in addressing our global energy challenges. The complexity and promise of these materials has caused our community to more freely share our ideas and results than ever before, and it is gratifying to see how we have grown into an enthusiastic global network to advance the field. This invited collection is true to this agenda and we are delighted to have received contributions from many of the world leaders for an initiative that is designed to benefit both newcomers and established researchers in superconductivity.

  10. Academic training: Applied superconductivity

    2007-01-01

    LECTURE SERIES 17, 18, 19 January from 11.00 to 12.00 hrs Council Room, Bldg 503 Applied Superconductivity : Theory, superconducting Materials and applications E. PALMIERI/INFN, Padova, Italy When hearing about persistent currents recirculating for several years in a superconducting loop without any appreciable decay, one realizes that we are dealing with a phenomenon which in nature is the closest known to the perpetual motion. Zero resistivity and perfect diamagnetism in Mercury at 4.2 K, the breakthrough during 75 years of several hundreds of superconducting materials, the revolution of the "liquid Nitrogen superconductivity"; the discovery of still a binary compound becoming superconducting at 40 K and the subsequent re-exploration of the already known superconducting materials: Nature discloses drop by drop its intimate secrets and nobody can exclude that the last final surprise must still come. After an overview of phenomenology and basic theory of superconductivity, the lectures for this a...

  11. Applied hydraulic transients

    Chaudhry, M Hanif

    2014-01-01

    This book covers hydraulic transients in a comprehensive and systematic manner from introduction to advanced level and presents various methods of analysis for computer solution. The field of application of the book is very broad and diverse and covers areas such as hydroelectric projects, pumped storage schemes, water-supply systems, cooling-water systems, oil pipelines and industrial piping systems. Strong emphasis is given to practical applications, including several case studies, problems of applied nature, and design criteria. This will help design engineers and introduce students to real-life projects. This book also: ·         Presents modern methods of analysis suitable for computer analysis, such as the method of characteristics, explicit and implicit finite-difference methods and matrix methods ·         Includes case studies of actual projects ·         Provides extensive and complete treatment of governed hydraulic turbines ·         Presents design charts, desi...

  12. Applied research with cyclotrons

    During the past three decades the Flerov laboratory carried out research and development of a number of applications that have found or may find use in modern technologies. One of the applications is the so-called ion track technology enabling us to create micro- and nano-structured materials. Accelerated heavy ion beams are the unique tools for structuring insulating solids in a controllable manner. At FLNR JINR the U-400 cyclotron and the IC-100 cyclotron are employed for irradiation of materials to be modified by the track-etch technique. For practical applications, U-400 delivers the 86Kr ion beams with total energies of 250, 350, 430 and 750 MeV, and the 136Xe ion beams with the energy of 430 MeV. The cyclotron is equipped with a specialized channel for irradiation of polymer foils. IC-100 is a compact accelerator specially designed for the technological uses. High-intensity krypton ion beams with the energy of ∼ 1 MeV/u are available now at IC-100. Production of track-etch membranes is an example of mature technology based on irradiation with accelerated ions. The track-etch membranes offer distinct advantages over other types of membranes due to their precisely determined structure. One-pore, oligo-pore and multi-pore samples can serve as models for studying the transport of liquids, gases, particles, solutes, and electrolytes in narrow channels. Track-etch pores are also used as templates for making nano wires, nano tubes or array of nano rods. The microstructures obtained this way may find use in miniaturized devices such as sensors for biologically important molecules. Bulk and surface modification for the production of new composites and materials with special optical properties can be performed with ion beams. Flexible printed circuits, high-performance heat transfer modules, X-ray filters, and protective signs are examples of products developed in collaboration with research and industrial partners. Some recent achievements and promising ideas that

  13. Language Management Tools

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  14. UniProt Tools.

    Pundir, Sangya; Martin, Maria J; O'Donovan, Claire

    2016-01-01

    The Universal Protein Resource (UniProt) is a comprehensive resource for protein sequence and annotation data (UniProt Consortium, 2015). The UniProt Web site receives ∼400,000 unique visitors per month and is the primary means to access UniProt. Along with various datasets that you can search, UniProt provides three main tools. These are the 'BLAST' tool for sequence similarity searching, the 'Align' tool for multiple sequence alignment, and the 'Retrieve/ID Mapping' tool for using a list of identifiers to retrieve UniProtKB proteins and to convert database identifiers from UniProt to external databases or vice versa. This unit provides three basic protocols, three alternate protocols, and two support protocols for using UniProt tools. © 2016 by John Wiley & Sons, Inc. PMID:27010333

  15. OOTW Force Design Tools

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  16. Directory of Online Cataloging Tools

    Mohamed Hamed Mu'awwad

    2007-09-01

    Full Text Available A directory of online cataloging tools, it collects more than 200 resources of essential tools used in libraries, the resources divided into 4 categories; individual tools classified according to type of activity, like: cataloging, classification, standard format; collective tools, non-collective tools, and free general tools available on the internet.

  17. Micro and nano fabrication tools and processes

    Gatzen, Hans H; Leuthold, Jürg

    2015-01-01

    For Microelectromechanical Systems (MEMS) and Nanoelectromechanical Systems (NEMS) production, each product requires a unique process technology. This book provides a comprehensive insight into the tools necessary for fabricating MEMS/NEMS and the process technologies applied. Besides, it describes enabling technologies which are necessary for a successful production, i.e., wafer planarization and bonding, as well as contamination control.

  18. Computational social networks tools, perspectives and applications

    Abraham, Ajith

    2012-01-01

    Provides the latest advances in computational social networks, and illustrates how organizations can gain a competitive advantage by applying these ideas in real-world scenarios Presents a specific focus on practical tools and applications Provides experience reports, survey articles, and intelligence techniques and theories relating to specific problems in network technology

  19. Essays in applied microeconomics

    Wang, Xiaoting

    In this dissertation I use Microeconomic theory to study firms' behavior. Chapter One introduces the motivations and main findings of this dissertation. Chapter Two studies the issue of information provision through advertisement when markets are segmented and consumers' price information is incomplete. Firms compete in prices and advertising strategies for consumers with transportation costs. High advertising costs contribute to market segmentation. Low advertising costs promote price competition among firms and improves consumer welfare. Chapter Three also investigates market power as a result of consumers' switching costs. A potential entrant can offer a new product bundled with an existing product to compensate consumers for their switching cost. If the primary market is competitive, bundling simply plays the role of price discrimination, and it does not dominate unbundled sales in the process of entry. If the entrant has market power in the primary market, then bundling also plays the role of leveraging market power and it dominates unbundled sales. The market for electric power generation has been opened to competition in recent years. Chapter Four looks at issues involved in the deregulated electricity market. By comparing the performance of the competitive market with the social optimum, we identify the conditions under which market equilibrium generates socially efficient levels of electric power. Chapter Two to Four investigate the strategic behavior among firms. Chapter Five studies the interaction between firms and unemployed workers in a frictional labor market. We set up an asymmetric job auction model, where two types of workers apply for two types of job openings by bidding in auctions and firms hire the applicant offering them the most profits. The job auction model internalizes the determination of the share of surplus from a match, therefore endogenously generates incentives for an efficient division of the matching surplus. Microeconomic

  20. Applied large eddy simulation.

    Tucker, Paul G; Lardeau, Sylvain

    2009-07-28

    Large eddy simulation (LES) is now seen more and more as a viable alternative to current industrial practice, usually based on problem-specific Reynolds-averaged Navier-Stokes (RANS) methods. Access to detailed flow physics is attractive to industry, especially in an environment in which computer modelling is bound to play an ever increasing role. However, the improvement in accuracy and flow detail has substantial cost. This has so far prevented wider industrial use of LES. The purpose of the applied LES discussion meeting was to address questions regarding what is achievable and what is not, given the current technology and knowledge, for an industrial practitioner who is interested in using LES. The use of LES was explored in an application-centred context between diverse fields. The general flow-governing equation form was explored along with various LES models. The errors occurring in LES were analysed. Also, the hybridization of RANS and LES was considered. The importance of modelling relative to boundary conditions, problem definition and other more mundane aspects were examined. It was to an extent concluded that for LES to make most rapid industrial impact, pragmatic hybrid use of LES, implicit LES and RANS elements will probably be needed. Added to this further, highly industrial sector model parametrizations will be required with clear thought on the key target design parameter(s). The combination of good numerical modelling expertise, a sound understanding of turbulence, along with artistry, pragmatism and the use of recent developments in computer science should dramatically add impetus to the industrial uptake of LES. In the light of the numerous technical challenges that remain it appears that for some time to come LES will have echoes of the high levels of technical knowledge required for safe use of RANS but with much greater fidelity. PMID:19531503

  1. Applied research in uncertainty modeling and analysis

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  2. Tool Gear: Infrastructure for Building Parallel Programming Tools

    May, J M; Gyllenhaal, J

    2002-12-09

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  3. Benchmarking expert system tools

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  4. Machine Tool Software

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  5. Vibration absorber modeling for handheld machine tool

    Abdullah, Mohd Azman; Mustafa, Mohd Muhyiddin; Jamil, Jazli Firdaus; Salim, Mohd Azli; Ramli, Faiz Redza

    2015-05-01

    Handheld machine tools produce continuous vibration to the users during operation. This vibration causes harmful effects to the health of users for repeated operations in a long period of time. In this paper, a dynamic vibration absorber (DVA) is designed and modeled to reduce the vibration generated by the handheld machine tool. Several designs and models of vibration absorbers with various stiffness properties are simulated, tested and optimized in order to diminish the vibration. Ordinary differential equation is used to derive and formulate the vibration phenomena in the machine tool with and without the DVA. The final transfer function of the DVA is later analyzed using commercial available mathematical software. The DVA with optimum properties of mass and stiffness is developed and applied on the actual handheld machine tool. The performance of the DVA is experimentally tested and validated by the final result of vibration reduction.

  6. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  7. Chatter and machine tools

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  8. Cash Reconciliation Tool

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create...

  9. Chemical Data Access Tool

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  10. Tools and their uses

    1973-01-01

    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  11. Friction stir welding tool

    Tolle; Charles R. , Clark; Denis E. , Barnes; Timothy A.

    2008-04-15

    A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

  12. TENCompetence tool demonstration

    Kluijfhout, Eric

    2010-01-01

    Kluijfhout, E. (2009). TENCompetence tool demonstration. Presented at Zorgacademie Parkstad (Health Academy Parkstad), Limburg Leisure Academy, Life Long Learning Limburg and a number of regional educational institutions. May, 18, 2009, Heerlen, The Netherlands: Open University of the Netherlands, TENCompetence.

  13. Laser Prepared Cutting Tools

    Konrad, Wegener; Claus, Dold; Marcel, Henerichs; Christian, Walter

    Laser pulses with a pulsewidth of a few picoseconds have recently received a lot of attention, solving the problem of manufacturing tools for new materials of superior mechanical properties. Processing thermally sensitive material, such as diamond and CBN structures, can be done without major material deterioration effects. The breakthrough of this new technology becomes possible, if the accuracy and life time requirements of those tools are met. The paper shows in three applications the potential of laser manufacturing of cutting tools. Manufacturing of cutting edges for CFRP cutting needs sharp and stable cutting edges, which are prepared in PCD tools by laser sources in the picosecond pulsewidth regime. Profiling of hybrid bond grinding wheels yields geometric flexibility, which is impossible by mechanical treatment so far. Touch dressing of grinding wheels substantially reduces cutting forces.

  14. Recovery Action Mapping Tool

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  15. THOR: Metrics and Tools

    Dasler, Robin

    2016-01-01

    This report describes the work of the THOR Project to develop a dashboard to monitor interoperability of persistent identifiers. The dashboard is an essential step towards a suite of tools to measure the impact of the project.

  16. Game development tool essentials

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  17. Autism Teaching Tool

    2014-01-01

    CERN pattern recognition technologies transferred to Austistic children learning tool. The state of the art of pattern recognition technology developed at CERN for High Energy Physics are transferred to Computer Vision domain and are used to develop a new

  18. Health Check Tools

    ... https://www.nlm.nih.gov/medlineplus/healthchecktools.html Health Check Tools To use the sharing features on ... Schedule (Centers for Disease Control and Prevention) Children's Health Body Mass Index: Calculator for Child and Teen ( ...

  19. Neighborhood Mapping Tool

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  20. Mapping Medicare Disparities Tool

    U.S. Department of Health & Human Services — The CMS Office of Minority Health has designed an interactive map, the Mapping Medicare Disparities Tool, to identify areas of disparities between subgroups of...

  1. NWRS Survey Prioritization Tool

    US Fish and Wildlife Service, Department of the Interior — A SMART Tool and User's Guide for aiding NWRS Station staff when prioritizing their surveys for an Inventory and Monitoring Plan. This guide describes a process and...

  2. Surface exploration geophysics applied to the moon

    With the advent of a permanent lunar base, the desire to explore the lunar near-surface for both scientific and economic purposes will arise. Applications of exploration geophysical methods to the earth's subsurface are highly developed. This paper briefly addresses some aspects of applying this technology to near surface lunar exploration. It is noted that both the manner of application of some techniques, as well as their traditional hierarchy as assigned on earth, should be altered for lunar exploration. In particular, electromagnetic techniques may replace seismic techniques as the primary tool for evaluating near-surface structure

  3. Tools used for hand deburring

    Gillespie, L.K.

    1981-03-01

    This guide is designed to help in quick identification of those tools most commonly used to deburr hand size or smaller parts. Photographs and textual descriptions are used to provide rapid yet detailed information. The data presented include the Bendix Kansas City Division coded tool number, tool description, tool crib in which the tool can be found, the maximum and minimum inventory requirements, the cost of each tool, and the number of the illustration that shows the tool.

  4. CMS offline web tools

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  5. Stochastic tools in turbulence

    Lumey, John L

    2012-01-01

    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  6. Stone Tool Production

    Hikade, Thomas

    2010-01-01

    In ancient Egypt, flint or chert was used for knapped stone tools from the Lower Palaeolithic down to the Pharaonic Period. The raw material was available in abundance on the desert surface, or it could be mined from the limestone formations along the Nile Valley. While the earliest lithic industries of Prehistoric Egypt resemble the stone tool assemblages from other parts of Africa, as well as Asia and Europe, the later Prehistoric stone industries in Egypt had very specific characteristics,...

  7. Delila system tools.

    Schneider, T D; Stormo, G D; Yarus, M A; Gold, L

    1984-01-01

    We introduce three new computer programs and associated tools of the Delila nucleic-acid sequence analysis system. The first program, Module, allows rapid transportation of new sequence analysis tools between scientists using different computers. The second program, DBpull, allows efficient access to the large nucleic-acid sequence databases being collected in the United States and Europe. The third program, Encode, provides a flexible way to process sequence data for analysis by other programs.

  8. Quality management tool

    This book introduces basic conception of quality with characteristic, price, cost, and function, basic conception on quality management, introduction and operation of quality management, quality guaranteed and claim like handling of claim of goods, standards, and quality guaranteed method, basic tools of quality management such as Pareto diagram, characteristic diagram, cause-and-effect, fish born diagram check sheet histogram scatter diagram graph and stratification new seven tools of QC, quality deployment function and measurement system.

  9. Tools for software visualization

    Stojanova, Aleksandra; Stojkovic, Natasa; Bikov, Dusan

    2015-01-01

    Software visualization is a kind of computer art, and in the same time is a science for generating visual representations of different software aspects and of software development process. There are many tools that allow software visualization but we are focusing on some of them. In this paper will be examined in details just four tools: Jeliot 3, SRec, jGrasp and DDD. Visualizations that they produce will be reviewed and analyzed and will be mentioned possible places for their application. A...

  10. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    Cui Chengsong

    2015-01-01

    Full Text Available In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming process was used to produce graded tool materials in the form of a flat product. The graded deposits were subsequently hot rolled and heat treated to achieve an optimal microstructure and advanced properties. Micro plunge rotary swaging tools with fine geometrical structures were machined from the hot rolled materials. The new forming tools were successfully applied in the micro plunge rotary swaging of wires of stainless steel.

  11. Kymenlaakso University of Applied Sciences Social Media Marketing Campaign

    Bogdanov, Evgeny

    2012-01-01

    The popularity of social media has grown significantly in the course of the last few years and become a beneficial tool for promoting an organisation. Maintaining online presence on social media websites allows a company to reach its target audiences and raise public awareness on a global basis. The purpose of this study was to achieve a better understanding of how social media marketing of Kymenlaakso University of Applied Sciences can be improved and which social media marketing tools ...

  12. NASA's Applied Sciences for Water Resources

    Doorn, Bradley; Toll, David; Engman, Ted

    2011-01-01

    The Earth Systems Division within NASA has the primary responsibility for the Earth Science Applied Science Program and the objective to accelerate the use of NASA science results in applications to help solve problems important to society and the economy. The primary goal of the Earth Science Applied Science Program is to improve future and current operational systems by infusing them with scientific knowledge of the Earth system gained through space-based observation, assimilation of new observations, and development and deployment of enabling technologies, systems, and capabilities. This paper discusses one of the major problems facing water resources managers, that of having timely and accurate data to drive their decision support tools. It then describes how NASA?s science and space based satellites may be used to overcome this problem. Opportunities for the water resources community to participate in NASA?s Water Resources Applications Program are described.

  13. GIS Technology: Resource and Habitability Assessment Tool Project

    National Aeronautics and Space Administration — This is a one-year project to apply a GIS analysis tool to new orbital data for lunar resource assessment and martian habitability identification.  We used...

  14. NUCLEAR SCIENCE REFERENCES AS A TOOL FOR DATA EVALUATION.

    WINCHELL,D.F.

    2004-09-26

    For several decades, the Nuclear Science References database has been maintained as a tool for data evaluators and for the wider pure and applied research community. This contribution will describe the database and recent developments in web-based access.

  15. Physics analysis tools

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  16. Tool steel for tool holder applications : microstructure and mechanical properties

    Medvedeva, Anna

    2008-01-01

    Large improvements in cutting tool design and technology, including the application of advanced surface engineering treatments on the cemented carbide insert, have been achieved in the last decades to enhance tool performance. However, the problem of improving the tool body material is not adequately studied. Fatigue is the most common failure mechanism in cutting tool bodies. Rotating tools, tool going in and out of cutting engagement, impose dynamic stresses and require adequate fatigue str...

  17. Machining M42 tool steel using nanostructured coated cutting tools

    M.J. Jackson; G.M. Robinson; J.S. Morrell

    2007-01-01

    Purpose: This paper discusses improvements associated with the life of cutting tools used to machine M42 tool steel. To achieve this in an efficient way, experiments on a variety of tool coatings are conducted on AISI M42 tool steel (58-63 HRC).Design/methodology/approach: In order to assess the impact of different tool coatings on the machining process, initial experiments simulate existing machining operations; this provides a standard for tool life and surface finish.Findings: The finding...

  18. Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY-08

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Dreher, Joseph

    2008-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the third quarter of Fiscal Year 2008 (April - June 2008). Tasks reported on are: Peak Wind Tool for User Launch Commit Criteria (LCC), Anvil Forecast Tool in AWIPS Phase II, Completion of the Edward Air Force Base (EAFB) Statistical Guidance Wind Tool, Volume Averaged Height Integ rated Radar Reflectivity (VAHIRR), Impact of Local Sensors, Radar Scan Strategies for the PAFB WSR-74C Replacement, VAHIRR Cost Benefit Analysis, and WRF Wind Sensitivity Study at Edwards Air Force Base

  19. Applied Meteorology Unit (AMU) Quarterly Report - Fourth Quarter FY-09

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark

    2009-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the fourth quarter of Fiscal Year 2009 (July - September 2009). Tasks reports include: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool. Phase III, (3) Peak Wind Tool for General Forecasting. Phase II, (4) Update and Maintain Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS), (5) Verify MesoNAM Performance (6) develop a Graphical User Interface to update selected parameters for the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLlT)

  20. Applied Ethics in Nowadays Society

    Tomita CIULEI

    2013-01-01

    This special issue is dedicated to Nowadays Applied Ethics in Society, and falls in the field of social sciences and humanities, being hosted both theoretical approaches and empirical research in various areas of applied ethics. Applied ethics analyzes of a series of morally concrete situations of social or professional practice in order to make / adopt decisions. In the field of applied ethics are integrated medical ethics, legal ethics, media ethics, professional ethics, environmental ethic...

  1. The Routledge Applied Linguistics Reader

    Wei, Li, Ed.

    2011-01-01

    "The Routledge Applied Linguistics Reader" is an essential collection of readings for students of Applied Linguistics. Divided into five sections: Language Teaching and Learning, Second Language Acquisition, Applied Linguistics, Identity and Power and Language Use in Professional Contexts, the "Reader" takes a broad interpretation of the subject…

  2. Certifying Tools for Test Reduction in Open Architecture

    Valdis Berzins

    2012-01-01

    In this paper, we describe a method for evaluating tools that can be used to guide decisions about how much retesting is needed and to check conditions under which testing of unmodified components can be reduced or avoided. The approach uses a combination of dependency analysis applied to source code and automated testing applied to executable component implementations. Dependability of such tools is a key concern in this context, which our ongoing research addresses. We also discuss other ap...

  3. Some Notes About Artificial Intelligence as New Mathematical Tool

    Angel Garrido

    2010-04-01

    Full Text Available Mathematics is a mere instance of First-Order Predicate Calculus. Therefore it belongs to applied Monotonic Logic. So, we found the limitations of classical logic reasoning and the clear advantages of Fuzzy Logic and many other new interesting tools. We present here some of the more usefulness tools of this new field of Mathematics so-called Artificial Intelligence.

  4. A Tool for Assessing Social Climate in University Classrooms

    Sánchez, Carles Rostan; Ortiz, Dolors Cañabate; Carrasco, Mònica González; Carbo, Pilar Albertín; Burriel, Marc Pérez

    2015-01-01

    Introduction: Despite academic climate being a key aspect of teaching quality in academic institutions, few studies conducted in the university context have analyzed this construct systematically. Method: Given the absence of specific tools to apply to university, we propose the construction of a tool for assessing college students' perceptions of…

  5. A cross-species alignment tool (CAT)

    Li, Heng; Guan, Liang; Liu, Tao;

    2007-01-01

    sensitive methods which are usually applied in aligning inter-species sequences. RESULTS: Here we present a new algorithm called CAT (for Cross-species Alignment Tool). It is designed to align mRNA sequences to mammalian-sized genomes. CAT is implemented using C scripts and is freely available on the web at...... http://xat.sourceforge.net/. CONCLUSIONS: Examined from different angles, CAT outperforms other extant alignment tools. Tested against all available mouse-human and zebrafish-human orthologs, we demonstrate that CAT combines the specificity and speed of the best intra-species algorithms, like BLAT and...

  6. New Conceptual Design Tools

    Pugnale, Alberto; Holst, Malene; Kirkegaard, Poul Henning

    2010-01-01

    hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal......This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper...

  7. Assembly tool design

    The reactor core of the International Thermonuclear Experimental Reactor (ITER) is assembled with a number of large and asymmetric components within a tight tolerance in order to assure the structural integrity for various loads and to provide the tritium confinement. In addition, the assembly procedure should be compatible with remote operation since the core structures will be activated by 14-MeV neutrons once it starts operation and thus personal access will be prohibited. Accordingly, the assembly procedure and tool design are quite essential and should be designed from the beginning to facilitate remote operation. According to the ITER Design Task Agreement, the Japan Atomic Energy Research Institute (JAERI) has performed design study to develop the assembly procedures and associated tool design for the ITER tokamak assembly. This report describes outlines of the assembly tools and the remaining issues obtained in this design study. (author)

  8. Cataract Surgery Tool

    1977-01-01

    The NASA-McGannon cataract surgery tool is a tiny cutter-pump which liquefies and pumps the cataract lens material from the eye. Inserted through a small incision in the cornea, the tool can be used on the hardest cataract lens. The cutter is driven by a turbine which operates at about 200,000 revolutions per minute. Incorporated in the mechanism are two passages for saline solutions, one to maintain constant pressure within the eye, the other for removal of the fragmented lens material and fluids. Three years of effort have produced a design, now being clinically evaluated, with excellent potential for improved cataract surgery. The use of this tool is expected to reduce the patient's hospital stay and recovery period significantly.

  9. CMS tracker visualization tools

    Zito, G; Osborne, I; Regano, A

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  10. CMS tracker visualization tools

    Mennea, M.S. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Osborne, I. [Northeastern University, 360 Huntington Avenue, Boston, MA 02115 (United States); Regano, A. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Zito, G. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy)]. E-mail: giuseppe.zito@ba.infn.it

    2005-08-21

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  11. Tool nimega Sacco

    1998-01-01

    Kolmekümneseks on saanud Zanotta kott ehk tool "Sacco", mille 1968. a. disainisid P. Gatti, C. Paolini, F. Teodoro. "Sacco" - polüstüreenist graanulitega täidetud kott. Tähelepanu pälvis ka Zanotta firma täispuhutav tool "Blow" (1967, Scholari, D'Urbino, Lomazzi, De Pas). E. Lucie-Smith neist. 1968. aastale on pühendatud Düsseldorfi Kunstimuuseumi näitus "1968. a. legendid ja sümbolid", kus on eksponeeritud ligi 500 objekti ja mitu rekonstrueeritud interjööri

  12. General purpose MDE tools

    Juan Manuel Cueva Lovelle

    2008-12-01

    Full Text Available MDE paradigm promises to release developers from writing code. The basis of this paradigm consists in working at such a level of abstraction that will make it easyer for analysts to detail the project to be undertaken. Using the model described by analysts, software tools will do the rest of the task, generating software that will comply with customer's defined requirements. The purpose of this study is to compare general purpose tools available right now that enable to put in practice the principles of this paradigm and aimed at generating a wide variety of applications composed by interactive multimedia and artificial intelligence components.

  13. CMS tracker visualization tools

    Mennea, M. S.; Osborne, I.; Regano, A.; Zito, G.

    2005-08-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  14. CMS tracker visualization tools

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking

  15. Nitrogen Trading Tool (NTT)

    The Natural Resources Conservation Service (NRCS) recently developed a prototype web-based nitrogen trading tool to facilitate water quality credit trading. The development team has worked closely with the Agriculture Research Service Soil Plant Nutrient Research Unit (ARS-SPNR) and the Environmenta...

  16. HSI Trade Space Tool

    2007-01-01

    The HSI Trade Space Tool (TST) was designed to help HSI practitioners, program managers, and other acquisition professionals visualize the relationships among: the HSI domains of Manpower, Personnel, Training, Human Factors Engineering, System Safety, Survivability, Health Hazards, and Habitability; the dimensions of Cost, Schedule, and Risk; and, the result of Total System Performance.

  17. Linux programming tools unveiled

    Venkateswarlu, N B

    2007-01-01

    In the recent years, Linux, a public domain, freely available Unix variant has attracted the people very much. Today's complex production environments demands superior application performance. Linux is having extraordinary advantages such as : complete source code access, availability of exceptional optimization, testing tools. This book has explored this facet of Linux.

  18. Rapid Tooling via Stereolithography

    Montgomery, Eva

    2006-01-01

    Approximately three years ago, composite stereolithography (SL) resins were introduced to the marketplace, offering performance features beyond what traditional SL resins could offer. In particular, the high heat deflection temperatures and high stiffness of these highly filled resins have opened the door to several new rapid prototyping (RP) applications, including wind tunnel test modelling and, more recently, rapid tooling.

  19. Clean Cities Tools

    None

    2014-12-19

    The U.S. Department of Energy's Clean Cities offers a large collection of Web-based tools on the Alternative Fuels Data Center. These calculators, interactive maps, and data searches can assist fleets, fuels providers, and other transportation decision makers in their efforts to reduce petroleum use.

  20. Google - Security Testing Tool

    Staykov, Georgi

    2007-01-01

    Using Google as a security testing tool, basic and advanced search techniques using advanced google search operators. Examples of obtaining control over security cameras, VoIP systems, web servers and collecting valuable information as: Credit card details, cvv codes – only using Google.

  1. Incident Information Management Tool

    Pejovic, Vladimir

    2015-01-01

    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  2. Essential marketing tools

    Potter, Ned

    2012-01-01

    This chapter from The Library Marketing Toolkit introduces several essential tools which every library needs as part of their marketing toolkit. These include developing the library website (including mobile version), utilising word-of-mouth promotion, obtaining and interpreting user feedback, perfecting the elevator pitch, and signs and displays. It includes case studies from David Lee King, and Aaron Tay, Rebecca Jones.

  3. Apple Shuns Tracking Tool

    2011-01-01

    Apple Inc. is advising software de- velopers to stop using a feature in software for its iPhones and iPads .that has been linked to privacyconcerns, a move that would also take away a widely used tool for tracking users and their behavior. Developers who write programs for Apple's lOS operating system have been using a unique.

  4. Verification of Simulation Tools

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  5. Risk Management Implementation Tool

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  6. Spray-formed tooling

    McHugh, K. M.; Key, J. F.

    The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be designed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.

  7. Applied Academics. Applied Mathematics: Drafting. Curriculum Bulletin VE-53.

    Cincinnati Public Schools, OH. Div. of Student Services.

    This publication contains the Applied Mathematics Curriculum (Drafting) for grades 11 and 12 for the Cincinnati (Ohio) Public Schools. The curriculum is part of a larger program (the Applied Academics Program), which emphasizes the integration of mathematics and the language arts with vocational content. Included in the document is a description…

  8. A Review of Applied Mathematics

    Ó Náraigh, Lennon; Ní Shúilleabháin, Aoibhinn

    2015-01-01

    Applied Mahtematics is a subject which deals with problmes arising inthe physical, life, and social sciences as well as in engineering and provides a broad body of knowledge for use in a wide spectrum of research and insdustry. Applied Mathematics is an important school subject which builds students' mathematical and problem solving skills. The subject has remained on the periphery of school time-tables and, without the commitment and enthusiasm of Applied Maths teachers, would likely be omit...

  9. Applied Ethics in Nowadays Society

    Tomita CIULEI

    2013-12-01

    Full Text Available This special issue is dedicated to Nowadays Applied Ethics in Society, and falls in the field of social sciences and humanities, being hosted both theoretical approaches and empirical research in various areas of applied ethics. Applied ethics analyzes of a series of morally concrete situations of social or professional practice in order to make / adopt decisions. In the field of applied ethics are integrated medical ethics, legal ethics, media ethics, professional ethics, environmental ethics, business ethics etc. Classification-JEL: A23

  10. Using corporate social responsibility tools while forming the enterprise internationalization marketing strategy

    Ilchuk, P. H.; Мuzhelyak, М. М.; Кots, О. О.

    2014-01-01

    The article investigates the possibility of improving the marketing strategy by using new tools, among themthe tools of corporate social responsibility at Ukrainian enterprises. The definition of the corporate social responsibility concept is generalized. The main tools of the corporate social responsibility as well as the relationship between these tools and the enterprise internationalization marketing strategy are determined. Feasibility of applying corporate social responsibility tools in...

  11. An intelligent condition monitoring system for on-line classification of machine tool wear

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  12. EVALUATION OF MACHINE TOOL QUALITY

    Ivan Kuric

    2011-01-01

    Paper deals with aspects of quality and accuracy of machine tools. As the accuracy of machine tools has key factor for product quality, it is important to know the methods for evaluation of quality and accuracy of machine tools. Several aspects of diagnostics of machine tools are described, such as aspects of reliability.

  13. Tool use by aquatic animals.

    Mann, Janet; Patterson, Eric M

    2013-11-19

    Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631

  14. OPERATIONS MANAGEMENT TOOLS IN BRAZILIAN SMALL COMPANIES

    Tonny Kerley de Alencar Rodrigues; Átila de Melo Lira; Irenilza de Alencar Naas

    2014-01-01

    This research has the objective was to characterize the small Brazilian companies about the knowledge of operations management tools that help in improving the administrative process for these organizations. For that we chose a more positivist strand which values quantitative aspects. The research can be descriptive and explanatory, applied and/or intervention. As for media, research can be classified as documentary, bibliographic and/or participant. The population for this study is composed ...

  15. Dynamic optimization case studies in DYNOPT tool

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  16. A Tool for Safety Officers Investigating " simple" Accidents

    Jørgensen, Kirsten

    2010-01-01

    accidents normally caused by apparent banalities occur much more frequently and with a higher rate of fatalities, disablements and other serious injuries than the ostensibly most dangerous kinds of accidents. In 1999 a practical tool for use by safety officers was developed; this tool is based...... on the investigation methods applied in major accidents, but comprises a simpler and more user-friendly presentation. The tool involves three steps: Mapping the facts, analysing the events, and developing preventive solutions. Practical application of the tool has shown that it affords managers and workers...

  17. The Development of a Climate Time Line Information Tool

    Kowal, D.; McCaffery, M.; Anderson, D.; Habermann, D. E.

    2001-12-01

    The "Climate Time Line" or CTL tool currently in development at the National Geophysical Data Center will provide a climatic and "place-based" context for current weather patterns and a pre-instrumental context for current climate trends. Two audiences-GLOBE students and water managers involved with the Western Water Assessment--are targeted in the pilot project phase to test the CTL as a learning and decision-making support tool. Weather, climate and paleoclimatic observations will be integrated through a web-based interface that can be used for comparing data collected over 10 year, 100 year and 1000+ year periods, and made accessible and meaningful to non-technical users. The Climate Time Line prototype will include the following features: 1) Access to diverse data sets such as NCDC's Historic Climate Network, GLOBE Student Data Archive, World Data Center for Paleoclimatology and historical streamflow data from the USGS; 2) Map Locator/Search Utility for regional inquiries and comparison views; 3) Varying temporal and spatial displays; 4) Tutorial and help sections to guide and support users; 5) Supporting materials including a "Powers of Ten" primer examining variability at various timescales; and 6) Statistical assessment tools. The CTL prototype offers a novel approach in the scientific analysis of climate and hydrology data. It will facilitate inquiries by simplifying access to environmental data. Additionally, it will provide historical timelines for the intended user to compare the development of human cultures in relation to climate trends and variability--promoting an inquiry-rich learning environment. Throughout the pilot project phase, the CTL will undergo evaluation particularly in the area of usability, followed by a pre- and post- assessment of its educational impact on the targeted, non-technical audience. A hypernews workspace has been created to facilitate the development of the CTL. >http://HyperNews.ngdc.noaa.gov/HyperNews/get/ ClimateTimelineProject.html.

  18. Automated Standard Hazard Tool

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  19. Avionics System Architecture Tool

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian

    2005-01-01

    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  20. Remote vehicle survey tool

    The Remote Vehicle Survey Tool (RVS7) is a color graphical display tool for viewing remotely acquired scientific data. The RVST displays the data in the form of a color two-dimensional world model map. The world model map allows movement of the remote vehicle to be tracked by the operator and the data from sensors to be graphically depicted in the interface. Linear and logarithmic meters, dual channel oscilloscopes, and directional compasses are used to display sensor information. The RVST is user-configurable by the use of ASCII text files. The operator can configure the RVST to work with any remote data acquisition system and teleoperated or autonomous vehicle. The modular design of the RVST and its ability to be quickly configured for varying system requirements make the RVST ideal for remote scientific data display in all environmental restoration and waste management programs

  1. Algal functional annotation tool

    Lopez, D. [UCLA; Casero, D. [UCLA; Cokus, S. J. [UCLA; Merchant, S. S. [UCLA; Pellegrini, M. [UCLA

    2012-07-01

    The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps and batch gene identifier conversion.

  2. LIKWID: Lightweight Performance Tools

    Treibig, Jan; Wellein, Gerhard

    2011-01-01

    Exploiting the performance of today's microprocessors requires intimate knowledge of the microarchitecture as well as an awareness of the ever-growing complexity in thread and cache topology. LIKWID is a set of command line utilities that addresses four key problems: Probing the thread and cache topology of a shared-memory node, enforcing thread-core affinity on a program, measuring performance counter metrics, and microbenchmarking for reliable upper performance bounds. Moreover, it includes an mpirun wrapper allowing for portable thread-core affinity in MPI and hybrid MPI/threaded applications. To demonstrate the capabilities of the tool set we show the in uence of thread affinity on performance using the well-known OpenMP STREAM triad benchmark, use hardware counter tools to study the performance of a stencil code, and finally show how to detect bandwidth problems on ccNUMA-based compute nodes.

  3. Automatically-Programed Machine Tools

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  4. Java Vertexing Tools

    This document describes the implementation of the topological vertex finding algorithm ZVTOP within the org.lcsim reconstruction and analysis framework. At the present date, Java vertexing tools allow users to perform topological vertexing on tracks that have been obtained from a Fast MC simulation. An implementation that will be able to handle fully reconstructed events is being designed from the ground up for longevity and maintainability

  5. Communication tools in Canada

    This document deals with the means and tools that are used for communicating with elected representatives. First, messages need to be simple, few in number and accurate. It is also advised to seek a first briefing because there is some information to be given, and not because some help is needed. On top of that, contacts should be made often enough to assure of a continued interest. (TEC)

  6. C-TOOL

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John;

    2014-01-01

    Soil organic carbon (SOC) is a significant component of the global carbon (C) cycle. Changes in SOC storage affect atmospheric CO2 concentrations on decadal to centennial timescales. The C-TOOL model was developed to simulate farm- and regional-scale effects of management on medium- to long...... proper model evaluation. Experimental verification of management effects on subsoil C storage, subsoil C inputs from roots, and vertical transport of C in the soil profile remains prioritised research areas....

  7. Channel nut tool

    Olson, Marvin

    2016-01-12

    A method, system, and apparatus for installing channel nuts includes a shank, a handle formed on a first end of a shank, and an end piece with a threaded shaft configured to receive a channel nut formed on the second end of the shaft. The tool can be used to insert or remove a channel nut in a channel framing system and then removed from the channel nut.

  8. Friction Stir Weld Tools

    Carter, Robert W. (Inventor); Payton, Lewis N. (Inventor)

    2007-01-01

    A friction stir weld tool sleeve is supported by an underlying support pin. The pin material is preferably selected for toughness and fracture characteristics. The pin sleeve preferably has a geometry which employs the use of an interrupted thread, a plurality of flutes and/or eccentric path to provide greater flow through. Paddles have been found to assist in imparting friction and directing plastic metal during the welding process.

  9. Program Management Tool

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity

  10. Open ICT tools project

    Turnock, Chris; Bohemia, Erik; Woodhouse, Jed; Smith, Neil; Lovatt, Ben

    2009-01-01

    The paper will introduce a project titled the ‘Open ICT Tools’ which aims to explore and trial out ICT tools to facilitate a global collaborative and secured engagement with external business and community partners. The challenge is to facilitate a communication and multimedia data exchange between Northumbria University and participating external educational and business organisations without compromising the security of either Northumbria University IT infrastructure or that of the partner ...

  11. Applied probability and stochastic processes

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  12. Frequency Response Analysis Tool

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  13. Balancing the tools

    Leroyer, Patrick

    The purpose of this article is to describe the potential of a new combination of functions in lexicographic tools for tourists. So far lexicography has focused on the communicative information needs of tourists, i.e. helping tourists decide what to say in a number of specific tourist situations, ...... tourist situations, wherever and whenever needed. It is demonstrated how this type of objective knowledge, which is conventionally represented in tourist guides and on tourist web sites, could benefit from being arranged in a lexicographic design.......The purpose of this article is to describe the potential of a new combination of functions in lexicographic tools for tourists. So far lexicography has focused on the communicative information needs of tourists, i.e. helping tourists decide what to say in a number of specific tourist situations, in...... othe words communicative functions. However, this kind of help should not stand alone. It is argued that tourists also have experiential information needs that are lexicographically relevant. These needs can be satisfied by lexicographic tools that help tourists decide what to do in various specific...

  14. Dynamic Contingency Analysis Tool

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  15. Distributed computing applied applied to the identification of new drugs

    Isea, Raul; Mayo, Rafael

    2010-01-01

    This work emphasizes the assets of implementing the distributed computing for the intensive use in computational science devoted to the search of new medicines that could be applied in public healthy problems.

  16. Algal functional annotation tool

    2012-07-12

    Abstract BACKGROUND: Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. DESCRIPTION: The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes

  17. Channel and floodplain change analysis over a 100-year period : Lower Yuba River, California

    Rolf Aalto; L. Allan James; Michael B. Singer; Subhajit Ghoshal

    2010-01-01

    Hydraulic gold mining in the Sierra Nevada, California (1853–1884) displaced ~1.1 billion m3 of sediment from upland placer gravels that were deposited along piedmont rivers below dams where floods can remobilize them. This study uses topographic and planimetric data from detailed 1906 topographic maps, 1999 photogrammetric data, and pre- and post-flood aerial photographs to document historic sediment erosion and deposition along the lower Yuba River due to individual floods at the reach scal...

  18. How Earth works 100 years after Wegener's continental drift theory and IGCP 648

    Li, Z. X.; Evans, D. A.; Zhong, S.; Eglington, B. M.

    2015-12-01

    It took half a century for Wegener's continental drift theory to be accepted as a fundamental element of the plate tectonic theory. Another half a century on, we are still unsure of the driving mechanism for plate tectonics: is it dominated by thermal convection, gravitational forces, or by a combination of mechanisms? Nonetheless, breakthroughs in the past decades put us in a position to make a major stride in answering this question. These include: (1) widely accepted cyclic occurrences of supercontinent assembly and break-up (whereas random occurrence of supercontinents was an equal possibility in the 1990s); (2) the discovery of two equatorial and antipodal large low seismic velocity provinces (LLSVPs) that dominate the lower mantle and appear to have been the base for almost all mantle plumes since at the Mesozoic, and of subduction of oceanic slabs all the way to the core-mantle boundary, which together suggesting whole-mantle convection; (3) the recognition of true polar wander (TPW) as an important process in Earth history, likely reflecting Earth's major internal mass redistribution events; and (4) rapidly enhancing computer modelling power enabling us to simulate all aspect of Earth's dynamic inner working. Many new yet often controversial ideas have been proposed, such a possible coupling in time (with an offset) and space between supercontinent cycle and superplume (LLSVP) events which oppose to the idea of static and long-lived LLSVPs, and the orthoversion v.s. introversion or extroversion models for supercontinent transition. To fully utilise these advances as well as the rapidly expanding global geoscience databases to address the question of how Earth works, an UNESCO-IUGS sponsored IGCP project No. 648 was formed to coordinate a global cross-disciplinary effort. We aim to achieve a better understanding of the supercontinent cycle, and examine the relationship between supercontinent cycle and global plume events. We will establish a series of global geological and geophysical databases to enable the geoscience community to make data-rich visual paleogeographic reconstructions using software like GPlates. In addition, the project will bring the geotectonic and the geodynamic modelling communities together to test global geodynamic models into the geological deep time.

  19. A CTE Legacy Built on Chocolate: Milton Hershey School's 100 Years

    Kemmery, Robert

    2010-01-01

    One hundred years ago, Chocolate Magnate Milton S. Hershey and his wife Catherine signed the deed of trust creating the Hershey Industrial School in the heart of their Pennsylvania farming community. They had no children of their own and wanted to help orphan boys get a good education. The couple eventually left their entire fortune to the school.…

  20. Playback from the Victrola to MP3, 100 years of music, machines, and money

    Coleman, Mark

    2009-01-01

    ""Playback is the first book to place the fascinating history of sound reproduction within its larger social, economic, and cultural context-and includes appearances by everyone from Thomas Edison to En""