WorldWideScience

Sample records for 100-year tool applied

  1. Solving the Supreme Problem: 100 years of selection and recruitment at the Journal of Applied Psychology.

    Science.gov (United States)

    Ployhart, Robert E; Schmitt, Neal; Tippins, Nancy T

    2017-03-01

    This article reviews 100 years of research on recruitment and selection published in the Journal of Applied Psychology. Recruitment and selection research has been present in the Journal from the very first issue, where Hall (1917) suggested that the challenge of recruitment and selection was the Supreme Problem facing the field of applied psychology. As this article shows, the various topics related to recruitment and selection have ebbed and flowed over the years in response to business, legal, and societal changes, but this Supreme Problem has captivated the attention of scientist-practitioners for a century. Our review starts by identifying the practical challenges and macro forces that shaped the sciences of recruitment and selection and helped to define the research questions the field has addressed. We then describe the evolution of recruitment and selection research and the ways the resulting scientific advancements have contributed to staffing practices. We conclude with speculations on how recruitment and selection research may proceed in the future. Supplemental material posted online provides additional depth by including a summary of practice challenges and scientific advancements that affected the direction of selection and recruitment research and an outline of seminal articles published in the Journal and corresponding time line. The 100-year anniversary of the Journal of Applied Psychology is very much the celebration of recruitment and selection research, although predictions about the future suggest there is still much exciting work to be done. (PsycINFO Database Record

  2. 100 years of applied psychology research on individual careers: From career management to retirement.

    Science.gov (United States)

    Wang, Mo; Wanberg, Connie R

    2017-03-01

    This article surveys 100 years of research on career management and retirement, with a primary focus on work published in the Journal of Applied Psychology. Research on career management took off in the 1920s, with most attention devoted to the development and validation of career interest inventories. Over time, research expanded to attend to broader issues such as the predictors and outcomes of career interests and choice; the nature of career success and who achieves it; career transitions and adaptability to change; retirement decision making and adjustment; and bridge employment. In this article, we provide a timeline for the evolution of the career management and retirement literature, review major theoretical perspectives and findings on career management and retirement, and discuss important future research directions. (PsycINFO Database Record

  3. 100 years of superconductivity

    CERN Document Server

    Rogalla, Horst

    2011-01-01

    Even a hundred years after its discovery, superconductivity continues to bring us new surprises, from superconducting magnets used in MRI to quantum detectors in electronics. 100 Years of Superconductivity presents a comprehensive collection of topics on nearly all the subdisciplines of superconductivity. Tracing the historical developments in superconductivity, the book includes contributions from many pioneers who are responsible for important steps forward in the field.The text first discusses interesting stories of the discovery and gradual progress of theory and experimentation. Emphasizi

  4. [Osteology--100 years].

    Science.gov (United States)

    Götte, S

    2001-10-01

    As is the case for many other subspecialties of medical science, osteology has developed in tandem with technological progress over the last 100 years. The discover of X-rays made visualization of the skeletal system possible. Progress in surgery and hygiene permitted examination and treatment of bones in vivo. Optical techniques made it possible to gain insight into the microarchitecture of the bone. Chemistry and biochemistry opened the door for pathophysiology and microcellular assessment of the bone so that modern osteology deals with interventions in cellular mechanisms, in particular for the treatment of bone diseases. The realization that the bone represents a dynamic tissue, characterized by processes of generation and degeneration, was decisive. These events have a profound influence on the treatment of osteoporosis. Questions pertaining to osteology have been subject to heightened interdisciplinary debate in the past few years, which is reflected in interdisciplinary associations and co-operative groups, and ultimately the umbrella Society of Osteology. Contemplation of the subject from an interdisciplinary viewpoint shows what a significant and natural role orthopedics plays in research on bone metabolism, but also in the treatment of bone diseases. Interdisciplinary cooperation aids quality control and is also reflected in the formulation of common guidelines for the clinical picture of osteoporosis, which constitutes a major epidemiological disease.

  5. 100 years of Philips Research

    Science.gov (United States)

    van Delft, Dirk

    2014-03-01

    On Thursday 23 October 1913, a Dutch newspaper published the following advertisement: Hiring: A capable young scientist with a doctorate in physics. Must be a good experimenter. Letters containing information on age, life history and references may be submitted to Philips in Eindhoven. Two days later, a candidate applied: Gilles Holst. At that time, Holst was working in Leiden as an assistant to Heike Kamerlingh Onnes, a recent Nobel Prize winner.

  6. 100-Year Floodplains, 100 Year Floodplains, Published in 2007, Not Applicable scale, Dunn County, WI.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from LIDAR information as of 2007. It is described as '100 Year...

  7. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  8. Energizer keep going: 100 years of superconductivity

    Institute of Scientific and Technical Information of China (English)

    Pengcheng Dai; Xing-jiang Zhou; Dao-xin Yao

    2011-01-01

    It has been 100 years since Heike Kamerlingh Onnes discovered superconductivity on April 8,1911.Amazingly,this field is still very active and keeps booming,like a magic.A lot of new phenomena and materials have been found,and superconductors have been used in many different fields to improve our lives.Onnes won the Nobel Prize for this incredible discovery in 1913 and used the word superconductivity for the first time.Onnes believed that quantum mechanics would explain the effect,but he could not produce a theory at that time.Now we know superconductivity is a macroscopic quantum phenomenon.

  9. Analysis of 100 Years of Curriculum Designs

    Directory of Open Access Journals (Sweden)

    Lynn Kelting-Gibson

    2013-01-01

    Full Text Available Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational planning and teaching because it is a way to gather accurate evidence of student learning and information to inform instructional decisions. The purpose of this review was to analyze 100 years of curriculum designs to uncover elements of assessment that will support teachers in their desire to improve student learning. Throughout this research the author seeks to answer the question: Do historical and contemporary curriculum designs include elements of assessment that help teachers improve student learning? The results of the review reveal that assessment elements were addressed in all of the curricular designs analyzed, but not all elements of assessment were identified using similar terminology. Based on the analysis of this review, it is suggested that teachers not be particular about the terminology used to describe assessment elements, as all curriculum models discussed use one or more elements similar to the context of pre, formative, and summative assessments.

  10. Healthcare, molecular tools and applied genome research.

    Science.gov (United States)

    Groves, M

    2000-11-01

    Biotechnology 2000 offered a rare opportunity for scientists from academia and industry to present and discuss data in fields as diverse as environmental biotechnology and applied genome research. The healthcare section of the meeting encompassed a number of gene therapy delivery systems that are successfully treating genetic disorders. Beta-thalassemia is being corrected in mice by continous erythropoeitin delivery from engineered muscles cells, and from naked DNA electrotransfer into muscles, as described by Dr JM Heard (Institut Pasteur, Paris, France). Dr Reszka (Max-Delbrueck-Centrum fuer Molekulare Medizin, Berlin, Germany), meanwhile, described a treatment for liver metastasis in the form of a drug carrier emolization system, DCES (Max-Delbrueck-Centrum fuer Molekulare Medizin), composed of surface modified liposomes and a substance for chemo-occlusion, which drastically reduces the blood supply to the tumor and promotes apoptosis, necrosis and antiangiogenesis. In the molecular tools section, Willem Stemmer (Maxygen Inc, Redwood City, CA, USA) gave an insight into the importance that techniques, such as molecular breeding (DNA shuffling), have in the evolution of molecules with improved function, over a range of fields including pharmaceuticals, vaccines, agriculture and chemicals. Technologies, such as ribosome display, which can incorporate the evolution and the specific enrichment of proteins/peptides in cycles of selection, could play an enormous role in the production of novel therapeutics and diagnostics in future years, as explained by Andreas Plückthun (Institute of Biochemistry, University of Zurich, Switzerland). Applied genome research offered technologies, such as the 'in vitro expression cloning', described by Dr Zwick (Promega Corp, Madison, WI, USA), are providing a functional analysis for the overwhelming flow of data emerging from high-throughput sequencing of genomes and from high-density gene expression microarrays (DNA chips). The

  11. Beam Line: 100 years of elementary particles

    Science.gov (United States)

    Pais, A.; Weinberg, S.; Quigg, C.; Riordan, M.; Panofsky, W. K. H.

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  12. Opening the 100-Year Window for Time Domain Astronomy

    CERN Document Server

    Grindlay, Jonathan; Los, Edward; Servillat, Mathieu

    2012-01-01

    The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890--1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the "production scanning" phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys

  13. Advances of Bioinformatics Tools Applied in Virus Epitopes Prediction

    Institute of Scientific and Technical Information of China (English)

    Ping Chen; Simon Rayner; Kang-hong Hu

    2011-01-01

    In recent years, the in silico epitopes prediction tools have facilitated the progress of vaccines development significantly and many have been applied to predict epitopes in viruses successfully. Herein, a general overview of different tools currently available, including T cell and B cell epitopes prediction tools, is presented. And the principles of different prediction algorithms are reviewed briefly. Finally, several examples are present to illustrate the application of the prediction tools.

  14. The Royal Aircraft Establishment - 100 Years of Research.

    Science.gov (United States)

    1981-10-02

    OF RESIARCH.(U) OCT 61 A J SMITH UNCLASSIFIED RAE-TN-FS-432 DRIC-BR-80894 NLslomommmmoimco IE-E,,.IuENE 1111U1IIIjj -. ’ jI2 MICROCOPY RESOLUTION TESI ...Aircraft Estpblishment - 100 years of research 7a. (For Translations ) .Title in Foreign Language 7b. (For Conference Papers) Title, Place and Date of

  15. Spring wheat gliadins: Have they changed in 100 years?

    Science.gov (United States)

    There have been many hard red spring (HRS) wheat cultivars released in North Dakota during the last 100 years. These cultivars have been improved for various characteristics such as, adaptation to weather conditions, high yield, and good milling and baking quality. The objectives of this study wer...

  16. Bacteriophages, revitalized after 100 years in the shadow of antibiotics

    Institute of Scientific and Technical Information of China (English)

    Hongping; Wei

    2015-01-01

    <正>The year 2015 marks 100 years since Dr.Frederick Twort discovered the"filterable lytic factor",which was later independently discovered and named "bacteriophage" by Dr.Felix d’Herelle.On this memorable centennial,it is exciting to see a special issue published by Virologica Sinica on Phages and Therapy.In this issue,readers will not only fi nd that bacteriophage research is a

  17. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    CERN Document Server

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  18. Improving durability of hot forging tools by applying hybrid layers

    Directory of Open Access Journals (Sweden)

    Z. Gronostajski

    2015-10-01

    Full Text Available This paper deals with problems relating to the durability of the dies used for the hot forging of spur gears. The results of industrial tests carried out on dies with a hybrid layer (a nitrided layer (PN + physical vapor deposition (PVD coating applied to improve their durability are presented. Two types of hybrid layers, differing in their PVD coating, were evaluated with regard to their durability improvement effectiveness. The tests have shown that by applying hybrid layers of the nitrided layer/PVD coating type one can effectively increase the durability of hot forging tools.

  19. Coating-substrate-simulations applied to HFQ® forming tools

    Directory of Open Access Journals (Sweden)

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  20. Extending dry storage of spent LWR fuel for 100 years.

    Energy Technology Data Exchange (ETDEWEB)

    Einziger, R. E.

    1998-12-16

    Because of delays in closing the back end of the fuel cycle in the U.S., there is a need to extend dry inert storage of spent fuel beyond its originally anticipated 20-year duration. Many of the methodologies developed to support initial licensing for 20-year storage should be able to support the longer storage periods envisioned. This paper evaluates the applicability of existing information and methodologies to support dry storage up to 100 years. The thrust of the analysis is the potential behavior of the spent fuel. In the USA, the criteria for dry storage of LWR spent fuel are delineated in 10 CFR 72 [1]. The criteria fall into four general categories: maintain subcriticality, prevent the release of radioactive material above acceptable limits, ensure that radiation rates and doses do not exceed acceptable levels, and maintain retrievability of the stored radioactive material. These criteria need to be considered for normal, off-normal, and postulated accident conditions. The initial safety analysis report submitted for licensing evaluated the fuel's ability to meet the requirements for 20 years. It is not the intent to repeat these calculations, but to look at expected behavior over the additional 80 years, during which the temperatures and radiation fields are lower. During the first 20 years, the properties of the components may change because of elevated temperatures, presence of moisture, effects of radiation, etc. During normal storage in an inert atmosphere, there is potential for the cladding mechanical properties to change due to annealing or interaction with cask materials. The emissivity of the cladding could also change due to storage conditions. If there is air leakage into the cask, additional degradation could occur through oxidation in breached rods, which could lead to additional fission gas release and enlargement of cladding breaches. Air in-leakage could also affect cover gas conductivity, cladding oxidation, emissivity changes, and

  1. Relativity and Gravitation : 100 Years After Einstein in Prague

    CERN Document Server

    Ledvinka, Tomáš; General Relativity, Cosmology and Astrophysics : Perspectives 100 Years After Einstein's Stay in Prague

    2014-01-01

    In early April 1911 Albert Einstein arrived in Prague to become full professor of theoretical physics at the German part of Charles University. It was there, for the first time, that he concentrated primarily on the problem of gravitation. Before he left Prague in July 1912 he had submitted the paper “Relativität und Gravitation: Erwiderung auf eine Bemerkung von M. Abraham” in which he remarkably anticipated what a future theory of gravity should look like. At the occasion of the Einstein-in-Prague centenary an international meeting was organized under a title inspired by Einstein's last paper from the Prague period: "Relativity and Gravitation, 100 Years after Einstein in Prague". The main topics of the conference included: classical relativity, numerical relativity, relativistic astrophysics and cosmology, quantum gravity, experimental aspects of gravitation, and conceptual and historical issues. The conference attracted over 200 scientists from 31 countries, among them a number of leading experts in ...

  2. Total Hip Arthroplasty – over 100 years of operative history

    Directory of Open Access Journals (Sweden)

    Stephen Richard Knight

    2011-11-01

    Full Text Available Total hip arthroplasty (THA has completely revolutionised the nature in which the arthritic hip is treated, and is considered to be one of the most successful orthopaedic interventions of its generation (1. With over 100 years of operative history, this review examines the progression of the operation from its origins, together with highlighting the materials and techniques that have contributed to its development. Knowledge of its history contributes to a greater understanding of THA, such as the reasons behind selection of prosthetic materials in certain patient groups, while demonstrating the importance of critically analyzing research to continually determine best operative practice. Finally, we describe current areas of research being undertaken to further advance techniques and improve outcomes.

  3. 100 years after Smoluchowski: stochastic processes in cell biology

    Science.gov (United States)

    Holcman, D.; Schuss, Z.

    2017-03-01

    100 years after Smoluchowski introduced his approach to stochastic processes, they are now at the basis of mathematical and physical modeling in cellular biology: they are used for example to analyse and to extract features from a large number (tens of thousands) of single molecular trajectories or to study the diffusive motion of molecules, proteins or receptors. Stochastic modeling is a new step in large data analysis that serves extracting cell biology concepts. We review here Smoluchowski’s approach to stochastic processes and provide several applications for coarse-graining diffusion, studying polymer models for understanding nuclear organization and finally, we discuss the stochastic jump dynamics of telomeres across cell division and stochastic gene regulation.

  4. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  5. [Sheehan's syndrome--a forgotten disease with 100 years' history].

    Science.gov (United States)

    Krysiak, Robert; Okopień, Bogusław

    2015-01-01

    Although named after Harold Sheehan, postpartum ischemic pituitary necrosis was reported for the first time 100 years ago in Przeglad Lekarski by Leon Konrad Gliński. In the majority of cases, the syndrome is a consequence of severe postpartum bleeding episode resulting in severe hypotension or hemorrhagic shock. The frequency of Sheehan's syndrome has decreased in developed countries as a result of improved obstetrical care, but this clinical entity remains a common cause of hypopituitarism in developing countries. The syndrome is characterized by varying degrees of anterior pituitary dysfunction resulting from the deficiency of multiple pituitary hormones. The order of frequency of hormone loss has generally been found to be growth hormone and prolactin, gonadotropins, ACTH and thyrotropin. Women with Sheehan's syndrome exhibit a variety of signs and symptoms including failure to lactate or resume menses, loss of genital and axillary hair, and often occurring long after delivery clinical manifestations of central hypothyroidism and secondary adrenal insufficiency. Diagnosis is based on laboratory studies, including hormone levels and hormone stimulation tests. Treatment of Sheehan's syndrome involves hormone replacement therapy. The aim of this study is to review current knowledge on clinically relevant aspects of this clinical entity and to provide the reader with recommendations concerning its diagnosis and treatment.

  6. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  7. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  8. Creating Long Term Income Streams for the 100 Year Starship Study Initiative

    Science.gov (United States)

    Sylvester, A. J.

    Development and execution of long term research projects are very dependent on a consistent application of funding to maximize the potential for success. The business structure for the 100 Year Starship Study project should allow for multiple income streams to cover the expenses of the research objectives. The following examples illustrate the range of potential avenues: 1) affiliation with a charitable foundation for creating a donation program to fund a long term endowment for research, 2) application for grants to fund initial research projects and establish the core expertise of the research entity, 3) development of intellectual property which can then be licensed for additional revenue, 4) creation of spinout companies with equity positions retained by the lab for funding the endowment, and 5) funded research which is dual use for the technology goals of the interstellar flight research objectives. With the establishment of a diversified stream of funding options, then the endowment can be funded at a level to permit dedicated research on the interstellar flight topics. This paper will focus on the strategy of creating spinout companies to create income streams which would fund the endowment of the 100 Year Starship Study effort. This technique is widely used by universities seeking to commercially develop and market technologies developed by university researchers. An approach will be outlined for applying this technique to potentially marketable technologies generated as a part of the 100 Year Starship Study effort.

  9. Progress of Cometary Science in the Past 100 Years

    Science.gov (United States)

    Sekanina, Zdenek

    1999-01-01

    Enormous strides made by cometary science during the 20th century defy any meaningful comparison of its state 100 years ago and now. The great majority of the subfields enjoying much attention nowadays did not exist in the year 1900. Dramatic developments, especially in the past 30-50 years, have equally affected observational and theoretical studies of comets. The profound diversification of observing techniques has been documented by the ever widening limits on the electromagnetic spectrum covered. While the time around 1900 marked an early period of slow and painful experimentation with photographic methods in cometary studies, observations of comets from the x-ray region to the radio waves have by now become routine. Many of the new techniques, and all those involved with the wavelengths shorter than about 300 nm, were made possible by another major breakthrough of this century - observing from space. Experiments on dedicated Earth-orbiting satellites as well as several deep-space probes have provided fascinating new information on the nature and makeup of comets. In broader terms, much of the progress has been achieved thanks to fundamental discoveries and major advances in electronics, whose applications resulted in qualitatively new instruments (e.g. radiotelescopes) and sensors or detectors (e.g. CCD arrays). The most universal effect on the entire cometary science, from observing to data handling to quantitative interpretations, has been, as in any other branch of science, due to the introduction of electronic computers, with their processing capabilities not only unheard of, but literally unimaginable, in the age of classical desk calculators. As if all this should not be enough, the today's generations of comet scientists have, in addition, been blessed with nature's highly appreciated cooperation. Indeed, in the span of a dozen years, between 1985 and 1997, we were privileged to witness four remarkable cometary events: (i) a return of Halley

  10. 100 years of seismic research on the Moho

    DEFF Research Database (Denmark)

    Prodehl, Claus; Kennett, Brian; Artemieva, Irina

    2013-01-01

    on the Moho is primarily based on the comprehensive overview of the worldwide history of seismological studies of the Earth's crust using controlled sources from 1850 to 2005, by Prodehl and Mooney (2012). Though the art of applying explosions, so-called “artificial events”, as energy sources for studies......The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research...

  11. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  12. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  13. 100-Year Floodplains, 100 year flood plain data, Published in 2006, 1:1200 (1in=100ft) scale, Washoe County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Field Survey/GPS information as of 2006. It is described...

  14. Process for selecting engineering tools : applied to selecting a SysML tool.

    Energy Technology Data Exchange (ETDEWEB)

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  15. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  16. INNOVATIVE SOLUTIONS APPLIED IN TOOLS FOR DETERMINING COAL MECHANICAL PROPRERTIES

    Directory of Open Access Journals (Sweden)

    Witold BIAŁY

    2015-10-01

    Full Text Available Due to very specific conditions of work of machines and equipment used in coal mining industry, the manner of their selection, taking into account the changing working conditions, is very important. Appropriate selection influences the increased durability and reliability of machines and equipment, which translates into the economic effects achieved. As the issue of measurement and evaluation of coal mechanical properties (including coal workability measurement is of great importance, previously applied methods for coal workability evaluation have been shortly reviewed. The im-portance of the problem is confirmed by the number of methods developed in various research centres all over the world. The article presents new instruments for determining and evaluating the mechanical properties of coal material (workability. The instruments have been developed in Poland and the author of this article is their co-inventor. The construction, principle of operation and innovative character of solutions applied in the instruments have been presented.

  17. Applying macro design tools to the design of MEMS accelerometers

    Energy Technology Data Exchange (ETDEWEB)

    Davies, B.R.; Rodgers, M.S.; Montague, S.

    1998-02-01

    This paper describes the design of two different surface micromachined (MEMS) accelerometers and the use of design and analysis tools intended for macro sized devices. This work leverages a process for integrating both the micromechanical structures and microelectronics circuitry of a MEMS accelerometer on the same chip. In this process, the mechanical components of the sensor are first fabricated at the bottom of a trench etched into the wafer substrate. The trench is then filled with oxide and sealed to protect the mechanical components during subsequent microelectronics processing. The wafer surface is then planarized in preparation for CMOS processing. Next, the CMOS electronics are fabricated and the mechanical structures are released. The mechanical structure of each sensor consists of two polysilicon plate masses suspended by multiple springs (cantilevered beam structures) over corresponding polysilicon plates fixed to the substrate to form two parallel plate capacitors. One polysilicon plate mass is suspended using compliant springs forming a variable capacitor. The other polysilicon plate mass is suspended using very stiff springs acting as a fixed capacitor. Acceleration is measured by comparing the variable capacitance with the fixed capacitance during acceleration.

  18. Geo-environmental mapping tool applied to pipeline design

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Karina de S.; Calle, Jose A.; Gil, Euzebio J. [Geomecanica S/A Tecnologia de Solo Rochas e Materiais, Rio de Janeiro, RJ (Brazil); Sare, Alexandre R. [Geomechanics International Inc., Houston, TX (United States); Soares, Ana Cecilia [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The Geo-Environmental Mapping is an improvement of the Geological-Geotechnical Mapping used for basic pipeline designs. The main purpose is to assembly the environmental, geotechnical and geological concepts in a methodological tool capable to predict constrains and reduce the pipeline impact to the environment. The Geo-Environmental mapping was built to stress the influence of soil/structure interaction, related to the physical effect that comes from the contact between structures and soil or rock. A Geological-Geotechnical-Environmental strip (chart) was presented to emphasize the pipeline operational constrains and its influence to the environment. The mapping was developed to clearly show the occurrence and properties of geological materials divided into geotechnical domain units (zones). The strips present construction natural properties, such as: excavability, stability of the excavation and soil re-use capability. Also, the environmental constrains were added to the geological-geotechnical mapping. The Geo-Environmental Mapping model helps the planning of the geotechnical and environmental inquiries to be carried out during executive design, the discussion on the types of equipment to be employed during construction and the analysis of the geological risks and environmental impacts to be faced during the built of the pipeline. (author)

  19. Quality control tools applied to a PV microgrid in Ecuador

    Energy Technology Data Exchange (ETDEWEB)

    Camino-Villacorta, M.; Egido-Aguilera, M.A. [Ciudad Univ., Madrid (Spain). Inst. de Energia Solar - UPM; Gamez, J.; Arranz-Piera, P. [Trama Tecnoambiental (TTA), Barcelona (Spain)

    2010-07-01

    The Instituto de Energia Solar has been dealing with quality control issues for rural electrification for many years. In the framework of project DOSBE (Development of Electricity Service Operators for Poverty Alleviation in Ecuador and Peru), a technical toolkit has been developed to implement adapted integral quality control procedures for photovoltaic systems (covering all components and equipment, installation and servicing), applicable at a local and regional scale, with the overall aim of increasing the confidence in photovoltaic systems. This toolkit was applied in the evaluation of an existing microgrid in Ecuador, which is described in this paper. The toolkit and the detailed results of its application are presented in a published document which is being widely distributed among the stakeholders of rural electrification in Ecuador and Peru. It can be downloaded from the web page of the DOSBE project: www.dosbe.org (orig.)

  20. Big Data Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2017-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and to...

  1. 100 Years Jubilee for the discovery of the enzymes in yeast

    DEFF Research Database (Denmark)

    Berg, Rolf W.

    1997-01-01

    The work by Prof. E. Buchner 100 years ago which led to the discovery of the enzymes in yeast for brewing beer is reviewed.......The work by Prof. E. Buchner 100 years ago which led to the discovery of the enzymes in yeast for brewing beer is reviewed....

  2. Under Connecticut Skies: Exploring 100 Years of Astronomy at Van Vleck Observatory in Middletown, Connecticut

    Science.gov (United States)

    Kilgard, Roy E.; Williams, Amrys; Erickson, Paul; Herbst, William; Redfield, Seth

    2017-01-01

    Under Connecticut Skies examines the history of astronomy at Van Vleck Observatory, located on the campus of Wesleyan University in Middletown, Connecticut. Since its dedication in June of 1916, Van Vleck has been an important site of astronomical research, teaching, and public outreach. Over a thousand visitors pass through the observatory each year, and regular public observing nights happen year-round in cooperation with the Astronomical Society of Greater Hartford. Our project explores the place-based nature of astronomical research, the scientific instruments, labor, and individuals that have connected places around the world in networks of observation, and the broader history of how observational astronomy has linked local people, amateur observers, professional astronomers, and the tools and objects that have facilitated their work under Connecticut’s skies over the past 100 years. Our research team has produced a historical exhibition to help commemorate the observatory’s centennial that opened to the public in May of 2016. Our work included collecting, documenting, and interpretting this history through objects, archival documents, oral histories, photographs, and more. The result is both a museum and a working history "laboratory" for use by student and professional researchers. In addition to the exhibit itself, we have engaged in new interpretive programs to help bring the history of astronomy to life. Future work will include digitization of documents and teaching slides, further collection of oral histories, and expanding the collection to the web for use by off-site researches.

  3. Individual differences and their measurement: A review of 100 years of research.

    Science.gov (United States)

    Sackett, Paul R; Lievens, Filip; Van Iddekinge, Chad H; Kuncel, Nathan R

    2017-03-01

    This article reviews 100 years of research on individual differences and their measurement, with a focus on research published in the Journal of Applied Psychology. We focus on 3 major individual differences domains: (a) knowledge, skill, and ability, including both the cognitive and physical domains; (b) personality, including integrity, emotional intelligence, stable motivational attributes (e.g., achievement motivation, core self-evaluations), and creativity; and (c) vocational interests. For each domain, we describe the evolution of the domain across the years and highlight major theoretical, empirical, and methodological developments, including relationships between individual differences and variables such as job performance, job satisfaction, and career development. We conclude by discussing future directions for individual differences research. Trends in the literature include a growing focus on substantive issues rather than on the measurement of individual differences, a differentiation between constructs and measurement methods, and the use of innovative ways of assessing individual differences, such as simulations, other-reports, and implicit measures. (PsycINFO Database Record

  4. 100-Year Floodplains, Floodplain, Published in 2000, Smaller than 1:100000 scale, Taylor County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Smaller than 1:100000 scale, was produced all or in part from Hardcopy Maps information as of 2000. It is described...

  5. 100-Year Floodplains, FEMA FIRM Mapping, Published in 2014, Not Applicable scale, GIS.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at Not Applicable scale, was produced all or in part from Other information as of 2014. It is described as 'FEMA FIRM...

  6. 100-Year Floodplains, Digitized FEMA flood maps, Published in unknown, Eureka County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, was produced all or in part from Hardcopy Maps information as of unknown. It is described as 'Digitized FEMA flood maps'. Data by...

  7. LOW FREQUENCY VARIABILITY OF INTERANNUAL CHANGE PATTERNS FOR GLOBAL MEAN TEMPERATURE DURING THE RECENT 100 YEARS

    Institute of Scientific and Technical Information of China (English)

    刘晶淼; 丁裕国; 等

    2002-01-01

    The TEEOF method that expands temporally is used to conduct a diagnostic study of the variation patterns of 1,3,6and 10 years with regard to mean air temperature over the globe and Southern and Northern Hemispheres over the course of 100 years.The results show that the first mode of TEEOF takes up more than 50%in the total variance,with each of the first mode in the interannual osicllations generally standing for annually varying patterns which are related with climate and reflecting long-term tendency of change in air temperature.It is particularly true for the first mode on the 10-year scale.which shows an obvious ascending ascending trend concerning the temperature in winter and consistently the primary component of time goes in a way that is very close to the sequence of actual temperature,Apart from the first mode of all time sections of TEEOF for the globe and the two hemispheres and the second mode of the 1-year TEEOF.interannual variation described by other characteristic vectors are showing various patterns,with corresponding primary components having relation with longterm variability of specific interannual quasi-periodic oscillation structures.A T2 test applied to the annual variation pattern shows that the abrupt changes for the southern Hemisphere and the globe come close to the result of a uni-element t test for mean temperature than those for the Northern Hemisphere do.It indicates that the T2 test,when carried out with patterns of multiple variables.Seems more reasonable than the t test with single elements.

  8. Base (100-year) flood elevations for selected sites in Marion County, Missouri

    Science.gov (United States)

    Southard, Rodney E.; Wilson, Gary L.

    1998-01-01

    The primary requirement for community participation in the National Flood Insurance Program is the adoption and enforcement of floodplain management requirements that minimize the potential for flood damages to new construction and avoid aggravating existing flooding conditions. This report provides base flood elevations (BFE) for a 100-year recurrence flood for use in the management and regulation of 14 flood-hazard areas designated by the Federal Emergency Management Agency as approximate Zone A areas in Marion County, Missouri. The one-dimensional surface-water flow model, HEC-RAS, was used to compute the base (100-year) flood elevations for the 14 Zone A sites. The 14 sites were located at U.S., State, or County road crossings and the base flood elevation was determined at the upstream side of each crossing. The base (100-year) flood elevations for BFE 1, 2, and 3 on the South Fork North River near Monroe City, Missouri, are 627.7, 579.2, and 545.9 feet above sea level. The base (100-year) flood elevations for BFE 4, 5, 6, and 7 on the main stem of the North River near or at Philadelphia and Palmyra, Missouri, are 560.5, 539.7, 504.2, and 494.4 feet above sea level. BFE 8 is located on Big Branch near Philadelphia, a tributary to the North River, and the base (100-year) flood elevation at this site is 530.5 feet above sea level. One site (BFE 9) is located on the South River near Monroe City, Missouri. The base (100-year) flood elevation at this site is 619.1 feet above sea level. Site BFE 10 is located on Bear Creek near Hannibal, Missouri, and the base (100-year) elevation is 565.5 feet above sea level. The four remaining sites (BFE 11, 12, 13, and 14) are located on the South Fabius River near Philadelphia and Palmyra, Missouri. The base (100-year) flood elevations for BFE 11, 12, 13, and 14 are 591.2, 578.4, 538.7, and 506.9 feet above sea level.

  9. An assessment tool applied to manure management systems using innovative technologies

    DEFF Research Database (Denmark)

    Sørensen, Claus G.; Jacobsen, Brian H.; Sommer, Sven G.

    2003-01-01

    of operational and cost-effective animal manure handling technologies. An assessment tool covering the whole chain of the manure handling system from the animal houses to the field has been developed. The tool enables a system-oriented evaluation of labour demand, machinery capacity and costs related...... to the handling of manure. By applying the tool to a pig farm and a dairy farm scenario, the competitiveness of new technologies was compared with traditional manure handling. The concept of a continuous flow of transport and application of slurry using umbilical transportation systems rather than traditional...

  10. 100 years of mapping the Holocene Rhine-Meuse delta plain: combining research and teaching

    NARCIS (Netherlands)

    Cohen, K. M.; Stouthamer, E.; Hoek, W. Z.; Middelkoop, H.

    2012-01-01

    The history of modern soil, geomorphological and shallow geological mapping in the Holocene Rhine-Meuse delta plain goes back about 100 years. The delta plain is of very heterogeneous build up, with clayey and peaty flood basins, dissected by sandy fluvial distributary channel belts with fine textur

  11. A MODEL TO EVALUATE 100-YEAR ENERGY-MIX SCENARIOS TO FACILITATE DEEP DECARBONIZATION IN THE SOUTHEASTERN UNITED STATES

    Energy Technology Data Exchange (ETDEWEB)

    Adkisson, Mary A [ORNL; Qualls, A L [ORNL

    2016-08-01

    The Southeast United States consumes approximately one billion megawatt-hours of electricity annually; roughly two-thirds from carbon dioxide (CO2) emitting sources. The balance is produced by non-CO2 emitting sources: nuclear power, hydroelectric power, and other renewables. Approximately 40% of the total CO2 emissions come from the electric grid. The CO2 emitting sources, coal, natural gas, and petroleum, produce approximately 372 million metric tons of CO2 annually. The rest is divided between the transportation sector (36%), the industrial sector (20%), the residential sector (3%), and the commercial sector (2%). An Energy Mix Modeling Analysis (EMMA) tool was developed to evaluate 100-year energy mix strategies to reduce CO2 emissions in the southeast. Current energy sector data was gathered and used to establish a 2016 reference baseline. The spreadsheet-based calculation runs 100-year scenarios based on current nuclear plant expiration dates, assumed electrical demand changes from the grid, assumed renewable power increases and efficiency gains, and assumed rates of reducing coal generation and deployment of new nuclear reactors. Within the model, natural gas electrical generation is calculated to meet any demand not met by other sources. Thus, natural gas is viewed as a transitional energy source that produces less CO2 than coal until non-CO2 emitting sources can be brought online. The annual production of CO2 and spent nuclear fuel and the natural gas consumed are calculated and summed. A progression of eight preliminary scenarios show that nuclear power can substantially reduce or eliminate demand for natural gas within 100 years if it is added at a rate of only 1000 MWe per year. Any increases in renewable energy or efficiency gains can offset the need for nuclear power. However, using nuclear power to reduce CO2 will result in significantly more spent fuel. More efficient advanced reactors can only marginally reduce the amount of spent fuel generated in

  12. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  13. Liverpool's Discovery: A University Library Applies a New Search Tool to Improve the User Experience

    Science.gov (United States)

    Kenney, Brian

    2011-01-01

    This article features the University of Liverpool's arts and humanities library, which applies a new search tool to improve the user experience. In nearly every way imaginable, the Sydney Jones Library and the Harold Cohen Library--the university's two libraries that serve science, engineering, and medical students--support the lives of their…

  14. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    Science.gov (United States)

    2016-09-01

    Washington Headquarters Services , Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington...Microsoft Windows. 15. SUBJECT TERMS Applied Anomaly Detection Tool, AADT, Windows, server, web service , installation 16. SECURITY CLASSIFICATION OF: 17...Web Platform Installer at the Products page..........................................8 Fig. 8 Web Platform Installer search results for PHP

  15. Struggles for Perspective: A Commentary on ""One Story of Many to Be Told": Following Empirical Studies of College and Adult Writing through 100 Years of NCTE Journals"

    Science.gov (United States)

    Brandt, Deborah

    2011-01-01

    In this article, the author comments on Kevin Roozen and Karen Lunsford's insightful examination of empirical studies of college and adult writing published in NCTE journals over the last 100 years. One sees in their account the struggles for perspective that marked writing studies in this period, as researchers applied ever wider lenses to the…

  16. FMEA TOOL APPLYING: CASE STUDY IN A COMPANY OF PASSENGER TRANSPORTATION BUSINESS

    Directory of Open Access Journals (Sweden)

    Cristiano Roos

    2007-12-01

    Full Text Available This assignment presents a case study in which a tool Failure Modes and Effects Analysis (FMEA was applied in a company on the land/air transportation of passengers and carrier. The objective of this study was to determine, with the tool FMEA, actions that minimize or eliminate potential failure modes in one of the service outcomes attended by the company. The specific point where the tool was applied was the manager of passenger land transportation vehicles, because failures related to this component increase the maintenance expenses of the company and tend to generate client insatisfaction. Also used, incorporated to the FMEA tool, other quality tools like Brainstorming, Ishikawa Diagram and Pareto Graphic. The results of the assignment were reached after determining actions that bring with itself the main objective of this study, this is, the liability and quality increase of the service given. This way, the achievement of the present case study caused a better understanding about the proposed theme, besides showing the importance of quality manager nowadays and facing the rising demands of the clients.

  17. REVIEW OF MODERN NON‐SURGICAL TOOLS APPLIED IN CARDIAC SURGERY

    Directory of Open Access Journals (Sweden)

    Marcin MARUSZEWSKI

    2013-04-01

    Full Text Available Surgical intervention is commonly associated with the use of hardware that facilitates invasive medical treatment. Nowadays surgeons apply a new set of tools that help them anticipate the outcome of the intervention and define potential risk factors. Increasing patient migration inspired healthcare professionals to introduce universal standards of care, supported by medical guidelines and checklists. Today, prior to skin incision, every modern cardiac surgeon is enabled in the whole range of tools that are designed to increase patient safety and provide thorough information to the whole medical team.

  18. 100 years of mapping the Holocene Rhine-Meuse delta plain: combining research and teaching

    Science.gov (United States)

    Cohen, K. M.; Stouthamer, E.; Hoek, W. Z.; Middelkoop, H.

    2012-04-01

    The history of modern soil, geomorphological and shallow geological mapping in the Holocene Rhine-Meuse delta plain goes back about 100 years. The delta plain is of very heterogeneous build up, with clayey and peaty flood basins, dissected by sandy fluvial distributary channel belts with fine textured levees grading into tidal-influenced rivers and estuaries. Several generations of precursor rivers occur as alluvial ridges and buried ribbon sands. They form an intricate network originating from repeated avulsions, back to 8000 years ago. Present rivers have been embanked since ca. 1250 AD and the delta plain (~ 3000 km2) has been reclaimed for agriculture. Soils are young and subject to oxidation and compaction. The first detailed field map of channel belts and floodbasins was made in 1926 by Vink, a geography teacher from Amsterdam. Soil mapping and Holocene geology gained interest after WW-II, with Wageningen soil scientists Edelman, Hoeksema and Pons taking lead. Utrecht University started teaching and research on the subject in 1959, launching an undergraduate mapping field course based on hand augering and field observation. An archive of borehole logs and local maps started to build up. Initially focused on soil mapping, from 1973 the course shifted to a geomorphological-geological focus. Berendsen took over supervision, introduced standard description protocols and legends and increased coring depth. This resulted in 1982 in his influential PhD thesis on the Rhine delta's genesis. New coring and sampling methods came and extensive 14C dating campaigns began. With steadily increasing numbers of students, accumulation of data speeded up, and increasingly larger parts of the delta were mapped. The academic mapping ran in parallel with soil survey and geological survey mapping campaigns. The computer was introduced in the field course and digital data archiving began in 1989. A series of PhD studies on thematic aspects of delta evolution and an increasing number

  19. Accuracy assessment of the UT1 prediction method based on 100-year series analysis

    CERN Document Server

    Malkin, Z; Tolstikov, A

    2013-01-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole coordinates. The method is based on construction of a general polyharmonic model of the variations of the Earth rotation parameters using all the data available for the last 80-100 years, and modified autoregression technique. In this presentation, a detailed comparison was made of real-time UT1 predictions computed making use of this method in 2006-2010 with simultaneous predictions computed at the International Earth Rotation and Reference Systems Service (IERS). Obtained results have shown that proposed method provides better accuracy at different prediction lengths.

  20. Fascia Research Congress evidence from the 100 year perspective of Andrew Taylor Still.

    Science.gov (United States)

    Findley, Thomas W; Shalwala, Mona

    2013-07-01

    More than 100 years ago A.T. Still MD founded osteopathic medicine, and specifically described fascia as a covering, with common origins of layers of the fascial system despite diverse names for individual parts. Fascia assists gliding and fluid flow and is highly innervated. Fascia is intimately involved with respiration and with nourishment of all cells of the body, including those of disease and cancer. This paper reviews information presented at the first three International Fascia Research Congresses in 2007, 2009 and 2012 from the perspective of Dr Still, that fascia is vital for organism's growth and support, and it is where disease is sown.

  1. 100 years of Epilepsia: landmark papers and their influence in neuropsychology and neuropsychiatry.

    Science.gov (United States)

    Hermann, Bruce

    2010-07-01

    As part of the 2009 International League Against Epilepsy (ILAE) Centenary Celebration, a special symposium was dedicated to Epilepsia (100 Years of Epilepsia: Landmark Papers and Their Influence). The Associate Editors were asked to identify a particularly salient and meaningful paper in their areas of expertise. From the content areas of neuropsychology and neuropsychiatry two very interesting papers were identified using quite different ascertainment techniques. One paper addressed the problem of psychosis in temporal lobe epilepsy, whereas the other represents the first paper to appear in Epilepsia presenting quantitative assessment of cognitive status in epilepsy. These two papers are reviewed in detail and placed in historical context.

  2. Did Open Solar Magnetic Field Increase during the Last 100 Years: A Reanalysis of Geomagnetic Activity

    CERN Document Server

    Mursula, K; Karinen, A

    2004-01-01

    Long-term geomagnetic activity presented by the aa index has been used to show that the heliospheric magnetic field has more than doubled during the last 100 years. However, serious concern has been raised on the long-term consistency of the aa index and on the centennial rise of the solar magnetic field. Here we reanalyze geomagnetic activity during the last 100 years by calculating the recently suggested IHV (Inter-Hour Variability) index as a measure of local geomagnetic activity for seven stations. We find that local geomagnetic activity at all stations follows the same qualitative long-term pattern: an increase from early 1900s to 1960, a dramatic dropout in 1960s and a (mostly weaker) increase thereafter. Moreover, at all stations, the activity at the end of the 20th century has a higher average level than at the beginning of the century. This agrees with the result based on the aa index that global geomagnetic activity, and thereby, the open solar magnetic field has indeed increased during the last 100...

  3. Oceanic environmental changes of subarctic Bering Sea in recent 100 years: Evidence from molecular fossils

    Institute of Scientific and Technical Information of China (English)

    LU; Bing; CHEN; Ronghua; ZHOU; Huaiyang; WANG; Zipan; CHEN

    2005-01-01

    The core sample B2-9 from the seafloor of the subarctic Bering Sea was dated with 210Pb to obtain a consecutive sequence of oceanic sedimentary environments at an interval of a decade during 1890-1999. A variety of molecular fossils were detected, including n-alkanes, isoprenoids, fatty acids, sterols, etc. By the characteristics of these fine molecules (C27, C28, and C29 sterols) and their molecular indices (Pr/Ph, ∑C+22/∑C?21, CPI and C18∶2/C18∶0) and in consideration of the variation of organic carbon content, the 100-year evolution history of subarctic sea paleoenvironment was reestablished. It is indicated that during the past 100 years in the Arctic, there were two events of strong climate warming (1920-1950 and 1980-1999), which resulted in an oxidated sediment environment owing to decreasing terrigenous organic matters and increasing marine-derived organic matters, and two events of transitory climate cooling (1910 and 1970-1980), which resulted in a slightly reduced sediment environment owing to increasing terrigenous organic matters and decreasing marine-derived organic matters. It is revealed that the processes of warming/cooling alternated climate are directly related to the Arctic and global climate variations.

  4. A risk assessment tool applied to the study of shale gas resources.

    Science.gov (United States)

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach.

  5. Sedimentary records of eutrophication and hypoxia in the Changjiang Estuary over the last 100 years

    Science.gov (United States)

    Xuwen, F.; Hongliang, L.; Zhao, M.; Xuefa, S.

    2012-12-01

    We selected two cores in the Changjiang Estuary, one located in the Changjiang Estuary mud area (CEMA) within the region of seasonal hypoxia, the other located in the Cheju Island mud area (SCIMA) and outside the hypoxia region. The grain size, total organic carbon (TOC), stable carbon isotopic ratios (δ13 Corg), biomarkers (the sum of brassicasterol, dinosterol and alkenone) and some redox sensitive elements (RSEs) were determined on the 210Pb-dated sediment cores to study potential hundrend-years eutrophication and hypoxia. The sediment record in CEMA showed that an increase in TOC (21%), biomarkers (141%) and δ13 Corg (1.6‰PDB ) occurred since 1950s and a marked increase since 1970s. These distributions indicated the enhanced productivity and establshed the history of eutrophication in the Changjiang Estuary during the past 100 years. Some RSEs have been enriched significantly since the late 1960s to 1970s, the rates of Mo/Al, Cd/Al and As/Al increased about 83%, 73% and 50% respectively. These data may indicate the onset of hypoxia in the Changjiang Estuary during the last 100 years. The increasing of marine organic matter and RSEs accumulation was corresponding with the fertilizer consumption and high nutrient inputs from the Changjiang River. The riverine runoff of fertilizers and nutrients stimulated the algae (e g. brassicasterol, dinosterol) blooming. Enhanced primary production resulted in an enrichment of organic matter and hypoxia invoked organic matter preserved in the sediment. For the core sediment in SCIMA, the geochemical indicators (TOC, biomarkers and δ13Corg ) increased in difference degrees before 1950s~1970s and then were almost the constant. Productivity in the SCIMA have been mainly influenced by climate ocean circulation changes over the last 100 years. The RSEs were controlled by "grain size effects" which indicated no hypoxia occurred. This study concluded that δ13 Corg, RSEs and biomarkers in sediment could be used to trace or

  6. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    Directory of Open Access Journals (Sweden)

    Mari-Carmen Mochón

    2016-03-01

    Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.

  7. Sustainable Foods and Medicines Support Vitality, Sex and Longevity for a 100-Year Starship Expedition

    Science.gov (United States)

    Edwards, M. R.

    Extended space flight requires foods and medicines that sustain crew health and vitality. The health and therapeutic needs for the entire crew and their children for a 100-year space flight must be sustainable. The starship cannot depend on resupply or carry a large cargo of pharmaceuticals. Everything in the starship must be completely recyclable and reconstructable, including food, feed, textiles, building materials, pharmaceuticals, vaccines, and medicines. Smart microfarms will produce functional foods with superior nutrition and sensory attributes. These foods provide high-quality protein and nutralence (nutrient density), that avoids obesity, diabetes, and other Western diseases. The combination of functional foods, lifestyle actions, and medicines will support crew immunity, energy, vitality, sustained strong health, and longevity. Smart microfarms enable the production of fresh medicines in hours or days, eliminating the need for a large dispensary, which eliminates concern over drug shelf life. Smart microfarms are adaptable to the extreme growing area, resource, and environmental constraints associated with an extended starship expedition.

  8. [The 100-year anniversary of Eugene Jamot's (1879-1937) admittance to the Pharo School].

    Science.gov (United States)

    Milleliri, J M; Louis, F J

    2010-04-01

    For the 100-year anniversary of Dr. Eugene Jamot's (1879-1937) admittance to the Pharo School (then known as the Training School of the Colonial Army Health Corps), the authors describe the life of a French military physician working in Africa. Eugene Jamot devoted 22 years of his life to fighting sleeping sickness. Using a standardized approach that has become a textbook example, he was highly successful in controlling this dreaded tropical disease. Despite being criticized by some officials of the colonial administration and becoming the target of an obvious smear campaign because of his strong personality and growing fame, Jamot handed down a set of values that are recognized by most physicians working to improve the living conditions of the unfortunately still suffering African population.

  9. Prediction of Climatic Change for the Next 100 Years in the Apulia Region, Southern Italy

    Directory of Open Access Journals (Sweden)

    Mladen Todorovic

    2007-12-01

    Full Text Available The impact of climate change on water resources and use for agricultural production has become a critical question for sustainability. Our objective was investigate the impact of the expected climate changes for the next 100 years on the water balance variations, climatic classifications, and crop water requirements in the Apulia region (Southern Italy. The results indicated that an increase of temperature, in the range between 1.3 and 2,5 °C, is expected in the next 100 years. The reference evapotranspiration (ETo variations would follow a similar trend; as averaged over the whole region, the ETo increase would be about 15.4%. The precipitation will not change significantly on yearly basis although a slight decrease in summer months and a slight increase during the winter season are foreseen. The climatic water deficit (CWD is largely caused by ETo increase, and it would increase over the whole Apulia region in average for more than 200 mm. According to Thornthwaite and Mather climate classification, the moisture index will decrease in the future, with decreasing of humid areas and increasing of aridity zones. The net irrigation requirements (NIR, calculated for ten major crops in the Apulia region, would increase significantly in the future. By the end of the 21st Century, the foreseen increase of NIR, in respect to actual situation, is the greatest for olive tree (65%, wheat (61%, grapevine (49%, and citrus (48% and it is slightly lower for maize (35%, sorghum (34%, sunflower (33%, tomato (31%, and winter and spring sugar beet (both 27%.

  10. 77 FR 66823 - Freedom of Information Act Request for Papers Submitted to DARPA for the 2011 100 Year Starship...

    Science.gov (United States)

    2012-11-07

    ... of the Secretary Freedom of Information Act Request for Papers Submitted to DARPA for the 2011 100 Year Starship Symposium AGENCY: Defense Advanced Research Projects Agency (DARPA), DoD. ACTION: Notice... panels at the 2011 100 Year Starship Symposium must provide DARPA a written response explaining...

  11. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  12. Lessons to be learned from an analysis of ammonium nitrate disasters in the last 100 years

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, William; Han, Zhe; Harding, Brian; Rosas, Camilo; Jiang, Jiaojun; Pineda, Alba; Mannan, M. Sam, E-mail: mannan@tamu.edu

    2014-09-15

    Highlights: • Root causes and contributing factors from ammonium nitrate incidents are categorized into 10 lessons. • The lessons learned from the past 100 years of ammonium nitrate incidents can be used to improve design, operation, and maintenance procedures. • Improving organizational memory to help improve safety performance. • Combating and changing organizational cultures. - Abstract: Process safety, as well as the safe storage and transportation of hazardous or reactive chemicals, has been a topic of increasing interest in the last few decades. The increased interest in improving the safety of operations has been driven largely by a series of recent catastrophes that have occurred in the United States and the rest of the world. A continuous review of past incidents and disasters to look for common causes and lessons is an essential component to any process safety and loss prevention program. While analyzing the causes of an accident cannot prevent that accident from occurring, learning from it can help to prevent future incidents. The objective of this article is to review a selection of major incidents involving ammonium nitrate in the last century to identify common causes and lessons that can be gleaned from these incidents in the hopes of preventing future disasters. Ammonium nitrate has been involved in dozens of major incidents in the last century, so a subset of major incidents were chosen for discussion for the sake of brevity. Twelve incidents are reviewed and ten lessons from these incidents are discussed.

  13. Physiological and morphological acclimation to height in cupressoid leaves of 100-year-old Chamaecyparis obtusa.

    Science.gov (United States)

    Shiraki, Ayumi; Azuma, Wakana; Kuroda, Keiko; Ishii, H Roaki

    2016-10-15

    Cupressoid (scale-like) leaves are morphologically and functionally intermediate between stems and leaves. While past studies on height acclimation of cupressoid leaves have focused on acclimation to the vertical light gradient, the relationship between morphology and hydraulic function remains unexplored. Here, we compared physiological and morphological characteristics between treetop and lower-crown leaves of 100-year-old Chamaecyparis obtusa Endl. trees (~27 m tall) to investigate whether height-acclimation compensates for hydraulic constraints. We found that physiological acclimation of leaves was determined by light, which drove the vertical gradient of evaporative demand, while leaf morphology and anatomy were determined by height. Compared with lower-crown leaves, treetop leaves were physiologically acclimated to water stress. Leaf hydraulic conductance was not affected by height, and this contributed to higher photosynthetic rates of treetop leaves. Treetop leaves had higher leaf area density and greater leaf mass per area, which increase light interception but could also decrease hydraulic efficiency. We inferred that transfusion tissue flanking the leaf vein, which was more developed in the treetop leaves, contributes to water-stress acclimation and maintenance of leaf hydraulic conductance by facilitating osmotic adjustment of leaf water potential and efficient water transport from xylem to mesophyll. Our findings may represent anatomical adaptation that compensates for hydraulic constraints on physiological function with increasing height.

  14. Revisiting extreme storms of the past 100 years for future safety of large water management infrastructures

    Science.gov (United States)

    Chen, Xiaodong; Hossain, Faisal

    2016-07-01

    Historical extreme storm events are widely used to make Probable Maximum Precipitation (PMP) estimates, which form the cornerstone of large water management infrastructure safety. Past studies suggest that extreme precipitation processes can be sensitive to land surface feedback and the planetary warming trend, which makes the future safety of large infrastructures questionable given the projected changes in land cover and temperature in the coming decades. In this study, a numerical modeling framework was employed to reconstruct 10 extreme storms over CONUS that occurred during the past 100 years, which are used by the engineering profession for PMP estimation for large infrastructures such as dams. Results show that the correlation in daily rainfall for such reconstruction can range between 0.4 and 0.7, while the correlation for maximum 3-day accumulation (a standard period used in infrastructure design) is always above 0.5 for post-1948 storms. This suggests that current numerical modeling and reanalysis data allow us to reconstruct big storms after 1948 with acceptable accuracy. For storms prior to 1948, however, reconstruction of storms shows inconsistency with observations. Our study indicates that numerical modeling and data may not have advanced to a sufficient level to understand how such old storms (pre-1948) may behave in future warming and land cover conditions. However, the infrastructure community can certainly rely on the use of model reconstructed extreme storms of the 1948-present period to reassess safety of our large water infrastructures under assumed changes in temperature and land cover.

  15. [Nutrient dynamics over the past 100 years and its restoration baseline in Dianshan Lake].

    Science.gov (United States)

    Li, Xiao-Ping; Chen, Xiao-Hua; Dong, Xu-Hui; Dong, Zhi; Sun, Dun-Ping

    2012-10-01

    The restoration of eutrophic lakes requires a good knowledge on the history and baseline of nutrients in the lakes. This work conducted an analysis on 210Pb/137Cs, water content, loss-on-ignition, sedimentary total phosphorus (TP), total nitrogen (TN), total organic carbon (TOC) and diatoms in the four sediment cores from Dianshan Lake (near Shanghai City). Good coherence in palaeoproxies between the cores indicates a relatively stable sedimentary environment. With increasing human impact, diatom communities shifted from oligo-trophic species Cyclotella bodanica, C. ocelata, Achnanthes minutissima, Cocconeis placentula var lineate, Cymbella sp. , Fragilaria pintata, F. brevistrata, F. construens var venter to recent eutrophic species including Cyclostephanos dubias, C. atomus, Stephanodiscus minitulus, S. hantzschi, Aulacoseria alpigena. The epilimnetic TP over the past 100 years reconstructed using an established diatom-TP transfer function matches well with the monitoring TP where exists. Based on the sedimentary nutrient characteristics and diatom-reconstructed nutrient dynamics, we proposed that the nutrient baseline for Dianshan Lake is 50-60 microg x L(-1), 500 mg x kg(-1) and 550 mg x kg(-1) for water TP concentration, sedimentary TP and TN, respectively.

  16. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  17. The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools

    CERN Document Server

    Holst, Michael; Tiglio, Manuel; Vallisneri, Michele

    2016-01-01

    On September 14, 2015, the newly upgraded Laser Interferometer Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW) signal, emitted a billion light-years away by a coalescing binary of two stellar-mass black holes. The detection was announced in February 2016, in time for the hundredth anniversary of Einstein's prediction of GWs within the theory of general relativity (GR). The signal represents the first direct detection of GWs, the first observation of a black-hole binary, and the first test of GR in its strong-field, high-velocity, nonlinear regime. In the remainder of its first observing run, LIGO observed two more signals from black-hole binaries, one moderately loud, another at the boundary of statistical significance. The detections mark the end of a decades-long quest, and the beginning of GW astronomy: finally, we are able to probe the unseen, electromagnetically dark Universe by listening to it. In this article, we present a short historical overview of GW science: this youn...

  18. Applying a visual language for image processing as a graphical teaching tool in medical imaging

    Science.gov (United States)

    Birchman, James J.; Tanimoto, Steven L.; Rowberg, Alan H.; Choi, Hyung-Sik; Kim, Yongmin

    1992-05-01

    Typical user interaction in image processing is with command line entries, pull-down menus, or text menu selections from a list, and as such is not generally graphical in nature. Although applying these interactive methods to construct more sophisticated algorithms from a series of simple image processing steps may be clear to engineers and programmers, it may not be clear to clinicians. A solution to this problem is to implement a visual programming language using visual representations to express image processing algorithms. Visual representations promote a more natural and rapid understanding of image processing algorithms by providing more visual insight into what the algorithms do than the interactive methods mentioned above can provide. Individuals accustomed to dealing with images will be more likely to understand an algorithm that is represented visually. This is especially true of referring physicians, such as surgeons in an intensive care unit. With the increasing acceptance of picture archiving and communications system (PACS) workstations and the trend toward increasing clinical use of image processing, referring physicians will need to learn more sophisticated concepts than simply image access and display. If the procedures that they perform commonly, such as window width and window level adjustment and image enhancement using unsharp masking, are depicted visually in an interactive environment, it will be easier for them to learn and apply these concepts. The software described in this paper is a visual programming language for imaging processing which has been implemented on the NeXT computer using NeXTstep user interface development tools and other tools in an object-oriented environment. The concept is based upon the description of a visual language titled `Visualization of Vision Algorithms' (VIVA). Iconic representations of simple image processing steps are placed into a workbench screen and connected together into a dataflow path by the user. As

  19. 100-Year Floodplains, flood plain, Published in 2009, 1:24000 (1in=2000ft) scale, Washington County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'flood...

  20. Surveillance as an innovative tool for furthering technological development as applied to the plastic packaging sector

    Directory of Open Access Journals (Sweden)

    Freddy Abel Vargas

    2010-04-01

    Full Text Available The demand for production process efficiency and quality has made it necessary to resort to new tools for development and technological innovation. Surveillance of the enviroment has thus bee identified as beign a priority, paying special attention to technology which (by its changing nature is a key factor in competitiveness. Surveillance is a routine activity in developed countries ' organisations; however, few suitable studies have been carried out in Colombia and few instruments produced for applying it to existing sectors of the economy. The present article attempts to define a methodology for technological awareness (based on transforming the information contained in databases by means of constructing technological maps contributing useful knowledge to production processes. This methodology has been applied to the flexible plastic packaging sector. The main trends in this industry's technological development were identified allowing strategies to be proposed for incorporating these advances and tendencies in national companies and research groups involved in flexible plastic packaging technological development and innovation. Technological mappiong's possibilities as an important instrument for producing technological development in a given sector are the analysed as are their possibilities for being used in other production processes.

  1. The Archives of the Department of Terrestrial Magnetism: Documenting 100 Years of Carnegie Science

    Science.gov (United States)

    Hardy, S. J.

    2005-12-01

    The archives of the Department of Terrestrial Magnetism (DTM) of the Carnegie Institution of Washington document more than a century of geophysical and astronomical investigations. Primary source materials available for historical research include field and laboratory notebooks, equipment designs, plans for observatories and research vessels, scientists' correspondence, and thousands of expedition and instrument photographs. Yet despite its history, DTM long lacked a systematic approach to managing its documentary heritage. A preliminary records survey conducted in 2001 identified more than 1,000 linear feet of historically-valuable records languishing in dusty, poorly-accessible storerooms. Intellectual control at that time was minimal. With support from the National Historical Publications and Records Commission, the "Carnegie Legacy Project" was initiated in 2003 to preserve, organize, and facilitate access to DTM's archival records, as well as those of the Carnegie Institution's administrative headquarters and Geophysical Laboratory. Professional archivists were hired to process the 100-year backlog of records. Policies and procedures were established to ensure that all work conformed to national archival standards. Records were appraised, organized, and rehoused in acid-free containers, and finding aids were created for the project web site. Standardized descriptions of each collection were contributed to the WorldCat bibliographic database and the AIP International Catalog of Sources for History of Physics. Historic photographs and documents were digitized for online exhibitions to raise awareness of the archives among researchers and the general public. The success of the Legacy Project depended on collaboration between archivists, librarians, historians, data specialists, and scientists. This presentation will discuss key aspects (funding, staffing, preservation, access, outreach) of the Legacy Project and is aimed at personnel in observatories, research

  2. Rainfall and Drought In Equatorial East Africa During The Past 1,100 Years

    Science.gov (United States)

    Verschuren, D.; Laird, K. R.; Cumming, B. F.

    Knowledge of natural long-term rainfall variability is essential for water-resource and land-use management in all dry-land regions of the world. In tropical Africa, data rel- evant to determining this variability are scarce because of the lack of long instrumen- tal climate records and the limited potential of some high-resolution proxy climate archives such as tree rings and ice cores. An 1,100-year reconstruction of decadel- scale variability in African rainfall and drought based on lake-level and salinity fluc- tuations of Lake Naivasha (Eastern Rift Valley, Kenya) now indicates that eastern equatorial Africa within the last millennia has alternated between strongly contrast- ing climatic conditions, including significantly reduced effective moisture during the 'Medieval Warm Period' (~AD 900-1270), and a generally higher effective moisture than today during the 'Little Ice Age' (~AD 1270-1850) interrupted by three episodes of prolonged aridity (~AD 1380-1420, 1560-1620, and 1760-1840) more severe than any historically recorded drought. Pattern and timing of the reconstructed Little Ice Age climate fluctuations correlate with the residual record of atmospheric radiocar- bon production, suggesting that long-term variations in solar radiation may have con- tributed to African rainfall variability at these time scales. In agreement with other recently documented instances of a solar influence on long-term variations in hydro- logical balance, solar minima correlated with increases in effective moisture and solar maxima with increased aridity. It remains unclear, however, whether variation in solar radiation generated fluctuations in effective moisture mainly by means of a direct or indirect influence on rainfall, or rather through the influence of temperature variations on evaporation rates.

  3. To Humbly Go: Guarding Against Perpetuating Models of Colonization in the 100-Year Starship Study

    Science.gov (United States)

    Kramer, W. R.

    Past patterns of exploration, colonization and exploitation on Earth continue to provide the predominant paradigms that guide many space programs. Any project of crewed space exploration, especially of the magnitude envisioned by the 100-Year Starship Study, must guard against the hubris that may emerge among planners, crew, and others associated with the project, including those industries and bureaucracies that will emerge from the effort. Maintaining a non-exploitative approach may be difficult in consideration of the century of preparatory research and development and the likely multigenerational nature of the voyage itself. Starting now with mission dreamers and planners, the purpose of the voyage must be cast as one of respectful learning and humble discovery, not of conquest (either actual or metaphorical) or other inappropriate models, including military. At a minimum, the Study must actively build non-violence into the voyaging culture it is beginning to create today. References to exploitive colonization, conquest, destiny and other terms from especially American frontier mythology, while tempting in their propagandizing power, should be avoided as they limit creative thinking about alternative possible futures. Future voyagers must strive to adapt to new environments wherever possible and be assimilated by new worlds both biologically and behaviorally rather than to rely on attempts to recreate the Earth they have left. Adaptation should be strongly considered over terraforming. This paper provides an overview of previous work linking the language of colonization to space programs and challenges the extension of the myth of the American frontier to the Starship Study. It argues that such metaphors would be counter-productive at best and have the potential to doom long-term success and survival by planting seeds of social decay and self-destruction. Cautions and recommendations are suggested.

  4. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Science.gov (United States)

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  5. The Hunterian Neurosurgical Laboratory: the first 100 years of neurosurgical research.

    Science.gov (United States)

    Sampath, P; Long, D M; Brem, H

    2000-01-01

    Modern neurosurgery has long had a strong laboratory foundation, and much of this tradition can be traced to the Hunterian Neurosurgical Laboratory of the Johns Hopkins Hospital. Founded with the basic goals of investigating the causes and symptoms of disease and establishing the crucial role that surgeons may play in the treatment of disease, the Hunterian laboratory has adhered to these tenets, despite the dramatic changes in neurosurgery that have occurred in the last 100 years. Named for the famous English surgeon John Hunter (1728-1793), the Hunterian laboratory was conceived by William Welch and William Halsted as a special laboratory for experimental work in surgery and pathology. In 1904, Harvey Cushing was appointed by Halsted to direct the laboratory. With the three primary goals of student education, veterinary surgery that stressed surgical techniques, and meticulous surgical and laboratory record-keeping, the laboratory was quite productive, introducing the use of physiological saline solutions, describing the anatomic features and function of the pituitary gland, and establishing the field of endocrinology. In addition, the original development of hanging drop tissue culture, fundamental investigations into cerebrospinal fluid, and countless contributions to otolaryngology by Samuel Crowe all occurred during this "crucible" period. In 1912, Cushing was succeeded by Walter Dandy, whose work on experimental hydrocephalus and cerebrospinal fluid circulation led to the development of pneumoencephalography. The early days of neurosurgery evolved with close ties to general surgery, and so did the Hunterian laboratory. After Dandy began devoting his time to clinical work, general surgeons (first Jay McLean and then, in 1922, Ferdinand Lee) became the directors of the laboratory. Between 1928 and 1942, more than 150 original articles were issued from the Hunterian laboratory; these articles described significant advances in surgery, including pioneering

  6. Evolution of iron minerals in a 100years-old Technosol. Consequences on Zn mobility

    Energy Technology Data Exchange (ETDEWEB)

    Coussy, Samuel; Grangeon, Sylvain; Bataillard, Philippe; Khodja, Hicham; Maubec, Nicolas; Faure, Pierre; Schwartz, Christophe; Dagois, Robin (BRGM- France); (CNRS-UMR)

    2017-03-01

    The prediction of the long term trace element mobility in anthropogenic soils would be a way to anticipate land management and should help in reusing slightly contaminated materials. In the present study, iron (Fe) and zinc (Zn) status evolution was investigated in a 100-year old Technosol. The site of investigation is an old brownfield located in the Nord-Pas-de-Calais region (France) which has not been reshaped since the beginning of the last century. The whole soil profile was sampled as a function of depth, and trace elements mobility at each depth was determined by batch leaching test. A specific focus on Fe and Zn status was carried out by bulk analyses, such as selective dissolution, X-ray diffraction (XRD) and X-ray absorption spectroscopy (XAS). Fe and Zn status in the profile samples was also studied using laterally resolved techniques such as μ-particle induced X-ray emission (μ-PIXE) and μ-Rutherford backscattering spectroscopy (μ-RBS). The results indicate that (i) Fe is mainly under Fe(III) form, except a minor contribution of Fe(II) in the deeper samples, (ii) some Fe species inherited from the past have been weathered and secondary minerals are constituted of metal-bearing sulphates and Fe (hydr)oxides, (iii) ferrihydrite is formed during pedogenesis (iv) 20 to 30% more Fe (hydr)oxides are present in the surface than in depth and (v) Zn has tetrahedral coordination and is sorbed to phases of increasing crystallinity when depth increases. Zn-bearing phases identified in the present study are: complex Fe, Mn, Zn sulphides, sulphates, organic matter, and ferrihydrite. Soil formation on such material does not induce a dramatic increase of Zn solubility since efficient scavengers are concomitantly formed in the system. However, Technosols are highly heterogeneous and widely differ from one place to another. The behavior examined in this study is not generic and will depend on the type of Technosol and on the secondary minerals formed as well as on

  7. Underworld-GT Applied to Guangdong, a Tool to Explore the Geothermal Potential of the Crust

    Institute of Scientific and Technical Information of China (English)

    Steve Quenette; Yufei Xi; John Mansour; Louis Moresi; David Abramson

    2015-01-01

    Geothermal energy potential is usually discussed in the context of conventional or engi-neered systems and at the scale of an individual reservoir. Whereas exploration for conventional reser-voirs has been relatively easy, with expressions of resource found close to or even at the surface, explora-tion for non-conventional systems relies on temperature inherently increasing with depth and searching for favourable geological environments that maximise this increase. To utilitise the information we do have, we often assimilate available exploration data with models that capture the physics of the domi-nant underlying processes. Here, we discuss computational modelling approaches to exploration at a re-gional or crust scale, with application to geothermal reservoirs within basins or systems of basins. Tar-get reservoirs have (at least) appropriate temperature, permeability and are at accessible depths. We discuss the software development approach that leads to effective use of the tool Underworld. We ex-plore its role in the process of modelling, understanding computational error, importing and exporting geological knowledge as applied to the geological system underpinning the Guangdong Province, China.

  8. Is Scores Derived from the Most Internationally Applied Patient Safety Culture Assessment Tool Correct?

    Directory of Open Access Journals (Sweden)

    Javad Moghri

    2013-09-01

    Full Text Available Background: Hospital Survey on Patient Safety Culture, known as HSOPS, is an internationally well known and widely used tool for measuring patient safety culture in hospitals. It includes 12 dimensions with positive and negative wording questions. The distribution of these questions in different dimensions is uneven and provides the risk of acquiescence bias. The aim of this study was to assess the questionnaire against this bias.Methods: Three hundred nurses were assigned into study and control groups randomly. Short form of HSOPS was distributed in the control group and totally reversed form of it was given to the study group. Percent positive scores and t-test were applied for data analysis. Statistical analyses were conducted using SPSS Version 16.Results: Finally a total of 272 nurses completed the questionnaire. All dimensions with positive wording items in both groups had higher scores compared with their negative worded format. The first dimension "organizational learning and continued improvement" which had the only statistically significant difference, got 16.2% less score in the study group comparing the other group. In addition six out of 18 differences in questions were statistically significant.Conclusion: The popular and widely used HSOPS is subject to acquiescence bias. The bias might lead to exaggerate the status of some patient safety culture composites. Balancing the number of positive and negative worded items in each composite could mitigate the mentioned bias and provide a more valid estimation of different elements of patient safety culture.

  9. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...... be a promising alternative to build tools for GSE. However, significant effort is required to introduce a new paradigm; there is a need of sound theoretical foundation based on activity theory to address challenges faced by tools in GSE. This paper reports our effort aimed at building theoretical foundations...... in building supporting infrastructure for GSE, and describe a proof of concept prototype....

  10. Applying New Diabetes Teaching Tools in Health-Related Extension Programming

    Science.gov (United States)

    Grenci, Alexandra

    2010-01-01

    In response to the emerging global diabetes epidemic, health educators are searching for new and better education tools to help people make positive behavior changes to successfully prevent or manage diabetes. Conversation Maps[R] are new learner-driven education tools that have been developed to empower individuals to improve their health…

  11. Tool for Experimenting with Concepts of Mobile Robotics as Applied to Children's Education

    Science.gov (United States)

    Jimenez Jojoa, E. M.; Bravo, E. C.; Bacca Cortes, E. B.

    2010-01-01

    This paper describes the design and implementation of a tool for experimenting with mobile robotics concepts, primarily for use by children and teenagers, or by the general public, without previous experience in robotics. This tool helps children learn about science in an approachable and interactive way, using scientific research principles in…

  12. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  13. 100 years of elementary particles [Beam Line, vol. 27, issue 1, Spring 1997

    Energy Technology Data Exchange (ETDEWEB)

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K.H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  14. 100 years of Elementary Particles [Beam Line, vol. 27, issue 1, Spring 1997

    Science.gov (United States)

    Pais, Abraham; Weinberg, Steven; Quigg, Chris; Riordan, Michael; Panofsky, Wolfgang K. H.; Trimble, Virginia

    1997-04-01

    This issue of Beam Line commemorates the 100th anniversary of the April 30, 1897 report of the discovery of the electron by J.J. Thomson and the ensuing discovery of other subatomic particles. In the first three articles, theorists Abraham Pais, Steven Weinberg, and Chris Quigg provide their perspectives on the discoveries of elementary particles as well as the implications and future directions resulting from these discoveries. In the following three articles, Michael Riordan, Wolfgang Panofsky, and Virginia Trimble apply our knowledge about elementary particles to high-energy research, electronics technology, and understanding the origin and evolution of our Universe.

  15. 100 years of training and development research: What we know and where we should go.

    Science.gov (United States)

    Bell, Bradford S; Tannenbaum, Scott I; Ford, J Kevin; Noe, Raymond A; Kraiger, Kurt

    2017-03-01

    Training and development research has a long tradition within applied psychology dating back to the early 1900s. Over the years, not only has interest in the topic grown but there have been dramatic changes in both the science and practice of training and development. In the current article, we examine the evolution of training and development research using articles published in the Journal of Applied Psychology (JAP) as a primary lens to analyze what we have learned and to identify where future research is needed. We begin by reviewing the timeline of training and development research in JAP from 1918 to the present in order to elucidate the critical trends and advances that define each decade. These trends include the emergence of more theory-driven training research, greater consideration of the role of the trainee and training context, examination of learning that occurs outside the classroom, and understanding training's impact across different levels of analysis. We then examine in greater detail the evolution of 4 key research themes: training criteria, trainee characteristics, training design and delivery, and the training context. In each area, we describe how the focus of research has shifted over time and highlight important developments. We conclude by offering several ideas for future training and development research. (PsycINFO Database Record

  16. Predictive Maintenance--An Effective Money Saving Tool Being Applied in Industry Today.

    Science.gov (United States)

    Smyth, Tom

    2000-01-01

    Looks at preventive/predictive maintenance as it is used in industry. Discusses core preventive maintenance tools that must be understood to prepare students. Includes a list of websites related to the topic. (JOW)

  17. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.

    Science.gov (United States)

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  18. Cast Iron And Mineral Cast Applied For Machine Tool Bed - Dynamic Behavior Analysis

    OpenAIRE

    2015-01-01

    Cast iron and mineral cast are the materials most often used in the machine structural elements design (bodies, housings, machine tools beds etc.). The materials significantly differ in physical and mechanical properties. The ability to suppress vibration is one of the most important factors determining the dynamic properties of the machine and has a significant impact on the machining capabilities of a machine tool. Recent research and development trends show that there is a clear tendency t...

  19. Hamiltonian Systems and Optimal Control in Computational Anatomy: 100 Years Since D'Arcy Thompson.

    Science.gov (United States)

    Miller, Michael I; Trouvé, Alain; Younes, Laurent

    2015-01-01

    The Computational Anatomy project is the morphome-scale study of shape and form, which we model as an orbit under diffeomorphic group action. Metric comparison calculates the geodesic length of the diffeomorphic flow connecting one form to another. Geodesic connection provides a positioning system for coordinatizing the forms and positioning their associated functional information. This article reviews progress since the Euler-Lagrange characterization of the geodesics a decade ago. Geodesic positioning is posed as a series of problems in Hamiltonian control, which emphasize the key reduction from the Eulerian momentum with dimension of the flow of the group, to the parametric coordinates appropriate to the dimension of the submanifolds being positioned. The Hamiltonian viewpoint provides important extensions of the core setting to new, object-informed positioning systems. Several submanifold mapping problems are discussed as they apply to metamorphosis, multiple shape spaces, and longitudinal time series studies of growth and atrophy via shape splines.

  20. Accumulation of pharmaceuticals, Enterococcus, and resistance genes in soils irrigated with wastewater for zero to 100 years in central Mexico.

    Directory of Open Access Journals (Sweden)

    Philipp Dalkmann

    Full Text Available Irrigation with wastewater releases pharmaceuticals, pathogenic bacteria, and resistance genes, but little is known about the accumulation of these contaminants in the environment when wastewater is applied for decades. We sampled a chronosequence of soils that were variously irrigated with wastewater from zero up to 100 years in the Mezquital Valley, Mexico, and investigated the accumulation of ciprofloxacin, enrofloxacin, sulfamethoxazole, trimethoprim, clarithromycin, carbamazepine, bezafibrate, naproxen, diclofenac, as well as the occurrence of Enterococcus spp., and sul and qnr resistance genes. Total concentrations of ciprofloxacin, sulfamethoxazole, and carbamazepine increased with irrigation duration reaching 95% of their upper limit of 1.4 µg/kg (ciprofloxacin, 4.3 µg/kg (sulfamethoxazole, and 5.4 µg/kg (carbamazepine in soils irrigated for 19-28 years. Accumulation was soil-type-specific, with largest accumulation rates in Leptosols and no time-trend in Vertisols. Acidic pharmaceuticals (diclofenac, naproxen, bezafibrate were not retained and thus did not accumulate in soils. We did not detect qnrA genes, but qnrS and qnrB genes were found in two of the irrigated soils. Relative concentrations of sul1 genes in irrigated soils were two orders of magnitude larger (3.15 × 10(-3 ± 0.22 × 10(-3 copies/16S rDNA than in non-irrigated soils (4.35 × 10(-5± 1.00 × 10(-5 copies/16S rDNA, while those of sul2 exceeded the ones in non-irrigated soils still by a factor of 22 (6.61 × 10(-4 ± 0.59 × 10(-4 versus 2.99 × 10(-5 ± 0.26 × 10(-5 copies/16S rDNA. Absolute numbers of sul genes continued to increase with prolonging irrigation together with Enterococcus spp. 23S rDNA and total 16S rDNA contents. Increasing total concentrations of antibiotics in soil are not accompanied by increasing relative abundances of resistance genes. Nevertheless, wastewater irrigation enlarges the absolute concentration of resistance genes in soils due to a

  1. Operation reliability assessment for cutting tools by applying a proportional covariate model to condition monitoring information.

    Science.gov (United States)

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-09-25

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.

  2. Integrated tools and techniques applied to the TES ground data system

    Science.gov (United States)

    Morrison, B. A.

    2000-01-01

    The author of this paper will dicuss the selection of CASE tools, a decision making process, requirements tracking and a review mechanism that leads to a highly integrated approach to software development that must deal with the constant pressure to change software requirements and design that is associated with research and development.

  3. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    eutrophication. The goal is to promote an educational example of environmental impacts assessment through science-based tools to predict the impacts, communicate knowledge and support decisions. The example builds on the (D) high demand for fixation of reactive nitrogen that supports several socio...

  4. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Science.gov (United States)

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  5. Applying Behavior-Based Robotics Concepts to Telerobotic Use of Power Tooling

    Energy Technology Data Exchange (ETDEWEB)

    Noakes, Mark W [ORNL; Hamel, Dr. William R. [University of Tennessee, Knoxville (UTK)

    2011-01-01

    While it has long been recognized that telerobotics has potential advantages to reduce operator fatigue, to permit lower skilled operators to function as if they had higher skill levels, and to protect tools and manipulators from excessive forces during operation, relatively little laboratory research in telerobotics has actually been implemented in fielded systems. Much of this has to do with the complexity of the implementation and its lack of ability to operate in complex unstructured remote systems environments. One possible solution is to approach the tooling task using an adaptation of behavior-based techniques to facilitate task decomposition to a simpler perspective and to provide sensor registration to the task target object in the field. An approach derived from behavior-based concepts has been implemented to provide automated tool operation for a teleoperated manipulator system. The generic approach is adaptable to a wide range of typical remote tools used in hot-cell and decontamination and dismantlement-type operations. Two tasks are used in this work to test the validity of the concept. First, a reciprocating saw is used to cut a pipe. The second task is bolt removal from mockup process equipment. This paper explains the technique, its implementation, and covers experimental data, analysis of results, and suggestions for implementation on fielded systems.

  6. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  7. δ18O record and temperature change over the past 100 years in ice cores on the Tibetan Plateau

    Institute of Scientific and Technical Information of China (English)

    YAO; Tandong; GUO; Xuejun; Lonnie; Thompson; DUAN; Keqin; WANG; Ninglian; PU; Jianchen; XU; Baiqing; YANG; Xiaoxin; SUN; Weizhen

    2006-01-01

    The 213 m ice core from the Puruogangri Ice Field on the Tibetan Plateau facilitates the study of the regional temperature changes with its δ18O record of the past 100 years. Here we combine information from this core with that from the Dasuopu ice core (from the southern Tibetan Plateau), the Guliya ice core (from the northwestern Plateau) and the Dunde ice core (from the northeastern Plateau) to learn about the regional differences in temperature change across the Tibetan Plateau. The δ18O changes vary with region on the Plateau, the variations being especially large between South and North and between East and West. Moreover, these four ice cores present increasing δ18O trends, indicating warming on the Tibetan Plateau over the past 100 years. A comparative study of Northern Hemisphere (NH) temperature changes, the δ18O-reflected temperature changes on the Plateau, and available meteorological records show consistent trends in overall warming during the past 100 years.

  8. Systems thinking tools as applied to community-based participatory research: a case study.

    Science.gov (United States)

    BeLue, Rhonda; Carmack, Chakema; Myers, Kyle R; Weinreb-Welch, Laurie; Lengerich, Eugene J

    2012-12-01

    Community-based participatory research (CBPR) is being used increasingly to address health disparities and complex health issues. The authors propose that CBPR can benefit from a systems science framework to represent the complex and dynamic characteristics of a community and identify intervention points and potential "tipping points." Systems science refers to a field of study that posits a holistic framework that is focused on component parts of a system in the context of relationships with each other and with other systems. Systems thinking tools can assist in intervention planning by allowing all CBPR stakeholders to visualize how community factors are interrelated and by potentially identifying the most salient intervention points. To demonstrate the potential utility of systems science tools in CBPR, the authors show the use of causal loop diagrams by a community coalition engaged in CBPR activities regarding youth drinking reduction and prevention.

  9. Is Scores Derived from the Most Internationally Applied Patient Safety Culture Assessment Tool Correct?

    OpenAIRE

    Javad Moghri; Ali Akbari Sari; Mehdi Yousefi; Hasan Zahmatkesh; Ranjbar Mohammad Ezzatabadi; Pejman Hamouzadeh; Satar Rezaei; Jamil Sadeghifar

    2013-01-01

    Abstract Background Hospital Survey on Patient Safety Culture, known as HSOPS, is an internationally well known and widely used tool for measuring patient safety culture in hospitals. It includes 12 dimensions with positive and negative wording questions. The distribution of these questions in different dimensions is uneven and provides the risk of acquiescence bias. The aim of this study was to assess the questionnaire against this bias. Methods Three hundred nurses were assigned into study ...

  10. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined.

  11. Applying quality management tools to medical photography services: a pilot project.

    Science.gov (United States)

    Murray, Peter

    2003-03-01

    The Medical Photography Department at Peterborough Hospitals NHS Trust set up a pilot project to reduce the turnaround time of fundus fluorescein angiograms to the Ophthalmology Department. Quality management tools were used to analyse current photographic practices and develop more efficient methods of service delivery. The improved service to the Ophthalmology Department demonstrates the value of quality management in developing medical photography services at Peterborough Hospitals.

  12. Drugs on the internet, part IV: Google's Ngram viewer analytic tool applied to drug literature.

    Science.gov (United States)

    Montagne, Michael; Morgan, Melissa

    2013-04-01

    Google Inc.'s digitized book library can be searched based on key words and phrases over a five-century time frame. Application of the Ngram Viewer to drug literature was assessed for its utility as a research tool. The results appear promising as a method for noting changes in the popularity of specific drugs over time, historical epidemiology of drug use and misuse, and adoption and regulation of drug technologies.

  13. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  14. Atomic Force Microscopy as a Tool for Applied Virology and Microbiology

    Science.gov (United States)

    Zaitsev, Boris

    2003-12-01

    Atomic force microscope (AFM) can be successfully used for simple and fast solution of many applied biological problems. In this paper the survey of the results of the application of atomic force microscope SolverP47BIO (NT-MDT, Russia) in State Research Center of Virology and Biotechnology "Vector" is presented. The AFM has been used: - in applied virology for the counting of viral particles and examination of virus-cell interaction; - in microbiology for measurements and indication of bacterial spores and cells; - in biotechnology for control of biotechnological processes and evaluation of the distribution of particle dimension for viral and bacterial diagnostic assays. The main advantages of AFM in applied researches are simplicity of the processing of sample preparation and short time of the examination.

  15. Applying CRISPR-Cas9 tools to identify and characterize transcriptional enhancers.

    Science.gov (United States)

    Lopes, Rui; Korkmaz, Gozde; Agami, Reuven

    2016-09-01

    The development of the CRISPR-Cas9 system triggered a revolution in the field of genome engineering. Initially, the use of this system was focused on the study of protein-coding genes but, recently, a number of CRISPR-Cas9-based tools have been developed to study non-coding transcriptional regulatory elements. These technological advances offer unprecedented opportunities for elucidating the functions of enhancers in their endogenous context. Here, we discuss the application, current limitations and future development of CRISPR-Cas9 systems to identify and characterize enhancer elements in a high-throughput manner.

  16. 100 years of radar

    CERN Document Server

    Galati, Gaspare

    2016-01-01

    This book offers fascinating insights into the key technical and scientific developments in the history of radar, from the first patent, taken out by Hülsmeyer in 1904, through to the present day. Landmark events are highlighted and fascinating insights provided into the exceptional people who made possible the progress in the field, including the scientists and technologists who worked independently and under strict secrecy in various countries across the world in the 1930s and the big businessmen who played an important role after World War II. The book encourages multiple levels of reading. The author is a leading radar researcher who is ideally placed to offer a technical/scientific perspective as well as a historical one. He has taken care to structure and write the book in such a way as to appeal to both non-specialists and experts. The book is not sponsored by any company or body, either formally or informally, and is therefore entirely unbiased. The text is enriched by approximately three hundred ima...

  17. 100 years of superconductivity

    CERN Multimedia

    Globe Info

    2011-01-01

    Public lecture by Philippe Lebrun, who works at CERN on applications of superconductivity and cryogenics for particle accelerators. He was head of CERN’s Accelerator Technology Department during the LHC construction period. Centre culturel Jean Monnet, route de Gex Tuesday 11 October from 8.30 p.m. to 10.00 p.m. » Suitable for all – Admission free - Lecture in French » Number of places limited For further information: +33 (0)4 50 42 29 37

  18. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified).

  19. Deriving efficient policy portfolios promoting sustainable energy systems-Case studies applying Invert simulation tool

    Energy Technology Data Exchange (ETDEWEB)

    Kranzl, Lukas; Stadler, Michael; Huber, Claus; Haas, Reinhard [Energy Economics Group, Vienna University of Technology, Gusshausstrasse 28/29/373-2A, 1040 Vienna (Austria); Ragwitz, Mario; Brakhage, Anselm [Fraunhofer Institute for Systems and Innovation Research, Breslauer Strasse 48, D-76139 Karlsruhe (Germany); Gula, Adam; Figorski, Arkadiusz [Faculty of Fuels and Energy, AGH University of Science and Technology, Al. Mickiewicza 30, PL-30-059 Krakow (Poland)

    2006-12-15

    Within recent years, energy policies have imposed a number of targets at European and national level for rational use of energy (RUE), renewable energy sources (RES) and related CO{sub 2} reductions. As a result, a wide variety of policy instruments is currently implemented and hence the question arises: how can these instruments be designed in a way to reach the maximum policy target with the minimum public money spent? The objective of this paper is to derive a methodology for obtaining efficient policy portfolios promoting sustainable energy systems depending on the policy target and show corresponding results from case studies in Austria, Germany and Poland. The investigations were carried out by application of Invert simulation tool, a computer model developed for simulating the impacts of various promotion schemes for renewable and efficient energy systems. With this tool, the CO{sub 2} reductions and related public expenses have been calculated for various policy mixes. In the building-related energy sector, it turned out that in all investigated regions support schemes for supply side measures are the most cost-efficient instruments. However, their potential is restricted and for achieving higher levels of CO{sub 2} reduction, promotion of demand side measures is indispensable. The paper shows that for a comprehensive comparison of policy portfolios, there are always two dimensions to be considered: efficiency and effectiveness. The more effective, i.e. the higher the implementation rate of a scheme, the more essential becomes the efficiency criteria. (author)

  20. Orymold: ontology based gene expression data integration and analysis tool applied to rice

    Directory of Open Access Journals (Sweden)

    Segura Jordi

    2009-05-01

    Full Text Available Abstract Background Integration and exploration of data obtained from genome wide monitoring technologies has become a major challenge for many bioinformaticists and biologists due to its heterogeneity and high dimensionality. A widely accepted approach to solve these issues has been the creation and use of controlled vocabularies (ontologies. Ontologies allow for the formalization of domain knowledge, which in turn enables generalization in the creation of querying interfaces as well as in the integration of heterogeneous data, providing both human and machine readable interfaces. Results We designed and implemented a software tool that allows investigators to create their own semantic model of an organism and to use it to dynamically integrate expression data obtained from DNA microarrays and other probe based technologies. The software provides tools to use the semantic model to postulate and validate of hypotheses on the spatial and temporal expression and function of genes. In order to illustrate the software's use and features, we used it to build a semantic model of rice (Oryza sativa and integrated experimental data into it. Conclusion In this paper we describe the development and features of a flexible software application for dynamic gene expression data annotation, integration, and exploration called Orymold. Orymold is freely available for non-commercial users from http://www.oryzon.com/media/orymold.html

  1. Enabling to Apply XP Process in Distributed Development Environments with Tool Support

    Directory of Open Access Journals (Sweden)

    Ali Akbar Ansari

    2012-07-01

    Full Text Available The evaluation both in academic and industrial areas of the XP methodology has shown very good results if applied to small/medium co-localized working groups. In this paper, we described an approach that overcomes the XP constraint of collocation by introducing a process-support environment (called M.P.D.X.P that helps software development teams and solves the problems which arise when XP is carried out by distributed teams.

  2. Teaching Strategies to Apply in the Use of Technological Tools in Technical Education

    Directory of Open Access Journals (Sweden)

    Olga Arranz García

    2014-09-01

    Full Text Available The emergence of new technologies in education area is changing the way of organizing the educational processes. Teachers are not unrelated to these changes and must employ new strategies to adapt their teaching methods to the new circumstances. One of these adaptations is framed in the virtual learning, where the learning management systems have been revealed as a very effective means within the learning process. In this paper we try to provide teachers in engineering schools how to use in an appropriate way the different technological tools that are present in a virtual platform. Thus, in the experimental framework we show the results outcomes in the analysis of two data samples obtained before and after the implementation of the European Higher Education Area, that would be extrapolated for its innovative application to the learning techniques.

  3. Prediction of permafrost distribution on the Qinghai-Tibet Plateau in the next 50 and 100 years

    Institute of Scientific and Technical Information of China (English)

    NAN Zhuotong; LI Shuxun; CHENG Guodong

    2005-01-01

    Intergovernmental Panel on Climate Change (IPCC) in 2001 reported that the Earth air temperature would rise by 1.4-5.8℃ and 2.5℃ on average by the year 2100. China regional climate model results also showed that the air temperature on the Qinghai-Tibet Plateau (QTP) would increase by 2.2-2.6℃ in the next 50 years. A numerical permafrost model was ed to predict the changes of permafrost distribution on the QTP over the next 50 and 100 years under the two climatic warming scenarios, i.e. 0.02℃/a, the lower value of IPCC's estimation, and 0.052℃/a, the higher value predicted by Qin et al. Simulation results show that ( i ) in the case of 0.02℃/a air-temperature rise, permafrost area on the QTP will shrink about 8.8% in the next 50 years, and high temperature permafrost with mean annual ground temperature (MAGT) higher than -0.11℃ may turn into seasonal frozen soils. In the next 100 years, permafrost with MAGT higher than -0.5℃ will disappear and the permafrost area will shrink up to 13.4%. (ii) In the case of 0.052℃/a air-temperature rise, permafrost area on the QTP will reduce about 13.5% after 50 years. More remarkable degradation will take place after 100 years, and permafrost area will reduce about 46%. Permafrost with MAGT higher than -2℃ will turn into seasonal frozen soils and even unfrozen soils.

  4. Quantitative seismic interpretation: Applying rock physics tools to reduce interpretation risk

    Institute of Scientific and Technical Information of China (English)

    Yong Chen

    2007-01-01

    @@ Seismic data analysis is one of the key technologies for characterizing reservoirs and monitoring subsurface pore fluids. While there have been great advances in 3D seismic data processing, the quantitative interpretation of the seismic data for rock properties still poses many challenges. This book demonstrates how rock physics can be applied to predict reservoir parameters, such as lithologies and pore fluids, from seismically derived attributes, as well as how the multidisciplinary combination of rock physics models with seismic data, sedimentological information, and stochastic techniques can lead to more powerful results than can be obtained from a single technique.

  5. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching......This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...

  6. The potential of social entrepreneurship: conceptual tools for applying citizenship theory to policy and practice.

    Science.gov (United States)

    Caldwell, Kate; Harris, Sarah Parker; Renko, Maija

    2012-12-01

    Contemporary policy encourages self-employment and entrepreneurship as a vehicle for empowerment and self-sufficiency among people with disabilities. However, such encouragement raises important citizenship questions concerning the participation of people with intellectual and developmental disabilities (IDD). As an innovative strategy for addressing pressing social and economic problems, "social entrepreneurship" has become a phrase that is gaining momentum in the IDD community--one that carries with it a very distinct history. Although social entrepreneurship holds the potential to be an empowering source of job creation and social innovation, it also has the potential to be used to further disenfranchise this marginalized population. It is crucial that in moving forward society takes care not to perpetuate existing models of oppression, particularly in regard to the social and economic participation of people with IDD. The conceptual tools addressed in this article can inform the way that researchers, policymakers, and practitioners approach complex issues, such as social entrepreneurship, to improve communication among disciplines while retaining an integral focus on rights and social justice by framing this issue within citizenship theory.

  7. Neutron tomography of particulate filters: a non-destructive investigation tool for applied and industrial research

    Energy Technology Data Exchange (ETDEWEB)

    Toops, Todd J., E-mail: toopstj@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Bilheux, Hassina Z.; Voisin, Sophie [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Gregor, Jens [University of Tennessee, Knoxville, TN (United States); Walker, Lakeisha; Strzelec, Andrea; Finney, Charles E.A.; Pihl, Josh A. [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2013-11-21

    This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerized reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). This effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF.

  8. Applied Circular Dichroism: A Facile Spectroscopic Tool for Configurational Assignment and Determination of Enantiopurity

    Directory of Open Access Journals (Sweden)

    Macduff O. Okuom

    2015-01-01

    Full Text Available In order to determine if electronic circular dichroism (ECD is a good tool for the qualitative evaluation of absolute configuration and enantiopurity in the absence of chiral high performance liquid chromatography (HPLC, ECD studies were performed on several prescriptions and over-the-counter drugs. Cotton effects (CE were observed for both S and R isomers between 200 and 300 nm. For the drugs examined in this study, the S isomers showed a negative CE, while the R isomers displayed a positive CE. The ECD spectra of both enantiomers were nearly mirror images, with the amplitude proportional to the enantiopurity. Plotting the differential extinction coefficient (Δε versus enantiopurity at the wavelength of maximum amplitude yielded linear standard curves with coefficients of determination (R2 greater than 97% for both isomers in all cases. As expected, Equate, Advil, and Motrin, each containing a racemic mixture of ibuprofen, yielded no chiroptical signal. ECD spectra of Suphedrine and Sudafed revealed that each of them is rich in 1S,2S-pseudoephedrine, while the analysis of Equate vapor inhaler is rich in R-methamphetamine.

  9. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2016-03-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  10. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  11. Can the FAST and ROSIER adult stroke recognition tools be applied to confirmed childhood arterial ischemic stroke?

    Directory of Open Access Journals (Sweden)

    Babl Franz E

    2011-10-01

    Full Text Available Abstract Background Stroke recognition tools have been shown to improve diagnostic accuracy in adults. Development of a similar tool in children is needed to reduce lag time to diagnosis. A critical first step is to determine whether adult stoke scales can be applied in childhood stroke. Our objective was to assess the applicability of adult stroke scales in childhood arterial ischemic stroke (AIS Methods Children aged 1 month to Results 47 children with AIS were identified. 34 had anterior, 12 had posterior and 1 child had anterior and posterior circulation infarcts. Median age was 9 years and 51% were male. Median time from symptom onset to ED presentation was 21 hours but one third of children presented within 6 hours. The most common presenting stroke symptoms were arm (63%, face (62%, leg weakness (57%, speech disturbance (46% and headache (46%. The most common signs were arm (61%, face (70% or leg weakness (57% and dysarthria (34%. 36 (78% of children had at least one positive variable on FAST and 38 (81% had a positive score of ≥1 on the ROSIER scale. Positive scores were less likely in children with posterior circulation stroke. Conclusion The presenting features of pediatric stroke appear similar to adult strokes. Two adult stroke recognition tools have fair to good sensitivity in radiologically confirmed childhood AIS but require further development and modification. Specificity of the tools also needs to be determined in a prospective cohort of children with stroke and non-stroke brain attacks.

  12. A practical guide to applying lean tools and management principles to health care improvement projects.

    Science.gov (United States)

    Simon, Ross W; Canacari, Elena G

    2012-01-01

    Manufacturing organizations have used Lean management principles for years to help eliminate waste, streamline processes, and cut costs. This pragmatic approach to structured problem solving can be applied to health care process improvement projects. Health care leaders can use a step-by-step approach to document processes and then identify problems and opportunities for improvement using a value stream process map. Leaders can help a team identify problems and root causes and consider additional problems associated with methods, materials, manpower, machinery, and the environment by using a cause-and-effect diagram. The team then can organize the problems identified into logical groups and prioritize the groups by impact and difficulty. Leaders must manage action items carefully to instill a sense of accountability in those tasked to complete the work. Finally, the team leaders must ensure that a plan is in place to hold the gains.

  13. Ecoinformatics for integrated pest management: expanding the applied insect ecologist's tool-kit.

    Science.gov (United States)

    Rosenheim, Jay A; Parsa, Soroush; Forbes, Andrew A; Krimmel, William A; Law, Yao Hua; Segoli, Michal; Segoli, Moran; Sivakoff, Frances S; Zaviezo, Tania; Gross, Kevin

    2011-04-01

    Experimentation has been the cornerstone of much of integrated pest management (IPM) research. Here, we aim to open a discussion on the possible merits of expanding the use of observational studies, and in particular the use of data from farmers or private pest management consultants in "ecoinformatics" studies, as tools that might complement traditional, experimental research. The manifold advantages of experimentation are widely appreciated: experiments provide definitive inferences regarding causal relationships between key variables, can produce uniform and high-quality data sets, and are highly flexible in the treatments that can be evaluated. Perhaps less widely considered, however, are the possible disadvantages of experimental research. Using the yield-impact study to focus the discussion, we address some reasons why observational or ecoinformatics approaches might be attractive as complements to experimentation. A survey of the literature suggests that many contemporary yield-impact studies lack sufficient statistical power to resolve the small, but economically important, effects on crop yield that shape pest management decision-making by farmers. Ecoinformatics-based data sets can be substantially larger than experimental data sets and therefore hold out the promise of enhanced power. Ecoinformatics approaches also address problems at the spatial and temporal scales at which farming is conducted, can achieve higher levels of "external validity," and can allow researchers to efficiently screen many variables during the initial, exploratory phases of research projects. Experimental, observational, and ecoinformatics-based approaches may, if used together, provide more efficient solutions to problems in pest management than can any single approach, used in isolation.

  14. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    Science.gov (United States)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  15. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  16. Effects of 100 years wastewater irrigation on resistance genes, class 1 integrons and IncP-1 plasmids in Mexican soil

    Directory of Open Access Journals (Sweden)

    Sven eJechalke

    2015-03-01

    Full Text Available Long-term irrigation with untreated wastewater can lead to an accumulation of antibiotic substances and antibiotic resistance genes in soil. However, little is known so far about effects of wastewater, applied for decades, on the abundance of IncP-1 plasmids and class 1 integrons which may contribute to the accumulation and spread of resistance genes in the environment, and their correlation with heavy metal concentrations.Therefore, a chronosequence of soils that were irrigated with wastewater from zero to 100 years was sampled in the Mezquital Valley in Mexico in the dry season. The total community DNA was extracted and the absolute and relative abundance (relative to 16S rRNA genes of antibiotic resistance genes (tet(W, tet(Q, aadA, class 1 integrons (intI1, quaternary ammonium compound resistance genes (qacE+qacEΔ1 and IncP-1 plasmids (korB were quantified by real-time PCR. Except for intI1 and qacE+qacEΔ1 the abundances of selected genes were below the detection limit in non-irrigated soil. Confirming the results of a previous study, the absolute abundance of 16S rRNA genes in the samples increased significantly over time (linear regression model, p < 0.05 suggesting an increase in bacterial biomass due to repeated irrigation with wastewater. Correspondingly, all tested antibiotic resistance genes as well as intI1 and korB significantly increased in abundance over the period of 100 years of irrigation. In parallel, concentrations of the heavy metals Zn, Cu, Pb, Ni, and Cr significantly increased. However, no significant positive correlations were observed between the relative abundance of selected genes and years of irrigation, indicating no enrichment in the soil bacterial community due to repeated wastewater irrigation or due to a potential co-selection by increasing concentrations of heavy metals.

  17. Snooker: a structure-based pharmacophore generation tool applied to class A GPCRs.

    Science.gov (United States)

    Sanders, Marijn P A; Verhoeven, Stefan; de Graaf, Chris; Roumen, Luc; Vroling, Bas; Nabuurs, Sander B; de Vlieg, Jacob; Klomp, Jan P G

    2011-09-26

    G-protein coupled receptors (GPCRs) are important drug targets for various diseases and of major interest to pharmaceutical companies. The function of individual members of this protein family can be modulated by the binding of small molecules at the extracellular side of the structurally conserved transmembrane (TM) domain. Here, we present Snooker, a structure-based approach to generate pharmacophore hypotheses for compounds binding to this extracellular side of the TM domain. Snooker does not require knowledge of ligands, is therefore suitable for apo-proteins, and can be applied to all receptors of the GPCR protein family. The method comprises the construction of a homology model of the TM domains and prioritization of residues on the probability of being ligand binding. Subsequently, protein properties are converted to ligand space, and pharmacophore features are generated at positions where protein ligand interactions are likely. Using this semiautomated knowledge-driven bioinformatics approach we have created pharmacophore hypotheses for 15 different GPCRs from several different subfamilies. For the beta-2-adrenergic receptor we show that ligand poses predicted by Snooker pharmacophore hypotheses reproduce literature supported binding modes for ∼75% of compounds fulfilling pharmacophore constraints. All 15 pharmacophore hypotheses represent interactions with essential residues for ligand binding as observed in mutagenesis experiments and compound selections based on these hypotheses are shown to be target specific. For 8 out of 15 targets enrichment factors above 10-fold are observed in the top 0.5% ranked compounds in a virtual screen. Additionally, prospectively predicted ligand binding poses in the human dopamine D3 receptor based on Snooker pharmacophores were ranked among the best models in the community wide GPCR dock 2010.

  18. Simulated carbon and water processes of forest ecosystems in Forsmark and Oskarshamn during a 100-year period

    Energy Technology Data Exchange (ETDEWEB)

    Gustafsson, David; Jansson, Per-Erik [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Land and Water Resources Engineering; Gaerdenaes, Annemieke [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Soil Sciences; Eckersten, Henrik [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Crop Production Ecology

    2006-12-15

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is currently investigating the Forsmark and Oskarshamn areas for possible localisation of a repository for spent nuclear fuel. Important components of the investigations are characterizations of the land surface ecosystems in the areas with respect to hydrological and biological processes, and their implications for the fate of radionuclide contaminants entering the biosphere from a shallow groundwater contamination. In this study, we simulate water balance and carbon turnover processes in forest ecosystems representative for the Forsmark and Oskarshamn areas for a 100-year period using the ecosystem process model CoupModel. The CoupModel describes the fluxes of water and matter in a one-dimensional soil-vegetation-atmosphere system, forced by time series of meteorological variables. The model has previously been parameterized for many of the vegetation systems that can be found in the Forsmark and Oskarshamn areas: spruce/pine forests, willow, grassland and different agricultural crops. This report presents a platform for further use of models like CoupModel for investigations of radionuclide turnover in the Forsmark and Oskarshamn area based on SKB data, including a data set of meteorological forcing variables for Forsmark 1970-2004, suitable for simulations of a 100-year period representing the present day climate, a hydrological parameterization of the CoupModel for simulations of the forest ecosystems in the Forsmark and Oskarshamn areas, and simulated carbon budgets and process descriptions for Forsmark that correspond to a possible steady state of the soil storage of the forest ecosystem.

  19. A centennial to celebrate : energy and minerals science and technology 100 years of excellence : improving the quality of life of Canadians through natural resources

    Energy Technology Data Exchange (ETDEWEB)

    Udd, J.; Reeve, D.

    2007-07-01

    The year 2007 marked the 100th anniversary of Natural Resources Canada's (NRCan) contribution to science and technology excellence in energy and minerals. This publication discussed the 100 years of excellence of the energy and minerals science and technology sector. It discussed the history of Natural Resources Canada, with reference to the early years; first fuel testing efforts; first World War; the 1920s and 1930s; second World War; post-war years; the 1970s and 1980s; and the 1990s to the present. The publication discussed the creation of the Canada Centre for Mineral and Energy Technology (CANMET) as well as some current NRCan science and technology activities, such as alternative energy programs; energy efficiency for buildings, industries and communities; clean coal; oil sands tailings and water management; community energy systems; renewable energy efficient technology projects (RET) such as RETscreen; hybrid scoop; the anti-vibration rock drill handle; mine waste management; and green mines-green energy. Other NRCan science and technology programs that were presented in the publication included materials technology laboratory relocation; corrosion management tools for the oil and gas pipeline industry; lightweight magnesium engine cradle; mine environment neutral drainage program; metallurgical processing; counter-terrorism; and clean energy. figs.

  20. CoSMoS Southern California v3.0 Phase 1 (100-year storm) storm hazard projections

    Science.gov (United States)

    Barnard, Patrick; Erikson, Li; Foxgrover, Amy; O'Neill, Andrea; Herdman, Liv

    2016-01-01

    The Coastal Storm Modeling System (CoSMoS) makes detailed predictions (meter-scale) over large geographic scales (100s of kilometers) of storm-induced coastal flooding and erosion for both current and future sea-level rise (SLR) scenarios. CoSMoS v3.0 for Southern California shows projections for future climate scenarios (sea-level rise and storms) to provide emergency responders and coastal planners with critical storm-hazards information that can be used to increase public safety, mitigate physical damages, and more effectively manage and allocate resources within complex coastal settings. Phase I data for Southern California include flood-hazard information for the coast from the Mexican Border to Pt. Conception for a 100-year storm scenario and sea-level rise 0 - 2 m. Changes from previous data releases may be reflected in some areas. Data are complete for the information presented but are considered preliminary; changes may be reflected in the full data release (Phase II) in summer 2016.

  1. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    NARCIS (Netherlands)

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in fie

  2. 100-Year Floodplains, Flood plains from FEMA, Published in 2003, 1:600 (1in=50ft) scale, Town of Cary NC.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:600 (1in=50ft) scale, was produced all or in part from LIDAR information as of 2003. It is described as 'Flood...

  3. 100-Year Floodplains, FEMA DFIRM preliminary map out now, to be published in 2009, Published in 2009, 1:12000 (1in=1000ft) scale, Brown County, WI.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:12000 (1in=1000ft) scale, was produced all or in part from Other information as of 2009. It is described as 'FEMA...

  4. 100-Year Floodplains, Data provided by FEMA and WI DNR, Published in 2009, 1:2400 (1in=200ft) scale, Dane County Land Information Office.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale as of 2009. It is described as 'Data provided by FEMA and WI DNR'. Data by this publisher...

  5. 100-Year Floodplains, NC Floodplain Mapping Program data, Published in 2007, 1:12000 (1in=1000ft) scale, Iredell County GIS.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:12000 (1in=1000ft) scale, was produced all or in part from LIDAR information as of 2007. It is described as 'NC...

  6. 100-Year Floodplains, FEMA Floodway and Flood Boundary Maps, Published in 2005, 1:24000 (1in=2000ft) scale, Lafayette County Land Records.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Other information as of 2005. It is described as 'FEMA...

  7. 100-Year Floodplains, FEMA Flood Zones, Published in 2010, 1:2400 (1in=200ft) scale, Effingham County Board Of Commissioners.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This 100-Year Floodplains dataset, published at 1:2400 (1in=200ft) scale, was produced all or in part from Published Reports/Deeds information as of 2010. It is...

  8. The protection of Canfranc Internation Railway Sattion against natural risks. Analysis and evaluation of its effectiveness 100 years later.

    Science.gov (United States)

    Fabregas, S.; Hurtado, R.; Mintegui, J.

    2012-04-01

    In the late XIXth century and early XXth century, the international railway station in Canfranc "Los Arañones" is built in the Central Pyrenees of Huesca in Spain, along the border between France and Spain. Just after starting the construction of the huge station (250 m long), it was found that natural hazards such as flash floods, landslides, falling blocks and avalanches affected it and compromised the safety of users and infrastructures.Quickly, hydrological restoration works were carried out along "Los Arañones" gorgers basins to reduce joint residual risks. Longitudinal and transversal dams for floods, a large reforestation work to prevent against falling blocks, erosion, flooding and regarding avalanches stone walls were built, as well as benches of grit, snow rakes, and "empty dams", which were created as experimental structures to dissipate the energy of the avalanche in the track zone and wich do not exist anywhere else in the world. All the works were carried out mainly by hand, with materials such as stone, cement and iron. Over 2,500,000 holes were made for planting more than 15 different species of trees, and more than 400,000 tons of stone were moved to build more than 12 different kinds of control measures.It is essential to emphasize the empirical nature of these works and Canfranc's function as a "laboratory or field tests", with most of its structures still effective 100 years after its construction. The works involved about 30% of the total cost of the station in the early XX century. Nowadays to have an "equivalent protection" with the current technology, around 100 million euro should be invested. It is also necessary to validate the current effectiveness of such works, its maintenance task and the protective role of the forest.

  9. The story of the Hawaiian Volcano Observatory -- A remarkable first 100 years of tracking eruptions and earthquakes

    Science.gov (United States)

    Babb, Janet L.; Kauahikaua, James P.; Tilling, Robert I.

    2011-01-01

    The year 2012 marks the centennial of the Hawaiian Volcano Observatory (HVO). With the support and cooperation of visionaries, financiers, scientists, and other individuals and organizations, HVO has successfully achieved 100 years of continuous monitoring of Hawaiian volcanoes. As we celebrate this milestone anniversary, we express our sincere mahalo—thanks—to the people who have contributed to and participated in HVO’s mission during this past century. First and foremost, we owe a debt of gratitude to the late Thomas A. Jaggar, Jr., the geologist whose vision and efforts led to the founding of HVO. We also acknowledge the pioneering contributions of the late Frank A. Perret, who began the continuous monitoring of Kīlauea in 1911, setting the stage for Jaggar, who took over the work in 1912. Initial support for HVO was provided by the Massachusetts Institute of Technology (MIT) and the Carnegie Geophysical Laboratory, which financed the initial cache of volcano monitoring instruments and Perret’s work in 1911. The Hawaiian Volcano Research Association, a group of Honolulu businessmen organized by Lorrin A. Thurston, also provided essential funding for HVO’s daily operations starting in mid-1912 and continuing for several decades. Since HVO’s beginning, the University of Hawaiʻi (UH), called the College of Hawaii until 1920, has been an advocate of HVO’s scientific studies. We have benefited from collaborations with UH scientists at both the Hilo and Mänoa campuses and look forward to future cooperative efforts to better understand how Hawaiian volcanoes work. The U.S. Geological Survey (USGS) has operated HVO continuously since 1947. Before then, HVO was under the administration of various Federal agencies—the U.S. Weather Bureau, at the time part of the Department of Agriculture, from 1919 to 1924; the USGS, which first managed HVO from 1924 to 1935; and the National Park Service from 1935 to 1947. For 76 of its first 100 years, HVO has been

  10. Applying decision trial and evaluation laboratory as a decision tool for effective safety management system in aviation transport

    Directory of Open Access Journals (Sweden)

    Ifeanyichukwu Ebubechukwu Onyegiri

    2016-10-01

    Full Text Available In recent years, in the aviation industry, the weak engineering controls and lapses associated with safety management systems (SMSs are responsible for the seemingly unprecedented disasters. A previous study has confirmed the difficulties experienced by safety managers with SMSs and the need to direct research to this area of investigation for more insights and progress in the evaluation and maintenance of SMSs in the aviation industry. The purpose of this work is to examine the application of Decision Trial and Evaluation Laboratory (DEMATEL to the aviation industry in developing countries with illustration using the Nigerian aviation survey data for the validation of the method. The advantage of the procedure over other decision making methods is in its ability to apply feedback in its decision making. It also affords us the opportunity of breaking down the complex aviation SMS components and elements which are multi-variate in nature through the analysis of the contributions of the diverse system criteria from the perspective of cause and effects, which in turn yields easier and yet more effective aviation transportation accident pre-corrective actions. In this work, six revised components of an SMS were identified and DEMATEL was applied to obtain their direct and indirect impacts and influences on the overall SMS performance. Data collection was by the survey questionnaire, which served as the initial direct-relation matrix, coded in Matlab software for establishing the impact relation map (IRM. The IRM was then plotted in MS Excel spread-sheet software. From our results, safety structure and regulation has the highest impact level on an SMS with a corresponding positive relation level value. In conclusion, the results agree with those of previous researchers that used grey relational analysis. Thus, DEMATEL serves as a great tool and resource for the safety manager.

  11. GEodesy Tools for Societal Issues (GETSI): Undergraduate curricular modules that feature geodetic data applied to critical social topics

    Science.gov (United States)

    Douglas, B. J.; Pratt-Sitaula, B.; Walker, B.; Miller, M. S.; Charlevoix, D.

    2014-12-01

    The GETSI project is a three-year NSF funded project to develop and disseminate teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (http://serc.carleton.edu/getsi). GETSI was born out of requests from geoscience faculty for more resources with which to educate future citizens and future geoscience professionals on the power and breadth of geodetic methods to address societally relevant topics. Development of the first two modules started at a February 2014 workshop and initial classroom testing begins in fall 2014. The Year 1 introductory module "Changing Ice and Sea Level" includes geodetic data such as gravity, satellite altimetry, and GPS time series. The majors-level Year 1 module is "Imaging Active Tectonics" and it has students analyzing InSAR and LiDAR data to assess infrastructure vulnerability to demonstratively active faults. Additional resources such as animations and interactive data tools are also being developed. The full modules will take about two weeks of class time; module design will permit portions of the material to be used as individual projects or assignments of shorter duration. Ultimately a total of four modules will be created and disseminated, two each at the introductory and majors-levels. GETSI is working in tight partnership with the Science Education Resource Center's (SERC) InTeGrate project on the module development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. This will allow for an optimized module development process based on successful practices defined by these earlier efforts.

  12. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  13. Portable hyperspectral device as a valuable tool for the detection of protective agents applied on hystorical buildings

    Science.gov (United States)

    Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.

    2012-04-01

    In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration [1]. The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in

  14. Dr Margaretha Brongersma-Sanders (1905-1996), Dutch scientist: an annotated bibliography of her work to celebrate 100 years since her birth

    NARCIS (Netherlands)

    Turner, S.; Cadée, G.C.

    2006-01-01

    Dr Margaretha Brongersma-Sanders, palaeontologist, pioneer geochemist, geobiologist and oceanographer, Officer of the Order of Oranje Nassau was born 100 years ago (February 20th, 1905) in Kampen in The Netherlands. The fields of research that she covered during her lifetime include taxonomy of rece

  15. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Eric Moyer

    2016-04-01

    Full Text Available Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  16. Applying Total Quality Management Tools Using QFD at Higher Education Institutions in Gulf Area (Case Study: ALHOSN University

    Directory of Open Access Journals (Sweden)

    Adnan Al-Bashir

    2016-07-01

    Full Text Available Human power’s quality plays the key role in the growth and development of societies where the quality of human powers can be enriched with the high quality education provided by the higher education institutions. The higher education institutions are hereby an important sector of any society since it defines the overall quality of human lives. This research will investigate the application of Total Quality Management (TQM tools at the higher education institutions; specifically at ALHOSN University. In this study five tools were implemented at ALHOSN University’s engineering college including: Quality Function Deployment, Affinity Diagrams, Tree Diagrams, Pareto Charts, and Fishbone Diagrams. The research will reveal that the implementation of TQM tools has a great benefit for higher education institutions where they have uncovered many area of potential improvement as well as the main causes of some of the problems the Faculty of Engineering is facing. Also, it will show that the implementation of TQM tools on higher education institution systems will enhance the performance of such institutions.

  17. Applying a statewide geospatial leaching tool for assessing soil vulnerability ratings for agrochemicals across the contiguous United States.

    Science.gov (United States)

    Ki, Seo Jin; Ray, Chittaranjan; Hantush, Mohamed M

    2015-06-15

    A large-scale leaching assessment tool not only illustrates soil (or groundwater) vulnerability in unmonitored areas, but also can identify areas of potential concern for agrochemical contamination. This study describes the methodology of how the statewide leaching tool in Hawaii modified recently for use with pesticides and volatile organic compounds can be extended to the national assessment of soil vulnerability ratings. For this study, the tool was updated by extending the soil and recharge maps to cover the lower 48 states in the United States (US). In addition, digital maps of annual pesticide use (at a national scale) as well as detailed soil properties and monthly recharge rates (at high spatial and temporal resolutions) were used to examine variations in the leaching (loads) of pesticides for the upper soil horizons. Results showed that the extended tool successfully delineated areas of high to low vulnerability to selected pesticides. The leaching potential was high for picloram, medium for simazine, and low to negligible for 2,4-D and glyphosate. The mass loadings of picloram moving below 0.5 m depth increased greatly in northwestern and central US that recorded its extensive use in agricultural crops. However, in addition to the amount of pesticide used, annual leaching load of atrazine was also affected by other factors that determined the intrinsic aquifer vulnerability such as soil and recharge properties. Spatial and temporal resolutions of digital maps had a great effect on the leaching potential of pesticides, requiring a trade-off between data availability and accuracy. Potential applications of this tool include the rapid, large-scale vulnerability assessments for emerging contaminants which are hard to quantify directly through vadose zone models due to lack of full environmental data.

  18. MATLAB-Simulink tool development for process identification and tuning tool applied to a HDT pilot plant; Desenvolvimento de ferramenta MATLAB-Simulink para identificacao de processos e sintonia de malhas aplicada a uma planta piloto HDT

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Viviane; Araujo, Ofelia; Vaz Junior, Carlos Andre [Universidade Federal, Rio de Janeiro, RJ (Brazil). Escola de Quimica]. E-mail: vfonseca@eq.ufrj.br

    2003-07-01

    This research presents a process identification and PID tuning tool, applied to a HDT pilot plant, located at Chemistry School of University of Rio de Janeiro - Brazil, with collaboration of a petrochemical industry named COPENE. MATLAB and its library Simulink are used to obtain the tool, which accomplishes the functions of system optimization and process simulation, respectively. In and out plant data are obtained and used to make the openloop process identification through the transfer functions estimation, in agreement with the least squares method. The control loop tuning of the reactor pressure is determined through the transfer functions parameters estimated in the previous identification, based in the ITAE method. All obtained information is accept if the convergence criteria is achieved. The tuning results allow the success of the pressure control response and the tool could be used with other chemical processes. (author)

  19. A Simulation Tool for Steady State Thermal Performance Applied to the SPL Double-Walled Tube RF Power Coupler

    CERN Document Server

    Bonomi, R

    2014-01-01

    This note reports on the study carried out to design a tool for steady-state thermal performance of the RF power coupler inside the SPL cryostat. To reduce the amount of heat penetrating into the helium bath where the cavity is placed, the main coupler is actively cooled by means of an adequate flow rate of helium gas. The knowledge of the temperature profiles and the overall thermal performance of the power coupler are fundamental for the estimation of the total heat load budget of the cryostat.

  20. Applied acoustics concepts, absorbers, and silencers for acoustical comfort and noise control alternative solutions, innovative tools, practical examples

    CERN Document Server

    Fuchs, Helmut V

    2013-01-01

    The author gives a comprehensive overview of materials and components for noise control and acoustical comfort. Sound absorbers must meet acoustical and architectural requirements, which fibrous or porous material alone can meet. Basics and applications are demonstrated, with representative examples for spatial acoustics, free-field test facilities and canal linings. Acoustic engineers and construction professionals will find some new basic concepts and tools for developments in order to improve acoustical comfort. Interference absorbers, active resonators and micro-perforated absorbers of different materials and designs complete the list of applications.

  1. Applying value engineering and modern assessment tools in managing NEPA: Improving effectiveness of the NEPA scoping and planning process

    Energy Technology Data Exchange (ETDEWEB)

    ECCLESTON, C.H.

    1998-09-03

    While the National Environmental Policy Act (NEPA) implementing regulations focus on describing ''What'' must be done, they provide surprisingly little direction on ''how'' such requirements are to be implemented. Specific implementation of these requirements has largely been left to the discretion of individual agencies. More than a quarter of a century after NEPA's enactment, few rigorous tools, techniques, or methodologies have been developed or widely adopted for implementing the regulatory requirements. In preparing an Environmental Impact Statement, agencies are required to conduct a public scoping process to determine the range of actions, alternatives, and impacts that will be investigated. Determining the proper scope of analysis is an element essential in the successful planning and implementation of future agency actions. Lack of rigorous tools and methodologies can lead to project delays, cost escalation, and increased risk that the scoping process may not adequately capture the scope of decisions that eventually might need to be considered. Recently, selected Value Engineering (VE) techniques were successfully used in managing a prescoping effort. A new strategy is advanced for conducting a pre-scoping/scoping effort that combines NEPA with VE. Consisting of five distinct phases, this approach has potentially wide-spread implications in the way NEPA, and scoping in particular, is practiced.

  2. Population Dynamics P system (PDP) models: a standardized protocol for describing and applying novel bio-inspired computing tools.

    Science.gov (United States)

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics.

  3. 100 years of Planck's quantum

    CERN Document Server

    Duck, Ian M

    2000-01-01

    This invaluable book takes the reader from Planck's discovery of the quantum in 1900 to the most recent interpretations and applications of nonrelativistic quantum mechanics.The introduction of the quantum idea leads off the prehistory of quantum mechanics, featuring Planck, Einstein, Bohr, Compton, and de Broglie's immortal contributions. Their original discovery papers are featured with explanatory notes and developments in Part 1.The invention of matrix mechanics and quantum mechanics by Heisenberg, Born, Jordan, Dirac, and Schrödinger is presented next, in Part 2.Following that, in Part 3,

  4. 100 years of radionuclide metrology.

    Science.gov (United States)

    Judge, S M; Arnold, D; Chauvenet, B; Collé, R; De Felice, P; García-Toraño, E; Wätjen, U

    2014-05-01

    The discipline of radionuclide metrology at national standards institutes started in 1913 with the certification by Curie, Rutherford and Meyer of the first primary standards of radium. In early years, radium was a valuable commodity and the aim of the standards was largely to facilitate trade. The focus later changed to providing standards for the new wide range of radionuclides, so that radioactivity could be used for healthcare and industrial applications while minimising the risk to patients, workers and the environment. National measurement institutes responded to the changing demands by developing new techniques for realising primary standards of radioactivity. Looking ahead, there are likely to be demands for standards for new radionuclides used in nuclear medicine, an expansion of the scope of the field into quantitative imaging to facilitate accurate patient dosimetry for nuclear medicine, and an increasing need for accurate standards for radioactive waste management and nuclear forensics.

  5. FEMA 100 year Flood Data

    Data.gov (United States)

    California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  6. Poor reliability between Cochrane reviewers and blinded external reviewers when applying the Cochrane risk of bias tool in physical therapy trials.

    Directory of Open Access Journals (Sweden)

    Susan Armijo-Olivo

    Full Text Available OBJECTIVES: To test the inter-rater reliability of the RoB tool applied to Physical Therapy (PT trials by comparing ratings from Cochrane review authors with those of blinded external reviewers. METHODS: Randomized controlled trials (RCTs in PT were identified by searching the Cochrane Database of Systematic Reviews for meta-analysis of PT interventions. RoB assessments were conducted independently by 2 reviewers blinded to the RoB ratings reported in the Cochrane reviews. Data on RoB assessments from Cochrane reviews and other characteristics of reviews and trials were extracted. Consensus assessments between the two reviewers were then compared with the RoB ratings from the Cochrane reviews. Agreement between Cochrane and blinded external reviewers was assessed using weighted kappa (κ. RESULTS: In total, 109 trials included in 17 Cochrane reviews were assessed. Inter-rater reliability on the overall RoB assessment between Cochrane review authors and blinded external reviewers was poor (κ  =  0.02, 95%CI: -0.06, 0.06]. Inter-rater reliability on individual domains of the RoB tool was poor (median κ  = 0.19, ranging from κ  =  -0.04 ("Other bias" to κ  =  0.62 ("Sequence generation". There was also no agreement (κ  =  -0.29, 95%CI: -0.81, 0.35] in the overall RoB assessment at the meta-analysis level. CONCLUSIONS: Risk of bias assessments of RCTs using the RoB tool are not consistent across different research groups. Poor agreement was not only demonstrated at the trial level but also at the meta-analysis level. Results have implications for decision making since different recommendations can be reached depending on the group analyzing the evidence. Improved guidelines to consistently apply the RoB tool and revisions to the tool for different health areas are needed.

  7. The Black Top Hat function applied to a DEM: A tool to estimate recent incision in a mountainous watershed (Estibère Watershed, Central Pyrenees)

    Science.gov (United States)

    Rodriguez, Felipe; Maire, Eric; Courjault-Radé, Pierre; Darrozes, José

    2002-03-01

    The Top Hat Transform function is a grey-level image analysis tool that allows extracting peaks and valleys in a non-uniform background. This function can be applied onto a grey-level Digital Elevation Model (DEM). It is herein applied to quantify the volume of recent incised material in a mountainous Pyrenean watershed. Grey-level Closing operation applied to the Present-Day DEM gives a new image called ``paleo'' DEM. The Black Top Hat function consists in the subtraction of the ``paleo'' DEM with the Present-Day DEM. It gives a new DEM representing all valleys whose sizes range between the size of the structuring element and the null value as no threshold is used. The calculation of the incised volume is directly derived from the subtraction between the two DEM's. The geological significance of the quantitative results is discussed.

  8. Applying the “WSUD potential”-tool in the framework of the Copenhagen Climate Adaptation and Cloudburst Management Plans

    DEFF Research Database (Denmark)

    Lerer, Sara Maria; Madsen, Herle Mo; Smit Andersen, Jonas;

    2016-01-01

    Water Sensitive Urban Design (WSUD) is still in the “Opportunity”-phase of its stabilization process in Copenhagen, Denmark, indicating that there are controversies surrounding its proper use and the regulatory framework is not completely adapted to the new technology. In 2015 private land owners...... in Denmark could get up to 100% of the construction costs of climate adaptation measures funded by the utility companies, which resulted in a race to apply for this co-funding plan. In this study we briefly review the climate adaptation framework in Copenhagen, and then discuss how well different scenarios...

  9. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices......: it is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future...

  10. Applying TRIZ and Fuzzy AHP Based on Lean Production to Develop an Innovative Design of a New Shape for Machine Tools

    Directory of Open Access Journals (Sweden)

    Ho-Nien Hsieh

    2015-03-01

    Full Text Available Companies are facing cut throat competition and are forced to continuously perform better than their competitors. In order to enhance their position in the competitive world, organizations are improving at a faster pace. Industrial organizations must be used to the new ideals, such as innovation. Today, innovative design in the development of new products has become a core value in most companies, while innovation is recognized as the main driving force in the market. This work applies the Russian theory of inventive problem-solving, TRIZ and the fuzzy analytical hierarchy process (FAHP to design a new shape for machine tools. TRIZ offers several concepts and tools to facilitate concept creation and problem-solving, while FAHP is employed as a decision support tool that can adequately represent qualitative and subjective assessments under the multiple criteria decision-making environment. In the machine tools industry, this is the first study to develop an innovative design under the concept of lean production. We used TRIZ to propose the relevant principles to the shape’s design with the innovative design consideration and also used FAHP to evaluate and select the best feasible alternative from independent factors based on a multiple criteria decision-making environment. To develop a scientific method based on the lean production concept in order to design a new product and improve the old designing process is the contribution of this research.

  11. Lead-time reduction utilizing lean tools applied to healthcare: the inpatient pharmacy at a local hospital.

    Science.gov (United States)

    Al-Araidah, Omar; Momani, Amer; Khasawneh, Mohammad; Momani, Mohammed

    2010-01-01

    The healthcare arena, much like the manufacturing industry, benefits from many aspects of the Toyota lean principles. Lean thinking contributes to reducing or eliminating nonvalue-added time, money, and energy in healthcare. In this paper, we apply selected principles of lean management aiming at reducing the wasted time associated with drug dispensing at an inpatient pharmacy at a local hospital. Thorough investigation of the drug dispensing process revealed unnecessary complexities that contribute to delays in delivering medications to patients. We utilize DMAIC (Define, Measure, Analyze, Improve, Control) and 5S (Sort, Set-in-order, Shine, Standardize, Sustain) principles to identify and reduce wastes that contribute to increasing the lead-time in healthcare operations at the pharmacy understudy. The results obtained from the study revealed potential savings of > 45% in the drug dispensing cycle time.

  12. Inspecting what you expect: Applying modern tools and techniques to evaluate the effectiveness of household energy interventions

    Science.gov (United States)

    Pillarisetti, Ajay

    Exposure to fine particles (PM2.5) resulting from solid fuel use for household energy needs - including cooking, heating, and lighting - is one of the leading causes of ill-health globally and is responsible for approximately 4 million premature deaths and 84 million lost disability-adjusted life years globally. The well-established links between cooking and ill-health are modulated by complex social, behavioral, technological, and environmental issues that pose unique challenges to efforts that seek to reduce this large health burden. Despite growing interest in the field - and numerous technical solutions that, in the laboratory at least, reduce emissions of harmful air pollutants from solid fuel combustion - there exists a need for refined tools, models, and techniques (1) for measuring environmental pollution in households using solid fuel, (2) for tracking adoption of interventions, and (3) for estimating the potential health benefits attributable to an intervention. Part of the need for higher spatial and temporal resolution data on particular concentrations and dynamics is being met by low-cost sensing platforms that provide large amounts of time-resolved data on critical parameters of interest, including PM2.5 concentrations and time-of-use metrics for heat-generating appliances, like stoves. Use of these sensors can result in non-trivial challenges, including those related to data management and analysis, and field logistics, but also enables novel lines of inquiry and insight. Chapter 2 presents a long-term deployment of real-time PM2.5 sensors in rural, solid-fuel-using kitchens, specifically seeking to evaluate how well commonly measured 24 or 48-hour samples represent long-term means. While short-term measures were poor predictors of long-term means, the dataset enabled evaluation of numerous sampling strategies - including sampling once per week, month, or season - that had much lower errors and higher probabilities of estimating the true mean

  13. Network Analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish Water Management System.

    Science.gov (United States)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-03-30

    New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable patterns of growth and development. This study aims at using Network Analysis (NA) to move such "ecosystem perspective" from theory into practice. A Danish municipal Water Management System (WMS) is used as case study to test the NA methodology and to discuss its generic applicability. We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices of system-level performance for seven different network configurations illustrating past conditions (2004-2008) and future scenarios (2015 and 2020). We also computed the same indices for other 24 human systems and for 12 ecosystems, by using information from the existing scientific literature on NA. The comparison of these results reveals that the WMS is similar to the other human systems and that human systems generally differ from ecosystems. The WMS is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future performance of the WMS are marginal. We argue that future interventions should create alternative pathways for reusing rainwater within the WMS, increasing its potential to withstand the occurrence of flooding. We discuss advantages, limitations, and general applicability of NA as a tool for assessing environmental sustainability in human systems.

  14. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    Science.gov (United States)

    Seaquist, J. W.; Li Johansson, Emma; Nicholas, Kimberly A.

    2014-11-01

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling.

  15. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    Science.gov (United States)

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  16. Simple model of dissolved oxygen consumption in a bay within high organic loading: an applied remediation tool.

    Science.gov (United States)

    Ahumada, Ramón; Vargas, José; Pagliero, Liliana

    2006-07-01

    San Vicente Bay is a coastal shallow embayment in Central Chile with multiple uses, one of which is receiving wastewater from industrial fisheries, steel mill effluents, and domestic sewage. A simulation model was developed and applied to dissolved oxygen consumption by organic residues released into this embayment. Three compartments were established as function of: depth, circulation and outfall location. The model compartments had different volumes, and their oxygen saturation value was used as baseline. The parameters: (a) BOD5 of the industrial and urban effluents, (b) oxygen demand by organic sediments, (c) respiration, (d) photosynthesis and (e) re-aeration were included in the model. Iteration results of the model showed severe alterations in Compartment 1, with a decrease of 65% in the oxygen below saturation. Compartment 2 showed a small decline (10%) and compartment 3 did not show apparent changes in oxygen values. Measures recommended for remediation were to decrease the BOD5 loading by 30% in the affected sector. Iteration of the model for 200 h following recommendations derived from the preceding results produced an increase in saturation of 60% (5 ml O2 L(-1)), which suggested an improvement of the environmental conditions.

  17. Undergraduate teaching modules featuring geodesy data applied to critical social topics (GETSI: GEodetic Tools for Societal Issues)

    Science.gov (United States)

    Pratt-Sitaula, B. A.; Walker, B.; Douglas, B. J.; Charlevoix, D. J.; Miller, M. M.

    2015-12-01

    The GETSI project, funded by NSF TUES, is developing and disseminating teaching and learning materials that feature geodesy data applied to critical societal issues such as climate change, water resource management, and natural hazards (serc.carleton.edu/getsi). It is collaborative between UNAVCO (NSF's geodetic facility), Mt San Antonio College, and Indiana University. GETSI was initiated after requests by geoscience faculty for geodetic teaching resources for introductory and majors-level students. Full modules take two weeks but module subsets can also be used. Modules are developed and tested by two co-authors and also tested in a third classroom. GETSI is working in partnership with the Science Education Resource Center's (SERC) InTeGrate project on the development, assessment, and dissemination to ensure compatibility with the growing number of resources for geoscience education. Two GETSI modules are being published in October 2015. "Ice mass and sea level changes" includes geodetic data from GRACE, satellite altimetry, and GPS time series. "Imaging Active Tectonics" has students analyzing InSAR and LiDAR data to assess infrastructure earthquake vulnerability. Another three modules are in testing during fall 2015 and will be published in 2016. "Surface process hazards" investigates mass wasting hazard and risk using LiDAR data. "Water resources and geodesy" uses GRACE, vertical GPS, and reflection GPS data to have students investigating droughts in California and the High Great Plains. "GPS, strain, and earthquakes" helps students learn about infinitesimal and coseismic strain through analysis of horizontal GPS data and includes an extension module on the Napa 2014 earthquake. In addition to teaching resources, the GETSI project is compiling recommendations on successful development of geodesy curricula. The chief recommendations so far are the critical importance of including scientific experts in the authorship team and investing significant resources in

  18. VERONA V6.22 – An enhanced reactor analysis tool applied for continuous core parameter monitoring at Paks NPP

    Energy Technology Data Exchange (ETDEWEB)

    Végh, J., E-mail: janos.vegh@ec.europa.eu [Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Pós, I., E-mail: pos@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Horváth, Cs., E-mail: csaba.horvath@energia.mta.hu [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, P.O. Box 49 (Hungary); Kálya, Z., E-mail: kalyaz@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Parkó, T., E-mail: parkot@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary); Ignits, M., E-mail: ignits@npp.hu [Paks Nuclear Power Plant Ltd., H-7031 Paks, P.O. Box 71 (Hungary)

    2015-10-15

    Between 2003 and 2007 the Hungarian Paks NPP performed a large modernization project to upgrade its VERONA core monitoring system. The modernization work resulted in a state-of-the-art system that was able to support the reactor thermal power increase to 108% by more accurate and more frequent core analysis. Details of the new system are given in Végh et al. (2008), the most important improvements were as follows: complete replacement of the hardware and the local area network; application of a new operating system and porting a large fraction of the original application software to the new environment; implementation of a new human-system interface; and last but not least, introduction of new reactor physics calculations. Basic novelty of the modernized core analysis was the introduction of an on-line core-follow module based on the standard Paks NPP core design code HELIOS/C-PORCA. New calculations also provided much finer spatial resolution, both in terms of axial node numbers and within the fuel assemblies. The new system was able to calculate the fuel applied during the first phase of power increase accurately, but it was not tailored to determine the effects of burnable absorbers as gadolinium. However, in the second phase of the power increase process the application of fuel assemblies containing three fuel rods with gadolinium content was intended (in order to optimize fuel economy), therefore off-line and on-line VERONA reactor physics models had to be further modified, to be able to handle the new fuel according to the accuracy requirements. In the present paper first a brief overview of the system version (V6.0) commissioned after the first modernization step is outlined; then details of the modified off-line and on-line reactor physics calculations are described. Validation results for new modules are treated extensively, in order to illustrate the extent and complexity of the V&V procedure associated with the development and licensing of the new

  19. History of views on the relative positions of Antarctica and South America: A 100-year tango between Patagonia and the Antarctic Peninsula

    Science.gov (United States)

    Miller, H.

    2007-01-01

    Discussion of continental drift around Antarctica began nearly 100 years ago. While the Gondwana connections of Antarctica to Africa and Australia have been well defined for decades, the relative pre-drift positions of the Antarctic Peninsula and Patagonia continue to be subjects of controversy. Certainly older figures, which showed a paleo-position of the Peninsula crossing over continental crust of the Falkland Plateau or even South Africa or Patagonia, are out of consideration now. But contradictory opinions remain over the relative paleo-position of the Peninsula as a more or less straight prolongation of the Patagonian Andes, versus a position parallel to Patagonia along the Pacific coast. Geological reasons are found for both opinions, but geophysical observations on the adjacent ocean floors, particularly the evolution of the Weddell Sea crust, speak for the last-mentioned reconstruction.

  20. CoSMoS Southern California v3.0 Phase 1 (100-year storm) flood hazard projections: Los Angeles, San Diego and Orange counties

    Science.gov (United States)

    Barnard, Patrick; Erikson, Li; Foxgrover, Amy; O'Neill, Andrea; Herdman, Liv

    2015-01-01

    The Coastal Storm Modeling System (CoSMoS) makes detailed predictions (meter-scale) over large geographic scales (100s of kilometers) of storm-induced coastal flooding and erosion for both current and future sea-level rise (SLR) scenarios. CoSMoS v3.0 for Southern California shows projections for future climate scenarios (sea-level rise and storms) to provide emergency responders and coastal planners with critical storm-hazards information that can be used to increase public safety, mitigate physical damages, and more effectively manage and allocate resources within complex coastal settings. Phase I data for Southern California include flood-hazard information for the coast from the Mexican Border to Pt. Conception for a 100-year storm scenario. Data are complete for the information presented but are considered preliminary; changes may be reflected in the full data release (Phase II) in summer 2016.

  1. Changes in C37 alkenones flux on the eastern continental shelf of the Bering Sea: the record of Emiliania huxleyi bloom over the past 100 years

    Science.gov (United States)

    Harada, N.; Sato, M.; Okazaki, Y.; Oguri, K.; Tadai, O.; Saito, S.; Konno, S.; Jordan, R. W.; Katsuki, K.; Shin, K.; Narita, H.

    2008-12-01

    Flourishes of coccolithophores can be detected by ocean color imagery with data from the satellite-borne Sea-viewing Wide Field-of-view sensor SeaWiFs that was launched in 1997. Thus, temporally and spatially large-scale blooms of Emiliania huxleyi (E. huxleyi) have been distinguished annually in the eastern continental shelf of the Bering Sea since 1997. In 1997, a combination of atmospheric mechanisms produced summer weather anomalies such as calm winds, clear skies, and warm air temperature over the Bering Sea and the weather anomalies caused depletion of the subpycnocline nutrient reservoir (Napp and Hunt, 2001). After depletion of nitrate and silicate, a sustained (more than 4-month-long) bloom of E. huxleyi was observed (Stockwell et al., 2001). Because of the speed and magnitude with which parts of the Bering Sea ecosystem responded to changes in atmospheric factors (Napp and Hunt, 2001) and because a bloom of the coccolithophorid, Coccolithus pelagicus has also been detected in the northeastern Atlantic Ocean off Iceland every year since 1997 (Ostermann, 2001), the appearance of an E. huxleyi bloom in the Bering Sea could be related to atmospherically forced decadal oscillations or global factors. We have investigated spatial expansion and temporal development of E. huxleyi bloom on the continental shelf in the Bering Sea by using a biomarker of E. huxleyi, C37 alkenones flux recorded in the sediments during the past 100 years. As a result, the E. huxleyi bloom had been prominent since 1970"fs at latest during the last 100 years. In this presentation, we will discuss the relationship between E. huxleyi bloom and activity of Aleutian low, and also changes in diatom assemblages. References Napp and Hunt, 2001, Fish Oceanogr., 10, 61-68. Ostermann, 2001, WHOI annual report, pp.17-18. Stockwell et al., 2001, Fish Oceanogr., 10, 99-116.

  2. 还原染料百年发展史话(待续)%The development of vat dyes in 100 years(to be continued)

    Institute of Scientific and Technical Information of China (English)

    陈荣圻

    2015-01-01

    1901年,BASF合成并生产了第1只还原染料(还原染料RSN),距今超过一百年,如果说1987年合成靛蓝早已诞生于BASF,那离百年就更远了.还原染料化学结构复杂,合成过程冗长、三废量大、难于处理,所以价格昂贵.但因其色彩鲜艳、密度高,非其他棉用染料所能取代.还原染料除了印染,经颜料化后,某些染料可制得高档有机颜料,有些品种还能拓展到光物理、电化学领域的液晶、光导材料等高科技领域,是不可缺失的功能材料,旧貌换新颜.%The first vat dye (vat dye RSN) was synthesized and produced by BASF in 1901, which was more than 100 years. Considering indigo blue was synthesized by BASF in 1987, it has a history of far more than 100 years. Vat dyes are expensive because of its complex chemical structure and long synthesis process with large amount of "three wastes" that are difficult to deal with. However, vat dyes are colourful, high color density and impossible to be replaced by any other dyes. Besides printing and dyeing, vat dyes can obtain high-grade organic pigments after pigmentation. Some of those high-grade organic pigments can extend to optical physics, liquid crystal in the electrochemistry optical material fields and other high-tech fields, which are indispensable function materials and completely changed.

  3. A 100-Year Retrospective Landscape-Level Carbon Budget for the Sooke Lake Watershed, British Columbia: Constraining Estimates of Terrestrial to Aquatic DOC Transfers.

    Science.gov (United States)

    Trofymow, J. A.; Smiley, B. P. K.

    2014-12-01

    To address how natural disturbance, forest harvest, and deforestation from reservoir creation affect landscape-level carbon (C) budgets, a retrospective C budget for the 8500 ha Sooke watershed from 1911 - 2012 was developed using historic spatial inventory and disturbance data. Data was input to a spatially-explicit version of the Carbon Budget Model-Canadian Forest Sector (CBM-CFS3), an inventory-based C budget model used to simulate forest C dynamics at multiple scales. In 1911 the watershed was dominated by mature/old Douglas-fir forests with aboveground biomass C (ABC) of 262 Mg C/ha and net ecosystem production (NEP) of 0.63 Mg C/ha/yr. Land was cleared around Sooke Lake, a dam built and lake expanded from 370 to 450 ha in 1915, 610 ha in 1970, 670 ha in 1980 and 810 ha in 2002. Along with deforestation, fires and localized harvest occurred from 1920 - 1940, reducing ABC to 189 Mg C/ha, with NEP varying from -1.63 to 0.13 Mg C/ha/yr. Distributed harvest occurred 1954 - 1998, with a minimum ABC of 148 Mg C/ha in 1991. By 2012 ABC (177 Mg C/ha) and NEP (2.29 Mg C/ha/yr) had increased. Over 100 years, 2430 ha forest was cut and replanted and 640 ha deforested. CBM-CFS3 includes transfers of dissolved organic C (DOC) to aquatic systems, however data has not been available to parameterize DOC flux. DOC fluxes are modelled as a fraction of decay loss from humified soil C with a default of 100% of losses to CO2 and 0% to DOC. Stream flow and [DOC] data from 1996 - 2012 for 3 watershed catchments, Rithet, Judge and Council were used to estimate annual DOC fluxes. Rithet, Judge and Council differed both in area % disturbed (logging or fire) over 100 years (39%, 93%, 91%) and in area % mature/old forest (>80yrs in 2012) (67%, 56%, 21%). DOC flux for Rithet and Judge ranged from 0.037 - 0.057 Mg C/ha/yr, Council averaged 0.017 Mg C/ha/yr. Low DOC fluxes were likely due to influences of a small lake in the catchment. Constraining CBM-CFS3 to observed DOC fluxes, required

  4. Organochlorine pesticides (OCPs) in wetland soils under different land uses along a 100-year chronosequence of reclamation in a Chinese estuary

    Science.gov (United States)

    Bai, Junhong; Lu, Qiongqiong; Zhao, Qingqing; Wang, Junjing; Gao, Zhaoqin; Zhang, Guangliang

    2015-12-01

    Soil profiles were collected at a depth of 30 cm in ditch wetlands (DWs), riverine wetlands (RiWs) and reclaimed wetlands (ReWs) along a 100-year chronosequence of reclamation in the Pearl River Delta. In total, 16 OCPs were measured to investigate the effects of wetland reclamation and reclamation history on OCP levels. Our results showed that average ∑DDTs, HCB, MXC, and ∑OCPs were higher in surface soils of DWs compared to RiWs and ReWs. Both D30 and D20 soils contained the highest ∑OCP levels, followed by D40 and D100 soils; lower ∑OCP levels occurred in D10 soils. Higher ∑OCP levels were observed in the younger RiWs than in the older ones, and surface soils exhibited higher ∑OCP concentrations in the older ReWs compared with younger ReWs. The predominant percentages of γ-HCH in ∑HCHs (>42%) and aldrin in ∑DRINs (>46%) in most samples reflected the recent use of lindane and aldrin. The presence of dominant DDT isomers (p,p’-DDE and p,p’-DDD) indicated the historical input of DDT and significant aerobic degradation of the compound. Generally, DW soils had a higher ecotoxicological risk of OCPs than RiW and ReW soils, and the top 30 cm soils had higher ecotoxicological risks of HCHs than of DDTs.

  5. Intraspecific variation in fine root respiration and morphology in response to in situ soil nitrogen fertility in a 100-year-old Chamaecyparis obtusa forest.

    Science.gov (United States)

    Makita, Naoki; Hirano, Yasuhiro; Sugimoto, Takanobu; Tanikawa, Toko; Ishii, Hiroaki

    2015-12-01

    Soil N fertility has an effect on belowground C allocation, but the physiological and morphological responses of individual fine root segments to variations in N availability under field conditions are still unclear. In this study, the direction and magnitude of the physiological and morphological function of fine roots in response to variable in situ soil N fertility in a forest site were determined. We measured the specific root respiration (Rr) rate, N concentration and morphology of fine root segments with 1-3 branching orders in a 100-year-old coniferous forest of Chamaecyparis obtusa. Higher soil N fertility induced higher Rr rates, root N concentration, and specific root length (SRL), and lower root tissue density (RTD). In all fertility levels, the Rr rates were significantly correlated positively with root N and SRL and negatively with RTD. The regression slopes of respiration with root N and RTD were significantly higher along the soil N fertility gradient. Although no differences in the slopes of Rr and SRL relationship were found across the levels, there were significant shifts in the intercept along the common slope. These results suggest that a contrasting pattern in intraspecific relationships between specific Rr and N, RTD, and SRL exists among soils with different N fertility. Consequently, substantial increases in soil N fertility would exert positive effects on organ-scale root performance by covarying the Rr, root N, and morphology for their potential nutrient and water uptake.

  6. Quantification of uncertainties in the 100-year flow at an ungaged site near a gaged station and its application in Georgia

    Science.gov (United States)

    Cho, Huidae; Bones, Emma

    2016-08-01

    The Federal Emergency Management Agency has introduced the concept of the "1-percent plus" flow to incorporate various uncertainties in estimation of the 100-year or 1-percent flow. However, to the best of the authors' knowledge, no clear directions for calculating the 1-percent plus flow have been defined in the literature. Although information about standard errors of estimation and prediction is provided along with the regression equations that are often used to estimate the 1-percent flow at ungaged sites, uncertainty estimation becomes more complicated when there is a nearby gaged station because regression flows and the peak flow estimate from a gage analysis should be weighted to compute the weighted estimate of the 1-percent flow. In this study, an equation for calculating the 1-percent plus flow at an ungaged site near a gaged station is analytically derived. Also, a detailed process is introduced for calculating the 1-percent plus flow for an ungaged site near a gaged station in Georgia as an example and a case study is performed. This study provides engineers and practitioners with a method that helps them better assess flood risks and develop mitigation plans accordingly.

  7. Application of a stent splint to protect intraoral organs from radiation injury to a 97 year-old patient with multiple oral cancers who survived over 100 year-old

    Energy Technology Data Exchange (ETDEWEB)

    Yanagisawa, Shigetaka; Kawamura, Tetsuo; Shimizu, Masatsugu; Aoki, Hirooki; Mizuki, Harumi; Ashizawa, Akira (Oita Medical Coll., Hasama (Japan))

    1989-06-01

    Radiation therapy had been used with increasing frequency in recent years in the management of oral cancers of advanced ages. In those cases we have to take good care to maintain the oral health of patients undergoing cancerocidal dose of radiation therapy. Using splints, as a tissue displacer, during radiation, we could treat a 99-year-old female patient without serious radiation sequelae, successfully she survived over 100 year-old. As she visited us at 97 year-old, the primary lesions located on the left upper lip, nose, upper and lower gums were diagnosed as multiple verrucous carcinoma histologically. Seventeen months after the first radiotherapy to the lip, nose and upper jaw, we planned again radiotherapy to the recurrent tumor of the lower gum. In order to eliminate and minimize side effects of the second irradiation for the contigenous intraoral organs, we devised a splint to exclude the tongue and upper gum apart from a radiation field. The splint, as tissue displacer, was made of heat-cured acrylic resin and divided into two pieces which were formed like full denture without artificial teeth. They were applied to the upper and lower jaws. The lower one had a large wing to exclude the tongue from irradiation field. After setting of the splint, she had been clenched slightly with an aid of chin cap. Then we could finish successfully the radiotherapy with 10 MV X-ray 40 Gy as scheduled without serious troubles. (author).

  8. Upwelling and anthropogenic forcing on phytoplankton productivity and community structure changes in the Zhejiang coastal area over the last 100 years

    Institute of Scientific and Technical Information of China (English)

    DUAN Shanshan; XING Lei; ZHANG Hailong; FENG Xuwen; YANG Haili; ZHAO Meixun

    2014-01-01

    Phytoplankton productivity and community structure in marginal seas have been altered significantly dur-ing the past three decades, but it is still a challenge to distinguish the forcing mechanisms between climate change and anthropogenic activities. High time-resolution biomarker records of two 210Pb-dated sediment cores (#34:28.5°N, 122.272°E;CJ12-1269:28.861 9°N, 122.515 3°E) from the Min-Zhe coastal mud area were compared to reveal changes of phytoplankton productivity and community structure over the past 100 years. Phytoplankton productivity started to increase gradually from the 1970s and increased rapidly after the late 1990s at Site #34;and it started to increase gradually from the middle 1960s and increased rapidly after the late 1980s at Site CJ12-1269. Productivity of Core CJ12-1269 was higher than that of Core #34. Phy-toplankton community structure variations displayed opposite patterns in the two cores. The decreasing D/B (dinosterol/brassicasterol) ratio of Core #34 since the 1960s revealed increased diatom contribution to total productivity. In contrast, the increasing D/B ratio of Core CJ12-1269 since the 1950s indicated in-creased dinoflagellate contribution to total productivity. Both the productivity increase and the increased dinoflagellate contribution in Core CJ12-1269 since the 1950-1960s were mainly caused by anthropogenic activities, as the location was closer to the Changjiang River Estuary with higher nutrient concentration and decreasing Si/N ratios. However, increased diatom contribution in Core #34 is proposed to be caused by increased coastal upwelling, with higher nutrient concentration and higher Si/N ratios.

  9. Fractionation, transfer, and ecological risks of heavy metals in riparian and ditch wetlands across a 100-year chronosequence of reclamation in an estuary of China

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Rong [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); School of Nature Conservation, Beijing Forestry University, Beijing 100083 (China); Bai, Junhong, E-mail: junhongbai@163.com [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China); Lu, Qiongqiong; Zhao, Qingqing; Gao, Zhaoqin; Wen, Xiaojun; Liu, Xinhui [State Key Laboratory of Water Environment Stimulation, School of Environment, Beijing Normal University, Beijing 100875 (China)

    2015-06-01

    The effect of reclamation on heavy metal concentrations and the ecological risks in ditch wetlands (DWs) and riparian wetlands (RWs) across a 100-year chronosequence in the Pearl River Estuary of China was investigated. Concentrations of 4 heavy metals (Cd, Cu, Pb, and Zn) in soil and plant samples, and sequential extracts of soil samples were determined, using inductively coupled plasma atomic absorption spectrometry. Results showed that heavy metal concentrations were higher in older DW soils than in the younger ones, and that the younger RW soils contained higher heavy metal concentrations compared to the older ones. Although the increasing tendency of heavy metal concentrations in soil was obvious after wetland reclamation, the metals Cu, Pb, and Zn exhibited low or no risks to the environment based on the risk assessment code (RAC). Cd, on the other hand, posed a medium or high risk. Cd, Pb, and Zn were mainly bound to Fe–Mn oxide, whereas most of Cu remained in the residual phase in both ditch and riparian wetland soils, and the residual proportions generally increased with depth. Bioconcentration and translocation factors for most of these four heavy metals significantly decreased in the DWs with older age (p < 0.05), whereas they increased in the RWs with younger age (p < 0.05). The DW soils contained higher concentrations of heavy metals in the organic fractions, whereas there were more carbonate and residual fractions in the RW soils. The non-bioavailable fractions of Cu and Zn, and the organic-bound Cd and Pb significantly inhibited plant growth. - Highlights: • Heavy metals in ditch wetland accumulated with increasing reclamation history. • Heavy metals exist in the Fe–Mn oxides and residual fractions in both wetlands. • Cd posed a medium to high environmental risk while low risk for other metals. • Long reclamation history caused lower BCFs and TFs in DWs and higher levels in RWs. • RW soils contained more heavy metals in the carbonate

  10. Changes in stable isotopes, lignin-derived phenols, and fossil pigments in sediments of Lake Biwa, Japan: implications for anthropogenic effects over the last 100 years.

    Science.gov (United States)

    Hyodo, Fujio; Tsugeki, Narumi; Azuma, Jun-Ichi; Urabe, Jotaro; Nakanishi, Masami; Wada, Eitaro

    2008-09-15

    We measured stable nitrogen (N) and carbon (C) isotope ratios, lignin-derived phenols, and fossil pigments in sediments of known ages to elucidate the historical changes in the ecosystem status of Lake Biwa, Japan, over the last 100 years. Stable N isotope ratios and algal pigments in the sediments increased rapidly from the early 1960s to the 1980s, and then remained relatively constant, indicating that eutrophication occurred in the early 1960s but ceased in the 1980s. Stable C isotope ratios of the sediment increased from the 1960s, but decreased after the 1980s to the present. This decrease in stable C isotope ratios after the 1980s could not be explained by annual changes in either terrestrial input or algal production. However, when the C isotope ratios were corrected for the Suess effect, the shift to more negative isotopic value in atmospheric CO(2) by fossil fuel burning, the isotopic value showed a trend, which is consistent with the other biomarkers and the monitoring data. The trend was also mirrored by the relative abundance of lignin-derived phenols, a unique organic tracer of material that originated from terrestrial plants, which decreased in the early 1960s and recovered to some degree in the 1980s. We detected no notable difference in the composition of lignin phenols, suggesting that the terrestrial plant composition did not change markedly. However, we found that lignin accumulation rate increased around the 1980s. These results suggest that although eutrophication has stabilized since the 1980s, allochthonous organic matter input has changed in Lake Biwa over the past 25 years.

  11. From Rail-Oriented to Automobile-Oriented Urban Development and Back. 100 Years of Paradigm Change and Transport Policy in Berlin

    Directory of Open Access Journals (Sweden)

    Friedemann Kunst

    2016-10-01

    Full Text Available Transport and its side effects are major problems in rapidly growing cities. Car traffic dominates these cities and pollutes the environment without being able to sufficiently secure the mobility of the urban population and goods. A paradigm shift in urban and transport policy will be necessary to change this situation. In spite of its different development dynamics, Berlin is an interesting example to discuss development strategies for rapidly growing cities because in the course of more than 100 years, a twofold paradigm shift has occurred in the city both conceptually and practically:  Berlin has shifted from a city dominated by rail traffic  to an automobile-oriented city,  and has then gradually transformed back into a city in which  an intertwined system of public and non-motorized individual means of transport secures the mobility of the urban population. The interdependencies on the conceptual level between urban planning and transport policies as well as on a practical level between urban structures and transport systems can be studied using the example of Berlin. Experiences with the implementation of automobile-oriented planning and the special conditions in the first decade after reunification led to protests, reflection, and a revision of the transport policy. A strategically designed process of integrated planning has brought about a trend reversal, and steered the development of transport in the direction of clearly formulated sustainability-oriented objectives. In this process, the reintegration of transport and spatial planning and a reorganization of institutional structures at the administrative level was of particular importance. Compact, rail-oriented settlement structures like in the metropolitan region of Berlin make it easier to dispense with automobiles than sprawled structures. The residual role that qualitatively improved automobiles will take in the cities of the future will have to be determined by research and

  12. Central control of information transmission through the intraspinal arborizations of sensory fibers examined 100 years after Ramón y Cajal.

    Science.gov (United States)

    Rudomin, Pablo

    2002-01-01

    About 100 years ago, Santiago Ramón y Cajal reported that sensory fibers entering the spinal cord have ascending and descending branches, and that each of them sends collaterals to the gray matter where they have profuse ramifications. To him this was a fundamental discovery and proposed that the intraspinal branches of the sensory fibers were "centripetal conductors by which sensory excitation is propagated to the various neurons in the gray matter". In addition, he assumed that "conduction of excitation within the intraspinal arborizations of the afferent fibers would be proportional to the diameters of the conductors", and that excitation would preferentially flow through the coarsest branches. The invariability of some elementary reflexes such as the knee jerk would be the result of a long history of plastic adaptations and natural selection of the safest neuronal organizations. There is now evidence suggesting that in the adult cat, the intraspinal branches of sensory fibers are not hard wired routes that diverge excitation to spinal neurons in an invariable manner, but rather dynamic pathways where excitation flow can be centrally addressed to reach specific neuronal targets. This central control of information flow is achieved by means of specific sets of GABAergic interneurons that produce primary afferent depolarization (PAD) via axo-axonic synapses and reduce transmitter release (presynaptic inhibition). The PAD produced by single, or by small groups of GABAergic interneurons in group I muscle afferents, can remain confined to some sets of intraspinal arborizations of the afferent fibers and not spread to nearby collaterals. In muscle spindle afferents this local character of PAD allows cutaneous and descending inputs to differentially inhibit the PAD in segmental and ascending collaterals of individual fibers, which may be an effective way to decouple the information flow arising from common sensory inputs. This feature appears to play an important role

  13. Assessment and remediation of a historical pipeline release : tools, techniques and technologies applied to in-situ/ex-situ soil and groundwater remediation

    Energy Technology Data Exchange (ETDEWEB)

    Reid, N. [EBA Engineering Consultants Ltd., Calgary, AB (Canada); Kohlsmith, B. [Kinder Morgan Canada Inc., Calgary, AB (Canada)

    2008-07-01

    Tools, techniques, and technologies applied to in-situ/ex-situ soil and groundwater remediation were presented as part of an assessment and remediation of a historical pipeline release. The presentation discussed the initial assessment, as well as a discussion of remediation of hydrophobic soils, re-assessment, site specific criteria, a remediation trial involving bioventing and chemical oxidation, and a full scale remediation. The pipeline release occurred in the summer of 1977. The event was followed by a complete surface remediation with a significant amount of topsoil being removed and replaced. In 2004, a landowner complained of poor crop growth in four patches near the area of the historical spill. An initial assessment was undertaken and several photographs were presented. It was concluded that a comprehensive assessment set the base for a careful staged approach to the remediation of the site including the establishment of site specific criteria. The process was made possible with a high level of communication between all stakeholders. In addition, the most appropriate solution for the site was realized. figs.

  14. Indications of progressive desiccation of the Transvaal Lowveld over the past 100 years, and implications for the water stabilization programme in the Kruger National Park

    Directory of Open Access Journals (Sweden)

    U. De V. Pienaar

    1985-12-01

    Full Text Available All available rainfall statistics recorded for the Kruger National Park area since 1907, coupled with an analysis of all the historical climatological data on hand, appear to confirm the quasi-twenty-year rainfall oscillation in precipitation pattern for the summer rainfall area. This was first pointed out by Tyson & Dyer (1975. The dendrochronological data obtained by Hall (1976 from a study of growth rings of a very old yellowwood tree (Podocarpus falcatus in Natal, also appear to indicate a superimposed, long-term (80-100 years pattern of alternate below- average and above-average rainfall periods. The historical data relating to climate in the park, during the past century or two, seem to bear out such a pattern. If this can be confirmed, it will be an enormous aid not only in wildlife-management planning, but also to agriculturists, demographic planners and others. It would appear that the long, relatively dry rainfall period of 1860-1970, with its concomitant progressive desiccation of the @ area in question, has passed over into the next aboveverage rainfall era. This does not mean that there will be no further cataclysmic droughts during future rainfall trough periods. It is therefore wise to plan ahead to meet such contingencies. The present water distribution pattern in the park (natural plus artificial water is conspicuously still well below that which pertained, during dry seasons, at the turn of the century, when the Sabi and Shingwedzi game reserves were proclaimed. It is the declared policy of the National Parks Board of Trustees to simulate natural regulating mechanisms as closely as possible. In consequence the artificial water-for-game program is a long way from completion. The large numbers of game animals in the park (including dominant species such as elephant Loxodonta africana and buffalo Syncerus coffer can no longer migrate out of the area to escape natural catastrophes (such as the crippling droughts of 1911-1917, the

  15. Uncertainty Analysis of Climate Warming During the Last 100 Years%近百年气候变暖的不确定性分析

    Institute of Scientific and Technical Information of China (English)

    赵宗慈; 王绍武; 罗勇; 江滢

    2009-01-01

    As mentioned in the IPCC report, the linear trend of the global annual mean surface air temperature during the last 100 years (1906 to 2005) is 0.74℃(in the range of 0.56 to 0.921℃). The wanning trend in China is 0.53~0.86℃ during the same period. But it should emphasized that there are some uncertainties and gaps in the information about the recent climate warming, especially in some special regions such as in China. The uncertainties of climate warming in China come from both lack of the observed data during the first half of the 20th century and the urbanization process (heat island effects) during the second half of the 20th century that might contribute 25% of the total wanning. The climate warming in China during the last 50 years is not only contributed by the human activity, but also by the urbanization, the natural periodicity and decadal variability, dimming and brightening of the solar radiation, as well as other forcing factors such as solar activity, volcanic activity and interactions inside the climate system. The fractional uncertainties of the future climate prediction and projections vary due to internal variability of the global climate system, climatemodel uncertainty and scenario uncertainty. The global climate models do not have adequately fine resolutions due to the lack of high speed computers, with inveracious simulations by the models, especially in the regional scales. The parameterizations of the physical, chemical and biological processes in the global climate system are complicated. The present climate models can hardly describe those processes, interactions and the feedback mechanisms, . such as clouds-aerosols-radiation feedbacks. The future climate changes will be caused by both natural and anthropogenic forcing factors. It is difficult to predict solar and volcanic activities in a long period. The human activities in future are provided as some scenarios, not the real human emissions. Therefore, the reliability of climate

  16. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  17. Performance-driven design with the support of digital tools: Applying discrete event simulation and space syntax on the design of the emergency department

    Directory of Open Access Journals (Sweden)

    David Morgareidge

    2014-09-01

    This case study demonstrates that DES and SSA are effective tools for facilitating decision-making related to design, reducing capital and operational costs, and improving organizational performance. DES focuses on operational processes and care flow. SSA complements DES with its strength in linking space to human behavior. Combining both tools can lead to high-performance ED design and can extend to broad applications in health care.

  18. The Screening Tool of Feeding Problems Applied to Children (STEP-CHILD): Psychometric Characteristics and Associations with Child and Parent Variables

    Science.gov (United States)

    Seiverling, Laura; Hendy, Helen M.; Williams, Keith

    2011-01-01

    The present study evaluated the 23-item Screening Tool for Feeding Problems (STEP; Matson & Kuhn, 2001) with a sample of children referred to a hospital-based feeding clinic to examine the scale's psychometric characteristics and then demonstrate how a children's revision of the STEP, the STEP-CHILD is associated with child and parent variables.…

  19. 100 years of occupational safety research: From basic protections and work analysis to a multilevel view of workplace safety and risk.

    Science.gov (United States)

    Hofmann, David A; Burke, Michael J; Zohar, Dov

    2017-03-01

    Starting with initiatives dating back to the mid-1800s, we provide a high-level review of the key trends and developments in the application of applied psychology to the field of occupational safety. Factory laws, basic worker compensation, and research on accident proneness comprised much of the early work. Thus, early research and practice very much focused on the individual worker, the design of their work, and their basic protection. Gradually and over time, the focus began to navigate further into the organizational context. One of the early efforts to broaden beyond the individual worker was a significant focus on safety-related training during the middle of the 20th century. Toward the latter years of the 20th century and continuing the move from the individual worker to the broader organizational context, there was a significant increase in leadership and organizational climate (safety climate) research. Ultimately, this resulted in the development of a multilevel model of safety culture/climate. After discussing these trends, we identify key conclusions and opportunities for future research. (PsycINFO Database Record

  20. 手持式电动工具用轴承性能浅析%Brief Performance Analysis on the Bearings Applied to Hand-held Motor-operated Electric Tools

    Institute of Scientific and Technical Information of China (English)

    张永恩; 方承志; 宋贵州; 李兴林

    2014-01-01

    Deep groove ball bearings, represented in the tool assembly without considering other factors, combined with the relevant sections of the power tool safety standards, the content mainly discusses several performance requirements of the bearings applied to hand-held motor-operated electric tools and their relationship and effects with safety standard. This content can also be referenced while make technical design and quality appraisal.%以深沟球轴承为代表,在不考虑工具装配等因素的前提下,结合电动工具安全标准的相关章节,讨论并分析手持式电动工具用滚动轴承的主要性能和要求,可供设计选用和质量评价时参考。

  1. Applying standards to ICT models, tools and data in Europe to improve river basin networks and spread innovation on water sector

    Science.gov (United States)

    Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier

    2015-04-01

    This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking

  2. Ammonia synthesis catalyst 100 years:Practice, enlightenment and challenge%氨合成催化剂100年:实践、启迪和挑战

    Institute of Scientific and Technical Information of China (English)

    刘化章

    2014-01-01

    Haber-Bosch发明的氨合成催化剂创立已经100周年。介绍了氨合成催化剂在理论和实践方面的发展、成就及其启迪,展望了氨合成催化剂的未来和面临的新挑战。催化合成氨技术在20世纪化学工业的发展中起着核心的作用。一个世纪以来,氨合成催化剂经历了Fe3O4基熔铁催化剂、Fe1-xO基熔铁催化剂、Ru基催化剂等发展阶段,以及钴钼双金属氮化物催化剂的发现。实践表明,氨合成催化剂是多相催化领域中许多基础研究的起点和试金石,没有别的反应象氨合成反应一样,能够把理论、模型催化剂和实验连接起来。催化合成氨反应仍然是多相催化理论研究的一个理想的模型体系。理解该反应机理并转换成完美技术成为催化研究领域发展的基本标准。这个永不结束的故事仍然没有结束。除了关于反应的基本步骤、真实结构、亚氮化物这些问题之外,催化合成氨在理论上一个新的挑战是关于在室温和常压下氨合成的预测,包括电催化合成氨、光催化合成氨和化学模拟生物固氮以及包括氮分子在内的催化化学研究中几种最稳定的小分子的活化方法等。%Ammonia synthesis catalyst found by Haber-Bosch achieves its history of 100 years. The current understanding and enlightenment from foundation and development of ammonia synthesis catalyst are reviewed, and its future and facing new challenge remained today are expected. Catalytic ammo-nia synthesis technology has played a central role in the development of the chemical industry dur-ing the 20th century. During 100 years, ammonia synthesis catalyst has come through diversified seedtime such as Fe3O4-based iron catalysts, Fe1-xO-based iron catalysts, ruthenium-based catalysts, and discovery of a Co-Mo-N system. Often new techniques, methods, and theories of catalysis have initially been developed and applied in connection with studies of

  3. Decision support tool for Virtual Power Players: Hybrid Particle Swarm Optimization applied to Day-ahead Vehicle-To-Grid Scheduling

    DEFF Research Database (Denmark)

    Soares, João; Valle, Zita; Morais, Hugo

    2013-01-01

    This paper presents a decision support Tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy ressource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application...... of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric network...... constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance...

  4. Modified Linear Theory Aircraft Design Tools and Sonic Boom Minimization Strategy Applied to Signature Freezing via F-function Lobe Balancing

    Science.gov (United States)

    Jung, Timothy Paul

    Commercial supersonic travel has strong business potential; however, in order for the Federal Aviation Administration to lift its ban on supersonic flight overland, designers must reduce aircraft sonic boom strength to an acceptable level. An efficient methodology and associated tools for designing aircraft for minimized sonic booms are presented. The computer-based preliminary design tool, RapidF, based on modified linear theory, enables quick assessment of an aircraft's sonic boom with run times less than 30 seconds on a desktop computer. A unique feature of RapidF is that it tracks where on the aircraft each segment of the of the sonic boom came from, enabling precise modifications, speeding the design process. Sonic booms from RapidF are compared to flight test data, showing that it is capability of predicting a sonic boom duration, overpressure, and interior shock locations. After the preliminary design is complete, scaled flight tests should be conducted to validate the low boom design. When conducting such tests, it is insufficient to just scale the length; thus, equations to scale the weight and propagation distance are derived. Using RapidF, a conceptual supersonic business jet design is presented that uses F-function lobe balancing to create a frozen sonic boom using lifting surfaces. The leading shock is reduced from 1.4 to 0.83 psf, and the trailing shock from 1.2 to 0.87 psf, 41% and 28% reductions respectfully. By changing the incidence angle of the surfaces, different sonic boom shapes can be created, and allowing the lobes to be re-balanced for new flight conditions. Computational fluid dynamics is conducted to validate the sonic boom predictions. Off-design analysis is presented that varies weight, altitude, Mach number, and propagation angle, demonstrating that lobe-balance is robust. Finally, the Perceived Level of Loudness metric is analyzed, resulting in a modified design that incorporates other boom minimization techniques to further reduce

  5. 100 years of the physics of diodes

    Science.gov (United States)

    Zhang, Peng; Valfells, Ágúst; Ang, L. K.; Luginsland, J. W.; Lau, Y. Y.

    2017-03-01

    The Child-Langmuir Law (CL), discovered a century ago, gives the maximum current that can be transported across a planar diode in the steady state. As a quintessential example of the impact of space charge shielding near a charged surface, it is central to the studies of high current diodes, such as high power microwave sources, vacuum microelectronics, electron and ion sources, and high current drivers used in high energy density physics experiments. CL remains a touchstone of fundamental sheath physics, including contemporary studies of nanoscale quantum diodes and nano gap based plasmonic devices. Its solid state analog is the Mott-Gurney law, governing the maximum charge injection in solids, such as organic materials and other dielectrics, which is important to energy devices, such as solar cells and light emitting diodes. This paper reviews the important advances in the physics of diodes since the discovery of CL, including virtual cathode formation and extension of CL to multiple dimensions, to the quantum regime, and to ultrafast processes. We review the influence of magnetic fields, multiple species in bipolar flow, electromagnetic and time dependent effects in both short pulse and high frequency THz limits, and single electron regimes. Transitions from various emission mechanisms (thermionic-, field-, and photoemission) to the space charge limited state (CL) will be addressed, especially highlighting the important simulation and experimental developments in selected contemporary areas of study. We stress the fundamental physical links between the physics of beams to limiting currents in other areas, such as low temperature plasmas, laser plasmas, and space propulsion.

  6. Lurpak: Ready for another 100 years?

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    2001-01-01

    The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of.......The Lur mark - the forerunner and very foundation of Lurpak butter - celebrates its 100th anniversary this year. That is an unusual and impressive lifetime for a consumer goods brand and something Danish dairy sector can be proud of....

  7. 100-Year UPS,"Swifter" Olympics

    Institute of Scientific and Technical Information of China (English)

    Guo Yan; Yang Wei

    2007-01-01

    @@ As the official logistics and express delivery sponsor of 2008Beijing Olympics,UPS will manage all logistical operations at the Olympics Test Events(formally known as "Good Luck Beijing Events")and the actual Games through which majority of equipmerit used at the events will flow.

  8. The 100 year DASCH Transient Search

    Science.gov (United States)

    Miller, George F.; Grindlay, J. E.; Tang, S.; Los, E.

    2014-01-01

    The Digital Access to a Sky Century at Harvard (DASCH) project is currently digitizing the roughly 500,000 photographic plates maintained by the Harvard College Observatory. The Harvard plate collection covers each point of the sky roughly 500 to 3000 times from 1885 to 1992, with limiting magnitudes ranging from B=14-18 mag and photometric accuracy within ±0.1 mag. Production scanning (up to 400 plates/day) is proceeding in Galactic coordinates from the North Galactic Pole and is currently at roughly 50 degrees galactic latitude. The vastness of these data makes the DASCH project ideal to search for transient behavior. In particular, the large time base of the DASCH collection gives an unprecedented advantage when searching for outbursting systems with recurrence rates of decades or longer. These include recurrent novae, rare WZ Sge Cataclysmic Variables, blazars, X-Ray binaries, and supernovae in the Virgo Supercluster. We report here the discovery of previously unidentified stellar-like objects that underwent abnormally large (Δm=5-9) outbursts discovered with DASCH. We also report the discovery of outbursts from previously quiet AM CVn stars, as well as attempt to characterize their recurrence rates.

  9. Appraising Schumpeter's "Essence" after 100 years

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    Schumpeter's unique type of evolutionary analysis can hardly be understood unless we recognise that he developed it in relation to a study of the strength and weaknesses of the Walrasian form of Neoclassical Economics. This development was largely performed in his first book 'Wesen und Hauptinhalt...

  10. 100 years of Weyl’s law

    Directory of Open Access Journals (Sweden)

    Victor Ivrii

    2016-08-01

    Full Text Available Abstract We discuss the asymptotics of the eigenvalue counting function for partial differential operators and related expressions paying the most attention to the sharp asymptotics. We consider Weyl asymptotics, asymptotics with Weyl principal parts and correction terms and asymptotics with non-Weyl principal parts. Semiclassical microlocal analysis, propagation of singularities and related dynamics play crucial role. We start from the general theory, then consider Schrödinger and Dirac operators with the strong magnetic field and, finally, applications to the asymptotics of the ground state energy of heavy atoms and molecules with or without a magnetic field.

  11. Analysis of 100 Years of Curriculum Designs

    Science.gov (United States)

    Kelting-Gibson, Lynn

    2013-01-01

    Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational…

  12. [100 years of the Babinski sign].

    Science.gov (United States)

    Estañol Vidal, B; Huerta Díaz, E; García Ramos, G

    1997-01-01

    In 1896 Joseph Francois Felix Babinski described for the first time the phenomenon of the toes. In his first paper he simply described extension of all toes with noxious stimulation of the sole of the foot. It was not until 1898 that he specifically described the extension of the hallux with stimulation of the lateral border of the sole. Babinski was probably not aware at the time that E. Remak, a German physician, had previously described the sign. In his third paper of 1903 Babinski concludes that if other authors had described the abnormal reflex before him, they found it fortuitously and did not realize its semiologic value. Babinski probably discovered it by a combination of chance, careful observation and intuition. He also had in mind practical applications of the sign particularly in the differential diagnosis with hysteria and in medico-legal areas. Several of his observations and the physiopathological mechanism proposed by him are still valid today. He realized since 1896 that the Babinski reflex was part of the flexor reflex synergy. He observed that several patients during the first hours of an acute cerebral or spinal insult had absent extensor reflexes. He realized that most patients with the abnormal reflex had weakness of the toes and ankles. He found a lack of correlation between hyperactive myotatic reflexes and the presence of an upgoing hallux. He discovered that not all patients with hemiplegia or paraplegia had the sign. He thought erroneously that some normal subjects could have an upgoing toe. His dream of a practical application of the sign has been fully achieved. The motto of Babinski was Observatio summa lex. Perhaps there is no better dictum in clinical neurology.

  13. Spinoff 2003: 100 Years of Powered Flight

    Science.gov (United States)

    2003-01-01

    Today, NASA continues to reach milestones in space exploration with the Hubble Telescope, Earth-observing systems, the Space Shuttle, the Stardust spacecraft, the Chandra X-Ray Observatory, the International Space Station, the Mars rovers, and experimental research aircraft these are only a few of the many initiatives that have grown out of NASA engineering know-how to drive the Agency s missions. The technical expertise gained from these programs has transferred into partnerships with academia, industry, and other Federal agencies, ensuring America stays capable and competitive. With Spinoff 2003, we once again highlight the many partnerships with U.S. companies that are fulfilling the 1958 Space Act stipulation that NASA s vast body of scientific and technical knowledge also benefit mankind. This year's issue showcases innovations such as the cochlear implant in health and medicine, a cockpit weather system in transportation, and a smoke mask benefiting public safety; many other products are featured in these disciplines, as well as in the additional fields of consumer/home/recreation, environment and resources management, computer technology, and industrial productivity/ manufactacturing technology. Also in this issue, we devote an entire section to NASA s history in the field of flight and showcase NASA s newest enterprise dedicated to education. The Education Enterprise will provide unique teaching and learning experiences for students and teachers at all levels in science, technology, engineering, and mathematics. The Agency also is committed, as never before, to engaging parents and families through NASA s educational resources, content, and opportunities. NASA s catalyst to intensify its focus on teaching and learning springs from our mission statement: to inspire the next generation of explorers as only NASA can.

  14. Leadership: reflections over the past 100 years.

    Science.gov (United States)

    Gregoire, Mary B; Arendt, Susan W

    2014-05-01

    Leadership, viewed by the American Dietetic Association as the ability to inspire and guide others toward building and achieving a shared vision, is a much written-about topic. Research on leadership has addressed the topic using many different approaches, from a very simplistic definition of traits to a more complex process involving interactions, emotions, and learning. Thousands of books and papers have been published on the topic of leadership. This review paper will provide examples of the varying foci of the writings on this topic and includes references for instruments used to measure leadership traits and behaviors. Research is needed to determine effective strategies for preparing dietitians to be effective leaders and assume leadership positions. Identifying ways to help dietitians better reflect on their leadership experiences to enhance their learning and leadership might be one strategy to explore.

  15. Mendelism in human genetics: 100 years on.

    Science.gov (United States)

    Majumdar, Sisir K

    2003-01-01

    Genetics (Greek word--'genes' = born) is a science without an objective past. But the genre of genetics was always roaming in the corridors of human psyche since antiquity. The account of heritable deformities in human often appears in myths and legends. Ancient Hindu Caste system was based on the assumption that both desirable and undesirable traits are passed from generation to generation. In Babylonia 60 birth defects were listed on Clay tablets written around 5,000 year ago. The Jewish Talmud contains accurate description of the inheritance of haemophilia--a human genetic disorder. The Upanisads vedant--800--200 BC provides instructions for the choice of a wife emphasizing that no heritable illness should be present and that the family should show evidence of good character for several preceding generations. These examples indicate that heritable human traits played a significant role in social customs are presented in this article.

  16. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations.

    Science.gov (United States)

    Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S

    2016-04-15

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  17. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    Science.gov (United States)

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  18. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    Science.gov (United States)

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  19. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  20. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Energy Technology Data Exchange (ETDEWEB)

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  1. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  2. Solar geometry tool applied to systems and bio-climatic architecture; Herramienta de geometria solar aplicada a sistemas y arquitectura bio-climatica

    Energy Technology Data Exchange (ETDEWEB)

    Urbano, Antonio; Matsumoto, Yasuhiro; Aguilar, Jaime; Asomoza Rene [CIMVESTAV-IPN, Mexico, D.F (Mexico)

    2000-07-01

    The present article shows the annual solar path, by means of graphic Cartesian, as well as the use of these, taken as base the astronomical, geographical antecedents and of the place. These graphs indicate the hours of sun along the day, month and year for the latitude of 19 Celsius degrees north, as well as the values of radiation solar schedule for the most important declines happened annually (equinoxes, solstices and the intermediate months). These graphs facilitate the user's good location to evaluate inherent obstacles of the environment and to determine in the place, the shades on the solar equipment or immovable (mountains, tree, buildings, windows, terraces, domes, et cetera), the hours of sun or the radiation for the wanted bio-climatic calculation. The present work is a tool of place engineering for the architects, designers, manufactures, planners, installers, energy auditors among other that require the use of the solar energy for anyone of its multiple applications. [Spanish] El presente articulo, muestra las trayectorias solares anules, mediante graficas cartesianas, asi como la utilizacion de estas, tomando como base los antecedentes astronomicos, geograficos y del lugar. Estas graficas indican las horas del sol a lo largo del dia, mes y ano para la latitud de 19 grados Celsius norte, asi como los valores de radiacion solar horaria para las declinaciones mas importantes ocurridas anualmente (equinoccios, solsticios y los meses intermedios). Estas graficas facilitan la ubicacion optima del usuario para evaluar obstaculos inherentes del entorno y determinar en el sitio, las sombras sobre los equipos solares o inmuebles (montanas, arboles, edificios, ventanas, terrazas, domos, etc.), las horas de sol o bien la radiacion para el calculo bio-climatico deseado. El presente trabajo es una herramienta de Ingenieria de sitio para los Arquitectos, Disenadores, Constructores, Proyectistas, Instaladores, Auditores Energeticos entre otros, que requieran el

  3. Structural engineering developments in power plant cooling tower construction. 100 years of natural draught cooling towers - from tower cooler to cooling tower. Bautechnische Entwicklungen im Kraftwerkskuehlturmbau. 100 Jahre Naturzugkuehltuerme - vom Kaminkuehler zum Kuehlkamin

    Energy Technology Data Exchange (ETDEWEB)

    Damjakob, H.; Depe, T.; Vrangos, V. (Balcke-Duerr AG, Ratingen (Germany))

    1992-06-01

    Almost exactly 100 years ago, tower-type structures were first used for the production of artificial ventilation for cooling purposes. The shell of these so-called tower coolers, today known as 'natural draught cooling towers', was, from the outset, the subject of multiple structural engineering develepments in respect of design, material, construction and statistical calculation. These developments have been stimulated especially by the spasmodic increase in dimensions in the application of power plant cooling towers and, more recently, in connection with ecological requirements. (orig.).

  4. The extreme dry/wet events in northern China during recent 100 years%中国近代北方极端干湿事件的演变规律

    Institute of Scientific and Technical Information of China (English)

    马柱国; 丹利; 胡跃文

    2004-01-01

    Using monthly precipitation and monthly mean temperature, a surface humid index was proposed. According to the index, the distributed characteristics of extreme dryness has been fully analyzed. The results indicated that there is an obvious increasing trend of extreme dryness in the central part of northern China and northeastern China in the last 10 years, which shows a high frequency period of extreme dryness; while a low frequency period in the regions during the last 100 years. Compared with variation trend of the temperature in these regions, the region of high frequent extreme dryness is consistent with the warming trend in the same region.

  5. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    Science.gov (United States)

    Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available.

  6. Applying Genetic Algorithms and RIA technologies to the development of Complex-VRP Tools in real-world distribution of petroleum products

    Directory of Open Access Journals (Sweden)

    Antonio Moratilla Ocaña

    2014-12-01

    Full Text Available Distribution problems had held a large body of research and development covering the VRP problem and its multiple characteristics, but few investigations examine it as an Information System, and far fewer as how it should be addressed from a development and implementation point of view. This paper describes the characteristics of a real information system for fuel distribution problems at country scale, joining the VRP research and development work using Genetic Algorithms, with the design of a Web based Information System. In this paper a view of traditional workflow in this area is shown, with a new approximation in which is based proposed system. Taking into account all constraint in the field, the authors have developed a VRPWeb-based solution using Genetic Algorithms with multiple web frameworks for each architecture layer, focusing on functionality and usability, in order to minimizing human error and maximizing productivity. To achieve these goals, authors have use SmartGWT as a powerful Web based RIA SPA framework with java integration, and multiple server frameworks and OSS based solutions,applied to development of a very complex VRP system for a logistics operator of petroleum products.

  7. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  8. Predicting pathogen transport and risk of infection from land-applied biosolids

    Science.gov (United States)

    Olson, M. S.; Teng, J.; Kumar, A.; Gurian, P.

    2011-12-01

    Biosolids have been recycled as fertilizer to sustainably improve and maintain productive soils and to stimulate plant growth for over forty years, but may contain low levels of microbial pathogens. The Spreadsheet Microbial Assessment of Risk: Tool for Biosolids ("SMART Biosolids") is an environmental transport, exposure and risk model that compiles knowledge on the occurrence, environmental dispersion and attenuation of biosolids-associated pathogens to estimate microbial risk from biosolids land application. The SMART Biosolids model calculates environmental pathogen concentrations and assesses risk associated with exposure to pathogens from land-applied biosolids through five pathways: 1) inhalation of aerosols from land application sites, 2) consumption of groundwater contaminated by land-applied biosolids, 3) direct ingestion of biosolids-amended soils, 4) ingestion of plants contaminated by land-applied biosolids, and 5) consumption of surface water contaminated by runoff from a land application site. The SMART Biosolids model can be applied under a variety of scenarios, thereby providing insight into effective management practices. This study presents example results of the SMART Biosolids model, focusing on the groundwater and surface water pathways, following biosolids application to a typical site in Michigan. Volumes of infiltration and surface water runoff are calculated following a 100-year storm event. Pathogen transport and attenuation through the subsurface and via surface runoff are modeled, and pathogen concentrations in a downstream well and an adjacent pond are calculated. Risks are calculated for residents of nearby properties. For a 100-year storm event occurring immediately after biosolids application, the surface water pathway produces risks that may be of some concern, but best estimates do not exceed the bounds of what has been considered acceptable risk for recreational water use (Table 1); groundwater risks are very uncertain and at the

  9. 100 years of power plant technology - 100 years of material technology

    Energy Technology Data Exchange (ETDEWEB)

    Schoch, W.

    1983-07-01

    The introduction of the steam turbine required a spasmodic further development of steam boiler technology for the development of new boilers. The difficulties which occurred in this process due to the lack of suitable steels are indicated in this paper. Design and manufacture have nevertheless still not been resolved satisfactorily. With the founding of the VGB the operators endeavoured to find solutions. Further developments up to the technical maturity of our present conventional and nuclear power plant technology are described.

  10. A summary review of the photos of the ethnic groups in Yunnan in the past 100 years%云南百年民族题材照片的人类学解读

    Institute of Scientific and Technical Information of China (English)

    尹绍亭

    2015-01-01

    This paper gives a comprehensive review of the photos of the ethnic groups in Yunnan taken by Chinese and international ethnologists and anthropologists in the past 100 years.These pho-tos are divided into three historical periods according to their characteristics.The paper discusses the theoretical orientations,cultural implications as well as the value,significance and weaknesses of these photos as precious data of visual anthropology.%本文系统地梳理回顾了中外人类学民族学者百年来所拍摄的云南田野照片。根据拍摄时代和资料的特征,笔者将照片分为三个阶段,并逐一讨论各阶段照片资料的理论取向和文化内涵,表现了作为影视人类学重要途径的照片拍摄研究的价值、意义以及存在的问题。

  11. 1,3:2,4-Dibenzylidene-D-sorbitol (DBS) and its derivatives--efficient, versatile and industrially-relevant low-molecular-weight gelators with over 100 years of history and a bright future.

    Science.gov (United States)

    Okesola, Babatunde O; Vieira, Vânia M P; Cornwell, Daniel J; Whitelaw, Nicole K; Smith, David K

    2015-06-28

    Dibenzylidene-D-sorbitol (DBS) has been a well-known low-molecular-weight gelator of organic solvents for over 100 years. As such, it constitutes a very early example of a supramolecular gel--a research field which has recently developed into one of intense interest. The ability of DBS to self-assemble into sample-spanning networks in numerous solvents is predicated upon its 'butterfly-like' structure, whereby the benzylidene groups constitute the 'wings' and the sorbitol backbone the 'body'--the two parts representing the molecular recognition motifs underpinning its gelation mechanism, with the nature of solvent playing a key role in controlling the precise assembly mode. This gelator has found widespread applications in areas as diverse as personal care products and polymer nucleation/clarification, and has considerable potential in applications such as dental composites, energy technology and liquid crystalline materials. Some derivatives of DBS have also been reported which offer the potential to expand the scope and range of applications of this family of gelators and endow the nansocale network with additional functionality. This review aims to explain current trends in DBS research, and provide insight into how by combining a long history of application, with modern methods of derivatisation and analysis, the future for this family of gelators is bright, with an increasing number of high-tech applications, from environmental remediation to tissue engineering, being within reach.

  12. Review and Preview on The Development of the World women's 100 m Running in Nearly 100 Years%世界女子百米短跑百年发展的回顾与前瞻

    Institute of Scientific and Technical Information of China (English)

    王刚; 辛飞庆; 辛飞兵

    2014-01-01

    为了解和掌握近百年来世界女子百米短跑变化状况,运用文献资料法、数理统计法和比较法,分析世界女子百米短跑发展趋势,旨在探索当今及未来世界女子百米短跑发展趋势。结论:奥运会女子纪录不断被刷新,美国女子百米仍处世界领先水平,夺取奥运女子百米桂冠黑人选手居多,女性的生理结构决定女子百米成绩已达到女子极限。%This paper aims to understand and grasp the changes of the world women's 100 m running in nearly 100 years, explores the current and future development trend of world women's 100m running by methods of literature material , mathematical statistics and comparative law.Results show that the Olym-pic Games women's record has been refreshed on and on , the United States sprinters are still leading in the world, black sprinters is in the majority seizing the best title of Olympic games , the performance has reached the limit determined by the physiological structure of the female.

  13. Bank erosion history of a mountain stream determined by means of anatomical changes in exposed tree roots over the last 100 years (Bílá Opava River — Czech Republic)

    Science.gov (United States)

    Malik, Ireneusz; Matyja, Marcin

    2008-06-01

    The date of exposure of spruce roots as a result of bank erosion was investigated on the Bílá Opava River in the northeastern Czech Republic. Following the exposure of roots, wood cells in the tree rings divide into early wood and late wood. Root cells within the tree rings also become smaller and more numerous. These processes permit dating of the erosion episodes in which roots were exposed. Sixty root samples were taken from seven sampling sites selected on two riverbed reaches. The results of root exposure dating were compared to historical data on hydrological flooding. Using the root exposure dating method, several erosion episodes were recorded for the last 100 years. The greatest bank erosion was recorded as consequence of an extraordinary flood in July 1997. In the upper, rocky part of the valley studied, bank erosion often took place during large floods that occurred in the early 20th century. In the lower, alluvial part of the valley, erosion in the exposed roots was recorded only in 1973 and has been intensive ever since. It is suggested that banks in the lower part are more frequently undercut, which leads to the falling of trees within whose roots older erosion episodes were recorded. Locally, bank erosion is often intensified by the position of 1- to 2-m boulders in the riverbed, which direct water into the parts of the banks where erosion occurs. Selective bank erosion could be intensified by debris dams and hillslope material supply to the riverbed.

  14. 近百年El Nino/La Nina事件与北京气候相关性分析%Correlation Analysis Between El Nino/La Nina Phenomenon During the Recent 100 Years and Beijing Climate

    Institute of Scientific and Technical Information of China (English)

    刘桂莲; 张明庆

    2001-01-01

    Results of the analysis suggest that during the recent 100 years there e xists a strong correlation between the El Nino/La Nina phenomenon and Beijing′s rainfall in summer(June—August),mean monthly maximum temperature (July) and mean monthly minimum temperature in winter (January).El Nino phenomenon appears a negative-correlation with the summer rainfall and the mean monthly minimum te mperature;whereas a positive correlation with the mean monthly maximum temperatu re in summer.La Nina phenomenon appears a positive correlation with the summer r ainfall and the mean monthly minimum temperature in winter;whereas a negative-c orrelation with the mean monthly maximum temperature in summer.%通过对近百年El Nino/La Nina事件与北京气候相关性研究发现,El Nino/ La Nina事件与北京夏季(6~8月)降水、平均最高气温(7月)和冬季(1月)平均最低气温之间 相互关系显著。El Nino事件与夏季降水、冬季平均最低气温呈负相关,与夏季平均最高气 温呈正相关,造成降水减少,气温年较差增大,大陆性增强的气候特点。La Nina事件与夏 季降水、冬季平均最低气温呈正相关,与夏季平均最高气温呈负相关,使降水增加,气温年 较差减小,大陆性减弱的气候特点。

  15. 中国近百年地面温度变化自然因子的因果链分析%The Causal Chain Analysis of Natural Factors for China Sur face Temperature Variation during the Recent 100 Years

    Institute of Scientific and Technical Information of China (English)

    朱玉祥; 赵亮

    2014-01-01

    采用格兰杰(Granger)因果检验法,从自然因素的角度,即从天文因子(太阳黑子数SSN)和地球运动因子(地极移动X方向和Y方向)的角度,对我国近百年地面温度(TC)的变化进行了归因分析。所得结果如下:滞后1~11年内, SSN都不是TC的Granger原因;对于TC和极移X方向,当滞后6年时信度最高,此时极移X方向是TC的Granger原因(87%信度)。研究结果可能暗示,极移X方向的变化可能会导致6年后中国地面气温的变化。%The attribution analysis of China surface temperature(TC) in the recent 100 years is done from the angle of the astronomical factor(sunspot number, SSN) and the earth movement factor(polar shift x direction and y direction ) by using Granger causality test. The results are as follows:(1) SSN is not the Granger cause of TC in all of 1 to 11 years lags;(2) when the lag is 6 years, conifdence is the highest, at this time polar shift x direction is the Granger cause of TC (87%conifdence);(3) when the lag is 12 years, TC is the Granger cause of the polar shift y direction, the conifdence is 86%. The research results of the paper suggest that the change of polar shift x direction possibly results in the change of TC, and the change of TC possibly inlfuences the change of the polar shift y direction.

  16. Oil exploration and production activities after the flexibilizing of the strategical state monopoly in Brazil: environmental control tools applied by governmental bodies; Activites d'exploration et de production du petrole dans le nouveau scenario de flexibilite du monopole d'Etat au Bresil. Les controles gouvernementaux pour la protection de l'environnement

    Energy Technology Data Exchange (ETDEWEB)

    Malheiros, T.M.M. [IBAMA, Institut bresilien pour l' Environnement et les Ressources Naturelles Renouvelables, Rio de Janeiro, RI (Brazil); La Rovere, E.L. [Centro de Tecnologia, PPE/COPPE/UFRJ, Rio de Janeiro (Brazil)

    2000-07-01

    The goal of this paper is to discuss the environmental control tools applied by Brazilian governmental bodies to oil exploration and production activities after the flexibilizing of the strategical state monopoly in this sector. An analysis of the environmental control tools applied up to now by governmental bodies is needed due to the fast growth rate of these activities in the last few months and to the entrance of new players in this sector. This work presents the new scenario of the flexibilizing of the state oil monopoly in Brazil and the current situation of environmental control tools applied to oil exploration and production activities. Follow some proposals of changes in the environmental licensing procedures, and for the adoption of environmental audits aiming at an improved environmental control of these activities in the current Brazilian context. (authors)

  17. LEGO Mindstorms在农业信息与自动化技术课程教学中的应用研究%Applying LEGO Mindstorms robots as a teaching tool in Agricultural Information and Automation education

    Institute of Scientific and Technical Information of China (English)

    冯雷; 郭亚芳; 王若青; 沈明卫; 何勇

    2012-01-01

    为了培养农业与生物系统工程类学生的创新能力,引导学生进行自主、探究式学习,在精细农业等农业信息和自动化技术类课程教学中应用LEGO Mindstorms组件作为教具,研制智能化可编程农机作业模拟系统.多组学生在Robolab编程环境下,使用此系统构建小型智能农业机械模型.结果表明,学生对这个实验项目的效果很满意.因为它不仅为学生提供了解决问题的技巧,提高了他们编程和机械设计等相关技能,同时有益于学生间形成团队协作的良好氛围.介绍了LECGO Mindstorms在相关农业信息课程教学中的探索应用以及实验结果.%The objective is to present details of an LEGO Mindstorms robot design challenge arranged for agriculture and biosystems engineering students in Zhejiang University. As a new teaching tool in the course of Agricultural Information and Automation, different groups of students built Robotic Agricultural Machines using LEGO Mindstorms kits, with Robolab as the programming environment. A survey among 30 students showed that all students were challenged by the projects and were highly satisfied with the outcomes. They all strongly agreed that the projects were effective in helping them to work in teams, apply problem-solving techniques and to boost their programming and mechanical design skills.

  18. Applied superconductivity

    CERN Document Server

    Newhouse, Vernon L

    1975-01-01

    Applied Superconductivity, Volume II, is part of a two-volume series on applied superconductivity. The first volume dealt with electronic applications and radiation detection, and contains a chapter on liquid helium refrigeration. The present volume discusses magnets, electromechanical applications, accelerators, and microwave and rf devices. The book opens with a chapter on high-field superconducting magnets, covering applications and magnet design. Subsequent chapters discuss superconductive machinery such as superconductive bearings and motors; rf superconducting devices; and future prospec

  19. Micromachining with Nanostructured Cutting Tools

    CERN Document Server

    Jackson, Mark J

    2013-01-01

    The purpose of the brief is to explain how nanostructured tools can be used to machine materials at the microscale.  The aims of the brief are to explain to readers how to apply nanostructured tools to micromachining applications. This book describes the application of nanostructured tools to machining engineering materials and includes methods for calculating basic features of micromachining. It explains the nature of contact between tools and work pieces to build a solid understanding of how nanostructured tools are made.

  20. Philosophical Foundation of Chinese Modernity and the Development of Chinese Philosophy in Recent 100 Years%中华现代性的哲学奠基与百年来的中国哲学

    Institute of Scientific and Technical Information of China (English)

    沈清松

    2013-01-01

    提本文关心的问题在于“中华现代性”的哲学奠基,并扣紧此一思路将百年来的中国哲学发展分为四个阶段。第一阶段,哲学作为西方现代性基础思想之引进者,最领风骚。第二阶段,哲学协助反思民族精神,虽渐退居第二线,然已开始思索中华现代性的方向。第三阶段,哲学工作者致力提出数个融合中西的哲学体系,试图为中华现代性作哲学奠基,在人文、社会科学界中实属难能可贵。现在,处于第四期后期,中国哲学面对全球化与后现代挑战,应能在跨文化与全世界脉络中继续探辟概念系统、生活世界与中华精神资源,进行批判性、合理性、顾全整体的哲学思索。%Focusing on the problematics of Chinese modernity and its philosophical foundation, this paper divides the development of philosophy in China in recent 100 years into four peroiods.In its first period, from 1911 to 1927, philosophers in China were the most enthusiastic in introdu-cing Western modernity and its philosophical foundations from various forms and doctrines of modern Western philosophy.This period was the most pro-gressive and, indeed, impacted all Chinese intellectuals at the time.In its second period from 1928 to 1949, during the time of national construction and the Japanese invasion, philosophers stepped behind, serving as helpers in clarifying and articulating the Chinese spirit and Chinese subjectivity at the time of its awakening.This prepared for the philosophical foundation of a Chinese model of modernity.In the third period from 1949 to 1980, some major philosophical figures had built their philosophical systems synthesizing Western and Chinese philosophies.This was, indeed, most precious, and also unique, in comparison with other disciplines in the humanities and the social sciences.These philosophical systems could be seen as divers at-tempts to lay the philosophical foundation of Chinese

  1. Applied mathematics

    CERN Document Server

    Logan, J David

    2013-01-01

    Praise for the Third Edition"Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." -MAA Reviews Applied Mathematics, Fourth Edition is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and nat

  2. Four self management assessment tools applied in the health education of CRF%4种自我管理测评工具在CRF健康教育中的应用

    Institute of Scientific and Technical Information of China (English)

    寇琳; 官计; 张娅

    2012-01-01

    目的 探讨4种自我行为管理测评工具在慢性肾功能衰竭患者健康教育中应用的可行性以及有效性.方法 按诊断标准选取该院200例慢性肾功能衰竭(CRF)患者,随机分为4组,各组均给予相同方式的健康教育,并在其前后分别以多方面健康状况-健康功能指标量表(Ⅰ组)、慢性病自我管理研究测评量表(Ⅱ组)、Partners健康量表(Ⅲ组)、住院患者活动自评量表(Ⅳ组)4种自我管理测评工具对其健康教育的效果进行测评.结果 各组健康教育前后的测评结果 差异均具有统计学意义,(P<0.05);Ⅰ组、Ⅱ组、Ⅲ组、Ⅳ组重测信度分别为0.75、0.71、0.82、0.88,(P<0.05);Cronbach's a系数分别为0.81、0.71、0.93、0.91,(P<0.05).经对比,Ⅲ组所使用Partners健康量表对CRF健康教育的效果进行测评时表现出更高的适用性和有效性.结论 使用Partners健康量表在应用于慢性肾功能衰竭患者健康教育中具有良好的信效度,能更大限度的保证其结果 的真实性、准确性、有效性和可靠性,值得临床推广.%Objective To study 4 kinds of self behavior management assessment tools in patients with chronic renal failure health education,the application feasibility and effectiveness. Methods According to the selection criteria for the diagnosis of 200 cases of chronic renal failure ( CRF) patients were randomly divided into four groups and each group was given the same health education. Health-health function index scale (group I) .chronic disease self management research assessment scale (group II) .Partners health scale (group III ) .hospitalized patients self-evaluation scale ( group IV ) 4 kinds of self management assessment tools were used respectively to evaluate health education effects. Results For each group before and after the health education evaluation results were statistically different ( P <0.05 ). Test-retest reliabilities of group I ,group n.group III,group IV were

  3. Geometric reasoning about assembly tools

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  4. Ferramenta fórum para discussão teórica em Estatística aplicada à Administração Forums as tools for theoretical discussion in Statistics applied to Administration

    Directory of Open Access Journals (Sweden)

    Daielly Melina Nassif Mantovani

    2010-08-01

    Full Text Available O objetivo deste artigo é analisar a utilização da ferramenta fórum no ensino de Estatística Aplicada à Administração. Estudou-se um fórum realizado em uma disciplina semipresencial, no qual os alunos discutiriam sobre um determinado estudo de caso que deveriam resolver (pesquisa sobre a viabilidade de um novo negócio. Observou-se que a maior parte dos alunos matriculados participou da atividade, provavelmente devido à sua obrigatoriedade. A maioria das mensagens foi classificada como relevante, correta e adequada, caracterizando debates. Ocorreu, porém, grande repetição, provavelmente devido à grande quantidade de alunos e à predominante participação no último dia da atividade. Houve poucos acessos após o encerramento da discussão, o que indica que o conteúdo construído foi pouco utilizado como fonte de consulta. De forma geral, a experiência foi bem-sucedida, indicando que esta ferramenta pode efetivamente ser utilizada como estratégia pedagógica para o ensino de Estatística, desde que se elabore um planejamento cuidadoso.This paper aims to analyze the use of forum discussion as a tool in the teaching of statistics applied to administration. In the study of a forum occurred in a hybrid discipline, the students were supposed to discuss a case study that they should solve (a new business viability research. Most students participated in the activity, probably due to its obligatory nature. Most messages posted were considered relevant, correct and adequate, characterizing the debates. There was, however, a lot of repetition, probably due to the great number of students interacting and to the high level of participation on the last day of the forum. There was little access to the forum after the last day of the interaction, which means that the content developed was not very much used as a research source. In general terms, this experience was successful, indicating that forums can really be used as a pedagogical

  5. Applied Enzymology.

    Science.gov (United States)

    Manoharan, Asha; Dreisbach, Joseph H.

    1988-01-01

    Describes some examples of chemical and industrial applications of enzymes. Includes a background, a discussion of structure and reactivity, enzymes as therapeutic agents, enzyme replacement, enzymes used in diagnosis, industrial applications of enzymes, and immobilizing enzymes. Concludes that applied enzymology is an important factor in…

  6. Applied dynamics

    CERN Document Server

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  7. Nutrient Dynamics over the Past 100 Years and Its Restoration Baseline in Dianshan Lake%淀山湖百年营养演化历史及营养物基准的建立

    Institute of Scientific and Technical Information of China (English)

    李小平; 陈小华; 董旭辉; 董志; 孙敦平

    2012-01-01

    Cyclotella bodanica,C.ocelata,Achnanthes minutissima,Cocconeis placentula var lineate,Cymbella sp.,Fragilaria piata,F.brevistrata,F.construens var venter to recent eutrophic species including Cyclostephanos dubius,C.atomus,Stephanodiscus minitulus,S.hantzschi,Aulacoseria alpigena.The epilimnetic TP over the past 100 years reconstructed using an established diatom-TP transfer function matches well with the monitoring TP where exists.Based on the sedimentary nutrient characteristics and diatom-reconstructed nutrient dynamics,we proposed that the nutrient baseline for Dianshan Lake is 50-60 μg·L-1,500 mg·kg-1 and 550 mg·kg-1 for water TP concentration,sedimentary TP and TN,respectively.

  8. Disclosure as a regulatory tool

    DEFF Research Database (Denmark)

    2006-01-01

    The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law.......The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law....

  9. Applied Literature for Healing,

    Directory of Open Access Journals (Sweden)

    Susanna Marie Anderson

    2014-11-01

    Full Text Available In this qualitative research study interviews conducted with elite participants serve to reveal the underlying elements that unite the richly diverse emerging field of Applied Literature. The basic interpretative qualitative method included a thematic analysis of data from the interviews yielding numerous common elements that were then distilled into key themes that elucidated the beneficial effects of engaging consciously with literature. These themes included developing a stronger sense of self in balance with an increasing connection with community; providing a safe container to engage challenging and potentially overwhelming issues from a stance of empowered action; and fostering a healing space for creativity. The findings provide grounds for uniting the work being done in a range of helping professions into a cohesive field of Applied Literature, which offers effective tools for healing, transformation and empowerment. Keywords: Applied Literature, Bibliotherapy, Poetry Therapy, Arts in Corrections, Arts in Medicine

  10. Applied combustion

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-12-31

    From the title, the reader is led to expect a broad practical treatise on combustion and combustion devices. Remarkably, for a book of modest dimension, the author is able to deliver. The text is organized into 12 Chapters, broadly treating three major areas: combustion fundamentals -- introduction (Ch. 1), thermodynamics (Ch. 2), fluid mechanics (Ch. 7), and kinetics (Ch. 8); fuels -- coal, municipal solid waste, and other solid fuels (Ch. 4), liquid (Ch. 5) and gaseous (Ch. 6) fuels; and combustion devices -- fuel cells (Ch. 3), boilers (Ch. 4), Otto (Ch. 10), diesel (Ch. 11), and Wankel (Ch. 10) engines and gas turbines (Ch. 12). Although each topic could warrant a complete text on its own, the author addresses each of these major themes with reasonable thoroughness. Also, the book is well documented with a bibliography, references, a good index, and many helpful tables and appendices. In short, Applied Combustion does admirably fulfill the author`s goal for a wide engineering science introduction to the general subject of combustion.

  11. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  12. 100 years' evolution of fisheries higher education and its strategic transformation in China%我国水产高等教育的百年沿革与战略转型

    Institute of Scientific and Technical Information of China (English)

    宁波

    2011-01-01

    In the early 20th century, in order to safeguard the country' s maritime rights and interests and to develop national fisheries industry, the government of Qing Dynasty and Republic of China learning from Japan, United States and other countries' experiences, began to develop China' s fisheries education. After the founding of new China, Shanghai Fisheries College and other fisheries colleges had been founded since 1952 in succession. In 1950's and 1960's, learning experiences from Soviet Union, the system of fisheries higher education was established in China. After decades of development, the fisheries higher education had made great achievements, and made outstanding contributions to the development of fisheries industry in China. In the late 20th century, to meet the needs of the marine industry development, and the self-development needs of fisheries higher education, and the needs of building a modern marine society, the fisheries colleges and universities have changed their names to marine universities, and have transformed from single discipline universities into multi-disciplinary marine universities. The transformation has promoted the development of marine higher education in China. For the effective development of marine higher education,it is suggested to take higher starting point and the road of international education, and to develop basic sciences and applied sciences coordinated, and to optimize the structure of marine higher education further,and to build a good three-dimensional linkage mechanism between the government, marine universities and the society.%20世纪初,为维护海权,清政府、国民政府先后学习日、美等国经验,开始发展水产教育.新中国成立后,上海水产学院等若干所本科水产学府从1952年起陆续建立.20世纪五六十年代,中国学习苏联水产高校经验,奠定中国水产高等教育的主要格局.经过几十年发展,中国水产高等教育取得很大成就,为中国水

  13. 代码重构工具在面向对象教学的应用探索%Research on Apply of Refactor Tools in Object-Orient Programming Teaching

    Institute of Scientific and Technical Information of China (English)

    沈健

    2013-01-01

    Presented a way that design several teaching cases and use the code refactoring tools for software reconstruction and im-provement in object-oriented programming experiment teaching. This method can improve the students' understanding of code reuse and reconstruction of software. It also can help student to understand the object-oriented ideas and improve the ability of programming.%针对面向对象程序教学,提出在实验教学中通过设计案例,应用代码重构工具对程序进行重构和改进,提高学生对代码复用以及软件重构等的认识,有助于面向对象思想的掌握,也有利于学生编程能力的提高。

  14. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  15. Applied ALARA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  16. Ludic Educational Game Creation Tool

    DEFF Research Database (Denmark)

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos

    2015-01-01

    creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play...

  17. Celebrating 100 Years of Flight: Testing Wing Designs in Aircraft

    Science.gov (United States)

    Pugalee, David K.; Nusinov, Chuck; Giersch, Chris; Royster, David; Pinelli, Thomas E.

    2005-01-01

    This article describes an investigation involving several designs of airplane wings in trial flight simulations based on a NASA CONNECT program. Students' experiences with data collection and interpretation are highlighted. (Contains 5 figures.)

  18. Engineering and malaria control: learning from the past 100 years

    DEFF Research Database (Denmark)

    Konradsen, Flemming; van der Hoek, Wim; Amerasinghe, Felix P

    2004-01-01

    Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental-management-based approa......Traditionally, engineering and environment-based interventions have contributed to the prevention of malaria in Asia. However, with the introduction of DDT and other potent insecticides, chemical control became the dominating strategy. The renewed interest in environmental...

  19. 100 Years of Curriculum History, Theory, and Research

    Science.gov (United States)

    Schoenfeld, Alan H.

    2016-01-01

    This article reviews a collection of papers written by the American Educational Research Association's first 50 presidents that deal specifically with curricular issues. It characterizes the ways in which curricula were conceptualized, implemented, and assessed, with an eye toward the epistemological and methodological framings that the authors…

  20. History of the Shaped Charge Effect: The First 100 Years

    Science.gov (United States)

    1990-03-22

    transferred, inasmuch as 10 Part 1 both originators of the effect were in proximiy - southern Gernmany and Switzerland border each other. Dr. Mohaupt’s...Mistel ( Mistletoe ) referred to the parasitic mounting of the top aircraft on the host aircraft. In the tactical version, the bomber’s nose was replaced...16) in the patents (Ref. 32) issued in France in 1940 and in Australia in 1941, wherein the inventors (Mohaupt and his two associates) had claimed the

  1. Evolutionary Lightsailing Missions for the 100-Year Starship

    Science.gov (United States)

    Friedman, L.; Garber, D.; Heinsheimer, T.

    Incremental milestones towards interstellar flight can be achieved in this century by building on first steps with lightsailing, the only known technology that might someday take us to the stars. That this is now possible is enabled by achievements of first solar sail flights, the use of nano-technology for miniaturization of spacecraft, advances in information processing and the decoding of our genomes into transportable form. This paper quantifies a series of robotic steps through and beyond the solar system that are practical and would stimulate the development of new technologies in guidance, navigation, materials, communication, sensors, information processing etc. while exploring ever-more distant, exciting space objectives at distances impractical for classical rocket-based technologies. There robotic steps may be considered as precursors to human interstellar flight, but they may also be considered as evolutionary steps that provide for a different future: One of virtual human interstellar flight that may bypass the ideas of the past (big rockets launching heavy people) in favour of those of the future ­ networking amongst the stars with information, and the physical transport of digital and biological genomes.

  2. [100 years of surgery in the Kosevo Hospital in Sarajevo].

    Science.gov (United States)

    Durić, O

    1994-01-01

    The surgery Department of the Regional Hospital was opened on 1st July, 1894 in Sarajevo, what meant the beginnings of European surgery school influence here. The School was in the second half of its activity, better known as "century of surgery". The building, fittings, equipment and staff continued their work here coping the Viennese school achievements. It was headed by the prominent European surgeon, primarius Dr Josef Preindisberger, first assistant to the great personality Dr. Billroth. In the way this institution became a referral centre for two other hospitals in Sarajevo: the Vakuf's and the Military Hospital, but for some 17 more in BH, which were built in the course of ten years. Because of the therapeutic success in the domain of the general surgery and diseases of the eye and according the annual reports, the first 50 beds became insufficient for all those who wanted the treatment. So, the Department was enlarged, in 1905 a new regional Hospital was planned, to act as clinics. The World War 1 stopped the plans. During the period of Kingdom of Yugoslavia, destroyed by war, the Surgery Department continued its work with the doctors educated to continue the work on the pre war level. As a broad pathology basis, but the need of space that time chief surgeon. Primarius Milivoje Kostić worked out in details the former plan of the new hospital building up with a base for clinics. It was accepted as a ten years project, which, to the regrets, did not come to existence to the World War 2.(ABSTRACT TRUNCATED AT 250 WORDS)

  3. 100 years of public electricity supply in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Gersdorff, B. von

    1984-05-08

    On May, 8sup(th), 1884, Emil Rathenau founded Germany's first power utility company, the Municipal Electricity Works, Berlin, the predecessor of today's Berlin Power and Light Co. (Bewag). Rathenau had recognised the significance of Thomas Alva Edison's light bulb for a broad application of electricity. Block power plants with machine outputs of 150 PS were at the beginning of their development in 1884. Today, in nuclear power plants with machine outputs of around 1300 MW, the technology of power supply has reached the ''large technology'' prophesied by Rathenau at the start of the century. With today's challenge of environment conservation the power utilities are confronted with new tasks for the future.

  4. 100 Years of Attempts to Transform Physics Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2016-01-01

    As far back as the late 1800s, U.S. physics teachers expressed many of the same ideas about physics education reform that are advocated today. However, several popular reform efforts eventually failed to have wide impact, despite strong and enthusiastic support within the physics education community. Broad-scale implementation of improved…

  5. Huck Finn: 100 Years of Durn Fool Problems.

    Science.gov (United States)

    Stanek, Lou Willett

    1985-01-01

    Discusses the censorship of Mark Twain's "Huckleberry Finn" since it was first published in 1885. Highlights include Twain's public image, viewpoints of censors, the banning of the book and school censorship cases, and the celebration of the centennial of "Huckleberry Finn." Nine references are cited. (EJS)

  6. Developing Resilient Children: After 100 Years of Montessori Education

    Science.gov (United States)

    Drake, Meg

    2008-01-01

    In this millennium, educators are faced with a number of issues that Dr. Maria Montessori could not have predicted. Today, students are different from the children Dr. Montessori observed in her "Casa dei Bambini." They are influenced by technology in all its forms. Some suffer from medical problems such as complex food allergies, which wreak…

  7. 100 years after the Marsica earthquake: contribute of outreach activities

    Science.gov (United States)

    D'Addezio, Giuliana; Giordani, Azzurra; Valle, Veronica; Riposati, Daniela

    2015-04-01

    Many outreach events have been proposed by the scientific community to celebrate the Centenary of the January 13, 1915 earthquake, that devastated the Marsica territory, located in Central Apennines. The Laboratorio Divulgazione Scientifica e Attività Museali of the Istituto Nazionale di Geofisica e Vulcanologia (INGV's Laboratory for Outreach and Museum Activities) in Rome, has realised an interactive exhibition in the Castello Piccolomini, Celano (AQ), to retrace the many aspects of the earthquake disaster, in a region such as Abruzzo affected by several destructive earthquakes during its history. The initiatives represent an ideal opportunity for the development of new programs of communication and training on seismic risk and to spread the culture of prevention. The INGV is accredited with the Servizio Civile Nazionale (National Civic Service) and volunteers are involved in the project "Science and Outreach: a comprehensive approach to the divulgation of knowledge of Earth Sciences" starting in 2014. In this contest, volunteers had the opportunity to fully contribute to the exhibition, in particular, promoting and realising two panels concerning the social and environmental consequences of the Marsica earthquake. Describing the serious consequences of the earthquake, we may raise awareness about natural hazards and about the only effective action for earthquake defense: building with anti seismic criteria. After studies and researches conducted in libraries and via web, two themes have been developped: the serious problem of orphans and the difficult reconstruction. Heavy snowfalls and the presence of wolves coming from the high and wild surrounding mountains complicated the scenario and decelerated the rescue of the affected populations. It is important to underline that the earthquake was not the only devastating event in the country in 1915; another drammatic event was, in fact, the First World War. Whole families died and the still alive infants and children were sent to Rome in hospitals and in other suitable structures. Many stories of poor orphans are known but we decided to outlines stories that besides the dramma had an happy ending. To understand the hugeness of the tragedy, we may consider that the number of towns and villages completely destroyed by the earthquake was more than fifty. The reconstruction was very difficult and slow also because of the war, and involved the relocation of settlements in different places. The first shelters to be reconstructed were those for survivors: very small shacks built with anti seismic criteria. They are still on the territory, to be a symbol of the reconstruction and a remined evidence of the earthquake.

  8. [100 years of studying poliomyelitis virus and nonpoliomyelitis enteroviruses].

    Science.gov (United States)

    Lashkevich, V A

    2008-01-01

    M. P. Chumakov Institute of Poliomyelitis and Viral Encephalitis, Russian Academy of Medical Sciences, Moscow The paper deals with the history of discovery of poliomyelitis virus by K. Landsteiner and E. Popper in 1908, the identification of three immunological types of the virus in 1949, the discovery of viral multiplication in the cultures of non-nerve cells with a cytopathogenic effect by A. Anders in 1949, the development of new diagnostic techniques, the design of inactivated poliovirus vaccine by D. Salk in 1953 and its live vaccine by A. Sabin in 1957. The advantages and disadvantages of these vaccines and the prospects for further poliomyelitis control are discussed. The characteristics and role of nonpoliomyelitis enteroviruses are considered. The most important scientific discoveries made in the study of enteroviruses are noted.

  9. Mário de Sá-Carneiro, 100 years later

    Directory of Open Access Journals (Sweden)

    Paula Cristina Costa

    2016-12-01

    Full Text Available http://dx.doi.org/10.5007/2175-7917.2016v21n2p112 Mário de Sá-Carneiro’s work is still current even hundred years after his death: the modernity of his style is not limited to the modernist fashions of the time, but remains contemporary of today. This work has a large thematic coherence. Throughout his poetry, prose and drama, is repeated anew the theme of Myself/Other, the desire to achieve absolute perfection, such as Icarus, and the failure of this fulfillment. In texts like the well known «Quasi» and «7» poems as well as in the «A Confissão de Lúcio» novel, it is clearly present this wish for the merge between Myself and the Other and the impossibility of achieving it. Mário de Sá-Carneiro was, together with Fernando Pessoa, one of the founders of the Portuguese Modernism and one of the directors of «Orpheu» magazine. Both poets created several important isms for the Portuguese Modermism: paulismo, interseccionismo, sensacionismo. Nevertheless they kept themselves loyal to their own styles.

  10. 100 YEARS OF AUDI%百年奥迪

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Since entering China in 1988, Audi has been ranked "China Auto Customer Service Satisfaction" champion and "China Auto Sales Satisfaction" champion ten times by J.D. Power, meanwhile becoming the first luxury car brand in the Chinese market to exceed one million units in sales. This success is not only due to the formidable technical background based on the idea of "Innovation through Technology,"%源于德国的奥迪汽车1988年挺进中国,二十几年内,十次摘得J.D.Power公布的"中国汽车用户服务满意度"冠军和"中国汽车销售满意度"冠军,并成为第一个中国地区累计销量超过100万辆的高档汽车品牌。奥迪在中国的成功不仅源自"突破科技、启迪未来"的技术背景,更要归功于其品牌策划与本地用户的良性互动。3月21日,"心动上海"奥迪艺术展在沪申画廊开幕。策展人顾振清生于上海,

  11. The death of Florence Nightingale: BJN 100 years ago.

    Science.gov (United States)

    Castledine, Sir George

    This August marks the centenary of the death of Florence Nightingale, who died at 2 o'clock on Saturday 13 August 1910 at her home, 10 South Street, Park Lane, London. The following are some snippets which appeared in the BJN of the 20 and 27 August 1910. It was not until the announcement of her death in the morning papers of Monday 15 August that the country heard about Nightingale's death. In her last hours she was attended by Sir Thomas Barlow and two nurses from the Nursing Sisters' Institution, Devonshire Square, founded by Mrs Elizabeth Fry in 1840.

  12. 100 Years of Attempts to Transform Physics Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2016-12-01

    As far back as the late 1800s, U.S. physics teachers expressed many of the same ideas about physics education reform that are advocated today. However, several popular reform efforts eventually failed to have wide impact, despite strong and enthusiastic support within the physics education community. Broad-scale implementation of improved instructional models today may be just as elusive as it has been in the past, and for similar reasons. Although excellent instructional models exist and have been available for decades, effective and scalable plans for transforming practice on a national basis have yet to be developed and implemented. Present-day teachers, education researchers, and policy makers can find much to learn from past efforts, both in their successes and their failures. To this end, we present a brief outline of some key ideas in U.S. physics education during the past 130 years. We address three core questions that are prominent in the literature: (a) Why and how should physics be taught? (b) What physics should be taught? (c) To whom should physics be taught? Related issues include the role of the laboratory and attempts to make physics relevant to everyday life. We provide here only a brief summary of the issues and debates found in primary-source literature; an extensive collection of historical resources on physics education is available at https://sites.google.com/site/physicseducationhistory/home.

  13. I. P. PAVLOV: 100 YEARS OF RESERACH ON ASSOCIATIVE LEARNING

    Directory of Open Access Journals (Sweden)

    GERMÁN GUTIÉRREZ

    2005-07-01

    Full Text Available A biographical summary of Ivan Pavlov is presented, emphasizing his academic formation and achievements, and hiscontributions to general science and psychology. His main findings on associative learning are described and three areasof current development in this area are discussed: the study of behavioral mechanisms, the study of neurobiologicalmechanisms and the functional role of learning.

  14. Applied investigation on multimedia interactive teaching system of numerical control machine tools%多媒体视频互动系统在数控机床实训中的应用研究

    Institute of Scientific and Technical Information of China (English)

    施立钦

    2014-01-01

    多媒体数控机床实训互动教学系统由高清示教系统、视频监控系统、视频显示系统、语音对讲系统、集中控制系统及课程录播等多个子系统构成,并通过后台软件融合成一个有机的整体。运用该系统可以解决当前高职院校数控技术专业面临的教学方法滞后、实践效果差、安全隐患不断、教师资源不足等诸多问题,探索出一种新的多媒体视频互动系统在数控机床实训中的教学模式。%Multimedia interactive teaching system of numerical control machine tools consisted of high deifnition demonstration teaching system, video monitoring system, video display system, voice intercom system, centralized control system, course recorded and multiple subsystems and formed an organic whole through the background software. The multimedia interactive teaching system solved the current numerical control technology specialty teaching facing many problems such as teaching lag, low practice teaching method effect, safety hidden trouble, insufifciency of teacher resources and many other issues in higher vocational colleges. A new teaching mode was explored.

  15. C-TOOL

    DEFF Research Database (Denmark)

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John

    2014-01-01

    Soil organic carbon (SOC) is a significant component of the global carbon (C) cycle. Changes in SOC storage affect atmospheric CO2 concentrations on decadal to centennial timescales. The C-TOOL model was developed to simulate farm- and regional-scale effects of management on medium- to long......-term SOC storage in the profile of well-drained agricultural mineral soils. C-TOOL uses three SOC pools for both the topsoil (0–25 cm) and the subsoil (25–100 cm), and applies temperature-dependent first order kinetics to regulate C turnover. C-TOOL also enables the simulation of 14C turnover. The simple...... model structure facilitates calibration and requires few inputs (mean monthly air temperature, soil clay content, soil C/N ratio and C in organic inputs). The model was parameterised using data from 19 treatments drawn from seven long-term field experiments in the United Kingdom, Sweden and Denmark...

  16. Applied mechanics of solids

    CERN Document Server

    Bower, Allan F

    2009-01-01

    Modern computer simulations make stress analysis easy. As they continue to replace classical mathematical methods of analysis, these software programs require users to have a solid understanding of the fundamental principles on which they are based. Develop Intuitive Ability to Identify and Avoid Physically Meaningless Predictions Applied Mechanics of Solids is a powerful tool for understanding how to take advantage of these revolutionary computer advances in the field of solid mechanics. Beginning with a description of the physical and mathematical laws that govern deformation in solids, the text presents modern constitutive equations, as well as analytical and computational methods of stress analysis and fracture mechanics. It also addresses the nonlinear theory of deformable rods, membranes, plates, and shells, and solutions to important boundary and initial value problems in solid mechanics. The author uses the step-by-step manner of a blackboard lecture to explain problem solving methods, often providing...

  17. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  18. Herramientas y propuestas de innovación basadas en la tecnología de realidad aumentada aplicadas a la literatura infantil y juvenil / Tools and proposals for innovation based on augmented reality technology applied children's literature

    Directory of Open Access Journals (Sweden)

    Noelia Margarita Moreno Martínez

    2017-01-01

    Full Text Available Resumen En la sociedad de la información en la que nos encontramos inmersos la proliferación y el auge que están teniendo hoy en día dispositivos como smartphone, tablet, phablet se manifiesta en la asunción de nuevos modelos de aprendizaje, formas de vida, de comunicación, relaciones y entretenimiento por parte del ciudadano de la nueva era digital. Así pues, el desarrollo de estrategias para implantar estas nuevas herramientas tecnológicas en el aula supone una oportunidad para replantearnos la práctica educativa acorde con las nuevas características, demandas y necesidades del alumnado diverso al que se atiende, aprovechando así las posibilidades que nos ofrecen las tecnologías emergentes como la realidad aumentada (RA bajo una nueva modalidad de aprendizaje basada en Mobile Learning. En el presente trabajo realizaremos una revisión y análisis de aplicaciones móviles basadas en la tecnología de realidad aumentada para entornos Android e iOs y, posteriormente, presentaremos propuestas de actividades para la implementación de dicha tecnología en el abordaje de la literatura infantil y juvenil. Abstract In our society´s information where we are immersed there is a boom about devices such as smartphone, tablet, phablet. They are reflected in the assumption of new learning models, lifestyles, communication, relationships and entertainment by citizens of the new digital era. Thus, the development of strategies to implement these new technological tools in the classroom is an opportunity to rethink the line educational practice with new features, demands and needs of diverse student. They are attending and take advantage of the possibilities offered by emerging technologies as augmented reality in a new mode of learning based on Mobile Learning. In this paper, we review and analysis of mobile applications based on augmented reality technology developed for Android and iOS environments, and we will present activities for the

  19. Mathematical tools

    Science.gov (United States)

    Capozziello, Salvatore; Faraoni, Valerio

    In this chapter we discuss certain mathematical tools which are used extensively in the following chapters. Some of these concepts and methods are part of the standard baggage taught in undergraduate and graduate courses, while others enter the tool-box of more advanced researchers. These mathematical methods are very useful in formulating ETGs and in finding analytical solutions.We begin by studying conformal transformations, which allow for different representations of scalar-tensor and f(R) theories of gravity, in addition to being useful in GR. We continue by discussing variational principles in GR, which are the basis for presenting ETGs in the following chapters. We close the chapter with a discussion of Noether symmetries, which are used elsewhere in this book to obtain analytical solutions.

  20. Research on the Acceptance of Web2.0 Tools Applied in Small and Micro Businesses’ Marketing Competitive Intelligence:A Case in Zhenjiang%Web2.0工具在小微企业市场营销竞争情报中的应用研究--以镇江地区为例

    Institute of Scientific and Technical Information of China (English)

    宋新平; 朱鹏云

    2016-01-01

    Nowadays, it is popular to use Web2.0 tools, which has been applied in marketing competitive intelligence. As a result, the competitive advantage and performance of salespeople have been improved. In this paper, we explored the acceptance and existing problems of Web2.0 tools applied in small and micro businesses’ marketing competitive intelligence through questionnaire survey and semi-structured interview. The results indicate that the cognition of applying Web2.0 tools is weak, most enterprises haven’t constructed a mature social media communication platform, and the value of Web2.0 tools has not been developed completely, thus, various barriers are existed. At the end, the paper put forward relevant suggestions to solve these problems.%目前,Web2.0工具的使用已相当普遍,并逐步渗透到了市场营销竞争情报工作中,对营销员的竞争优势和工作绩效产生了很大影响。为了解Web2.0工具在市场营销竞争情报中的应用现状及存在的问题,本文以镇江地区的小微企业为主要对象,进行了实证调研和统计分析。研究显示,目前Web2.0工具在小微企业市场营销竞争情报中的使用认知并不高,且多数企业缺乏成熟的Web2.0平台,而各类Web2.0工具的应用价值开发不完全,致使Web2.0工具的应用障碍较大。针对上述问题,本文提出了相应的建议。

  1. 29 CFR 1915.133 - Hand tools.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Hand tools. 1915.133 Section 1915.133 Labor Regulations...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Tools and Related Equipment § 1915.133 Hand tools. The provisions of this section shall apply to ship repairing, shipbuilding and shipbreaking....

  2. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  3. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  4. A Chinese type 2 diabetic patient sports hindrance survey tool developed by applying the Rasch model%应用Rasch模型开发我国2型糖尿病患者运动阻碍的调查工具

    Institute of Scientific and Technical Information of China (English)

    李庆雯; 朱为模; 李梅

    2014-01-01

    In order to develop and standardize a diabetic patient sports hindrance survey tool, and to clarify major sports hindrances suffered by type 2 diabetic patients in China, the authors applied the world latest educational/psychological measurement technology to simplify the survey tool, so that it can be applied to mass groups of people better. The authors carried out a questionnaire survey on 197 type 2 diabetic patients (104 males, 93 females, average age:53.6). The survey tool originated from a mature sports hindrance survey questionnaire established in the United States, including 43 sports hindrances, used to determine the degree of hindrance in their sports participation. The authors analyzed the data by apply-ing the Rasch model, determined the magnitudes of sports hindrances based on their Logits values, and then analyzed the results according to the Rasch model, reduced questionnaire contents on the basis of maintaining the original content structure, screened out the best reduced questionnaire version by means of correlation analysis, and found that the lack of sports knowledge and the lack of professional guidance were major sports hindrances for type 2 diabetic patients in China. Based on their analysis by applying the new item response theory (IRT) model, the authors found that all the 4 reduced versions of questionnaires were correlative to the original 43-hindrance version, and that the 16-hindrance questionnaire tool had the highest efficiency, whose usage was recommended.%为了开发并规范糖尿病患者运动阻碍的调查工具,弄清中国2型糖尿病患者的主要运动阻碍,运用国际上最新的教育/心理测量技术简化调查工具,更好地应用大规模人群。对197名2型糖尿病患者(男性104名,女性93名,平均年龄53.6岁)进行问卷调查。调查工具源自美国成熟的43项运动阻碍调查问卷,作为判定妨碍他们参加运动的程度。数据应用Rasch模型进

  5. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  6. Applied epidemiology: another tool in dairy herd health programs?

    Science.gov (United States)

    Frankena, K; Noordhuizen, J P; Stassen, E N

    1994-01-01

    Data bases of herd health programs concern data from individual animals mainly. Several parameters that determine herd performance can be calculated from these programs, and by comparing actual values with standard values, areas for further improvement of health (and production) can be advised. However, such advice is usually not backed up by the proper statistical analyses. Moreover, data concerning the environment of the animals are not present and hence advice concerning multifactorial diseases are based on common knowledge and experience. Veterinary epidemiology offers methods that might improve the value of herd health programs by identification and quantification of factors and conditions contributing to multifactorial disease occurrence. Implementation of these methods within herd health programs will lead to more scientifically sound advice.

  7. Spectroscopic Tools Applied to Element Z = 115 Decay Chains

    Directory of Open Access Journals (Sweden)

    Forsberg U.

    2014-03-01

    Full Text Available Nuclides that are considered to be isotopes of element Z = 115 were produced in the reaction 48Ca + 243Am at the GSI Helmholtzzentrum für Schwerionenforschung Darmstadt. The detector setup TASISpec was used. It was mounted behind the gas-filled separator TASCA. Thirty correlated α-decay chains were found, and the energies of the particles were determined with high precision. Two important spectroscopic aspects of the offline data analysis are discussed in detail: the handling of digitized preamplified signals from the silicon strip detectors, and the energy reconstruction of particles escaping to upstream detectors relying on pixel-by-pixel dead-layer thicknesses.

  8. Database Constraints Applied to Metabolic Pathway Reconstruction Tools

    Directory of Open Access Journals (Sweden)

    Jordi Vilaplana

    2014-01-01

    Full Text Available Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (reannotation of proteomes, to properly identify both the individual proteins involved in the process(es of interest and their function. It also enables the sets of proteins involved in the process(es in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  9. Scanning Probe Microscopy as a Tool Applied to Agriculture

    Science.gov (United States)

    Leite, Fabio Lima; Manzoli, Alexandra; de Herrmann, Paulo Sérgio Paula; Oliveira, Osvaldo Novais; Mattoso, Luiz Henrique Capparelli

    The control of materials properties and processes at the molecular level inherent in nanotechnology has been exploited in many areas of science and technology, including agriculture where nanotech methods are used in release of herbicides and monitoring of food quality and environmental impact. Atomic force microscopy (AFM) and related techniques are among the most employed nanotech methods, particularly with the possibility of direct measurements of intermolecular interactions. This chapter presents a brief review of the applications of AFM in agriculture that may be categorized into four main topics, namely thin films, research on nanomaterials and nanostructures, biological systems and natural fibers, and soils science. Examples of recent applications will be provided to give the reader a sense of the power of the technique and potential contributions to agriculture.

  10. Cyber Security Evaluation Tool

    Energy Technology Data Exchange (ETDEWEB)

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  11. What Metadata Principles Apply to Scientific Data?

    Science.gov (United States)

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  12. Applying SF-Based Genre Approaches to English Writing Class

    Science.gov (United States)

    Wu, Yan; Dong, Hailin

    2009-01-01

    By exploring genre approaches in systemic functional linguistics and examining the analytic tools that can be applied to the process of English learning and teaching, this paper seeks to find a way of applying genre approaches to English writing class.

  13. Downhole tool with replaceable tool sleeve sections

    Energy Technology Data Exchange (ETDEWEB)

    Case, W. A.

    1985-10-29

    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  14. Sheet Bending using Soft Tools

    Science.gov (United States)

    Sinke, J.

    2011-05-01

    Sheet bending is usually performed by air bending and V-die bending processes. Both processes apply rigid tools. These solid tools facilitate the generation of software for the numerical control of those processes. When the lower rigid die is replaced with a soft or rubber tool, the numerical control becomes much more difficult, since the soft tool deforms too. Compared to other bending processes the rubber backed bending process has some distinct advantages, like large radius-to-thickness ratios, applicability to materials with topcoats, well defined radii, and the feasibility of forming details (ridges, beads). These advantages may give the process exclusive benefits over conventional bending processes, not only for industries related to mechanical engineering and sheet metal forming, but also for other disciplines like Architecture and Industrial Design The largest disadvantage is that also the soft (rubber) tool deforms. Although the tool deformation is elastic and recovers after each process cycle, the applied force during bending is related to the deformation of the metal sheet and the deformation of the rubber. The deformation of the rubber interacts with the process but also with sheet parameters. This makes the numerical control of the process much more complicated. This paper presents a model for the bending of sheet materials using a rubber lower die. This model can be implemented in software in order to control the bending process numerically. The model itself is based on numerical and experimental research. In this research a number of variables related to the tooling and the material have been evaluated. The numerical part of the research was used to investigate the influence of the features of the soft lower tool, like the hardness and dimensions, and the influence of the sheet thickness, which also interacts with the soft tool deformation. The experimental research was focused on the relation between the machine control parameters and the most

  15. Software Engineering applied to Manufacturing Problems

    Directory of Open Access Journals (Sweden)

    Jorge A. Ruiz-Vanoye

    2010-05-01

    Full Text Available Optimization approaches have traditionally been viewed as tools for solving manufacturing problems, the optimization approach is not suitable for many problems arising in modern manufacturing systems due to their complexity and involvement of qualitative factors. In this paper we use a tool of software engineering applied to manufacturing problems. We use the Heuristics Lab software to determine and analyze the solution obtained for Manufacturing Problems.

  16. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  17. Evolution of the lignite industry in the Rhineland across 100 years of 'Rheinische Aktiengesellschaft fuer Braunkohlenbergbau und Brikettfabrikation''; Entwicklung der Braunkohlenindustrie Im Rheinland im Spiegel von 100 Jahren ''Rheinische Aktiengesellschaft fuer Braunkohlenbergbau und Brikettfabrikation''

    Energy Technology Data Exchange (ETDEWEB)

    Hartung, M. [RWE Power AG, Ressort Braunkohlengewinnung, -stromerzeugung und -veredlung, Koeln (Germany); Kulik, L. [RWE Power AG, Bereich Tagebauplanung und -genehmigung, Koeln (Germany). PBT Bereich Tagebauplanung und -genehmigung; Gaertner, D. [RWE Power AG, Sparte Tagebaue, Bergheim (Germany)

    2008-09-15

    Rheinische Aktiengesellschaft fuer Braunkohlenbergbau und Brikettfabrikation was set up in the year 1908, and this event heralded an unprecedented streamlining of the lignite industry as it grew and developed in the Rhenish mining area. It paved the way for a bundling of forces that had become necessary in the sector's recent history if it was going to successfully face the upcoming challenges on the commodity and energy markets and in the mining sector. The amalgamation of the many Rhenish mining companies also enabled the industry to safeguard its interests more forcefully and effectively in its dealings with other market players and policy-makers. In the light of today's discussions about raw-material shortages, sustainable energy supply, climate protection and emissions trading, lignite is again facing huge challenges, just as it once did. Even if these are new and of a different nature, the course taken 100 years ago may provide pointers for the future alignment of the Rhenish lignite-mining industry. In view of the ongoing efforts made to gain acceptance among citizens, policy-makers and business, backed by an internal promotion of knowledge transfer and further education, the systematic use of technical progress to improve the competitive situation, and a far-sighted gearing of the product portfolio to growth markets. Rhenish lignite is creating a sound basis that will enable it to successfully meet tomorrow's challenges in a time-tested manner. (orig.)

  18. 29 CFR 1915.132 - Portable electric tools.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Portable electric tools. 1915.132 Section 1915.132 Labor... § 1915.132 Portable electric tools. The provisions of this section shall apply to ship repairing... frames of portable electric tools and appliances, except double insulated tools approved by...

  19. Refrigerated cutting tools improve machining of superalloys

    Science.gov (United States)

    Dudley, G. M.

    1971-01-01

    Freon-12 applied to tool cutting edge evaporates quickly, leaves no residue, and permits higher cutting rate than with conventional coolants. This technique increases cutting rate on Rene-41 threefold and improves finish of machined surface.

  20. Industrial biotechnology: tools and applications.

    Science.gov (United States)

    Tang, Weng Lin; Zhao, Huimin

    2009-12-01

    Industrial biotechnology involves the use of enzymes and microorganisms to produce value-added chemicals from renewable sources. Because of its association with reduced energy consumption, greenhouse gas emissions, and waste generation, industrial biotechnology is a rapidly growing field. Here we highlight a variety of important tools for industrial biotechnology, including protein engineering, metabolic engineering, synthetic biology, systems biology, and downstream processing. In addition, we show how these tools have been successfully applied in several case studies, including the production of 1, 3-propanediol, lactic acid, and biofuels. It is expected that industrial biotechnology will be increasingly adopted by chemical, pharmaceutical, food, and agricultural industries.

  1. Tool Changer For Robot

    Science.gov (United States)

    Voellmer, George M.

    1992-01-01

    Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.

  2. Biomimetics: process, tools and practice.

    Science.gov (United States)

    Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A

    2017-01-23

    Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.

  3. Route Availabililty Planning Tool -

    Data.gov (United States)

    Department of Transportation — The Route Availability Planning Tool (RAPT) is a weather-assimilated decision support tool (DST) that supports the development and execution of departure management...

  4. Changes in phytoplankton productivity and impacts on environment in the Zhejiang coastal mud area during the last 100 years%浙江近岸泥质区百年来浮游植物生产力的变化及对环境的响应

    Institute of Scientific and Technical Information of China (English)

    冯旭文; 段杉杉; 石学法; 刘升发; 赵美训; 杨海丽; 朱德弟; 王奎

    2013-01-01

      在210 Pb定年的基础上,对取自浙江沿岸泥质缺氧区的柱样沉积物开展了菜子甾醇、甲藻甾醇、长链烯酮等生物标志化合物分析,根据生物标志化合物含量及比例的分布特征,重建了泥质区110年来浮游植物生产力及群落结构变化。结果表明浙江近岸浮游植物生产力百年来呈上升趋势,自20世纪60年代开始上升,80年代以来有显著增加,浮游植物群落结构则均有甲藻比例上升、硅藻比例下降的趋势。研究认为,浙江沿岸泥质区百年来浮游植物生产力的提高与我国化肥施用量和长江氮的入海通量呈正相关,营养盐N∶P和N∶Si比值的增加导致浮游植物优势种由硅藻向甲藻的转变,说明自20世纪60年代,尤其是自20世纪80年代以来工农业快速发展、大型水利工程建设等人类活动是导致浙江沿岸泥质区海域浮游植物生产力提高及群落结构变化的主要因素。%A high resolution sediment core was selected in the Zhejiang coastal mud area ,which was also located within the hypoxia area. The biomarkers ,such as brassicasterol ,dinosterol and C37-Alkenone were determined on the 210 Pb-dated sediment core. According to the vertical distribution of the biomarkers and ratios in the core sedi-ments ,we reconstructed the changes in phytoplankton productivity and community structure over the last 110 years in the Zhejiang coastal region. The results indicated increased phytoplankton productivity during the last 100 years in the mud area. Phytoplankton productivity increased gradually starting in the 1960s and accelerated after the 1980s. The change of phytoplankton community structure showed an increasing relative contribution of dino-flagellates and a decreasing relative contribution of diatoms over the last 100 years. The increase in phytoplankton productivity in the Zhejiang coastal mud area corresponded to the increased use of fertilizer and nitrogen

  5. Perspectives on Applied Ethics

    OpenAIRE

    2007-01-01

    Applied ethics is a growing, interdisciplinary field dealing with ethical problems in different areas of society. It includes for instance social and political ethics, computer ethics, medical ethics, bioethics, envi-ronmental ethics, business ethics, and it also relates to different forms of professional ethics. From the perspective of ethics, applied ethics is a specialisation in one area of ethics. From the perspective of social practice applying eth-ics is to focus on ethical aspects and ...

  6. Advances in Applied Mechanics

    OpenAIRE

    2014-01-01

    Advances in Applied Mechanics draws together recent significant advances in various topics in applied mechanics. Published since 1948, Advances in Applied Mechanics aims to provide authoritative review articles on topics in the mechanical sciences, primarily of interest to scientists and engineers working in the various branches of mechanics, but also of interest to the many who use the results of investigations in mechanics in various application areas, such as aerospace, chemical, civil, en...

  7. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This presentation therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: - On-line sensors, where for example spectroscopic measurements are increasingly applied - Mechanistic models, which can be used...

  8. Applied Neuroscience Laboratory Complex

    Data.gov (United States)

    Federal Laboratory Consortium — Located at WPAFB, Ohio, the Applied Neuroscience lab researches and develops technologies to optimize Airmen individual and team performance across all AF domains....

  9. Evaluating tidal marsh sustainability in the face of sea-level rise: a hybrid modeling approach applied to San Francisco Bay.

    Directory of Open Access Journals (Sweden)

    Diana Stralberg

    Full Text Available BACKGROUND: Tidal marshes will be threatened by increasing rates of sea-level rise (SLR over the next century. Managers seek guidance on whether existing and restored marshes will be resilient under a range of potential future conditions, and on prioritizing marsh restoration and conservation activities. METHODOLOGY: Building upon established models, we developed a hybrid approach that involves a mechanistic treatment of marsh accretion dynamics and incorporates spatial variation at a scale relevant for conservation and restoration decision-making. We applied this model to San Francisco Bay, using best-available elevation data and estimates of sediment supply and organic matter accumulation developed for 15 Bay subregions. Accretion models were run over 100 years for 70 combinations of starting elevation, mineral sediment, organic matter, and SLR assumptions. Results were applied spatially to evaluate eight Bay-wide climate change scenarios. PRINCIPAL FINDINGS: Model results indicated that under a high rate of SLR (1.65 m/century, short-term restoration of diked subtidal baylands to mid marsh elevations (-0.2 m MHHW could be achieved over the next century with sediment concentrations greater than 200 mg/L. However, suspended sediment concentrations greater than 300 mg/L would be required for 100-year mid marsh sustainability (i.e., no elevation loss. Organic matter accumulation had minimal impacts on this threshold. Bay-wide projections of marsh habitat area varied substantially, depending primarily on SLR and sediment assumptions. Across all scenarios, however, the model projected a shift in the mix of intertidal habitats, with a loss of high marsh and gains in low marsh and mudflats. CONCLUSIONS/SIGNIFICANCE: Results suggest a bleak prognosis for long-term natural tidal marsh sustainability under a high-SLR scenario. To minimize marsh loss, we recommend conserving adjacent uplands for marsh migration, redistributing dredged sediment to raise

  10. What are applied ethics?

    Science.gov (United States)

    Allhoff, Fritz

    2011-03-01

    This paper explores the relationships that various applied ethics bear to each other, both in particular disciplines and more generally. The introductory section lays out the challenge of coming up with such an account and, drawing a parallel with the philosophy of science, offers that applied ethics may either be unified or disunified. The second section develops one simple account through which applied ethics are unified, vis-à-vis ethical theory. However, this is not taken to be a satisfying answer, for reasons explained. In the third section, specific applied ethics are explored: biomedical ethics; business ethics; environmental ethics; and neuroethics. These are chosen not to be comprehensive, but rather for their traditions or other illustrative purposes. The final section draws together the results of the preceding analysis and defends a disunity conception of applied ethics.

  11. Special Functions for Applied Scientists

    CERN Document Server

    Mathai, A M

    2008-01-01

    Special Functions for Applied Scientists provides the required mathematical tools for researchers active in the physical sciences. The book presents a full suit of elementary functions for scholars at the PhD level and covers a wide-array of topics and begins by introducing elementary classical special functions. From there, differential equations and some applications into statistical distribution theory are examined. The fractional calculus chapter covers fractional integrals and fractional derivatives as well as their applications to reaction-diffusion problems in physics, input-output analysis, Mittag-Leffler stochastic processes and related topics. The authors then cover q-hypergeometric functions, Ramanujan's work and Lie groups. The latter half of this volume presents applications into stochastic processes, random variables, Mittag-Leffler processes, density estimation, order statistics, and problems in astrophysics. Professor Dr. A.M. Mathai is Emeritus Professor of Mathematics and Statistics, McGill ...

  12. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    Tools for design management are on the agenda in building projects in order to set targets, to choose and prioritise between alternative environmental solutions, to involve stakeholders and to document, evaluate and benchmark. Different types of tools are available, but what can we learn from...... the use or lack of use of current tools in the development of future design tools for sustainable buildings? Why are some used while others are not? Who is using them? The paper deals with design management, with special focus on sustainable building in Denmark, and the challenge of turning the generally...... vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...

  13. Mesothelioma Applied Research Foundation

    Science.gov (United States)

    ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs Speak with Mary Hesdorffer, Nurse ... Percentage Donations Tribute Wall Other Giving/Fundraising Opportunities Bitcoin Donation Form FAQs © 2017 Mesothelioma Applied Research Foundation, ...

  14. Applied eye tracking research

    NARCIS (Netherlands)

    Jarodzka, Halszka

    2011-01-01

    Jarodzka, H. (2010, 12 November). Applied eye tracking research. Presentation and Labtour for Vereniging Gewone Leden in oprichting (VGL i.o.), Heerlen, The Netherlands: Open University of the Netherlands.

  15. Computer and Applied Ethics

    OpenAIRE

    越智, 貢

    2014-01-01

    With this essay I treat some problems raised by the new developments in science and technology, that is, those about Computer Ethics to show how and how far Applied Ethics differs from traditional ethics. I take up backgrounds on which Computer Ethics rests, particularly historical conditions of morality. Differences of conditions in time and space explain how Computer Ethics and Applied Ethics are not any traditional ethics in concrete cases. But I also investigate the normative rea...

  16. LensTools: Weak Lensing computing tools

    Science.gov (United States)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  17. Personal Wellness Tools

    Science.gov (United States)

    ... Public Service Announcements Partnering with DBSA Personal Wellness Tools The Merriam-Webster dictionary gives several definitions for ... home to a wealth of customizable, personal wellness tools to help you live a full, healthy, and ...

  18. Critical review of prostate cancer predictive tools.

    Science.gov (United States)

    Shariat, Shahrokh F; Kattan, Michael W; Vickers, Andrew J; Karakiewicz, Pierre I; Scardino, Peter T

    2009-12-01

    Prostate cancer is a very complex disease, and the decision-making process requires the clinician to balance clinical benefits, life expectancy, comorbidities and potential treatment-related side effects. Accurate prediction of clinical outcomes may help in the difficult process of making decisions related to prostate cancer. In this review, we discuss attributes of predictive tools and systematically review those available for prostate cancer. Types of tools include probability formulas, look-up and propensity scoring tables, risk-class stratification prediction tools, classification and regression tree analysis, nomograms and artificial neural networks. Criteria to evaluate tools include discrimination, calibration, generalizability, level of complexity, decision analysis and ability to account for competing risks and conditional probabilities. The available predictive tools and their features, with a focus on nomograms, are described. While some tools are well-calibrated, few have been externally validated or directly compared with other tools. In addition, the clinical consequences of applying predictive tools need thorough assessment. Nevertheless, predictive tools can facilitate medical decision-making by showing patients tailored predictions of their outcomes with various alternatives. Additionally, accurate tools may improve clinical trial design.

  19. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  20. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  1. Corn nitrogen fertilization rate tools compared over eight Midwest states

    Science.gov (United States)

    Publicly-available nitrogen (N) rate recommendation tools are utilized to help maximize yield in corn production. These tools often fail when N is over-applied and results in excess N being lost to the environment, or when N is under-applied and results in decreased yield and economic returns. Perfo...

  2. 合成氨工业:过去、现在和未来--合成氨工业创立100周年回顾、启迪和挑战%Ammonia synthesis industry:Past,present and future--Retrospect,enlightenment and challenge from 100 years of ammonia synthesis industry

    Institute of Scientific and Technical Information of China (English)

    刘化章

    2013-01-01

    Haber-Bosch 发明的催化合成氨技术创立已经100周年。合成氨工业的巨大成功,改变了世界粮食生产的历史,解决了人类因人口增长所需要的粮食,奠定了多相催化科学和化学工程科学基础。催化合成氨技术在20世纪化学工业的发展中起着核心的作用。本文回顾了合成氨工业的创立和发展过程及其启迪,展望了合成氨工业未来和面临的新挑战,指出经典的传统合成氨工业与新兴产业密切相关,可以说是新兴产业的基础,蕴含着一系列高新技术。了解和熟悉合成氨的工艺流程和设备以及合成氨过程的成熟技术和实践经验,对于了解现代化工、能源、材料、环保领域一系列共性、关键技术,尤其是对于现代新型煤化工,具有强烈的启迪和借鉴作用。%Ammonia synthesis industry founded by Haber-Bosch has achieved its history of 100 years. The huge success altered the history of food production in the world,and met the growing demand of food due to population increase. In addition,it established the solid foundation for the development of heterogeneous catalysis and chemical engineering. Catalytic ammonia synthesis technology has played a central role in the development of chemical industry during the 20th century. This paper reviews the discovery and enlightenment from foundation and development of ammonia synthesis industry,and presents its future and new challenges. There is close relationship between traditional ammonia industry and the emerging industries. To some extent,ammonia synthesis industry is the basis of these emerging industries because ammonia synthesis industry contains a series of high and new technologies. Similarly,new discoveries in the field of ammonia synthesis industry have been extended to other areas of industries. Therefore , the high and new technologies of ammonia synthesis have strong enlightenment and reference for understanding and improving

  3. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  4. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  5. Applied chemical engineering thermodynamics

    CERN Document Server

    Tassios, Dimitrios P

    1993-01-01

    Applied Chemical Engineering Thermodynamics provides the undergraduate and graduate student of chemical engineering with the basic knowledge, the methodology and the references he needs to apply it in industrial practice. Thus, in addition to the classical topics of the laws of thermodynamics,pure component and mixture thermodynamic properties as well as phase and chemical equilibria the reader will find: - history of thermodynamics - energy conservation - internmolecular forces and molecular thermodynamics - cubic equations of state - statistical mechanics. A great number of calculated problems with solutions and an appendix with numerous tables of numbers of practical importance are extremely helpful for applied calculations. The computer programs on the included disk help the student to become familiar with the typical methods used in industry for volumetric and vapor-liquid equilibria calculations.

  6. PSYCHOANALYSIS AS APPLIED AESTHETICS.

    Science.gov (United States)

    Richmond, Stephen H

    2016-07-01

    The question of how to place psychoanalysis in relation to science has been debated since the beginning of psychoanalysis and continues to this day. The author argues that psychoanalysis is best viewed as a form of applied art (also termed applied aesthetics) in parallel to medicine as applied science. This postulate draws on a functional definition of modernity as involving the differentiation of the value spheres of science, art, and religion. The validity criteria for each of the value spheres are discussed. Freud is examined, drawing on Habermas, and seen to have erred by claiming that the psychoanalytic method is a form of science. Implications for clinical and metapsychological issues in psychoanalysis are discussed.

  7. Applied mathematics made simple

    CERN Document Server

    Murphy, Patrick

    1982-01-01

    Applied Mathematics: Made Simple provides an elementary study of the three main branches of classical applied mathematics: statics, hydrostatics, and dynamics. The book begins with discussion of the concepts of mechanics, parallel forces and rigid bodies, kinematics, motion with uniform acceleration in a straight line, and Newton's law of motion. Separate chapters cover vector algebra and coplanar motion, relative motion, projectiles, friction, and rigid bodies in equilibrium under the action of coplanar forces. The final chapters deal with machines and hydrostatics. The standard and conte

  8. Introduction to applied thermodynamics

    CERN Document Server

    Helsdon, R M; Walker, G E

    1965-01-01

    Introduction to Applied Thermodynamics is an introductory text on applied thermodynamics and covers topics ranging from energy and temperature to reversibility and entropy, the first and second laws of thermodynamics, and the properties of ideal gases. Standard air cycles and the thermodynamic properties of pure substances are also discussed, together with gas compressors, combustion, and psychrometry. This volume is comprised of 16 chapters and begins with an overview of the concept of energy as well as the macroscopic and molecular approaches to thermodynamics. The following chapters focus o

  9. Retransmission Steganography Applied

    CERN Document Server

    Mazurczyk, Wojciech; Szczypiorski, Krzysztof

    2010-01-01

    This paper presents experimental results of the implementation of network steganography method called RSTEG (Retransmission Steganography). The main idea of RSTEG is to not acknowledge a successfully received packet to intentionally invoke retransmission. The retransmitted packet carries a steganogram instead of user data in the payload field. RSTEG can be applied to many network protocols that utilize retransmissions. We present experimental results for RSTEG applied to TCP (Transmission Control Protocol) as TCP is the most popular network protocol which ensures reliable data transfer. The main aim of the performed experiments was to estimate RSTEG steganographic bandwidth and detectability by observing its influence on the network retransmission level.

  10. On applying cognitive psychology.

    Science.gov (United States)

    Baddeley, Alan

    2013-11-01

    Recent attempts to assess the practical impact of scientific research prompted my own reflections on over 40 years worth of combining basic and applied cognitive psychology. Examples are drawn principally from the study of memory disorders, but also include applications to the assessment of attention, reading, and intelligence. The most striking conclusion concerns the many years it typically takes to go from an initial study, to the final practical outcome. Although the complexity and sheer timescale involved make external evaluation problematic, the combination of practical satisfaction and theoretical stimulation make the attempt to combine basic and applied research very rewarding.

  11. Applied Electromagnetism and Materials

    CERN Document Server

    Moliton, André

    2007-01-01

    Applied Electromagnetism and Materials picks up where the author's Basic Electromagnetism and Materials left off by presenting practical and relevant technological information about electromagnetic material properties and their applications. This book is aimed at senior undergraduate and graduate students as well as researchers in materials science and is the product of many years of teaching basic and applied electromagnetism. Topics range from the spectroscopy and characterization of dielectrics and semiconductors, to non-linear effects and electromagnetic cavities, to ion-beam applications in materials science.

  12. Applied Astronomy: Asteroid Prospecting

    Science.gov (United States)

    Elvis, M.

    2013-09-01

    In the age of asteroid mining the ability to find promising ore-bearing bodies will be valuable. This will give rise to a new discipline- "Applied Astronomy". Just as most geologists work in industry, not in academia, the same will be true of astronomers. Just how rare or common ore-rich asteroids are likely to be, and the skills needed to assay their value, are discussed here, with an emphasis on remote - telescopic - methods. Also considered are the resources needed to conduct extensive surveys of asteroids for prospecting purposes, and the cost and timescale involved. The longer-term need for applied astronomers is also covered.

  13. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  14. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  15. Africa and Applied Linguistics.

    Science.gov (United States)

    Makoni, Sinfree, Ed.; Meinhof, Ulrike H., Ed.

    2003-01-01

    This collection of articles includes: "Introducing Applied Linguistics in Africa" (Sinfree Makoni and Ulrike H. Meinhof); "Language Ideology and Politics: A Critical Appraisal of French as Second Official Language in Nigeria" (Tope Omoniyi); "The Democratisation of Indigenous Languages: The Case of Malawi" (Themba…

  16. Applying Literature to ELT

    Institute of Scientific and Technical Information of China (English)

    翟悦

    2007-01-01

    Literature is no longer a frightening word to English language learner. Interactive teaching methods and attractive activities can help motivating Chinese university English learners. This essay will first elaborate the reasons to use literature in ELT ( English Language Teaching) class and how to apply literature to ELT class.

  17. Essays on Applied Microeconomics

    Science.gov (United States)

    Mejia Mantilla, Carolina

    2013-01-01

    Each chapter of this dissertation studies a different question within the field of Applied Microeconomics. The first chapter examines the mid- and long-term effects of the 1998 Asian Crisis on the educational attainment of Indonesian children ages 6 to 18, at the time of the crisis. The effects are identified as deviations from a linear trend for…

  18. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  19. Applied data mining for business and industry

    CERN Document Server

    Giudici, Paolo

    2009-01-01

    The increasing availability of data in our current, information overloaded society has led to the need for valid tools for its modelling and analysis. Data mining and applied statistical methods are the appropriate tools to extract knowledge from such data. This book provides an accessible introduction to data mining methods in a consistent and application oriented statistical framework, using case studies drawn from real industry projects and highlighting the use of data mining methods in a variety of business applications. Introduces data mining methods and applications.Covers classical and Bayesian multivariate statistical methodology as well as machine learning and computational data mining methods.Includes many recent developments such as association and sequence rules, graphical Markov models, lifetime value modelling, credit risk, operational risk and web mining.Features detailed case studies based on applied projects within industry.Incorporates discussion of data mining software, with case studies a...

  20. Applying the WEAP Model to Water Resource

    DEFF Research Database (Denmark)

    Gao, Jingjing; Christensen, Per; Li, Wei

    Water resources assessment is a tool to provide decision makers with an appropriate basis to make informed judgments regarding the objectives and targets to be addressed during the Strategic Environmental Assessment (SEA) process. The study shows how water resources assessment can be applied in SEA...... in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like irrigation...... efficiency, treatment and reuse of water. The WEAP model was applied to the Ordos catchment where it was used for the first time in China. The changes in water resource utilization in Ordos basin were assessed with the model. It was found that the WEAP model is a useful tool for water resource assessment...

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  2. Brain oxygenation patterns during the execution of tool use demonstration, tool use pantomime, and body-part-as-object tool use.

    Science.gov (United States)

    Helmich, Ingo; Holle, Henning; Rein, Robert; Lausberg, Hedda

    2015-04-01

    Divergent findings exist whether left and right hemispheric pre- and postcentral cortices contribute to the production of tool use related hand movements. In order to clarify the neural substrates of tool use demonstrations with tool in hand, tool use pantomimes without tool in hand, and body-part-as-object presentations of tool use (BPO) in a naturalistic mode of execution, we applied functional Near InfraRed Spectroscopy (fNIRS) in twenty-three right-handed participants. Functional NIRS techniques allow for the investigation of brain oxygenation during the execution of complex hand movements with an unlimited movement range. Brain oxygenation patterns were retrieved from 16 channels of measurement above pre- and postcentral cortices of each hemisphere. The results showed that tool use demonstration with tool in hand leads to increased oxygenation as compared to tool use pantomimes in the left hemispheric somatosensory gyrus. Left hand executions of the demonstration of tool use, pantomime of tool use, and BPO of tool use led to increased oxygenation in the premotor and somatosensory cortices of the left hemisphere as compared to right hand executions of either condition. The results indicate that the premotor and somatosensory cortices of the left hemisphere constitute relevant brain structures for tool related hand movement production when using the left hand, whereas the somatosensory cortex of the left hemisphere seems to provide specific mental representations when performing tool use demonstrations with the tool in hand.

  3. Miniaturised Spotter-Compatible Multicapillary Stamping Tool for Microarray Printing

    CERN Document Server

    Drobyshev, A L; Zasedatelev, A S; Drobyshev, Alexei L; Verkhodanov, Nikolai N; Zasedatelev, Alexander S

    2007-01-01

    Novel microstamping tool for microarray printing is proposed. The tool is capable to spot up to 127 droplets of different solutions in single touch. It is easily compatible with commercially available microarray spotters. The tool is based on multichannel funnel with polypropylene capillaries inserted into its channels. Superior flexibility is achieved by ability to replace any printing capillary of the tool. As a practical implementation, hydrogel-based microarrays were stamped and successfully applied to identify the Mycobacterium tuberculosis drug resistance.

  4. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  5. Applying WCET Analysis at Architectural Level

    OpenAIRE

    Gilles, Olivier; Hugues, Jérôme

    2008-01-01

    Real-Time embedded systems must enforce strict timing constraints. In this context, achieving precise Worst Case Execution Time is a prerequisite to apply scheduling analysis and verify system viability. WCET analysis is usually a complex and time-consuming activity. It becomes increasingly complex when one also considers code generation strategies from high-level models. In this paper, we present an experiment made on the coupling of the WCET analysis tool Bound-T and our AADL to code ...

  6. Capacitive tool standoff sensor for dismantlement tasks

    Energy Technology Data Exchange (ETDEWEB)

    Schmitt, D.J.; Weber, T.M. [Sandia National Labs., Albuquerque, NM (United States); Liu, J.C. [Univ. of Illinois, Urbana, IL (United States)

    1996-12-31

    A capacitive sensing technology has been applied to develop a Standoff Sensor System for control of robotically deployed tools utilized in Decontamination and Dismantlement (D and D) activities. The system combines four individual sensor elements to provide non-contact, multiple degree-of-freedom control of tools at distances up to five inches from a surface. The Standoff Sensor has been successfully integrated to a metal cutting router and a pyrometer, and utilized for real-time control of each of these tools. Experiments demonstrate that the system can locate stationary surfaces with a repeatability of 0.034 millimeters.

  7. PAT tools for fermentation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Bolic, Andrijana; Svanholm, Bent

    2012-01-01

    The publication of the Process Analytical Technology (PAT) guidance has been one of the most important milestones for pharmaceutical production during the past ten years. The ideas outlined in the PAT guidance are also applied in other industries, for example the fermentation industry. Process...... knowledge is central in PAT projects. This manuscript therefore gives a brief overview of a number of PAT tools for collecting process knowledge on fermentation processes: on-line sensors, mechanistic models and small-scale equipment for high-throughput experimentation. The manuscript ends with a short...

  8. Applied Control Systems Design

    CERN Document Server

    Mahmoud, Magdi S

    2012-01-01

    Applied Control System Design examines several methods for building up systems models based on real experimental data from typical industrial processes and incorporating system identification techniques. The text takes a comparative approach to the models derived in this way judging their suitability for use in different systems and under different operational circumstances. A broad spectrum of control methods including various forms of filtering, feedback and feedforward control is applied to the models and the guidelines derived from the closed-loop responses are then composed into a concrete self-tested recipe to serve as a check-list for industrial engineers or control designers. System identification and control design are given equal weight in model derivation and testing to reflect their equality of importance in the proper design and optimization of high-performance control systems. Readers’ assimilation of the material discussed is assisted by the provision of problems and examples. Most of these e...

  9. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  10. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  11. Applied Economics in Teaching

    Institute of Scientific and Technical Information of China (English)

    朱红萍

    2009-01-01

    This paper explains some plain phenomena in teaching and class management with an economic view. Some basic economic principles mentioned therein are: everything has its opportunity cost; the marginal utility of consumption of any kind is diminishing; Game theory is everywhere. By applying the economic theories to teaching, it is of great help for teachers to understand the students' behavior and thus improve the teaching effectiveness and efficiency.

  12. Methods of applied mathematics

    CERN Document Server

    Hildebrand, Francis B

    1992-01-01

    This invaluable book offers engineers and physicists working knowledge of a number of mathematical facts and techniques not commonly treated in courses in advanced calculus, but nevertheless extremely useful when applied to typical problems in many different fields. It deals principally with linear algebraic equations, quadratic and Hermitian forms, operations with vectors and matrices, the calculus of variations, and the formulations and theory of linear integral equations. Annotated problems and exercises accompany each chapter.

  13. 近百年全球草地生态系统净初级生产力时空动态对气候变化的响应%The NPP spatiotemporal variation of global grassland ecosystems in response to climate change over the past 100 years

    Institute of Scientific and Technical Information of China (English)

    刚成诚; 王钊齐; 杨悦; 陈奕兆; 张艳珍; 李建龙; 程积民

    2016-01-01

    Classification System (CSCS) and a segmentation model.Correlation analysis was also conducted to reveal the responses of grassland types to different climate variables.The results showed that the total global area of grassland ecosystems declined from 5175.73×104 km2 in the 1920s to 5102.16×104 km2 in the 1990s.The largest decrease,192.35×104 km2 ,oc-curred in tundra & alpine steppe ecosystems.The areas of desert grassland,typical grassland and temperate humid grassland decreased by 14.31,34.15 and 70.81×104 km2 respectively,while tropical savanna expanded by 238.06×104 km2 .Climate warming forced most grasslands to shift northwards,particularly in the northern hemisphere.Global grassland NPP increased from 25.93 Pg DW/yr in the 1920s to 26.67 Pg DW/yr in the 1990s.In terms of each grassland type,the NPP of the tundra and alpine steppe,desert grassland,typical grassland and temperate humid grassland decreased by 709.57,24.98,115.74 and 291.56 Tg DW/yr respec-tively.The NPP of tropical savanna increased by 1887.37 Tg DW/yr.At the global scale,precipitation was the dominant factor affecting grassland NPP.In general,grassland ecosystems have been substantially affected by climate change over the past 100 years.Although the global grassland NPP showed an overall increasing trend, the structure and distribution of particular grassland ecosystems had been adversely affected by the warmer and wetter climate.

  14. Lunar hand tools

    Science.gov (United States)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  15. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  16. Authoring tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.

    1994-09-15

    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  17. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  18. CMS offline web tools

    CERN Document Server

    Metson, S; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Evans, D; Fanfani, A; Feichtinger, D; Kavka, C; Kuznetsov, V; Van Lingen, F; Newbold, D; Tuura, L; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments.

  19. Qlikview Audit Tool (QLIKVIEW) -

    Data.gov (United States)

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  20. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  1. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  2. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  3. Applying lean thinking in construction

    Directory of Open Access Journals (Sweden)

    Remon Fayek Aziz

    2013-12-01

    Full Text Available The productivity of the construction industry worldwide has been declining over the past 40 years. One approach for improving the situation is using lean construction. Lean construction results from the application of a new form of production management to construction. Essential features of lean construction include a clear set of objectives for the delivery process, aimed at maximizing performance for the customer at the project level, concurrent design, construction, and the application of project control throughout the life cycle of the project from design to delivery. An increasing number of construction academics and professionals have been storming the ramparts of conventional construction management in an effort to deliver better value to owners while making real profits. As a result, lean-based tools have emerged and have been successfully applied to simple and complex construction projects. In general, lean construction projects are easier to manage, safer, completed sooner, and cost less and are of better quality. Significant research remains to complete the translation to construction of lean thinking in Egypt. This research will discuss principles, methods, and implementation phases of lean construction showing the waste in construction and how it could be minimized. The Last Planner System technique, which is an important application of the lean construction concepts and methodologies and is more prevalent, proved that it could enhance the construction management practices in various aspects. Also, it is intended to develop methodology for process evaluation and define areas for improvement based on lean approach principles.

  4. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2005-01-01

    Master linear regression techniques with a new edition of a classic text Reviews of the Second Edition: ""I found it enjoyable reading and so full of interesting material that even the well-informed reader will probably find something new . . . a necessity for all of those who do linear regression."" -Technometrics, February 1987 ""Overall, I feel that the book is a valuable addition to the now considerable list of texts on applied linear regression. It should be a strong contender as the leading text for a first serious course in regression analysis."" -American Scientist, May-June 1987

  5. SIFT applied to CBIR

    Directory of Open Access Journals (Sweden)

    ALMEIDA, J.

    2009-12-01

    Full Text Available Content-Based Image Retrieval (CBIR is a challenging task. Common approaches use only low-level features. Notwithstanding, such CBIR solutions fail on capturing some local features representing the details and nuances of scenes. Many techniques in image processing and computer vision can capture these scene semantics. Among them, the Scale Invariant Features Transform~(SIFT has been widely used in a lot of applications. This approach relies on the choice of several parameters which directly impact its effectiveness when applied to retrieve images. In this paper, we discuss the results obtained in several experiments proposed to evaluate the application of the SIFT in CBIR tasks.

  6. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  7. Applied energy an introduction

    CERN Document Server

    Abdullah, Mohammad Omar

    2012-01-01

    Introduction to Applied EnergyGeneral IntroductionEnergy and Power BasicsEnergy EquationEnergy Generation SystemsEnergy Storage and MethodsEnergy Efficiencies and LossesEnergy industry and Energy Applications in Small -Medium Enterprises (SME) industriesEnergy IndustryEnergy-Intensive industryEnergy Applications in SME Energy industriesEnergy Sources and SupplyEnergy SourcesEnergy Supply and Energy DemandEnergy Flow Visualization and Sankey DiagramEnergy Management and AnalysisEnergy AuditsEnergy Use and Fuel Consumption StudyEnergy Life-Cycle AnalysisEnergy and EnvironmentEnergy Pollutants, S

  8. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  9. Applied impulsive mathematical models

    CERN Document Server

    Stamova, Ivanka

    2016-01-01

    Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

  10. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  11. Applied Chaos Control

    Science.gov (United States)

    Spano, Mark

    1997-04-01

    The publication by Ott, Grebogi and Yorke(E. Ott, C. Grebogi and J. A. Yorke, Phys. Rev. Lett. 64, 1196 (1990).) of their theory of chaos control in 1990 led to an explosion of experimental work applying their theory to mechanical systems and electronic circuits, lasers and chemical reactors, and heart and brain tissue, to name only a few. In this talk the basics of chaos control as implemented in a simple mechanical system will be described, as well as extensions of the method to biological applications. Finally, current advances in the field, including the maintenance of chaos and the control of high dimensional chaos, will be discussed.

  12. Applied complex variables

    CERN Document Server

    Dettman, John W

    1965-01-01

    Analytic function theory is a traditional subject going back to Cauchy and Riemann in the 19th century. Once the exclusive province of advanced mathematics students, its applications have proven vital to today's physicists and engineers. In this highly regarded work, Professor John W. Dettman offers a clear, well-organized overview of the subject and various applications - making the often-perplexing study of analytic functions of complex variables more accessible to a wider audience. The first half of Applied Complex Variables, designed for sequential study, is a step-by-step treatment of fun

  13. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  14. Apply of Automatic Generation Technology Segment Tool Electrode Blank Geometry and Cutting Dimension%花纹块工具电极毛坯几何体及下料尺寸图自动生成技术的应用

    Institute of Scientific and Technical Information of China (English)

    胡海明; 张浩

    2013-01-01

    To solve the inefficient problem of relying on traditional manual drawing the tool electrode blank geometry and cutting dimension separately,it write the tool electrode blank geometry and cutting dimension generated procedures automatically and simultaneously are writed using GRIP language.Users need only directly to select the three-dimensional model of the tool electrode and enter the unilateral margin,the tool electrode blank geometry and cutting dimension could be generated automatically.This improves greatly work efficiency.%为了解决传统手工单独绘制工具电极毛坯几何体及下料尺寸图效率低的问题,应用GRIP语言编写了工具电极毛坯几何体及下料尺寸图自动同时生成程序.用户只需选择工具电极的三维模型并输入单边余量值,便可自动生成该工具电极的毛坯几何体及其下料尺寸图,大大提高了工作效率.

  15. Applied Impact Physics Research

    Science.gov (United States)

    Wickert, Matthias

    2013-06-01

    Applied impact physics research is based on the capability to examine impact processes for a wide range of impact conditions with respect to velocity as well as mass and shape of the projectile. For this reason, Fraunhofer EMI operates a large variety of launchers that address velocities up to ordnance velocities as single stage powder gun but which can also be operated as two-stage light gas guns achieving the regime of low earth orbital velocity. Thereby for projectile masses of up to 100 g hypervelocity impact phenomena up to 7.8 km/s can be addressed. Advanced optical diagnostic techniques like microsecond video are used as commercial systems but - since impact phenomena are mostly related with debris or dust - specialized diagnostics are developed in-house like x-ray cinematography and x-ray tomography. Selected topics of the field of applied impact physics will be presented like the interesting behavior of long rods penetrating low-density materials or experimental findings at hypervelocity for this class of materials as well as new x-ray diagnositic techniques.

  16. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  17. Study of Tools Interoperability

    NARCIS (Netherlands)

    Krilavičius, T.

    2007-01-01

    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  18. WATERS Expert Query Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Expert Query Tool is a web-based reporting tool using the EPA’s WATERS database.There are just three steps to using Expert Query:1. View Selection – Choose what...

  19. Coring Sample Acquisition Tool

    Science.gov (United States)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  20. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  1. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language......This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...

  2. Software Tool Issues

    Science.gov (United States)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  3. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  4. PREFACE: Celebrating 100 years of superconductivity: special issue on the iron-based superconductors Celebrating 100 years of superconductivity: special issue on the iron-based superconductors

    Science.gov (United States)

    Crabtree, George; Greene, Laura; Johnson, Peter

    2011-12-01

    In honor of this year's 100th anniversary of the discovery of superconductivity, this special issue of Reports on Progress in Physics is a dedicated issue to the 'iron-based superconductors'—a new class of high-temperature superconductors that were discovered in 2008. This is the first time the journal has generated a 'theme issue', and we provide this to the community to provide a 'snapshot' of the present status, both for researchers working in this fast-paced field, and for the general physics community. Reports on Progress in Physics publishes three classes of articles—comprehensive full Review Articles, Key Issues Reviews and, most recently, Reports on Progress articles that recount the current status of a rapidly evolving field, befitting of the articles in this special issue. It has been an exciting year for superconductivity—there have been numerous celebrations for this centenary recounting the fascinating history of this field, from seven Nobel prizes to life-saving discoveries that brought us medically useful magnetic resonance imaging. The discovery of a completely new class of high-temperature superconductors, whose mechanism remains as elusive as the cuprates discovered in 1986, has injected a new vitality into this field, and this year those new to the field were provided with the opportunity of interacting with those who have enjoyed a long history in superconductivity. Furthermore, as high-density current carriers with little or no power loss, high-temperature superconductors offer unique solutions to fundamental grid challenges of the 21st century and hold great promise in addressing our global energy challenges. The complexity and promise of these materials has caused our community to more freely share our ideas and results than ever before, and it is gratifying to see how we have grown into an enthusiastic global network to advance the field. This invited collection is true to this agenda and we are delighted to have received contributions from many of the world leaders for an initiative that is designed to benefit both newcomers and established researchers in superconductivity.

  5. Computational social networks tools, perspectives and applications

    CERN Document Server

    Abraham, Ajith

    2012-01-01

    Provides the latest advances in computational social networks, and illustrates how organizations can gain a competitive advantage by applying these ideas in real-world scenarios Presents a specific focus on practical tools and applications Provides experience reports, survey articles, and intelligence techniques and theories relating to specific problems in network technology

  6. A cross-species alignment tool (CAT)

    DEFF Research Database (Denmark)

    Li, Heng; Guan, Liang; Liu, Tao;

    2007-01-01

    sensitive methods which are usually applied in aligning inter-species sequences. RESULTS: Here we present a new algorithm called CAT (for Cross-species Alignment Tool). It is designed to align mRNA sequences to mammalian-sized genomes. CAT is implemented using C scripts and is freely available on the web...

  7. Applied partial differential equations

    CERN Document Server

    Logan, J David

    2015-01-01

    This text presents the standard material usually covered in a one-semester, undergraduate course on boundary value problems and PDEs.  Emphasis is placed on motivation, concepts, methods, and interpretation, rather than on formal theory. The concise treatment of the subject is maintained in this third edition covering all the major ideas: the wave equation, the diffusion equation, the Laplace equation, and the advection equation on bounded and unbounded domains. Methods include eigenfunction expansions, integral transforms, and characteristics. In this third edition, text remains intimately tied to applications in heat transfer, wave motion, biological systems, and a variety other topics in pure and applied science. The text offers flexibility to instructors who, for example, may wish to insert topics from biology or numerical methods at any time in the course. The exposition is presented in a friendly, easy-to-read, style, with mathematical ideas motivated from physical problems. Many exercises and worked e...

  8. Applied number theory

    CERN Document Server

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  9. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  10. New tools for learning.

    Science.gov (United States)

    Dickinson, D

    1999-01-01

    more often to collaborate on creating new knowledge as well as mastering the basics. As technology becomes more ubiquitous, there is growing recognition of the importance of the arts in humanizing the curriculum. "More high-tech, more need for high-touch" is becoming the by-word of many schools. They recognize that the arts are not only culturally important and civilizing influences, but they can facilitate the learning of almost any subject. I believe that these four concepts--the plasticity of the brain, the modifiability of intelligence, the use of technology as a powerful new tool for learning, and the renaissance of the arts in education--have major implications specifically for educational systems and generally for the future of our world. In this time of rapid change, leading-edge educational systems are equipping people with the ability to learn, unlearn, and relearn continually. They are giving students meaningful opportunities to apply what they have learned in order to turn information into knowledge. And--of critical importance if any of this is to lead to a healthy future--they are helping students to learn to use knowledge responsibly, ethically, and with integrity. Furthermore, they are involving students in experiences that develop compassion and altruism in the process of their education. Our complex world urgently needs more people who have developed their fullest potential in mind, body, and spirit.

  11. Applied physiology of cycling.

    Science.gov (United States)

    Faria, I E

    1984-01-01

    Historically, the bicycle has evolved through the stages of a machine for efficient human transportation, a toy for children, a finely-tuned racing machine, and a tool for physical fitness development, maintenance and testing. Recently, major strides have been made in the aerodynamic design of the bicycle. These innovations have resulted in new land speed records for human powered machines. Performance in cycling is affected by a variety of factors, including aerobic and anaerobic capacity, muscular strength and endurance, and body composition. Bicycle races range from a 200m sprint to approximately 5000km. This vast range of competitive racing requires special attention to the principle of specificity of training. The physiological demands of cycling have been examined through the use of bicycle ergometers, rollers, cycling trainers, treadmill cycling, high speed photography, computer graphics, strain gauges, electromyography, wind tunnels, muscle biopsy, and body composition analysis. These techniques have been useful in providing definitive data for the development of a work/performance profile of the cyclist. Research evidence strongly suggests that when measuring the cyclist's aerobic or anaerobic capacity, a cycling protocol employing a high pedalling rpm should be used. The research bicycle should be modified to resemble a racing bicycle and the cyclist should wear cycling shoes. Prolonged cycling requires special nutritional considerations. Ingestion of carbohydrates, in solid form and carefully timed, influences performance. Caffeine appears to enhance lipid metabolism. Injuries, particularly knee problems which are prevalent among cyclists, may be avoided through the use of proper gearing and orthotics. Air pollution has been shown to impair physical performance. When pollution levels are high, training should be altered or curtailed. Effective training programmes simulate competitive conditions. Short and long interval training, blended with long

  12. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  13. Benchmarking expert system tools

    Science.gov (United States)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  14. Applied Linguistics and the "Annual Review of Applied Linguistics."

    Science.gov (United States)

    Kaplan, Robert B.; Grabe, William

    2000-01-01

    Examines the complexities and differences involved in granting disciplinary status to the role of applied linguistics, discusses the role of the "Annual Review of Applied Linguistics" as a contributor to the development of applied linguistics, and highlights a set of publications for the future of applied linguistics. (Author/VWL)

  15. Wound assessment tools and nurses' needs: an evaluation study.

    Science.gov (United States)

    Greatrex-White, Sheila; Moxey, Helen

    2015-06-01

    The purpose of this study was to ascertain how well different wound assessment tools meet the needs of nurses in carrying out general wound assessment and whether current tools are fit for purpose. The methodology employed was evaluation research. In order to conduct the evaluation, a literature review was undertaken to identify the criteria of an optimal wound assessment tool which would meet nurses' needs. Several freely available wound assessment tools were selected based on predetermined inclusion and exclusion criteria and an audit tool was developed to evaluate the selected tools based on how well they met the criteria of the optimal wound assessment tool. The results provide a measure of how well the selected wound assessment tools meet the criteria of the optimal wound assessment tool. No tool was identified which fulfilled all the criteria, but two (the Applied Wound Management tool and the National Wound Assessment Form) met the most criteria of the optimal tool and were therefore considered to best meet nurses' needs in wound assessment. The study provides a mechanism for the appraisal of wound assessment tools using a set of optimal criteria which could aid practitioners in their search for the best wound assessment tool.

  16. Applied hydraulic transients

    CERN Document Server

    Chaudhry, M Hanif

    2014-01-01

    This book covers hydraulic transients in a comprehensive and systematic manner from introduction to advanced level and presents various methods of analysis for computer solution. The field of application of the book is very broad and diverse and covers areas such as hydroelectric projects, pumped storage schemes, water-supply systems, cooling-water systems, oil pipelines and industrial piping systems. Strong emphasis is given to practical applications, including several case studies, problems of applied nature, and design criteria. This will help design engineers and introduce students to real-life projects. This book also: ·         Presents modern methods of analysis suitable for computer analysis, such as the method of characteristics, explicit and implicit finite-difference methods and matrix methods ·         Includes case studies of actual projects ·         Provides extensive and complete treatment of governed hydraulic turbines ·         Presents design charts, desi...

  17. Academic training: Applied superconductivity

    CERN Multimedia

    2007-01-01

    LECTURE SERIES 17, 18, 19 January from 11.00 to 12.00 hrs Council Room, Bldg 503 Applied Superconductivity : Theory, superconducting Materials and applications E. PALMIERI/INFN, Padova, Italy When hearing about persistent currents recirculating for several years in a superconducting loop without any appreciable decay, one realizes that we are dealing with a phenomenon which in nature is the closest known to the perpetual motion. Zero resistivity and perfect diamagnetism in Mercury at 4.2 K, the breakthrough during 75 years of several hundreds of superconducting materials, the revolution of the "liquid Nitrogen superconductivity"; the discovery of still a binary compound becoming superconducting at 40 K and the subsequent re-exploration of the already known superconducting materials: Nature discloses drop by drop its intimate secrets and nobody can exclude that the last final surprise must still come. After an overview of phenomenology and basic theory of superconductivity, the lectures for this a...

  18. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution.

  19. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Science.gov (United States)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  20. Smart Growth Tools

    Science.gov (United States)

    This page describes a variety of tools useful to federal, state, tribal, regional, and local government staff and elected officials; community leaders; developers; and others interested in smart growth development.

  1. Neighborhood Mapping Tool

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  2. TENCompetence tool demonstration

    NARCIS (Netherlands)

    Kluijfhout, Eric

    2010-01-01

    Kluijfhout, E. (2009). TENCompetence tool demonstration. Presented at Zorgacademie Parkstad (Health Academy Parkstad), Limburg Leisure Academy, Life Long Learning Limburg and a number of regional educational institutions. May, 18, 2009, Heerlen, The Netherlands: Open University of the Netherlands, T

  3. Tools and their uses

    CERN Document Server

    1973-01-01

    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  4. NWRS Survey Prioritization Tool

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — A SMART Tool and User's Guide for aiding NWRS Station staff when prioritizing their surveys for an Inventory and Monitoring Plan. This guide describes a process and...

  5. Smart tool holder

    Science.gov (United States)

    Day, Robert Dean; Foreman, Larry R.; Hatch, Douglas J.; Meadows, Mark S.

    1998-01-01

    There is provided an apparatus for machining surfaces to accuracies within the nanometer range by use of electrical current flow through the contact of the cutting tool with the workpiece as a feedback signal to control depth of cut.

  6. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  7. Mapping Medicare Disparities Tool

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Minority Health has designed an interactive map, the Mapping Medicare Disparities Tool, to identify areas of disparities between subgroups of...

  8. ATO Resource Tool -

    Data.gov (United States)

    Department of Transportation — Cru-X/ART is a shift management tool designed for?use by operational employees in Air Traffic Facilities.? Cru-X/ART is used for shift scheduling, shift sign in/out,...

  9. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  10. Chemical Data Access Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  11. Recovery Action Mapping Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  12. Cash Reconciliation Tool

    Data.gov (United States)

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create an...

  13. Friction stir welding tool

    Science.gov (United States)

    Tolle; Charles R. , Clark; Denis E. , Barnes; Timothy A.

    2008-04-15

    A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

  14. Autism Teaching Tool

    CERN Multimedia

    2014-01-01

    CERN pattern recognition technologies transferred to Austistic children learning tool. The state of the art of pattern recognition technology developed at CERN for High Energy Physics are transferred to Computer Vision domain and are used to develop a new

  15. Quality tools - systematic use in process industry

    Directory of Open Access Journals (Sweden)

    M. Sokovic

    2007-11-01

    Full Text Available Purpose: The paper is dealing with one segment of broader research of universality systematicness in application of seven basic quality tools (7QC tools. The research was carried out in different areas that include power plant, process industry, government, health and tourism services. The aim of the research was to show on practical examples that there is real possibility of application of 7QC tools. Furthermore, the research has to show to what extent are selected tools in usage and what reasons of avoiding their broader application are. The simple example of successful application of one of quality tools are shown on selected company in process industry.Design/methodology/approach: In the research each of 7QC tools have been tested and its applicability in the frame of selected business has been shown. Systematic approach is explained on the example of selected company in process industry which is ISO 9000:2000 certified.Findings: Conducted research has shown that there is possibility of systematic application of all of the 7QC tools in the frame of companies’ overall quality management system. The research has also shown that 7QC tools are not so wide spread as expected, although they are quite simple for application and easy to interpret. The further investigation should be carried out in order to find reasons for such condition, and, in accordance, to define appropriate corrective actions to eliminate or minimize that problem.Practical implications: The partners in research were from different area of business, as stated above. To each partner has been shown in what manner can quality tools are applied and how formal quality management system can help in process improvement on day-to day basis.Originality/value: The main value of conducted research is, in author’s opinion, to show to each partner in research that all of 7QC tools can be successfully applied within their own business. Also, the correlation between formal quality

  16. Tools used for hand deburring

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, L.K.

    1981-03-01

    This guide is designed to help in quick identification of those tools most commonly used to deburr hand size or smaller parts. Photographs and textual descriptions are used to provide rapid yet detailed information. The data presented include the Bendix Kansas City Division coded tool number, tool description, tool crib in which the tool can be found, the maximum and minimum inventory requirements, the cost of each tool, and the number of the illustration that shows the tool.

  17. Manual bamboo cutting tool.

    Science.gov (United States)

    Bezerra, Mariana Pereira; Correia, Walter Franklin Marques; da Costa Campos, Fabio Ferreira

    2012-01-01

    The paper presents the development of a cutting tool guide, specifically for the harvest of bamboo. The development was made based on precepts of eco-design and ergonomics, for prioritizing the physical health of the operator and the maintenance of the environment, as well as meet specific requirements of bamboo. The main goal is to spread the use of bamboo as construction material, handicrafts, among others, from a handy, easy assembly and material available tool.

  18. Stochastic tools in turbulence

    CERN Document Server

    Lumey, John L

    2012-01-01

    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  19. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  20. Applied Historical Astronomy

    Science.gov (United States)

    Stephenson, F. Richard

    2014-01-01

    F. Richard Stephenson has spent most of his research career -- spanning more than 45 years -- studying various aspects of Applied Historical Astronomy. The aim of this interdisciplinary subject is the application of historical astronomical records to the investigation of problems in modern astronomy and geophysics. Stephenson has almost exclusively concentrated on pre-telescopic records, especially those preserved from ancient and medieval times -- the earliest reliable observations dating from around 700 BC. The records which have mainly interested him are of eclipses (both solar and lunar), supernovae, sunspots and aurorae, and Halley's Comet. The main sources of early astronomical data are fourfold: records from ancient and medieval East Asia (China, together with Korea and Japan); ancient Babylon; ancient and medieval Europe; and the medieval Arab world. A feature of Stephenson's research is the direct consultation of early astronomical texts in their original language -- either working unaided or with the help of colleagues. He has also developed a variety of techniques to help interpret the various observations. Most pre-telescopic observations are very crude by present-day standards. In addition, early motives for skywatching were more often astrological rather than scientific. Despite these drawbacks, ancient and medieval astronomical records have two remarkable advantages over modern data. Firstly, they can enable the investigation of long-term trends (e.g. in the terrestrial rate of rotation), which in the relatively short period covered by telescopic observations are obscured by short-term fluctuations. Secondly, over the lengthy time-scale which they cover, significant numbers of very rare events (such as Galactic supernovae) were reported, which have few -- if any-- counterparts in the telescopic record. In his various researches, Stephenson has mainly focused his attention on two specific topics. These are: (i) long-term changes in the Earth's rate of

  1. Vygotsky in applied neuropsychology

    Directory of Open Access Journals (Sweden)

    Glozman J. M.

    2016-12-01

    Full Text Available The aims of this paper are: 1 to show the role of clinical experience for the theoretical contributions of L.S. Vygotsky, and 2 to analyze the development of these theories in contemporary applied neuropsychology. An analysis of disturbances of mental functioning is impossible without a systemic approach to the evidence observed. Therefore, medical psychology is fundamental for forming a systemic approach to psychology. The assessment of neurological patients at the neurological hospital of Moscow University permitted L.S. Vygotsky to create, in collaboration with A.R. Luria, the theory of systemic dynamic localization of higher mental functions and their relationship to cultural conditions. In his studies of patients with Parkinson’s disease, Vygotsky also set out 3 steps of systemic development: interpsychological, then extrapsychological, then intrapsychological. L.S. Vygotsky and A.R. Luria in the late 1920s created a program to compensate for the motor subcortical disturbances in Parkinson’s disease (PD through a cortical (visual mediation of movements. We propose to distinguish the objective mediating factors — like teaching techniques and modalities — from subjective mediating factors, like the individual’s internal representation of his/her own disease. The cultural-historical approach in contemporary neuropsychology forces neuropsychologists to re-analyze and re-interpret the classic neuropsychological syndromes; to develop new assessment procedures more in accordance with the patient’s conditions of life; and to reconsider the concept of the social brain as a social and cultural determinant and regulator of brain functioning. L.S. Vygotsky and A.R. Luria proved that a defect interferes with a child’s appropriation of his/her culture, but cultural means can help the child overcome the defect. In this way, the cultural-historical approach became, and still is, a methodological basis for remedial education.

  2. Essays in applied microeconomics

    Science.gov (United States)

    Wang, Xiaoting

    In this dissertation I use Microeconomic theory to study firms' behavior. Chapter One introduces the motivations and main findings of this dissertation. Chapter Two studies the issue of information provision through advertisement when markets are segmented and consumers' price information is incomplete. Firms compete in prices and advertising strategies for consumers with transportation costs. High advertising costs contribute to market segmentation. Low advertising costs promote price competition among firms and improves consumer welfare. Chapter Three also investigates market power as a result of consumers' switching costs. A potential entrant can offer a new product bundled with an existing product to compensate consumers for their switching cost. If the primary market is competitive, bundling simply plays the role of price discrimination, and it does not dominate unbundled sales in the process of entry. If the entrant has market power in the primary market, then bundling also plays the role of leveraging market power and it dominates unbundled sales. The market for electric power generation has been opened to competition in recent years. Chapter Four looks at issues involved in the deregulated electricity market. By comparing the performance of the competitive market with the social optimum, we identify the conditions under which market equilibrium generates socially efficient levels of electric power. Chapter Two to Four investigate the strategic behavior among firms. Chapter Five studies the interaction between firms and unemployed workers in a frictional labor market. We set up an asymmetric job auction model, where two types of workers apply for two types of job openings by bidding in auctions and firms hire the applicant offering them the most profits. The job auction model internalizes the determination of the share of surplus from a match, therefore endogenously generates incentives for an efficient division of the matching surplus. Microeconomic

  3. Applied large eddy simulation.

    Science.gov (United States)

    Tucker, Paul G; Lardeau, Sylvain

    2009-07-28

    Large eddy simulation (LES) is now seen more and more as a viable alternative to current industrial practice, usually based on problem-specific Reynolds-averaged Navier-Stokes (RANS) methods. Access to detailed flow physics is attractive to industry, especially in an environment in which computer modelling is bound to play an ever increasing role. However, the improvement in accuracy and flow detail has substantial cost. This has so far prevented wider industrial use of LES. The purpose of the applied LES discussion meeting was to address questions regarding what is achievable and what is not, given the current technology and knowledge, for an industrial practitioner who is interested in using LES. The use of LES was explored in an application-centred context between diverse fields. The general flow-governing equation form was explored along with various LES models. The errors occurring in LES were analysed. Also, the hybridization of RANS and LES was considered. The importance of modelling relative to boundary conditions, problem definition and other more mundane aspects were examined. It was to an extent concluded that for LES to make most rapid industrial impact, pragmatic hybrid use of LES, implicit LES and RANS elements will probably be needed. Added to this further, highly industrial sector model parametrizations will be required with clear thought on the key target design parameter(s). The combination of good numerical modelling expertise, a sound understanding of turbulence, along with artistry, pragmatism and the use of recent developments in computer science should dramatically add impetus to the industrial uptake of LES. In the light of the numerous technical challenges that remain it appears that for some time to come LES will have echoes of the high levels of technical knowledge required for safe use of RANS but with much greater fidelity.

  4. Essays in Applied Microeconomics

    Science.gov (United States)

    Ge, Qi

    This dissertation consists of three self-contained applied microeconomics essays on topics related to behavioral economics and industrial organization. Chapter 1 studies how sentiment as a result of sports event outcomes affects consumers' tipping behavior in the presence of social norms. I formulate a model of tipping behavior that captures consumer sentiment following a reference-dependent preference framework and empirically test its relevance using the game outcomes of the NBA and the trip and tipping data on New York City taxicabs. While I find that consumers' tipping behavior responds to unexpected wins and losses of their home team, particularly in close game outcomes, I do not find evidence for loss aversion. Coupled with the findings on default tipping, my empirical results on the asymmetric tipping responses suggest that while social norms may dominate loss aversion, affect and surprises can result in freedom on the upside of tipping. Chapter 2 utilizes a novel data source of airline entry and exit announcements and examines how the incumbent airlines adjust quality provisions as a response to their competitors' announcements and the role of timing in such responses. I find no evidence that the incumbents engage in preemptive actions when facing probable entry and exit threats as signaled by the competitors' announcements in either short term or long term. There is, however, evidence supporting their responses to the competitors' realized entry or exit. My empirical findings underscore the role of timing in determining preemptive actions and suggest that previous studies may have overestimated how the incumbent airlines respond to entry threats. Chapter 3, which is collaborated with Benjamin Ho, investigates the habit formation of consumers' thermostat setting behavior, an often implicitly made decision and yet a key determinant of home energy consumption and expenditures. We utilize a high frequency dataset on household thermostat usage and find that

  5. GIS Technology: Resource and Habitability Assessment Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a one-year project to apply a GIS analysis tool to new orbital data for lunar resource assessment and martian habitability identification.  We used...

  6. Fractal Description of the Shearing-Surface of Tools

    Institute of Scientific and Technical Information of China (English)

    WANG Bing-cheng; JING Chang; REN Zhao-hui; REN Li-yi

    2004-01-01

    In this paper, the basic methods are introduced to calculate the fractal dimensions of the shearing surface of some tools. The fractal dimension of the shearing surface of experimental sampling is obtained and the fractal characteristics are also discussed. We can apply the fractal method to identify types of tools used by burglars and to do the job of individual recognition. New theories and methods are provided to measure and process the shearing surface profile of tools.

  7. Fault Injection Techniques and Tools

    Science.gov (United States)

    Hsueh, Mei-Chen; Tsai, Timothy K.; Iyer, Ravishankar K.

    1997-01-01

    Dependability evaluation involves the study of failures and errors. The destructive nature of a crash and long error latency make it difficult to identify the causes of failures in the operational environment. It is particularly hard to recreate a failure scenario for a large, complex system. To identify and understand potential failures, we use an experiment-based approach for studying the dependability of a system. Such an approach is applied not only during the conception and design phases, but also during the prototype and operational phases. To take an experiment-based approach, we must first understand a system's architecture, structure, and behavior. Specifically, we need to know its tolerance for faults and failures, including its built-in detection and recovery mechanisms, and we need specific instruments and tools to inject faults, create failures or errors, and monitor their effects.

  8. Performance management tools motivate change at the frontlines.

    Science.gov (United States)

    Smith, Christopher; Christiansen, Tanya; Dick, Don; Howden, Jane Squire; Wasylak, Tracy; Werle, Jason

    2014-01-01

    Performance management tools commonly used in business, such as incentives and the balanced scorecard, can be effectively applied in the public healthcare sector to improve quality of care. The province of Alberta applied these tools with the Institute for Health Improvement Learning Collaborative method to accelerate adoption of a clinical care pathway for hip and knee replacements. The results showed measurable improvements in all quality dimensions, including shorter hospital stays and wait times, higher bed utilization, earlier patient ambulation, and better patient outcomes.

  9. Tool Gear Documentation

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2002-04-03

    Tool Gear is designed to allow tool developers to insert instrumentation code into target programs using the DPCL library. This code can gather data and send it back to the Client for display or analysis. Tools can use the Tool Gear client without using the DPCL Collector. Any collector using the right protocols can send data to the Client for display and analysis. However, this document will focus on how to gather data with the DPCL Collector. There are three parts to the task of using Tool Gear to gather data through DPCL: (1) Write the instrumentation code that will be loaded and run in the target program. The code should be in the form of one or more functions, which can pass data structures back to the Client by way of DPCL. The collections of functions is compiled into a library, as described in this report. (2) Write the code that tells the DPCL Collector about the instrumentation and how to forward data back to the Client. (3) Extend the client to accept data from the Collector and display it in a useful way. The rest of this report describes how to carry out each of these steps.

  10. Some Notes About Artificial Intelligence as New Mathematical Tool

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-04-01

    Full Text Available Mathematics is a mere instance of First-Order Predicate Calculus. Therefore it belongs to applied Monotonic Logic. So, we found the limitations of classical logic reasoning and the clear advantages of Fuzzy Logic and many other new interesting tools. We present here some of the more usefulness tools of this new field of Mathematics so-called Artificial Intelligence.

  11. Verified System Development with the AutoFocus Tool Chain

    OpenAIRE

    Maria Spichkova; Florian Hölzl; David Trachtenherz

    2012-01-01

    This work presents a model-based development methodology for verified software systems as well as a tool support for it: an applied AutoFocus tool chain and its basic principles emphasizing the verification of the system under development as well as the check mechanisms we used to raise the level of confidence in the correctness of the implementation of the automatic generators.

  12. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  13. NASA's Applied Sciences for Water Resources

    Science.gov (United States)

    Doorn, Bradley; Toll, David; Engman, Ted

    2011-01-01

    The Earth Systems Division within NASA has the primary responsibility for the Earth Science Applied Science Program and the objective to accelerate the use of NASA science results in applications to help solve problems important to society and the economy. The primary goal of the Earth Science Applied Science Program is to improve future and current operational systems by infusing them with scientific knowledge of the Earth system gained through space-based observation, assimilation of new observations, and development and deployment of enabling technologies, systems, and capabilities. This paper discusses one of the major problems facing water resources managers, that of having timely and accurate data to drive their decision support tools. It then describes how NASA?s science and space based satellites may be used to overcome this problem. Opportunities for the water resources community to participate in NASA?s Water Resources Applications Program are described.

  14. Climate Change and Water Tools

    Science.gov (United States)

    EPA tools and workbooks guide users to mitigate and adapt to climate change impacts. Various tools can help manage risks, others can visualize climate projections in maps. Included are comprehensive tool kits hosted by other federal agencies.

  15. Cataract Surgery Tool

    Science.gov (United States)

    1977-01-01

    The NASA-McGannon cataract surgery tool is a tiny cutter-pump which liquefies and pumps the cataract lens material from the eye. Inserted through a small incision in the cornea, the tool can be used on the hardest cataract lens. The cutter is driven by a turbine which operates at about 200,000 revolutions per minute. Incorporated in the mechanism are two passages for saline solutions, one to maintain constant pressure within the eye, the other for removal of the fragmented lens material and fluids. Three years of effort have produced a design, now being clinically evaluated, with excellent potential for improved cataract surgery. The use of this tool is expected to reduce the patient's hospital stay and recovery period significantly.

  16. New Conceptual Design Tools

    DEFF Research Database (Denmark)

    Pugnale, Alberto; Holst, Malene; Kirkegaard, Poul Henning

    2010-01-01

    This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  17. Tool nimega Sacco

    Index Scriptorium Estoniae

    1998-01-01

    Kolmekümneseks on saanud Zanotta kott ehk tool "Sacco", mille 1968. a. disainisid P. Gatti, C. Paolini, F. Teodoro. "Sacco" - polüstüreenist graanulitega täidetud kott. Tähelepanu pälvis ka Zanotta firma täispuhutav tool "Blow" (1967, Scholari, D'Urbino, Lomazzi, De Pas). E. Lucie-Smith neist. 1968. aastale on pühendatud Düsseldorfi Kunstimuuseumi näitus "1968. a. legendid ja sümbolid", kus on eksponeeritud ligi 500 objekti ja mitu rekonstrueeritud interjööri

  18. General purpose MDE tools

    Directory of Open Access Journals (Sweden)

    Juan Manuel Cueva Lovelle

    2008-12-01

    Full Text Available MDE paradigm promises to release developers from writing code. The basis of this paradigm consists in working at such a level of abstraction that will make it easyer for analysts to detail the project to be undertaken. Using the model described by analysts, software tools will do the rest of the task, generating software that will comply with customer's defined requirements. The purpose of this study is to compare general purpose tools available right now that enable to put in practice the principles of this paradigm and aimed at generating a wide variety of applications composed by interactive multimedia and artificial intelligence components.

  19. Applied Meteorology Unit (AMU) Quarterly Report - Fourth Quarter FY-09

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark

    2009-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the fourth quarter of Fiscal Year 2009 (July - September 2009). Tasks reports include: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool. Phase III, (3) Peak Wind Tool for General Forecasting. Phase II, (4) Update and Maintain Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS), (5) Verify MesoNAM Performance (6) develop a Graphical User Interface to update selected parameters for the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLlT)

  20. Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY-08

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Dreher, Joseph

    2008-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the third quarter of Fiscal Year 2008 (April - June 2008). Tasks reported on are: Peak Wind Tool for User Launch Commit Criteria (LCC), Anvil Forecast Tool in AWIPS Phase II, Completion of the Edward Air Force Base (EAFB) Statistical Guidance Wind Tool, Volume Averaged Height Integ rated Radar Reflectivity (VAHIRR), Impact of Local Sensors, Radar Scan Strategies for the PAFB WSR-74C Replacement, VAHIRR Cost Benefit Analysis, and WRF Wind Sensitivity Study at Edwards Air Force Base

  1. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  2. Applied mathematics in the world of complexity

    Directory of Open Access Journals (Sweden)

    Kazaryan V. P.

    2016-01-01

    Full Text Available In modern mathematics the value of applied research increases, for this reason, modern mathematics is initially focused on resolving the situation actually arose in this respect on a par with other disciplines. Using a new tool - computer systems, applied mathematics appealed to the new object: not to nature, not to society or the practical activity of man. In fact, the subject of modern applied mathematics is a problem situation for the actor-person, and the study is aimed at solving the material and practical problems. Developing the laws of its internal logic, mathematical science today covers various practices and makes them its own requirements, subjugating and organizing in their own way a subject of study. This math activity is influenced by such external factors as the condition of the actor-person; capacity of computers together with service professionals; characteristics of the object, and the external circumstances act as constituting the essence of the case and not as minor issues. The modern applied mathematics is becoming more and more similar to the engineering sciences, and the importance of mathematical modeling problem is rising. In the studies the author bases on the broader context of modern science, including works of philosophy and methodology, as well as mathematicians and specialists in the field of natural sciences.

  3. Applied Ethics in Nowadays Society

    OpenAIRE

    Tomita CIULEI

    2013-01-01

    This special issue is dedicated to Nowadays Applied Ethics in Society, and falls in the field of social sciences and humanities, being hosted both theoretical approaches and empirical research in various areas of applied ethics. Applied ethics analyzes of a series of morally concrete situations of social or professional practice in order to make / adopt decisions. In the field of applied ethics are integrated medical ethics, legal ethics, media ethics, professional ethics, environmental ethic...

  4. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  5. Morphometrics applied to medical entomology.

    Science.gov (United States)

    Dujardin, Jean-Pierre

    2008-12-01

    Morphometrics underwent a revolution more than one decade ago. In the modern morphometrics, the estimate of size is now contained in a single variable reflecting variation in many directions, as many as there are landmarks under study, and shape is defined as their relative positions after correcting for size, position and orientation. With these informative data, and the corresponding software freely available to conduct complex analyses, significant biological and epidemiological features can be quantified more accurately. We discuss the evolutionary significance of the environmental impact on metric variability, mentioning the importance of concepts like genetic assimilation, genetic accommodation, and epigenetics. We provide examples of measuring the effect of selection on metric variation by comparing (unpublished) Qst values with corresponding (published) Fst. The primary needs of medical entomologists are to distinguish species, especially cryptic species, and to detect them where they are not expected. We explain how geometric morphometrics could apply to these questions, and where there are deficiencies preventing the approach from being utilized at its maximum potential. Medical entomologists in connection with control programs aim to identify isolated populations where the risk of reinfestation after treatment would be low ("biogeographical islands"). Identifying them can be obtained from estimating the number of migrants per generation. Direct assessment of movement remains the most valid approach, but it scores active movement only. Genetic methods estimating gene flow levels among interbreeding populations are commonly used, but gene flow does not necessarily mean the current flow of migrants. Methods using the morphometric variation are neither suited to evaluate gene flow, nor are they adapted to estimate the flow of migrants. They may provide, however, the information needed to create a preliminary map pointing to relevant areas where one could

  6. The Routledge Applied Linguistics Reader

    Science.gov (United States)

    Wei, Li, Ed.

    2011-01-01

    "The Routledge Applied Linguistics Reader" is an essential collection of readings for students of Applied Linguistics. Divided into five sections: Language Teaching and Learning, Second Language Acquisition, Applied Linguistics, Identity and Power and Language Use in Professional Contexts, the "Reader" takes a broad…

  7. The Routledge Applied Linguistics Reader

    Science.gov (United States)

    Wei, Li, Ed.

    2011-01-01

    "The Routledge Applied Linguistics Reader" is an essential collection of readings for students of Applied Linguistics. Divided into five sections: Language Teaching and Learning, Second Language Acquisition, Applied Linguistics, Identity and Power and Language Use in Professional Contexts, the "Reader" takes a broad interpretation of the subject…

  8. Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due

    2005-01-01

    in particular. A model of the aspects in the term tectonics – epresentation, ontology and culture – will be presented and used to discuss the current digital tools’ ability in tectonics. Furthermore it will be discussed what a digital tectonic tool is and could be and how a connection between the digital...

  9. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  10. Google - Security Testing Tool

    OpenAIRE

    Staykov, Georgi

    2007-01-01

    Using Google as a security testing tool, basic and advanced search techniques using advanced google search operators. Examples of obtaining control over security cameras, VoIP systems, web servers and collecting valuable information as: Credit card details, cvv codes – only using Google.

  11. Incident Information Management Tool

    CERN Document Server

    Pejovic, Vladimir

    2015-01-01

    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  12. Photutils: Photometry tools

    Science.gov (United States)

    Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan

    2016-09-01

    Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).

  13. Change Detection Tools

    NARCIS (Netherlands)

    Dekker, R.J.; Kuenzer, C.; Lehner, M.; Reinartz, P.; Niemeyer, I.; Nussbaum, S.; Lacroix, V.; Sequeira, V.; Stringa, E.; Schöpfer, E.

    2009-01-01

    In this chapter a wide range of change detection tools is addressed. They are grouped into methods suitable for optical and multispectral data, synthetic aperture radar (SAR) images, and 3D data. Optical and multispectral methods include unsupervised approaches, supervised and knowledge-based approa

  14. Field Information Support Tool

    Science.gov (United States)

    2010-09-01

    assessment and analysis tool developed by CASOS at Carnegie Mellon. According to the developer’s Web site (Carley, 2010): It [ORA] contains hundreds of... clinics , and the availability of pharmacies for medical supplies and were mapped in both Google Earth and FusionView. Once collected, the information

  15. Clean Cities Tools

    Energy Technology Data Exchange (ETDEWEB)

    None

    2014-12-19

    The U.S. Department of Energy's Clean Cities offers a large collection of Web-based tools on the Alternative Fuels Data Center. These calculators, interactive maps, and data searches can assist fleets, fuels providers, and other transportation decision makers in their efforts to reduce petroleum use.

  16. Tools and Concepts.

    Science.gov (United States)

    Artis, Margaret, Ed.; And Others

    This guide provides enrichment for students to develop tools and concepts used in various areas of mathematics. The first part presents arithmetic progressions, geometric progressions, and harmonic progression. In the second section, the concept of mathematic induction is developed from intuitive induction, using concrete activities, to the…

  17. The science writing tool

    Science.gov (United States)

    Schuhart, Arthur L.

    This is a two-part dissertation. The primary part is the text of a science-based composition rhetoric and reader called The Science Writing Tool. This textbook has seven chapters dealing with topics in Science Rhetoric. Each chapter includes a variety of examples of science writing, discussion questions, writing assignments, and instructional resources. The purpose of this text is to introduce lower-division college science majors to the role that rhetoric and communication plays in the conduct of Science, and how these skills contribute to a successful career in Science. The text is designed as a "tool kit," for use by an instructor constructing a science-based composition course or a writing-intensive Science course. The second part of this part of this dissertation reports on student reactions to draft portions of The Science Writing Tool text. In this report, students of English Composition II at Northern Virginia Community College-Annandale were surveyed about their attitudes toward course materials and topics included. The findings were used to revise and expand The Science Writing Tool.

  18. Balancing the tools

    DEFF Research Database (Denmark)

    Leroyer, Patrick

    2009-01-01

    The purpose of this article is to describe the potential of a new combination of functions in lexicographic tools for tourists. So far lexicography has focused on the communicative information needs of tourists, i.e. helping tourists decide what to say in a number of specific tourist situations, ...

  19. Apple Shuns Tracking Tool

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Apple Inc. is advising software de- velopers to stop using a feature in software for its iPhones and iPads .that has been linked to privacyconcerns, a move that would also take away a widely used tool for tracking users and their behavior. Developers who write programs for Apple's lOS operating system have been using a unique.

  20. Nitrogen Trading Tool (NTT)

    Science.gov (United States)

    The Natural Resources Conservation Service (NRCS) recently developed a prototype web-based nitrogen trading tool to facilitate water quality credit trading. The development team has worked closely with the Agriculture Research Service Soil Plant Nutrient Research Unit (ARS-SPNR) and the Environmenta...

  1. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  2. Risk Management Implementation Tool

    Science.gov (United States)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  3. Designer nanoparticle: nanobiotechnology tool for cell biology

    Science.gov (United States)

    Thimiri Govinda Raj, Deepak B.; Khan, Niamat Ali

    2016-09-01

    This article discusses the use of nanotechnology for subcellular compartment isolation and its application towards subcellular omics. This technology review significantly contributes to our understanding on use of nanotechnology for subcellular systems biology. Here we elaborate nanobiotechnology approach of using superparamagnetic nanoparticles (SPMNPs) optimized with different surface coatings for subcellular organelle isolation. Using pulse-chase approach, we review that SPMNPs interacted differently with the cell depending on its surface functionalization. The article focuses on the use of functionalized-SPMNPs as a nanobiotechnology tool to isolate high quality (both purity and yield) plasma membranes and endosomes or lysosomes. Such nanobiotechnology tool can be applied in generating subcellular compartment inventories. As a future perspective, this strategy could be applied in areas such as immunology, cancer and stem cell research.

  4. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  5. EVALUATION OF MACHINE TOOL QUALITY

    Directory of Open Access Journals (Sweden)

    Ivan Kuric

    2011-12-01

    Full Text Available Paper deals with aspects of quality and accuracy of machine tools. As the accuracy of machine tools has key factor for product quality, it is important to know the methods for evaluation of quality and accuracy of machine tools. Several aspects of diagnostics of machine tools are described, such as aspects of reliability.

  6. Web Tools: The Second Generation

    Science.gov (United States)

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  7. Graphics-Based Parallel Programming Tools

    Science.gov (United States)

    1991-09-01

    AD-A254 406 (9 FINAL REPORT DLECTF ’AUG 13 1992 Graphics-Based Parallel Programming Tools .p Janice E. Cuny, Principal Investigator Department of...suggest parallel (either because we use a parallel graph rewriting mechanism or because we apply our results to parallel programming ), we interpret it to...was to provide support for the ex- plicit representation of graphs for use within a parallel programming environ- ment. In our environment, we view a

  8. Mass Casualty Triage Performance Assessment Tool

    Science.gov (United States)

    2015-02-01

    tactical tasks for which Soldiers are supposed to be trained to complete and the lack of more precise measurement tools, one key gap identified for...between the wound and the heart) and elevate the wound above the level of the heart to slow the flow of blood to the wound. • Apply a clean...TERMS Assessment, Triage, Performance measurement , Feedback, Tasks-Collective, Brigade Combat Teams, Task analysis

  9. Dynamic optimization case studies in DYNOPT tool

    Science.gov (United States)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  10. Performance Auditing: A Management Tool for School Business Services.

    Science.gov (United States)

    Dierdorff, William H.

    1989-01-01

    Performance auditing is a tool designed to assist public officials in meeting their responsibility to apply resources efficiently and effectively. Applies performance auditing to school business support services; identifies benefits and obstacles; and provides selected alternatives and examples of performance auditing. (MLF)

  11. C++ Software Quality in the ATLAS experiment: Tools and Experience

    CERN Document Server

    Martin-Haugh, Stewart; The ATLAS collaboration

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  12. Algal functional annotation tool

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, D. [UCLA; Casero, D. [UCLA; Cokus, S. J. [UCLA; Merchant, S. S. [UCLA; Pellegrini, M. [UCLA

    2012-07-01

    The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps and batch gene identifier conversion.

  13. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  14. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  15. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  16. Channel nut tool

    Energy Technology Data Exchange (ETDEWEB)

    Olson, Marvin

    2016-01-12

    A method, system, and apparatus for installing channel nuts includes a shank, a handle formed on a first end of a shank, and an end piece with a threaded shaft configured to receive a channel nut formed on the second end of the shaft. The tool can be used to insert or remove a channel nut in a channel framing system and then removed from the channel nut.

  17. Program Management Tool

    Science.gov (United States)

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity

  18. Udder Hygiene Analysis tool

    OpenAIRE

    2013-01-01

    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  19. Knowing about tools: neural correlates of tool familiarity and experience.

    Science.gov (United States)

    Vingerhoets, Guy

    2008-04-15

    The observation of tools is known to elicit a distributed cortical network that reflects close-knit relations of semantic, action-related, and perceptual knowledge. The neural correlates underlying the critical knowledge of skilled tool use, however, remain to be elucidated. In this study, functional magnetic resonance imaging in 14 volunteers compares neural activation during the observation of familiar tools versus equally graspable unfamiliar tools of which the observers have little, if any, functional knowledge. In a second paradigm, the level of tool-experience is investigated by comparing the neural effects of observing frequently versus infrequently used familiar tools. Both familiar and unfamiliar tools activate the classic neural network associated with tool representations. Direct comparison of the activation patterns during the observation of familiar and unfamiliar tools in a priori determined regions of interest (ptool-use knowledge, with supramarginal gyrus storing information about limb and hand positions, and precuneus storing visuospatial information about hand-tool interactions. As no frontal activation survived this contrast, it appears that premotor activity is unrelated to experience based motor knowledge of tool use/function, but rather, is elicited by any graspable tool. Confrontation with unfamiliar or infrequently used tools reveals an increase in inferior temporal and medial and lateral occipital activation, predominantly in the left hemisphere, suggesting that these regions reflect visual feature processing for tool identification.

  20. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  1. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  2. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  3. Subcycling applied to the inflation of an automotive airbag

    NARCIS (Netherlands)

    Bruijs, W.E.M.; Buijk, A.J.; Coo, P.J.A. de; Sauren, A.A.H.J.

    1990-01-01

    Even when a supercomputer is used for a full scale crash simulation, the computer time is too large to apply crash simulation as an efficient and fast design tool. A method to reduce the computer time is subcycling. In this paper the subcycling algorithm as implemented in the PISCES-3DELK code is di

  4. Viral load: Roche applies for marketing approval for ultrasensitive test.

    Science.gov (United States)

    1998-08-07

    Roche Molecular Systems has applied for FDA permission to market a more sensitive viral load test. The Amplicor HIV-1 Monitor UltraSensitive Method tests viral load as low as 50 copies; current tests are only accurate to 400 copies. There is a widespread consensus among physicians that testing below 400 copies would be a valuable treatment tool.

  5. Applied Ethics in Nowadays Society

    Directory of Open Access Journals (Sweden)

    Tomita CIULEI

    2013-12-01

    Full Text Available This special issue is dedicated to Nowadays Applied Ethics in Society, and falls in the field of social sciences and humanities, being hosted both theoretical approaches and empirical research in various areas of applied ethics. Applied ethics analyzes of a series of morally concrete situations of social or professional practice in order to make / adopt decisions. In the field of applied ethics are integrated medical ethics, legal ethics, media ethics, professional ethics, environmental ethics, business ethics etc. Classification-JEL: A23

  6. Abductive networks applied to electronic combat

    Science.gov (United States)

    Montgomery, Gerard J.; Hess, Paul; Hwang, Jong S.

    1990-08-01

    A practical approach to dealing with combinatorial decision problems and uncertainties associated with electronic combat through the use of networks of high-level functional elements called abductive networks is presented. It describes the application of the Abductory Induction Mechanism (AIMTM) a supervised inductive learning tool for synthesizing polynomial abductive networks to the electronic combat problem domain. From databases of historical expert-generated or simulated combat engagements AIM can often induce compact and robust network models for making effective real-time electronic combat decisions despite significant uncertainties or a combinatorial explosion of possible situations. The feasibility of applying abductive networks to realize advanced combat decision aiding capabilities was demonstrated by applying AIM to a set of electronic combat simulations. The networks synthesized by AIM generated accurate assessments of the intent lethality and overall risk associated with a variety of simulated threats and produced reasonable estimates of the expected effectiveness of a group of electronic countermeasures for a large number of simulated combat scenarios. This paper presents the application of abductive networks to electronic combat summarizes the results of experiments performed using AIM discusses the benefits and limitations of applying abductive networks to electronic combat and indicates why abductive networks can often result in capabilities not attainable using alternative approaches. 1. ELECTRONIC COMBAT. UNCERTAINTY. AND MACHINE LEARNING Electronic combat has become an essential part of the ability to make war and has become increasingly complex since

  7. 运用品管圈质量改进工具降低住院患者跌倒发生率的实践%The practice of applying quality improvement tool of "quality control circle" to decrease the fall rate of hospitalized patients

    Institute of Scientific and Technical Information of China (English)

    蔡学联; 郑芝芬; 唐晓英; 孙香爱; 裘文娟; 贾勤; 杜丽萍; 裘丹英

    2011-01-01

    目的:探索品管圈活动在降低住院患者跌倒发生率中的作用.方法:根据患者跌倒报告表,回顾2006年1月至2008年9月该院39例住院患者跌倒状况,通过分析原因、设定目标、制定并实施对策等品管圈活动,确认运用品管圈管理力法后(2009年1月至2011年9月)该院住院患者跌倒发生率.结果:住院患者跌倒发生率由实施品管圈管理前的0.03‰降低至改善后的0.01‰,差异有统计学意义(P<0.05).结论:正确运用品管圈质量改进工具可有效降低住院患者跌倒发生率,并且能提高圈员品质改善的能力.%Objective: To explore the effect of quality control circle in decreasing the fall rate of hospitalized patients. Methods: According to fall report of hospitalized patients, the quality control circle team reviewed 39 falls of hospitalized patients in our hospital from January 2006 to September 2008, analyzed fall causes, set the goals, develop countermeasures, put them into effect, and confirmed the fall rate of hospitalized patients after the application of quality control circle from January 2009 to September 2011. Result: The fall rate of hospitalized patients decreased from 0.03%o to 0.01‰ after the application of quality control circle which improved 66.70% and the difference was significant (P<0.05). Conclusions: The application of the quality improvement tool of "quality control circle" could decrease the fall rate of hospitalized patients and improve the ability of the circle members.

  8. Accuracy assessment of the ERP prediction method based on analysis of 100-year ERP series

    Science.gov (United States)

    Malkin, Z.; Tissen, V. M.

    2012-12-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole motion (PM). In this study, a detailed comparison was made of real-time UT1 predictions made in 2006-2011 and PMpredictions made in 2009-2011making use of the SNIIM method with simultaneous predictions computed at the International Earth Rotation and Reference Systems Service (IERS), USNO. Obtained results have shown that proposed method provides better accuracy at different prediction lengths.

  9. What History Is Teaching Us: 100 Years of Advocacy in "Music Educators Journal"

    Science.gov (United States)

    Hedgecoth, David M.; Fischer, Sarah H.

    2014-01-01

    As "Music Educators Journal" celebrates its centennial, it is appropriate to look back over the past century to see how advocacy in music education has evolved. Of the more than 200 submitted articles on advocacy, four main themes emerged: music education in community, the relevancy of music education, the value of music education, and…

  10. Centennial Aerospace Power: The ’US Air Force’ at 100 Years

    Science.gov (United States)

    2000-01-01

    1999, p. 27. Walker, Tom, and Banja Luka , “Bomb video took fight out of Milosevic,” London Sunday Times, January 30, 2000. Waltz, Kenneth.,”The...Walker; Luka , 2000) The article implied that the video was the turning point when Milosevic saw what lay in store. Was this the deciding factor

  11. The role of the expedition doctor: lessons from 100 years ago.

    Science.gov (United States)

    Guly, Henry R

    2012-06-01

    This paper explores the role of the doctor on the expeditions of the heroic age of Antarctic exploration. The medical role includes medical screening of prospective expedition members, choosing medical equipment so as to maintain a balance between being able to cope with any eventuality and the cost and weight of equipment and drugs, health screening during an expedition, first aid training for field parties without a doctor, and, obviously, treatment of any injury or disease that occurs. If injury or illness occurs, the presence of a doctor is of great psychological benefit to the expedition. Although medical experience is important, it is probably more important that the doctor is a "team member," playing a full part in the expedition's aims, whether these are scientific, exploration, or reaching some goal. Most of the lessons learned during these expeditions a hundred years ago are just as relevant today.

  12. A personal perspective: 100-year history of the humoral theory of transplantation.

    Science.gov (United States)

    Terasaki, Paul I

    2012-04-27

    The humoral theory states that antibodies cause the rejection of allografts. From 1917 to 1929, extensive efforts were made to produce antibodies against tumors. It was finally realized that the antibodies were produced against the transplant antigens present on transplantable tumors, not against the tumor-specific antigens. To get around this problem, inbred mouse strains were developed, leading to identification of the transplant antigens determined by the H-2 locus of mice. The antibodies were hemagglutinating and cytotoxic antibodies. The analogous human leukocyte antigen system was established by analysis of lymphocytotoxic alloantibodies that were made by pregnant women, directed against mismatched antigens of the fetus. The human leukocyte antigen antibodies were then found to cause hyperacute rejection, acute rejection, and chronic rejection of kidneys. Antibodies appeared in almost all patients after rejection of kidneys. With Luminex single antigen bead technology, donor-specific antibodies could be identified before rise in serum creatinine and graft failure. Antibodies were shown to be predictive of subsequent graft failure in kidney, heart, and lung transplants: patients without antibodies had superior 4-year graft survival compared with those who did have antibodies. New evidence that antibodies are also associated with chronic failure has appeared for liver and islet transplants. Four studies have now shown that removal or reduction of antibodies result in higher graft survival. If removal of antibodies prevents chronic graft failure, final validation of the humoral theory can be achieved.

  13. Piping: Over 100 years of experience: From empiricism towards reliability-based design

    NARCIS (Netherlands)

    Van Beek, V.M.; Knoeff, H.G.; Schweckendiek, T.

    2011-01-01

    Backward piping is the process of channel formation in a sandy aquifer under river dikes. During high water periods this process manifests itself by the formation of sand boils. A long history of cases and experiments has contributed to the insights into this phenomenon and has improved the ability

  14. Publicly Open Virtualized Gaming Environment For Simulation of All Aspects Related to '100 Year Starship Study'

    Science.gov (United States)

    Obousy, R. K.

    2012-09-01

    Sending a mission to distant stars will require our civilization to develop new technologies and change the way we live. The complexity of the task is enormous [1] thus, the thought is to involve people from around the globe through the ``citizen scientist'' paradigm. The suggestion is a ``Gaming Virtual Reality Network'' (GVRN) to simulate sociological and technological aspects involved in this project. Currently there is work being done [2] in developing a technology which will construct computer games within GVRN. This technology will provide quick and easy ways for individuals to develop game scenarios related to various aspects of the ``100YSS'' project. People will be involved in solving certain tasks just by play games. Players will be able to modify conditions, add new technologies, geological conditions, social movements and assemble new strategies just by writing scenarios. The system will interface with textual and video information, extract scenarios written in millions of texts and use it to assemble new games. Thus, players will be able to simulate enormous amounts of possibilities. Information technologies will be involved which will require us to start building the system in a way that any modules can be easily replaced. Thus, GVRN should be modular and open to the community.

  15. 100 Years later: Celebrating the contributions of x-ray crystallography to allergy and clinical immunology.

    Science.gov (United States)

    Pomés, Anna; Chruszcz, Maksymilian; Gustchina, Alla; Minor, Wladek; Mueller, Geoffrey A; Pedersen, Lars C; Wlodawer, Alexander; Chapman, Martin D

    2015-07-01

    Current knowledge of molecules involved in immunology and allergic disease results from the significant contributions of x-ray crystallography, a discipline that just celebrated its 100th anniversary. The histories of allergens and x-ray crystallography are intimately intertwined. The first enzyme structure to be determined was lysozyme, also known as the chicken food allergen Gal d 4. Crystallography determines the exact 3-dimensional positions of atoms in molecules. Structures of molecular complexes in the disciplines of immunology and allergy have revealed the atoms involved in molecular interactions and mechanisms of disease. These complexes include peptides presented by MHC class II molecules, cytokines bound to their receptors, allergen-antibody complexes, and innate immune receptors with their ligands. The information derived from crystallographic studies provides insights into the function of molecules. Allergen function is one of the determinants of environmental exposure, which is essential for IgE sensitization. Proteolytic activity of allergens or their capacity to bind LPSs can also contribute to allergenicity. The atomic positions define the molecular surface that is accessible to antibodies. In turn, this surface determines antibody specificity and cross-reactivity, which are important factors for the selection of allergen panels used for molecular diagnosis and the interpretation of clinical symptoms. This review celebrates the contributions of x-ray crystallography to clinical immunology and allergy, focusing on new molecular perspectives that influence the diagnosis and treatment of allergic diseases.

  16. General relativity the most beautiful of theories : applications and trends after 100 years

    CERN Document Server

    2015-01-01

    Generalising Newton's law of gravitation, general relativity is one of the pillars of modern physics. On the occasion of general relativity's centennial, leading scientists in the different branches of gravitational research review the history and recent advances in the main fields of applications of the theory, which was referred to by Lev Landau as “the most beautiful of the existing physical theories”.

  17. [Primal psychoanalytic manuscript. 100 years "Studies of Hysteria" by Josef Breuer and Sigmund Freud].

    Science.gov (United States)

    Grubrich-Simitis, I

    1995-12-01

    In 1895 Breuer and Freud jointly published the Studies on Hysteria, a work that Grubrich-Simitis regards as the very first psychoanalytic monograph. The author begins by outlining the intellectual context in which the work took shape and the initial reception accorded to it by contemporary medical science and sexology. The main focus of the discussion centres around those aspects of the book that mark it out as a genuinely psychoanalytic work - hitherto unknown quality of seeing and hearing, a radical change in the relationship between doctor and patient, the establishment of a new form of case presentation and the development of approaches adumbrating psychoanalytic theory and technique. In conclusion the author describes the scientific cooperation between Freud and Breuer, assigning to the latter his rightful place in the history of psychoanalysis, a status frequently denied him by Freudians.

  18. 亚马逊干旱百年不遇%AMAZON DROUGHT WORST IN 100 YEARS

    Institute of Scientific and Technical Information of China (English)

    宋燕波

    2005-01-01

    @@ 亚马逊雨林是世界上最广阔的热带雨林,其面积为3亿公顷,从安第斯山脉低坡一直延伸到巴西的大西洋海岸,占世界现存热带雨林的1/3.充沛的雨水、湿热的气候和长时间的强烈日照,使这里自然资源丰富,物种繁多,生态纷繁复杂,生物多样性保存完好,有"生物科学家的天堂"和"地球之肺"的美誉.

  19. Is psychiatry only neurology? Or only abnormal psychology? Déjà vu after 100 years.

    Science.gov (United States)

    de Leon, Jose

    2015-04-01

    Forgetting history, which frequently repeats itself, is a mistake. In General Psychopathology, Jaspers criticised early 20th century psychiatrists, including those who thought psychiatry was only neurology (Wernicke) or only abnormal psychology (Freud), or who did not see the limitations of the medical model in psychiatry (Kraepelin). Jaspers proposed that some psychiatric disorders follow the medical model (Group I), while others are variations of normality (Group III), or comprise schizophrenia and severe mood disorders (Group II). In the early 21st century, the players' names have changed but the game remains the same. The US NIMH is reprising both Wernicke's brain mythology and Kraepelin's marketing promises. The neo-Kraepelinian revolution started at Washington University, became pre-eminent through the DSM-III developed by Spitzer, but reached a dead end with the DSM-5. McHugh, who described four perspectives in psychiatry, is the leading contemporary representative of the Jaspersian diagnostic approach. Other neo-Jaspersians are: Berrios, Wiggins and Schwartz, Ghaemi, Stanghellini, Parnas and Sass. Can psychiatry learn from its mistakes? The current psychiatric language, organised at its three levels, symptoms, syndromes, and disorders, was developed in the 19th century but is obsolete for the 21st century. Scientific advances in Jaspers' Group III disorders require collaborating with researchers in the social and psychological sciences. Jaspers' Group II disorders, redefined by the author as schizophrenia, catatonic syndromes, and severe mood disorders, are the core of psychiatry. Scientific advancement in them is not easy because we are not sure how to delineate between and within them correctly.

  20. Flood dynamics in urbanised landscapes: 100 years of climate and humans’ interaction

    Science.gov (United States)

    Sofia, G.; Roder, G.; Dalla Fontana, G.; Tarolli, P.

    2017-01-01

    Raising interest in the interaction between humans and climate drivers to understand the past and current development of floods in urbanised landscapes is of great importance. This study presents a regional screening of land-use, rainfall regime and flood dynamics in north-eastern Italy, covering the timeframe 1900–2010. This analysis suggests that, statistically, both climate and land-use have been contributing to a significant increase of the contribution of short duration floods to the increase in the number of flooded locations. The analysis also suggests that interaction arises, determining land-use dynamics to couple with climatic changes influencing the flood aggressiveness simultaneously. Given that it is not possible to control the climatic trend, an effective disaster management clearly needs an integrated approach to land planning and supervision. This research shows that land management and planning should include the investigation of the location of the past and future social and economic drivers for development, as well as past and current climatic trends. PMID:28079147